User login
Esophageal cancer screening isn’t for everyone: Study
Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.
Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.
Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.
In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.
The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.
In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.
Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.
The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.
By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.
The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.
“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.
The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.
Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.
Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.
R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.
Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.
Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.
R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.
Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.
Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.
R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.
Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.
Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.
Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.
In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.
The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.
In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.
Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.
The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.
By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.
The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.
“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.
The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.
Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.
Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.
Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.
In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.
The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.
In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.
Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.
The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.
By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.
The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.
“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.
The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.
FROM GASTROENTEROLOGY
Confronting endoscopic infection control
The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.
Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
Current challenges in endoscope reprocessing
According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.
Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.
The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.
“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
New sterilization technologies
Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.
The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
Quality indicators in reprocessing
Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.
Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.
Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
Novel disposable duodenoscope technologies
Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.
There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
Environmental impact
The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.
Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.
“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”
The authors disclosed no conflicts of interest.
The future remains to be seen
Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.
One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.
In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.
Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.
The future remains to be seen
Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.
One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.
In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.
Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.
The future remains to be seen
Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.
One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.
In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.
Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.
The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.
Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
Current challenges in endoscope reprocessing
According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.
Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.
The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.
“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
New sterilization technologies
Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.
The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
Quality indicators in reprocessing
Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.
Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.
Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
Novel disposable duodenoscope technologies
Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.
There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
Environmental impact
The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.
Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.
“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”
The authors disclosed no conflicts of interest.
The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.
Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
Current challenges in endoscope reprocessing
According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.
Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.
The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.
“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
New sterilization technologies
Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.
The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
Quality indicators in reprocessing
Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.
Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.
Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
Novel disposable duodenoscope technologies
Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.
There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
Environmental impact
The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.
Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.
“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”
The authors disclosed no conflicts of interest.
FROM TECHNIQUES AND INNOVATIONS IN GASTROINTESTINAL ENDOSCOPY
Do myenteric neurons replicate in small intestine?
A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.
Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.
According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.
For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.
The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.
In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.
“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”
The authors disclose no conflicts.
The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.
To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.
Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.
The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.
To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.
Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.
The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.
To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.
Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.
A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.
Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.
According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.
For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.
The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.
In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.
“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”
The authors disclose no conflicts.
A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.
Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.
According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.
For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.
The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.
In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.
“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”
The authors disclose no conflicts.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
AGA Clinical Practice Guideline: Diagnosis, treatment of rare hamartomatous polyposis
which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.
Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.
According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.
Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”
The guideline was published online in Gastroenterology .
Four syndromes reviewed
The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.
Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.
The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.
Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.
For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.
Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.
Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.
“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.
PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.
Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.
After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.
Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”
Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.
This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.
“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”
The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.
which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.
Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.
According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.
Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”
The guideline was published online in Gastroenterology .
Four syndromes reviewed
The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.
Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.
The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.
Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.
For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.
Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.
Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.
“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.
PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.
Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.
After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.
Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”
Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.
This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.
“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”
The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.
which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.
Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.
According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.
Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”
The guideline was published online in Gastroenterology .
Four syndromes reviewed
The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.
Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.
The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.
Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.
For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.
Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.
Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.
“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.
PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.
Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.
After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.
Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”
Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.
This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.
“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”
The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.
FROM GASTROENTEROLOGY
AGA Clinical Practice Update: Expert review of dietary options for IBS
The American Gastroenterological Association has published a clinical practice update on dietary interventions for patients with irritable bowel syndrome (IBS). The topics range from identification of suitable candidates for dietary interventions, to levels of evidence for specific diets, which are becoming increasingly recognized for their key role in managing patients with IBS, according to lead author William D. Chey, MD, of the University of Michigan, Ann Arbor, and colleagues.
“Most medical therapies for IBS improve global symptoms in fewer than one-half of patients, with a therapeutic gain of 7%-15% over placebo,” the researchers wrote in Gastroenterology. “Most patients with IBS associate their GI symptoms with eating food.”
According to Dr. Chey and colleagues, clinicians who are considering dietary modifications for treating IBS should first recognize the inherent challenges presented by this process and be aware that new diets won’t work for everyone.
“Specialty diets require planning and preparation, which may be impractical for some patients,” they wrote, noting that individuals with “decreased cognitive abilities and significant psychiatric disease” may be unable to follow diets or interpret their own responses to specific foods. Special diets may also be inappropriate for patients with financial constraints, and “should be avoided in patients with an eating disorder.”
Because of the challenges involved in dietary interventions, Dr. Chey and colleagues advised clinical support from a registered dietitian nutritionist or other resource.
Patients who are suitable candidates for intervention and willing to try a new diet should first provide information about their current eating habits. A food trial can then be personalized and implemented for a predetermined amount of time. If the patient does not respond, then the diet should be stopped and changed to a new diet or another intervention.
Dr. Chey and colleagues discussed three specific dietary interventions and their corresponding levels of evidence: soluble fiber; the low-fermentable oligo-, di-, and monosaccharides and polyols (FODMAP) diet; and a gluten-free diet.
“Soluble fiber is efficacious in treating global symptoms of IBS,” they wrote, citing 15 randomized controlled trials. Soluble fiber is most suitable for patients with constipation-predominant IBS, and different soluble fibers may yield different outcomes based on characteristics such as rate of fermentation and viscosity. In contrast, insoluble fiber is unlikely to help with IBS, and may worsen abdominal pain and bloating.
The low-FODMAP diet is “currently the most evidence-based diet intervention for IBS,” especially for patients with diarrhea-predominant IBS. Dr. Chey and colleagues offered a clear roadmap for employing the diet. First, patients should eat only low-FODMAP foods for 4-6 weeks. If symptoms don’t improve, the diet should be stopped. If symptoms do improve, foods containing a single FODMAP should be reintroduced one at a time, each in increasing quantities over 3 days, alongside documentation of symptom responses. Finally, the diet should be personalized based on these responses. The majority of patients (close to 80%) “can liberalize” a low-FODMAP diet based on their responses.
In contrast with the low-FODMAP diet, which has a relatively solid body of supporting evidence, efficacy data are still limited for treating IBS with a gluten-free diet. “Although observational studies found that most patients with IBS improve with a gluten-free diet, randomized controlled trials have yielded mixed results,” Dr. Chey and colleagues explained.
Their report cited a recent monograph on the topic that concluded that gluten-free eating offered no significant benefit over placebo (relative risk, 0.46; 95% confidence interval, 0.16-1.28). While some studies have documented positive results with a gluten-free diet, Dr. Chey and colleagues suggested that confounding variables such as the nocebo effect and the impact of other dietary factors have yet to be ruled out. “At present, it remains unclear whether a gluten-free diet is of benefit to patients with IBS.”
Dr. Chey and colleagues also explored IBS biomarkers. While some early data have shown that biomarkers may predict dietary responses, “there is insufficient evidence to support their routine use in clinical practice. ... Further efforts to identify and validate biomarkers that predict response to dietary interventions are needed to deliver ‘personalized nutrition.’ ”
The clinical practice update was commissioned and approved by the AGA CPU Committee and the AGA Governing Board. The researchers disclosed relationships with Biomerica, Salix, Mauna Kea Technologies, and others.
This article was updated May 19, 2022.
The American Gastroenterological Association has published a clinical practice update on dietary interventions for patients with irritable bowel syndrome (IBS). The topics range from identification of suitable candidates for dietary interventions, to levels of evidence for specific diets, which are becoming increasingly recognized for their key role in managing patients with IBS, according to lead author William D. Chey, MD, of the University of Michigan, Ann Arbor, and colleagues.
“Most medical therapies for IBS improve global symptoms in fewer than one-half of patients, with a therapeutic gain of 7%-15% over placebo,” the researchers wrote in Gastroenterology. “Most patients with IBS associate their GI symptoms with eating food.”
According to Dr. Chey and colleagues, clinicians who are considering dietary modifications for treating IBS should first recognize the inherent challenges presented by this process and be aware that new diets won’t work for everyone.
“Specialty diets require planning and preparation, which may be impractical for some patients,” they wrote, noting that individuals with “decreased cognitive abilities and significant psychiatric disease” may be unable to follow diets or interpret their own responses to specific foods. Special diets may also be inappropriate for patients with financial constraints, and “should be avoided in patients with an eating disorder.”
Because of the challenges involved in dietary interventions, Dr. Chey and colleagues advised clinical support from a registered dietitian nutritionist or other resource.
Patients who are suitable candidates for intervention and willing to try a new diet should first provide information about their current eating habits. A food trial can then be personalized and implemented for a predetermined amount of time. If the patient does not respond, then the diet should be stopped and changed to a new diet or another intervention.
Dr. Chey and colleagues discussed three specific dietary interventions and their corresponding levels of evidence: soluble fiber; the low-fermentable oligo-, di-, and monosaccharides and polyols (FODMAP) diet; and a gluten-free diet.
“Soluble fiber is efficacious in treating global symptoms of IBS,” they wrote, citing 15 randomized controlled trials. Soluble fiber is most suitable for patients with constipation-predominant IBS, and different soluble fibers may yield different outcomes based on characteristics such as rate of fermentation and viscosity. In contrast, insoluble fiber is unlikely to help with IBS, and may worsen abdominal pain and bloating.
The low-FODMAP diet is “currently the most evidence-based diet intervention for IBS,” especially for patients with diarrhea-predominant IBS. Dr. Chey and colleagues offered a clear roadmap for employing the diet. First, patients should eat only low-FODMAP foods for 4-6 weeks. If symptoms don’t improve, the diet should be stopped. If symptoms do improve, foods containing a single FODMAP should be reintroduced one at a time, each in increasing quantities over 3 days, alongside documentation of symptom responses. Finally, the diet should be personalized based on these responses. The majority of patients (close to 80%) “can liberalize” a low-FODMAP diet based on their responses.
In contrast with the low-FODMAP diet, which has a relatively solid body of supporting evidence, efficacy data are still limited for treating IBS with a gluten-free diet. “Although observational studies found that most patients with IBS improve with a gluten-free diet, randomized controlled trials have yielded mixed results,” Dr. Chey and colleagues explained.
Their report cited a recent monograph on the topic that concluded that gluten-free eating offered no significant benefit over placebo (relative risk, 0.46; 95% confidence interval, 0.16-1.28). While some studies have documented positive results with a gluten-free diet, Dr. Chey and colleagues suggested that confounding variables such as the nocebo effect and the impact of other dietary factors have yet to be ruled out. “At present, it remains unclear whether a gluten-free diet is of benefit to patients with IBS.”
Dr. Chey and colleagues also explored IBS biomarkers. While some early data have shown that biomarkers may predict dietary responses, “there is insufficient evidence to support their routine use in clinical practice. ... Further efforts to identify and validate biomarkers that predict response to dietary interventions are needed to deliver ‘personalized nutrition.’ ”
The clinical practice update was commissioned and approved by the AGA CPU Committee and the AGA Governing Board. The researchers disclosed relationships with Biomerica, Salix, Mauna Kea Technologies, and others.
This article was updated May 19, 2022.
The American Gastroenterological Association has published a clinical practice update on dietary interventions for patients with irritable bowel syndrome (IBS). The topics range from identification of suitable candidates for dietary interventions, to levels of evidence for specific diets, which are becoming increasingly recognized for their key role in managing patients with IBS, according to lead author William D. Chey, MD, of the University of Michigan, Ann Arbor, and colleagues.
“Most medical therapies for IBS improve global symptoms in fewer than one-half of patients, with a therapeutic gain of 7%-15% over placebo,” the researchers wrote in Gastroenterology. “Most patients with IBS associate their GI symptoms with eating food.”
According to Dr. Chey and colleagues, clinicians who are considering dietary modifications for treating IBS should first recognize the inherent challenges presented by this process and be aware that new diets won’t work for everyone.
“Specialty diets require planning and preparation, which may be impractical for some patients,” they wrote, noting that individuals with “decreased cognitive abilities and significant psychiatric disease” may be unable to follow diets or interpret their own responses to specific foods. Special diets may also be inappropriate for patients with financial constraints, and “should be avoided in patients with an eating disorder.”
Because of the challenges involved in dietary interventions, Dr. Chey and colleagues advised clinical support from a registered dietitian nutritionist or other resource.
Patients who are suitable candidates for intervention and willing to try a new diet should first provide information about their current eating habits. A food trial can then be personalized and implemented for a predetermined amount of time. If the patient does not respond, then the diet should be stopped and changed to a new diet or another intervention.
Dr. Chey and colleagues discussed three specific dietary interventions and their corresponding levels of evidence: soluble fiber; the low-fermentable oligo-, di-, and monosaccharides and polyols (FODMAP) diet; and a gluten-free diet.
“Soluble fiber is efficacious in treating global symptoms of IBS,” they wrote, citing 15 randomized controlled trials. Soluble fiber is most suitable for patients with constipation-predominant IBS, and different soluble fibers may yield different outcomes based on characteristics such as rate of fermentation and viscosity. In contrast, insoluble fiber is unlikely to help with IBS, and may worsen abdominal pain and bloating.
The low-FODMAP diet is “currently the most evidence-based diet intervention for IBS,” especially for patients with diarrhea-predominant IBS. Dr. Chey and colleagues offered a clear roadmap for employing the diet. First, patients should eat only low-FODMAP foods for 4-6 weeks. If symptoms don’t improve, the diet should be stopped. If symptoms do improve, foods containing a single FODMAP should be reintroduced one at a time, each in increasing quantities over 3 days, alongside documentation of symptom responses. Finally, the diet should be personalized based on these responses. The majority of patients (close to 80%) “can liberalize” a low-FODMAP diet based on their responses.
In contrast with the low-FODMAP diet, which has a relatively solid body of supporting evidence, efficacy data are still limited for treating IBS with a gluten-free diet. “Although observational studies found that most patients with IBS improve with a gluten-free diet, randomized controlled trials have yielded mixed results,” Dr. Chey and colleagues explained.
Their report cited a recent monograph on the topic that concluded that gluten-free eating offered no significant benefit over placebo (relative risk, 0.46; 95% confidence interval, 0.16-1.28). While some studies have documented positive results with a gluten-free diet, Dr. Chey and colleagues suggested that confounding variables such as the nocebo effect and the impact of other dietary factors have yet to be ruled out. “At present, it remains unclear whether a gluten-free diet is of benefit to patients with IBS.”
Dr. Chey and colleagues also explored IBS biomarkers. While some early data have shown that biomarkers may predict dietary responses, “there is insufficient evidence to support their routine use in clinical practice. ... Further efforts to identify and validate biomarkers that predict response to dietary interventions are needed to deliver ‘personalized nutrition.’ ”
The clinical practice update was commissioned and approved by the AGA CPU Committee and the AGA Governing Board. The researchers disclosed relationships with Biomerica, Salix, Mauna Kea Technologies, and others.
This article was updated May 19, 2022.
FROM GASTROENTEROLOGY
Cellular gene profiling may predict IBD treatment response
Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.
Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.
In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”
To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.
Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”
Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).
Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.
“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”
The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”
CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states
“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”
Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.
“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”
Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”
Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.
“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”
The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.
Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.
Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.
Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.
Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.
Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.
Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.
Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.
Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.
In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”
To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.
Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”
Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).
Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.
“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”
The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”
CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states
“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”
Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.
“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”
Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”
Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.
“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”
The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.
Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.
Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.
In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”
To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.
Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”
Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).
Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.
“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”
The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”
CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states
“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”
Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.
“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”
Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”
Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.
“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”
The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.
FROM GASTRO HEP ADVANCES
Deep learning system outmatches pathologists in diagnosing liver lesions
A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.
While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.
In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”
To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.
Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.
First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.
The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”
Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.
A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.
The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”
However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.
The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.
The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.
As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.
Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.
Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.
Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.
While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.
In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”
To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.
Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.
First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.
The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”
Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.
A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.
The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”
However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.
The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.
The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.
A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.
While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.
In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”
To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.
Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.
First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.
The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”
Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.
A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.
The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”
However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.
The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.
The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.
FROM GASTROENTEROLOGY
Liquid biopsy a valuable tool for detecting, monitoring HCC
Liquid biopsy using circulating tumor (ctDNA) detection and profiling is a valuable tool for clinicians in monitoring hepatocellular carcinoma (HCC), particularly in monitoring progression, researchers wrote in a recent review.
Details of the review, led by co–first authors Xueying Lyu and Yu-Man Tsui, both of the department of pathology and State Key Laboratory of Liver Research at the University of Hong Kong, were published in Cellular and Molecular Gastroenterology and Hepatology.
Because there are few treatment options for advanced-stage liver cancer, scientists are searching for noninvasive ways to detect liver cancer before is progresses. Liver resection is the primary treatment for HCC, but the recurrence rate is high. Early detection increases the ability to identify relevant molecular-targeted drugs and helps predict patient response.
There is growing interest in noninvasive circulating cell-free DNA (cfDNA) as well as in ctDNA – both are part of promising strategies to test circulating DNA in the bloodstream. Together with other circulating biomarkers, they are called liquid biopsy.
HCC can be detected noninvasively by detecting plasma ctDNA released from dying cancer cells. Detection depends on determining whether the circulating tumor DNA has the same molecular alterations as its tumor source. cfDNA contains genomic DNA from different tumor clones or tumors from different sites within a patient to help real-time monitoring of tumor progression.
Barriers to widespread clinical use of liquid biopsy include lack of standardization of the collection process. Procedures differ across health systems on how much blood should be collected, which tubes should be used for collection and how samples should be stored and shipped. The study authors suggested that “specialized tubes can be used for blood sample collection to reduce the chance of white blood cell rupture and genomic DNA contamination from the damaged white blood cells.”
Further research is needed
The study findings indicated that some aspects of liquid biopsy with cfDNA/ctDNA still need further exploration. For example, the effects of tumor vascularization, tumor aggressiveness, metabolic activity, and cell death mechanism on the dynamics of ctDNA in the bloodstream need to be identified.
It’s not yet clear how cfDNA is released into the bloodstream. Actively released cfDNA from the tumor may convey a different message from cfDNA released passively from dying cells upon treatment. The first represents treatment-resistant cells/subclones while the second represents treatment-responsive cells/subclones. Moreover, it is difficult to detect ctDNA mutation in early stage cancers that have lower tumor burden.
The investigators wrote: “The contributions of cfDNA from apoptosis, necrosis, autophagic cell death, and active release at different time points during disease progression, treatment response, and resistance appearance are poorly understood and will affect interpretation of the clinical observation in cfDNA assays.” A lower limit of detection needs to be determined and a standard curve set so that researchers can quantify the allelic frequencies of the mutants in cfDNA and avoid false-negative detection.
They urged establishing external quality assurance to verify laboratory performance, the proficiency in the cfDNA diagnostic test, and interpretation of results to identify errors in sampling, procedures, and decision making. Legal liability and cost effectiveness of using plasma cfDNA in treatment decisions also need to be considered.
The researchers wrote that, to better understand how ctDNA/cfDNA can be used to complement precision medicine in liver cancer, large multicenter cohorts and long-term follow-up are needed to compare ctDNA-guided decision-making against standard treatment without guidance from ctDNA profiling.
The authors disclosed having no conflicts of interest.
Detection and characterization of circulating tumor DNA (ctDNA) is one of the major forms of liquid biopsy. Because ctDNA can reflect molecular features of cancer tissues, it is considered an ideal alternative to tissue biopsy. Furthermore, it can overcome the limitation of tumor tissue biopsies such as bleeding, needle tract seeding, and sampling error.
Currently, several large biomarker trials of ctDNA for early HCC detection are underway. Once its accuracy is established in phase 3-4 biomarker studies, the role of ctDNA in the context of the existing surveillance program should be further defined. As the combination of ctDNA and other orthogonal circulating biomarkers was shown to enhance the performance, future research should explore biomarker panels that include ctDNA and other promising markers to maximize performance. Predictive biomarkers for treatment response is an unmet need in HCC. Investigating the role of a specific ctDNA marker panel as a predictor of immunotherapy responsiveness would be of great interest and is under active investigation.
Ju Dong Yang, MD, is with the Karsh Division of Digestive and Liver Diseases in the department of medicine, with the Comprehensive Transplant Center, and with the Samuel Oschin Comprehensive Cancer Institute at Cedars Sinai Medical Center, Los Angeles. He disclosed providing consulting services for Exact Sciences and Exelixis and Eisai.
Detection and characterization of circulating tumor DNA (ctDNA) is one of the major forms of liquid biopsy. Because ctDNA can reflect molecular features of cancer tissues, it is considered an ideal alternative to tissue biopsy. Furthermore, it can overcome the limitation of tumor tissue biopsies such as bleeding, needle tract seeding, and sampling error.
Currently, several large biomarker trials of ctDNA for early HCC detection are underway. Once its accuracy is established in phase 3-4 biomarker studies, the role of ctDNA in the context of the existing surveillance program should be further defined. As the combination of ctDNA and other orthogonal circulating biomarkers was shown to enhance the performance, future research should explore biomarker panels that include ctDNA and other promising markers to maximize performance. Predictive biomarkers for treatment response is an unmet need in HCC. Investigating the role of a specific ctDNA marker panel as a predictor of immunotherapy responsiveness would be of great interest and is under active investigation.
Ju Dong Yang, MD, is with the Karsh Division of Digestive and Liver Diseases in the department of medicine, with the Comprehensive Transplant Center, and with the Samuel Oschin Comprehensive Cancer Institute at Cedars Sinai Medical Center, Los Angeles. He disclosed providing consulting services for Exact Sciences and Exelixis and Eisai.
Detection and characterization of circulating tumor DNA (ctDNA) is one of the major forms of liquid biopsy. Because ctDNA can reflect molecular features of cancer tissues, it is considered an ideal alternative to tissue biopsy. Furthermore, it can overcome the limitation of tumor tissue biopsies such as bleeding, needle tract seeding, and sampling error.
Currently, several large biomarker trials of ctDNA for early HCC detection are underway. Once its accuracy is established in phase 3-4 biomarker studies, the role of ctDNA in the context of the existing surveillance program should be further defined. As the combination of ctDNA and other orthogonal circulating biomarkers was shown to enhance the performance, future research should explore biomarker panels that include ctDNA and other promising markers to maximize performance. Predictive biomarkers for treatment response is an unmet need in HCC. Investigating the role of a specific ctDNA marker panel as a predictor of immunotherapy responsiveness would be of great interest and is under active investigation.
Ju Dong Yang, MD, is with the Karsh Division of Digestive and Liver Diseases in the department of medicine, with the Comprehensive Transplant Center, and with the Samuel Oschin Comprehensive Cancer Institute at Cedars Sinai Medical Center, Los Angeles. He disclosed providing consulting services for Exact Sciences and Exelixis and Eisai.
Liquid biopsy using circulating tumor (ctDNA) detection and profiling is a valuable tool for clinicians in monitoring hepatocellular carcinoma (HCC), particularly in monitoring progression, researchers wrote in a recent review.
Details of the review, led by co–first authors Xueying Lyu and Yu-Man Tsui, both of the department of pathology and State Key Laboratory of Liver Research at the University of Hong Kong, were published in Cellular and Molecular Gastroenterology and Hepatology.
Because there are few treatment options for advanced-stage liver cancer, scientists are searching for noninvasive ways to detect liver cancer before is progresses. Liver resection is the primary treatment for HCC, but the recurrence rate is high. Early detection increases the ability to identify relevant molecular-targeted drugs and helps predict patient response.
There is growing interest in noninvasive circulating cell-free DNA (cfDNA) as well as in ctDNA – both are part of promising strategies to test circulating DNA in the bloodstream. Together with other circulating biomarkers, they are called liquid biopsy.
HCC can be detected noninvasively by detecting plasma ctDNA released from dying cancer cells. Detection depends on determining whether the circulating tumor DNA has the same molecular alterations as its tumor source. cfDNA contains genomic DNA from different tumor clones or tumors from different sites within a patient to help real-time monitoring of tumor progression.
Barriers to widespread clinical use of liquid biopsy include lack of standardization of the collection process. Procedures differ across health systems on how much blood should be collected, which tubes should be used for collection and how samples should be stored and shipped. The study authors suggested that “specialized tubes can be used for blood sample collection to reduce the chance of white blood cell rupture and genomic DNA contamination from the damaged white blood cells.”
Further research is needed
The study findings indicated that some aspects of liquid biopsy with cfDNA/ctDNA still need further exploration. For example, the effects of tumor vascularization, tumor aggressiveness, metabolic activity, and cell death mechanism on the dynamics of ctDNA in the bloodstream need to be identified.
It’s not yet clear how cfDNA is released into the bloodstream. Actively released cfDNA from the tumor may convey a different message from cfDNA released passively from dying cells upon treatment. The first represents treatment-resistant cells/subclones while the second represents treatment-responsive cells/subclones. Moreover, it is difficult to detect ctDNA mutation in early stage cancers that have lower tumor burden.
The investigators wrote: “The contributions of cfDNA from apoptosis, necrosis, autophagic cell death, and active release at different time points during disease progression, treatment response, and resistance appearance are poorly understood and will affect interpretation of the clinical observation in cfDNA assays.” A lower limit of detection needs to be determined and a standard curve set so that researchers can quantify the allelic frequencies of the mutants in cfDNA and avoid false-negative detection.
They urged establishing external quality assurance to verify laboratory performance, the proficiency in the cfDNA diagnostic test, and interpretation of results to identify errors in sampling, procedures, and decision making. Legal liability and cost effectiveness of using plasma cfDNA in treatment decisions also need to be considered.
The researchers wrote that, to better understand how ctDNA/cfDNA can be used to complement precision medicine in liver cancer, large multicenter cohorts and long-term follow-up are needed to compare ctDNA-guided decision-making against standard treatment without guidance from ctDNA profiling.
The authors disclosed having no conflicts of interest.
Liquid biopsy using circulating tumor (ctDNA) detection and profiling is a valuable tool for clinicians in monitoring hepatocellular carcinoma (HCC), particularly in monitoring progression, researchers wrote in a recent review.
Details of the review, led by co–first authors Xueying Lyu and Yu-Man Tsui, both of the department of pathology and State Key Laboratory of Liver Research at the University of Hong Kong, were published in Cellular and Molecular Gastroenterology and Hepatology.
Because there are few treatment options for advanced-stage liver cancer, scientists are searching for noninvasive ways to detect liver cancer before is progresses. Liver resection is the primary treatment for HCC, but the recurrence rate is high. Early detection increases the ability to identify relevant molecular-targeted drugs and helps predict patient response.
There is growing interest in noninvasive circulating cell-free DNA (cfDNA) as well as in ctDNA – both are part of promising strategies to test circulating DNA in the bloodstream. Together with other circulating biomarkers, they are called liquid biopsy.
HCC can be detected noninvasively by detecting plasma ctDNA released from dying cancer cells. Detection depends on determining whether the circulating tumor DNA has the same molecular alterations as its tumor source. cfDNA contains genomic DNA from different tumor clones or tumors from different sites within a patient to help real-time monitoring of tumor progression.
Barriers to widespread clinical use of liquid biopsy include lack of standardization of the collection process. Procedures differ across health systems on how much blood should be collected, which tubes should be used for collection and how samples should be stored and shipped. The study authors suggested that “specialized tubes can be used for blood sample collection to reduce the chance of white blood cell rupture and genomic DNA contamination from the damaged white blood cells.”
Further research is needed
The study findings indicated that some aspects of liquid biopsy with cfDNA/ctDNA still need further exploration. For example, the effects of tumor vascularization, tumor aggressiveness, metabolic activity, and cell death mechanism on the dynamics of ctDNA in the bloodstream need to be identified.
It’s not yet clear how cfDNA is released into the bloodstream. Actively released cfDNA from the tumor may convey a different message from cfDNA released passively from dying cells upon treatment. The first represents treatment-resistant cells/subclones while the second represents treatment-responsive cells/subclones. Moreover, it is difficult to detect ctDNA mutation in early stage cancers that have lower tumor burden.
The investigators wrote: “The contributions of cfDNA from apoptosis, necrosis, autophagic cell death, and active release at different time points during disease progression, treatment response, and resistance appearance are poorly understood and will affect interpretation of the clinical observation in cfDNA assays.” A lower limit of detection needs to be determined and a standard curve set so that researchers can quantify the allelic frequencies of the mutants in cfDNA and avoid false-negative detection.
They urged establishing external quality assurance to verify laboratory performance, the proficiency in the cfDNA diagnostic test, and interpretation of results to identify errors in sampling, procedures, and decision making. Legal liability and cost effectiveness of using plasma cfDNA in treatment decisions also need to be considered.
The researchers wrote that, to better understand how ctDNA/cfDNA can be used to complement precision medicine in liver cancer, large multicenter cohorts and long-term follow-up are needed to compare ctDNA-guided decision-making against standard treatment without guidance from ctDNA profiling.
The authors disclosed having no conflicts of interest.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Ultraprocessed foods: Large study finds link with Crohn’s disease
Higher consumption of ultraprocessed foods was linked with a significantly higher risk of Crohn’s disease (CD), but not ulcerative colitis, in a large prospective cohort study published online in Clinical Gastroenterology and Hepatology.
Researchers, led by Chun-Han Lo, MD, of Massachusetts General Hospital, Boston, defined ultraprocessed foods “as ready-to-consume formulations of ingredients, typically created by [a] series industrial techniques and processes. They frequently involve the incorporation of additives, such as sweeteners, preservatives, emulsifiers, thickeners, and flavors, which aid in food preservation and produce hyperpalatable products.”
The rising global incidence of inflammatory bowel disease (IBD) in regions undergoing Westernization has overlapped with rising increase in consumption of ultraprocessed food (UPF) over the past few decades, according to the authors. Previous studies have focused on links with individual nutrients and IBD, but this study focuses on the processing role itself. This study comprised 245,112 participants (203,516 women and 41,596 men) and more than 5,468,444 person-years of follow-up, taken from three cohorts: Nurses’ Health Study, Nurses’ Health Study II, and Health Professionals Follow-up Study.
In the highest quartile, UPFs made up on average nearly half (46.4%) of participants’ total energy consumption, compared with 21% in the lowest quartile.
The researchers found that compared with participants in the lowest quartile of simple updated UPF consumption, those in the highest quartile had a significantly increased risk of CD (adjusted hazard ratio, 1.70; 95% confidence interval, 1.23-2.35).
In addition, “a secondary analysis across different CD locations demonstrated that participants in the highest quartile of simple updated UPF intake had the highest risk of ileal, colonic, and ileocolonic CD,” the authors wrote.
Three groups of processed foods driving risk increase
Three groups of UPFs appeared to drive the increased risk of CD: ultraprocessed breads and breakfast foods; frozen or shelf-stable ready-to-eat/heat meals; and sauces, cheeses, spreads, and gravies.
Just as with overall consumption, researchers did not find an association between any of those three subgroups and UC risk.
The authors suggested several reasons for the link with Crohn’s disease. Among them were that higher UPF consumption may mean those foods are taking the place of unprocessed or minimally processed foods, such as those rich in fiber. Second, UPFs contain additives, such as salt, that may promote intestinal inflammation. Third, artificial sweeteners in UPFs may predispose the gut to inflammation, as supported by supplementing sucralose/maltodextrins in mice models of spontaneous ileitis.
As for why CD, but not UC, the authors said diet may be more relevant and have a stronger effect biologically in CD compared with UC. Another potential reason, they said, is that results “may reflect the greater specificity of dietary ligands and metabolites on the small intestine compared with the colon.”
Data from three large, prospective cohorts
Researchers used data from three ongoing, prospective nationwide cohorts of health professionals in the United States – the Nurses’ Health Study (1986-2014); the Nurses’ Health Study II (1991-2017); and the Health Professionals Follow-up Study (1986-2012).
In all three cohorts, participants filled in questionnaires at enrollment and every 2 years thereafter with information such as medical history and lifestyle factors. Diet was assessed via validated semi-quantitative food frequency questionnaires.
They used Cox proportional hazards models, adjusting for confounders to estimate the hazard ratios (HRs) and 95% confidence intervals for Crohn’s disease and ulcerative colitis, according to participants’ self-reports of their consumption of ultraprocessed foods.
Further studies could help determine which UPF components are driving the higher risk for Crohn’s disease and whether risk differs by length of exposure to UPFs.
“By avoiding UPF consumption, individuals might substantially lower their risk of developing CD in addition to gaining other health benefits,” the authors wrote.
A coauthor, James M. Richter, MD, is a consultant for Policy Analysis Inc and Takeda Pharmaceuticals. Andrew T. Chan, MD, serves as a consultant for Janssen Pharmaceuticals, Pfizer Inc, and Bayer Pharma AG. Ashwin N. Ananthakrishnan, MD, has served as a scientific advisory board member for Abbvie, Gilead, and Kyn Therapeutics, and received research grants from Pfizer and Merck. The remaining authors disclosed no conflicts. This work was supported by the National Institutes of Health; the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation.
Because consumption of industrially manufactured foods has risen in parallel with the incidence of autoimmune diseases, modern diets are hypothesized to contribute to the development of inflammatory bowel disease. In this study, Lo and colleagues conducted a retrospective cohort study to determine if individuals who reported higher levels of ultraprocessed food intake had higher rates of developing IBD. In their adjusted analysis, the authors report that the rate of developing Crohn’s disease was 70% higher for individuals who consumed the highest quartile of ultraprocessed foods; there was no association seen with ulcerative colitis.
While we await clarification about which ingredients are responsible, we should continue to encourage our patients to incorporate whole foods into their diets for both gastrointestinal and cardiometabolic health. At the same time, we must remain empathetic to systemic barriers to accessing and preparing high-quality, minimally processed foods. As such, we should advocate for policies and programs that mitigate food deserts. If food policy remains status quo, this study illustrates a frightening possibility of how disparities in gastrointestinal health equity could worsen in the future.
Ravy K. Vajravelu, MD, MSCE, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Center for Health Equity Research and Promotion and the VA Pittsburgh Healthcare System. This commentary does not represent the views of the U.S. Department of Veterans Affairs or the United States Government. Dr. Vajravelu reports no relevant disclosures.
Because consumption of industrially manufactured foods has risen in parallel with the incidence of autoimmune diseases, modern diets are hypothesized to contribute to the development of inflammatory bowel disease. In this study, Lo and colleagues conducted a retrospective cohort study to determine if individuals who reported higher levels of ultraprocessed food intake had higher rates of developing IBD. In their adjusted analysis, the authors report that the rate of developing Crohn’s disease was 70% higher for individuals who consumed the highest quartile of ultraprocessed foods; there was no association seen with ulcerative colitis.
While we await clarification about which ingredients are responsible, we should continue to encourage our patients to incorporate whole foods into their diets for both gastrointestinal and cardiometabolic health. At the same time, we must remain empathetic to systemic barriers to accessing and preparing high-quality, minimally processed foods. As such, we should advocate for policies and programs that mitigate food deserts. If food policy remains status quo, this study illustrates a frightening possibility of how disparities in gastrointestinal health equity could worsen in the future.
Ravy K. Vajravelu, MD, MSCE, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Center for Health Equity Research and Promotion and the VA Pittsburgh Healthcare System. This commentary does not represent the views of the U.S. Department of Veterans Affairs or the United States Government. Dr. Vajravelu reports no relevant disclosures.
Because consumption of industrially manufactured foods has risen in parallel with the incidence of autoimmune diseases, modern diets are hypothesized to contribute to the development of inflammatory bowel disease. In this study, Lo and colleagues conducted a retrospective cohort study to determine if individuals who reported higher levels of ultraprocessed food intake had higher rates of developing IBD. In their adjusted analysis, the authors report that the rate of developing Crohn’s disease was 70% higher for individuals who consumed the highest quartile of ultraprocessed foods; there was no association seen with ulcerative colitis.
While we await clarification about which ingredients are responsible, we should continue to encourage our patients to incorporate whole foods into their diets for both gastrointestinal and cardiometabolic health. At the same time, we must remain empathetic to systemic barriers to accessing and preparing high-quality, minimally processed foods. As such, we should advocate for policies and programs that mitigate food deserts. If food policy remains status quo, this study illustrates a frightening possibility of how disparities in gastrointestinal health equity could worsen in the future.
Ravy K. Vajravelu, MD, MSCE, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Center for Health Equity Research and Promotion and the VA Pittsburgh Healthcare System. This commentary does not represent the views of the U.S. Department of Veterans Affairs or the United States Government. Dr. Vajravelu reports no relevant disclosures.
Higher consumption of ultraprocessed foods was linked with a significantly higher risk of Crohn’s disease (CD), but not ulcerative colitis, in a large prospective cohort study published online in Clinical Gastroenterology and Hepatology.
Researchers, led by Chun-Han Lo, MD, of Massachusetts General Hospital, Boston, defined ultraprocessed foods “as ready-to-consume formulations of ingredients, typically created by [a] series industrial techniques and processes. They frequently involve the incorporation of additives, such as sweeteners, preservatives, emulsifiers, thickeners, and flavors, which aid in food preservation and produce hyperpalatable products.”
The rising global incidence of inflammatory bowel disease (IBD) in regions undergoing Westernization has overlapped with rising increase in consumption of ultraprocessed food (UPF) over the past few decades, according to the authors. Previous studies have focused on links with individual nutrients and IBD, but this study focuses on the processing role itself. This study comprised 245,112 participants (203,516 women and 41,596 men) and more than 5,468,444 person-years of follow-up, taken from three cohorts: Nurses’ Health Study, Nurses’ Health Study II, and Health Professionals Follow-up Study.
In the highest quartile, UPFs made up on average nearly half (46.4%) of participants’ total energy consumption, compared with 21% in the lowest quartile.
The researchers found that compared with participants in the lowest quartile of simple updated UPF consumption, those in the highest quartile had a significantly increased risk of CD (adjusted hazard ratio, 1.70; 95% confidence interval, 1.23-2.35).
In addition, “a secondary analysis across different CD locations demonstrated that participants in the highest quartile of simple updated UPF intake had the highest risk of ileal, colonic, and ileocolonic CD,” the authors wrote.
Three groups of processed foods driving risk increase
Three groups of UPFs appeared to drive the increased risk of CD: ultraprocessed breads and breakfast foods; frozen or shelf-stable ready-to-eat/heat meals; and sauces, cheeses, spreads, and gravies.
Just as with overall consumption, researchers did not find an association between any of those three subgroups and UC risk.
The authors suggested several reasons for the link with Crohn’s disease. Among them were that higher UPF consumption may mean those foods are taking the place of unprocessed or minimally processed foods, such as those rich in fiber. Second, UPFs contain additives, such as salt, that may promote intestinal inflammation. Third, artificial sweeteners in UPFs may predispose the gut to inflammation, as supported by supplementing sucralose/maltodextrins in mice models of spontaneous ileitis.
As for why CD, but not UC, the authors said diet may be more relevant and have a stronger effect biologically in CD compared with UC. Another potential reason, they said, is that results “may reflect the greater specificity of dietary ligands and metabolites on the small intestine compared with the colon.”
Data from three large, prospective cohorts
Researchers used data from three ongoing, prospective nationwide cohorts of health professionals in the United States – the Nurses’ Health Study (1986-2014); the Nurses’ Health Study II (1991-2017); and the Health Professionals Follow-up Study (1986-2012).
In all three cohorts, participants filled in questionnaires at enrollment and every 2 years thereafter with information such as medical history and lifestyle factors. Diet was assessed via validated semi-quantitative food frequency questionnaires.
They used Cox proportional hazards models, adjusting for confounders to estimate the hazard ratios (HRs) and 95% confidence intervals for Crohn’s disease and ulcerative colitis, according to participants’ self-reports of their consumption of ultraprocessed foods.
Further studies could help determine which UPF components are driving the higher risk for Crohn’s disease and whether risk differs by length of exposure to UPFs.
“By avoiding UPF consumption, individuals might substantially lower their risk of developing CD in addition to gaining other health benefits,” the authors wrote.
A coauthor, James M. Richter, MD, is a consultant for Policy Analysis Inc and Takeda Pharmaceuticals. Andrew T. Chan, MD, serves as a consultant for Janssen Pharmaceuticals, Pfizer Inc, and Bayer Pharma AG. Ashwin N. Ananthakrishnan, MD, has served as a scientific advisory board member for Abbvie, Gilead, and Kyn Therapeutics, and received research grants from Pfizer and Merck. The remaining authors disclosed no conflicts. This work was supported by the National Institutes of Health; the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation.
Higher consumption of ultraprocessed foods was linked with a significantly higher risk of Crohn’s disease (CD), but not ulcerative colitis, in a large prospective cohort study published online in Clinical Gastroenterology and Hepatology.
Researchers, led by Chun-Han Lo, MD, of Massachusetts General Hospital, Boston, defined ultraprocessed foods “as ready-to-consume formulations of ingredients, typically created by [a] series industrial techniques and processes. They frequently involve the incorporation of additives, such as sweeteners, preservatives, emulsifiers, thickeners, and flavors, which aid in food preservation and produce hyperpalatable products.”
The rising global incidence of inflammatory bowel disease (IBD) in regions undergoing Westernization has overlapped with rising increase in consumption of ultraprocessed food (UPF) over the past few decades, according to the authors. Previous studies have focused on links with individual nutrients and IBD, but this study focuses on the processing role itself. This study comprised 245,112 participants (203,516 women and 41,596 men) and more than 5,468,444 person-years of follow-up, taken from three cohorts: Nurses’ Health Study, Nurses’ Health Study II, and Health Professionals Follow-up Study.
In the highest quartile, UPFs made up on average nearly half (46.4%) of participants’ total energy consumption, compared with 21% in the lowest quartile.
The researchers found that compared with participants in the lowest quartile of simple updated UPF consumption, those in the highest quartile had a significantly increased risk of CD (adjusted hazard ratio, 1.70; 95% confidence interval, 1.23-2.35).
In addition, “a secondary analysis across different CD locations demonstrated that participants in the highest quartile of simple updated UPF intake had the highest risk of ileal, colonic, and ileocolonic CD,” the authors wrote.
Three groups of processed foods driving risk increase
Three groups of UPFs appeared to drive the increased risk of CD: ultraprocessed breads and breakfast foods; frozen or shelf-stable ready-to-eat/heat meals; and sauces, cheeses, spreads, and gravies.
Just as with overall consumption, researchers did not find an association between any of those three subgroups and UC risk.
The authors suggested several reasons for the link with Crohn’s disease. Among them were that higher UPF consumption may mean those foods are taking the place of unprocessed or minimally processed foods, such as those rich in fiber. Second, UPFs contain additives, such as salt, that may promote intestinal inflammation. Third, artificial sweeteners in UPFs may predispose the gut to inflammation, as supported by supplementing sucralose/maltodextrins in mice models of spontaneous ileitis.
As for why CD, but not UC, the authors said diet may be more relevant and have a stronger effect biologically in CD compared with UC. Another potential reason, they said, is that results “may reflect the greater specificity of dietary ligands and metabolites on the small intestine compared with the colon.”
Data from three large, prospective cohorts
Researchers used data from three ongoing, prospective nationwide cohorts of health professionals in the United States – the Nurses’ Health Study (1986-2014); the Nurses’ Health Study II (1991-2017); and the Health Professionals Follow-up Study (1986-2012).
In all three cohorts, participants filled in questionnaires at enrollment and every 2 years thereafter with information such as medical history and lifestyle factors. Diet was assessed via validated semi-quantitative food frequency questionnaires.
They used Cox proportional hazards models, adjusting for confounders to estimate the hazard ratios (HRs) and 95% confidence intervals for Crohn’s disease and ulcerative colitis, according to participants’ self-reports of their consumption of ultraprocessed foods.
Further studies could help determine which UPF components are driving the higher risk for Crohn’s disease and whether risk differs by length of exposure to UPFs.
“By avoiding UPF consumption, individuals might substantially lower their risk of developing CD in addition to gaining other health benefits,” the authors wrote.
A coauthor, James M. Richter, MD, is a consultant for Policy Analysis Inc and Takeda Pharmaceuticals. Andrew T. Chan, MD, serves as a consultant for Janssen Pharmaceuticals, Pfizer Inc, and Bayer Pharma AG. Ashwin N. Ananthakrishnan, MD, has served as a scientific advisory board member for Abbvie, Gilead, and Kyn Therapeutics, and received research grants from Pfizer and Merck. The remaining authors disclosed no conflicts. This work was supported by the National Institutes of Health; the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Locoregional therapy lowers wait-list dropout in HCC
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY