User login
More selective antibiotic shows promise for C. diff. infection
An investigational, novel oral antibiotic with greater selectivity than vancomycin, metronidazole, and even fidaxomicin may offer improved protection of healthy gut bacteria during the treatment of Clostridium difficile infection (CDI), according to ongoing research.
“CDI treatment has historically been dominated by metronidazole and vancomycin,” said Katherine Johnson, DO, from the Western Infectious Disease Consultants, P.C., Denver. However, these broad-spectrum drugs negatively affect healthy bacteria in the gut and increase the risk of CDI recurrence.
This is also a problem for drugs in the CDI antibiotic pipeline: Many candidate drugs have failed because of their broad-spectrum activity, she added during a session at the Peggy Lillis Foundation 2022 National C. diff. Advocacy Summit.
“An ideal CDI therapy would be a very narrow-spectrum antibiotic that has a minimal effect on normal gut bacteria,” she said.
Dr. Johnson is currently working on a phase 2 clinical trial that is evaluating the novel antibiotic, dubbed CRS3123, for the treatment of primary CDI and first-recurrence CDI. The investigational agent targets and inhibits a form of the methionyl tRNA synthetase enzyme, which is strictly required for protein biosynthesis in C. diff. and is therefore an ideal target for treatment of primary and recurrent CDI.
In her session, Dr. Johnson reported that CRS3123 inhibits the damaging toxins produced by C. diff., potentially resulting in more rapid symptom resolution. Additionally, owing to its novel mode of action, no strains are currently resistant to CRS3123.
She presented findings from an animal study that showed that CRS3123 was superior to vancomycin in terms of prolonging survival. She also presented findings from phase 1 clinical trials that showed that most adverse events (AEs) associated with CRS3123 were mild. No serious AEs were reported.
A ‘huge infectious burden’
If successful in further research, CRS3123 could offer significant value to patients with C. diff., especially those with recurrent infection, given the sometimes extreme clinical, quality-of-life, and economic burdens associated with CDI.
“CDI is a huge infectious burden to the U.S. health care system and globally has been listed by the Centers for Disease Control and Prevention as an urgent threat,” Byron Vaughn, MD, from the University of Minnesota, told this news organization.
“Despite a number of antibiotic stewardship and infection control and prevention efforts, we haven’t seen much of a change in the incidence of CDI,” he said. He said that the risk of recurrence can be as high as 30%.
While oral vancomycin is effective for treating C. diff., Dr. Vaughn noted that the antibiotic lacks selectivity and destroys healthy gut bacteria, resulting in substantial dysbiosis. “Dysbiosis is really the key to getting recurrent C. diff.,” he explained, “because if you have healthy gut bacteria, you will inherently resist CDI.”
Dr. Vaughn stated that his center is in the startup phase for being a site for a clinical trial of CRS3123. The hope is that CRS3123, because its spectrum is narrower than that of fidaxomicin and vancomycin, doesn’t induce intestinal dysbiosis. “It really just treats the C. diff. and leaves every other bug alone so that your gut bacteria can recover while the C. diff. is being treated,” he said. “And then when you stop CRS3123, you have healthy gut bacteria already present to prevent recurrence.”
If this is confirmed in large-scale trials, there could be a “dramatic decrease” in the rates of recurrent C. diff., said Dr. Vaughn.
Aside from the potential clinical impact, the economic implications of a novel selective antibiotic that preserves healthy gut bacteria could be significant, he added. “Depending on exactly what population you’re looking at, probably about a third of the cost of C. diff. is actually attributable to recurrence. That’s a huge economic burden that could be improved.”
Dr. Johnson is an employee of Crestone, which is developing CRS3123. Dr. Vaughn reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
An investigational, novel oral antibiotic with greater selectivity than vancomycin, metronidazole, and even fidaxomicin may offer improved protection of healthy gut bacteria during the treatment of Clostridium difficile infection (CDI), according to ongoing research.
“CDI treatment has historically been dominated by metronidazole and vancomycin,” said Katherine Johnson, DO, from the Western Infectious Disease Consultants, P.C., Denver. However, these broad-spectrum drugs negatively affect healthy bacteria in the gut and increase the risk of CDI recurrence.
This is also a problem for drugs in the CDI antibiotic pipeline: Many candidate drugs have failed because of their broad-spectrum activity, she added during a session at the Peggy Lillis Foundation 2022 National C. diff. Advocacy Summit.
“An ideal CDI therapy would be a very narrow-spectrum antibiotic that has a minimal effect on normal gut bacteria,” she said.
Dr. Johnson is currently working on a phase 2 clinical trial that is evaluating the novel antibiotic, dubbed CRS3123, for the treatment of primary CDI and first-recurrence CDI. The investigational agent targets and inhibits a form of the methionyl tRNA synthetase enzyme, which is strictly required for protein biosynthesis in C. diff. and is therefore an ideal target for treatment of primary and recurrent CDI.
In her session, Dr. Johnson reported that CRS3123 inhibits the damaging toxins produced by C. diff., potentially resulting in more rapid symptom resolution. Additionally, owing to its novel mode of action, no strains are currently resistant to CRS3123.
She presented findings from an animal study that showed that CRS3123 was superior to vancomycin in terms of prolonging survival. She also presented findings from phase 1 clinical trials that showed that most adverse events (AEs) associated with CRS3123 were mild. No serious AEs were reported.
A ‘huge infectious burden’
If successful in further research, CRS3123 could offer significant value to patients with C. diff., especially those with recurrent infection, given the sometimes extreme clinical, quality-of-life, and economic burdens associated with CDI.
“CDI is a huge infectious burden to the U.S. health care system and globally has been listed by the Centers for Disease Control and Prevention as an urgent threat,” Byron Vaughn, MD, from the University of Minnesota, told this news organization.
“Despite a number of antibiotic stewardship and infection control and prevention efforts, we haven’t seen much of a change in the incidence of CDI,” he said. He said that the risk of recurrence can be as high as 30%.
While oral vancomycin is effective for treating C. diff., Dr. Vaughn noted that the antibiotic lacks selectivity and destroys healthy gut bacteria, resulting in substantial dysbiosis. “Dysbiosis is really the key to getting recurrent C. diff.,” he explained, “because if you have healthy gut bacteria, you will inherently resist CDI.”
Dr. Vaughn stated that his center is in the startup phase for being a site for a clinical trial of CRS3123. The hope is that CRS3123, because its spectrum is narrower than that of fidaxomicin and vancomycin, doesn’t induce intestinal dysbiosis. “It really just treats the C. diff. and leaves every other bug alone so that your gut bacteria can recover while the C. diff. is being treated,” he said. “And then when you stop CRS3123, you have healthy gut bacteria already present to prevent recurrence.”
If this is confirmed in large-scale trials, there could be a “dramatic decrease” in the rates of recurrent C. diff., said Dr. Vaughn.
Aside from the potential clinical impact, the economic implications of a novel selective antibiotic that preserves healthy gut bacteria could be significant, he added. “Depending on exactly what population you’re looking at, probably about a third of the cost of C. diff. is actually attributable to recurrence. That’s a huge economic burden that could be improved.”
Dr. Johnson is an employee of Crestone, which is developing CRS3123. Dr. Vaughn reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
An investigational, novel oral antibiotic with greater selectivity than vancomycin, metronidazole, and even fidaxomicin may offer improved protection of healthy gut bacteria during the treatment of Clostridium difficile infection (CDI), according to ongoing research.
“CDI treatment has historically been dominated by metronidazole and vancomycin,” said Katherine Johnson, DO, from the Western Infectious Disease Consultants, P.C., Denver. However, these broad-spectrum drugs negatively affect healthy bacteria in the gut and increase the risk of CDI recurrence.
This is also a problem for drugs in the CDI antibiotic pipeline: Many candidate drugs have failed because of their broad-spectrum activity, she added during a session at the Peggy Lillis Foundation 2022 National C. diff. Advocacy Summit.
“An ideal CDI therapy would be a very narrow-spectrum antibiotic that has a minimal effect on normal gut bacteria,” she said.
Dr. Johnson is currently working on a phase 2 clinical trial that is evaluating the novel antibiotic, dubbed CRS3123, for the treatment of primary CDI and first-recurrence CDI. The investigational agent targets and inhibits a form of the methionyl tRNA synthetase enzyme, which is strictly required for protein biosynthesis in C. diff. and is therefore an ideal target for treatment of primary and recurrent CDI.
In her session, Dr. Johnson reported that CRS3123 inhibits the damaging toxins produced by C. diff., potentially resulting in more rapid symptom resolution. Additionally, owing to its novel mode of action, no strains are currently resistant to CRS3123.
She presented findings from an animal study that showed that CRS3123 was superior to vancomycin in terms of prolonging survival. She also presented findings from phase 1 clinical trials that showed that most adverse events (AEs) associated with CRS3123 were mild. No serious AEs were reported.
A ‘huge infectious burden’
If successful in further research, CRS3123 could offer significant value to patients with C. diff., especially those with recurrent infection, given the sometimes extreme clinical, quality-of-life, and economic burdens associated with CDI.
“CDI is a huge infectious burden to the U.S. health care system and globally has been listed by the Centers for Disease Control and Prevention as an urgent threat,” Byron Vaughn, MD, from the University of Minnesota, told this news organization.
“Despite a number of antibiotic stewardship and infection control and prevention efforts, we haven’t seen much of a change in the incidence of CDI,” he said. He said that the risk of recurrence can be as high as 30%.
While oral vancomycin is effective for treating C. diff., Dr. Vaughn noted that the antibiotic lacks selectivity and destroys healthy gut bacteria, resulting in substantial dysbiosis. “Dysbiosis is really the key to getting recurrent C. diff.,” he explained, “because if you have healthy gut bacteria, you will inherently resist CDI.”
Dr. Vaughn stated that his center is in the startup phase for being a site for a clinical trial of CRS3123. The hope is that CRS3123, because its spectrum is narrower than that of fidaxomicin and vancomycin, doesn’t induce intestinal dysbiosis. “It really just treats the C. diff. and leaves every other bug alone so that your gut bacteria can recover while the C. diff. is being treated,” he said. “And then when you stop CRS3123, you have healthy gut bacteria already present to prevent recurrence.”
If this is confirmed in large-scale trials, there could be a “dramatic decrease” in the rates of recurrent C. diff., said Dr. Vaughn.
Aside from the potential clinical impact, the economic implications of a novel selective antibiotic that preserves healthy gut bacteria could be significant, he added. “Depending on exactly what population you’re looking at, probably about a third of the cost of C. diff. is actually attributable to recurrence. That’s a huge economic burden that could be improved.”
Dr. Johnson is an employee of Crestone, which is developing CRS3123. Dr. Vaughn reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
The importance of toxin testing in C. difficile infection: Understanding the results
Clostridioides difficile infection is often confirmed through toxin testing, yet toxin tests alone may not be sufficient for diagnosing and guiding treatment decisions for patients with CDI.
“The presence of a toxigenic strain does not always equal disease,” said David Lyerly, PhD, during a session on C. difficile toxin testing at the Peggy Lillis Foundation 2022 National C. diff Advocacy Summit.
Dr. Lyerly, the chief science officer at Techlab, explained that exotoxins A and B are produced by specific strains of C. difficile and are involved in infections, but some patients who test positive for these toxins by polymerase chain reaction or other tests do not have CDI or they are not appropriate candidates for CDI treatment.
Several studies conducted during the past decade, however, support the importance of toxin detection. Some research has suggested that toxin-positive patients tend to have more clinically severe disease than those who test negative, he noted.
Although its use is limited when it is used alone, toxin testing is needed to confirm a CDI diagnosis and to ensure antibiotic stewardship, Dr. Lyerly said.
He suggested that, in addition to toxin testing, there is a need for molecular measures and other improved diagnostics to identify candidates most likely to benefit from CDI treatment.
“Because we generally detect toxin genes instead of toxin proteins, you can identify persons colonized with toxigenic C. difficile who do not actually have CDI,” Kevin W. Garey, PharmD, from the University of Houston, said in an interview.
Dr. Garey added that a person could likewise have low levels of toxins that aren’t detected by toxin tests but could still have CDI.
“Given this, better diagnostics that incorporate active toxin production and your body’s response to those toxins are needed,” he said, especially since C. difficile toxins are responsible for disease sequelae, including gastroenteritis, colonic perforation, sepsis, and death.
Toxin testing a ‘controversial area’
“C. difficile toxin testing has been a controversial area for almost a decade or more,” Shruti K. Gohil, MD, from University of California, Irvine, Health Epidemiology and Infection Prevention, said in an interview.
Dr. Gohil noted that toxin testing is a better test for clinical C. difficile colitis but by itself can miss C. difficile. “So, we are in this conundrum nationally,” she said.
“Many facilities will use a double- or triple-test strategy to make sure that you have a true C. difficile case mandating the use of antibiotics,” she explained. “The reason we test specifically with the enzyme immunoassay or toxin test is to know whether or not you have real C. difficile that’s actively producing the toxin for colitis.”
A patient with C. difficile who has been treated and is in recovery may still test positive on a C. difficile toxin test, added Dr. Gohil. “It would be great if we had a test that could really judge an active, clinical C. difficile infection. This [test] would help in identifying the right patients who need treatment and would also be able to tell if a patient has been cleared of C. difficile.”
Dr. Lyerly is an employee of Techlab. Dr. Garey and Dr. Gohil reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Clostridioides difficile infection is often confirmed through toxin testing, yet toxin tests alone may not be sufficient for diagnosing and guiding treatment decisions for patients with CDI.
“The presence of a toxigenic strain does not always equal disease,” said David Lyerly, PhD, during a session on C. difficile toxin testing at the Peggy Lillis Foundation 2022 National C. diff Advocacy Summit.
Dr. Lyerly, the chief science officer at Techlab, explained that exotoxins A and B are produced by specific strains of C. difficile and are involved in infections, but some patients who test positive for these toxins by polymerase chain reaction or other tests do not have CDI or they are not appropriate candidates for CDI treatment.
Several studies conducted during the past decade, however, support the importance of toxin detection. Some research has suggested that toxin-positive patients tend to have more clinically severe disease than those who test negative, he noted.
Although its use is limited when it is used alone, toxin testing is needed to confirm a CDI diagnosis and to ensure antibiotic stewardship, Dr. Lyerly said.
He suggested that, in addition to toxin testing, there is a need for molecular measures and other improved diagnostics to identify candidates most likely to benefit from CDI treatment.
“Because we generally detect toxin genes instead of toxin proteins, you can identify persons colonized with toxigenic C. difficile who do not actually have CDI,” Kevin W. Garey, PharmD, from the University of Houston, said in an interview.
Dr. Garey added that a person could likewise have low levels of toxins that aren’t detected by toxin tests but could still have CDI.
“Given this, better diagnostics that incorporate active toxin production and your body’s response to those toxins are needed,” he said, especially since C. difficile toxins are responsible for disease sequelae, including gastroenteritis, colonic perforation, sepsis, and death.
Toxin testing a ‘controversial area’
“C. difficile toxin testing has been a controversial area for almost a decade or more,” Shruti K. Gohil, MD, from University of California, Irvine, Health Epidemiology and Infection Prevention, said in an interview.
Dr. Gohil noted that toxin testing is a better test for clinical C. difficile colitis but by itself can miss C. difficile. “So, we are in this conundrum nationally,” she said.
“Many facilities will use a double- or triple-test strategy to make sure that you have a true C. difficile case mandating the use of antibiotics,” she explained. “The reason we test specifically with the enzyme immunoassay or toxin test is to know whether or not you have real C. difficile that’s actively producing the toxin for colitis.”
A patient with C. difficile who has been treated and is in recovery may still test positive on a C. difficile toxin test, added Dr. Gohil. “It would be great if we had a test that could really judge an active, clinical C. difficile infection. This [test] would help in identifying the right patients who need treatment and would also be able to tell if a patient has been cleared of C. difficile.”
Dr. Lyerly is an employee of Techlab. Dr. Garey and Dr. Gohil reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Clostridioides difficile infection is often confirmed through toxin testing, yet toxin tests alone may not be sufficient for diagnosing and guiding treatment decisions for patients with CDI.
“The presence of a toxigenic strain does not always equal disease,” said David Lyerly, PhD, during a session on C. difficile toxin testing at the Peggy Lillis Foundation 2022 National C. diff Advocacy Summit.
Dr. Lyerly, the chief science officer at Techlab, explained that exotoxins A and B are produced by specific strains of C. difficile and are involved in infections, but some patients who test positive for these toxins by polymerase chain reaction or other tests do not have CDI or they are not appropriate candidates for CDI treatment.
Several studies conducted during the past decade, however, support the importance of toxin detection. Some research has suggested that toxin-positive patients tend to have more clinically severe disease than those who test negative, he noted.
Although its use is limited when it is used alone, toxin testing is needed to confirm a CDI diagnosis and to ensure antibiotic stewardship, Dr. Lyerly said.
He suggested that, in addition to toxin testing, there is a need for molecular measures and other improved diagnostics to identify candidates most likely to benefit from CDI treatment.
“Because we generally detect toxin genes instead of toxin proteins, you can identify persons colonized with toxigenic C. difficile who do not actually have CDI,” Kevin W. Garey, PharmD, from the University of Houston, said in an interview.
Dr. Garey added that a person could likewise have low levels of toxins that aren’t detected by toxin tests but could still have CDI.
“Given this, better diagnostics that incorporate active toxin production and your body’s response to those toxins are needed,” he said, especially since C. difficile toxins are responsible for disease sequelae, including gastroenteritis, colonic perforation, sepsis, and death.
Toxin testing a ‘controversial area’
“C. difficile toxin testing has been a controversial area for almost a decade or more,” Shruti K. Gohil, MD, from University of California, Irvine, Health Epidemiology and Infection Prevention, said in an interview.
Dr. Gohil noted that toxin testing is a better test for clinical C. difficile colitis but by itself can miss C. difficile. “So, we are in this conundrum nationally,” she said.
“Many facilities will use a double- or triple-test strategy to make sure that you have a true C. difficile case mandating the use of antibiotics,” she explained. “The reason we test specifically with the enzyme immunoassay or toxin test is to know whether or not you have real C. difficile that’s actively producing the toxin for colitis.”
A patient with C. difficile who has been treated and is in recovery may still test positive on a C. difficile toxin test, added Dr. Gohil. “It would be great if we had a test that could really judge an active, clinical C. difficile infection. This [test] would help in identifying the right patients who need treatment and would also be able to tell if a patient has been cleared of C. difficile.”
Dr. Lyerly is an employee of Techlab. Dr. Garey and Dr. Gohil reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Locoregional therapy lowers wait-list dropout in HCC
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Researchers present cellular atlas of the human gut
New research sheds light on how different cell types behave across all intestinal regions and demonstrates variations in gene expression between these cells across three independent organ donors.
Research led by Joseph Burclaff, PhD, of the University of North Carolina at Chapel Hill, explained that the regional differences observed in the study “highlight the importance of regional selection when studying the gut.” Dr. Burclaff and colleagues, whose findings were published online in Cellular and Molecular Gastroenterology and Hepatology, wrote that they hope their “database serves as a resource to understand how drugs affect the intestinal epithelium and as guidance for future precision medicine approaches.”
In the study, Dr. Burclaff and colleagues performed single-cell transcriptomics that covered the duodenum, jejunum, ileum, as well as ascending, descending, and transverse colon from three independently processed organ donors. The donors varied in age, race, and body mass index.
The investigators evaluated 12,590 single epithelial cells for organ-specific lineage biomarkers, differentially regulated genes, receptors, and drug targets. The focus of the analyses was on intrinsic cell properties and their capacity for response to extrinsic signals found along the gut axis.
The research group assigned cells to 25 epithelial lineage clusters. According to the researchers, multiple accepted intestinal cell markers did not specifically mark all intestinal stem cells. In addition, the investigators explained that lysozyme expression was not unique to Paneth cells, and these cells lacked expression of certain “expected niche factors.” In fact, the researchers demonstrated lysozyme’s insufficiency for marking human Paneth cells.
Bestrophin-4þ (BEST4þ) cells, which expressed neuropeptide Y, demonstrated maturational differences between the colon and small intestine, suggesting organ-specific maturation for tuft and BEST4+ cells. In addition, the data from Dr. Burclaff and colleagues suggest BEST4+ cells are engaged in “diverse roles within the intestinal epithelium, laying the groundwork for functional studies.”
The researchers noted that “tuft cells possess a broad ability to interact with the innate and adaptive immune systems through previously unreported receptors.” Specifically, the researchers found these cells exhibit genes believed to be important for taste signaling, monitoring intestinal content, and signaling the immune system.
Certain classes of cell junctions, hormones, mucins, and nutrient absorption genes demonstrated “unappreciated regional expression differences across lineages,” the researchers wrote. The investigators added that the differential expression of receptors as well as drug targets across lineages demonstrated “biological variation and the potential for variegated responses.”
The researchers noted that while the regional differences identified in their study show the importance of regional selection during gut investigations, several previous colonic single-cell RNA sequencing studies did not specify the sample region or explain “if pooled samples are from consistent regions.”
In the study, the investigators also assessed how drugs may affect the intestinal epithelium and why certain side effects associated with pharmacologic agents occur. The researchers identified 498 drugs approved by the Food and Drug Administration that had 232 primary gene targets expressed in the gut epithelial dataset.
In their analysis, the researchers found that carboxylesterase-2, which metabolizes the drug irinotecan into biologically active SN-38, is the highest expressed phase 1 metabolism gene in the small intestine. Phase 2 enzyme UGT1A1, which inactivates SN-38, features low gut epithelial expression. The researchers explained that this finding suggests the cancer drug irinotecan may feature prolonged gut activation, supporting the notion that the orally administered agent may have efficacy against cancers of the intestine.
The researchers concluded their “database provides a foundation for understanding individual contributions of diverse epithelial cells across the length of the human intestine and colon to maintain physiologic function.”
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
Single cell transcriptomics has revolutionized our understanding of complex tissues, as this technology enables the identification of rare and/or novel cell types. Gastrointestinal science has benefited greatly from these technical advances, with multiple studies profiling liver, pancreas, stomach and intestine in health and disease, both in mouse and human samples.
The study by Burclaff and colleagues recently published in Cellular and Molecular Gastroenterology and Hepatology is the most complete analysis of the healthy human intestine to date, profiling over 12,000 single epithelial cells from three donors along the anterior-posterior axis from duodenum to descending colon. In a truly monumental work covering 35 journal pages, the authors not only delineate in great detail the various cell lineages – from stem cell to full differentiated enterocyte, for instance – but also make surprising discoveries that will change our thinking about fundamental issues in gastrointestinal biology. For instance, they find that human small intestinal Paneth cells, known for the production of antimicrobial peptides and long thought to be a critical component of the intestinal stem cell niche, do not express any of the niche factors, including mitogens such as epidermal growth factor, that had been attributed to Paneth cells in mice. The authors conclude that human Paneth cells are not major niche-supporting cells, in keeping with the recent identification of subepithelial telocytes as the critical cells that support crypt proliferation in mice. In addition, the authors’ analysis of so called “BEST4” cells, an intestinal lineage absent from the mouse gut, suggests a novel function for this rare cell type in metal absorption.
In sum, this study is the “final answer” for GI biologists needing a complete compendium of all genes active in the multitude of specialized human intestinal epithelial cells.
Klaus H. Kaestner, PhD, MS, is with the department of genetics and the Center for Molecular Studies in Digestive and Liver Diseases at the University of Pennsylvania, Philadelphia. He declares having no conflicts of interest.
Single cell transcriptomics has revolutionized our understanding of complex tissues, as this technology enables the identification of rare and/or novel cell types. Gastrointestinal science has benefited greatly from these technical advances, with multiple studies profiling liver, pancreas, stomach and intestine in health and disease, both in mouse and human samples.
The study by Burclaff and colleagues recently published in Cellular and Molecular Gastroenterology and Hepatology is the most complete analysis of the healthy human intestine to date, profiling over 12,000 single epithelial cells from three donors along the anterior-posterior axis from duodenum to descending colon. In a truly monumental work covering 35 journal pages, the authors not only delineate in great detail the various cell lineages – from stem cell to full differentiated enterocyte, for instance – but also make surprising discoveries that will change our thinking about fundamental issues in gastrointestinal biology. For instance, they find that human small intestinal Paneth cells, known for the production of antimicrobial peptides and long thought to be a critical component of the intestinal stem cell niche, do not express any of the niche factors, including mitogens such as epidermal growth factor, that had been attributed to Paneth cells in mice. The authors conclude that human Paneth cells are not major niche-supporting cells, in keeping with the recent identification of subepithelial telocytes as the critical cells that support crypt proliferation in mice. In addition, the authors’ analysis of so called “BEST4” cells, an intestinal lineage absent from the mouse gut, suggests a novel function for this rare cell type in metal absorption.
In sum, this study is the “final answer” for GI biologists needing a complete compendium of all genes active in the multitude of specialized human intestinal epithelial cells.
Klaus H. Kaestner, PhD, MS, is with the department of genetics and the Center for Molecular Studies in Digestive and Liver Diseases at the University of Pennsylvania, Philadelphia. He declares having no conflicts of interest.
Single cell transcriptomics has revolutionized our understanding of complex tissues, as this technology enables the identification of rare and/or novel cell types. Gastrointestinal science has benefited greatly from these technical advances, with multiple studies profiling liver, pancreas, stomach and intestine in health and disease, both in mouse and human samples.
The study by Burclaff and colleagues recently published in Cellular and Molecular Gastroenterology and Hepatology is the most complete analysis of the healthy human intestine to date, profiling over 12,000 single epithelial cells from three donors along the anterior-posterior axis from duodenum to descending colon. In a truly monumental work covering 35 journal pages, the authors not only delineate in great detail the various cell lineages – from stem cell to full differentiated enterocyte, for instance – but also make surprising discoveries that will change our thinking about fundamental issues in gastrointestinal biology. For instance, they find that human small intestinal Paneth cells, known for the production of antimicrobial peptides and long thought to be a critical component of the intestinal stem cell niche, do not express any of the niche factors, including mitogens such as epidermal growth factor, that had been attributed to Paneth cells in mice. The authors conclude that human Paneth cells are not major niche-supporting cells, in keeping with the recent identification of subepithelial telocytes as the critical cells that support crypt proliferation in mice. In addition, the authors’ analysis of so called “BEST4” cells, an intestinal lineage absent from the mouse gut, suggests a novel function for this rare cell type in metal absorption.
In sum, this study is the “final answer” for GI biologists needing a complete compendium of all genes active in the multitude of specialized human intestinal epithelial cells.
Klaus H. Kaestner, PhD, MS, is with the department of genetics and the Center for Molecular Studies in Digestive and Liver Diseases at the University of Pennsylvania, Philadelphia. He declares having no conflicts of interest.
New research sheds light on how different cell types behave across all intestinal regions and demonstrates variations in gene expression between these cells across three independent organ donors.
Research led by Joseph Burclaff, PhD, of the University of North Carolina at Chapel Hill, explained that the regional differences observed in the study “highlight the importance of regional selection when studying the gut.” Dr. Burclaff and colleagues, whose findings were published online in Cellular and Molecular Gastroenterology and Hepatology, wrote that they hope their “database serves as a resource to understand how drugs affect the intestinal epithelium and as guidance for future precision medicine approaches.”
In the study, Dr. Burclaff and colleagues performed single-cell transcriptomics that covered the duodenum, jejunum, ileum, as well as ascending, descending, and transverse colon from three independently processed organ donors. The donors varied in age, race, and body mass index.
The investigators evaluated 12,590 single epithelial cells for organ-specific lineage biomarkers, differentially regulated genes, receptors, and drug targets. The focus of the analyses was on intrinsic cell properties and their capacity for response to extrinsic signals found along the gut axis.
The research group assigned cells to 25 epithelial lineage clusters. According to the researchers, multiple accepted intestinal cell markers did not specifically mark all intestinal stem cells. In addition, the investigators explained that lysozyme expression was not unique to Paneth cells, and these cells lacked expression of certain “expected niche factors.” In fact, the researchers demonstrated lysozyme’s insufficiency for marking human Paneth cells.
Bestrophin-4þ (BEST4þ) cells, which expressed neuropeptide Y, demonstrated maturational differences between the colon and small intestine, suggesting organ-specific maturation for tuft and BEST4+ cells. In addition, the data from Dr. Burclaff and colleagues suggest BEST4+ cells are engaged in “diverse roles within the intestinal epithelium, laying the groundwork for functional studies.”
The researchers noted that “tuft cells possess a broad ability to interact with the innate and adaptive immune systems through previously unreported receptors.” Specifically, the researchers found these cells exhibit genes believed to be important for taste signaling, monitoring intestinal content, and signaling the immune system.
Certain classes of cell junctions, hormones, mucins, and nutrient absorption genes demonstrated “unappreciated regional expression differences across lineages,” the researchers wrote. The investigators added that the differential expression of receptors as well as drug targets across lineages demonstrated “biological variation and the potential for variegated responses.”
The researchers noted that while the regional differences identified in their study show the importance of regional selection during gut investigations, several previous colonic single-cell RNA sequencing studies did not specify the sample region or explain “if pooled samples are from consistent regions.”
In the study, the investigators also assessed how drugs may affect the intestinal epithelium and why certain side effects associated with pharmacologic agents occur. The researchers identified 498 drugs approved by the Food and Drug Administration that had 232 primary gene targets expressed in the gut epithelial dataset.
In their analysis, the researchers found that carboxylesterase-2, which metabolizes the drug irinotecan into biologically active SN-38, is the highest expressed phase 1 metabolism gene in the small intestine. Phase 2 enzyme UGT1A1, which inactivates SN-38, features low gut epithelial expression. The researchers explained that this finding suggests the cancer drug irinotecan may feature prolonged gut activation, supporting the notion that the orally administered agent may have efficacy against cancers of the intestine.
The researchers concluded their “database provides a foundation for understanding individual contributions of diverse epithelial cells across the length of the human intestine and colon to maintain physiologic function.”
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
New research sheds light on how different cell types behave across all intestinal regions and demonstrates variations in gene expression between these cells across three independent organ donors.
Research led by Joseph Burclaff, PhD, of the University of North Carolina at Chapel Hill, explained that the regional differences observed in the study “highlight the importance of regional selection when studying the gut.” Dr. Burclaff and colleagues, whose findings were published online in Cellular and Molecular Gastroenterology and Hepatology, wrote that they hope their “database serves as a resource to understand how drugs affect the intestinal epithelium and as guidance for future precision medicine approaches.”
In the study, Dr. Burclaff and colleagues performed single-cell transcriptomics that covered the duodenum, jejunum, ileum, as well as ascending, descending, and transverse colon from three independently processed organ donors. The donors varied in age, race, and body mass index.
The investigators evaluated 12,590 single epithelial cells for organ-specific lineage biomarkers, differentially regulated genes, receptors, and drug targets. The focus of the analyses was on intrinsic cell properties and their capacity for response to extrinsic signals found along the gut axis.
The research group assigned cells to 25 epithelial lineage clusters. According to the researchers, multiple accepted intestinal cell markers did not specifically mark all intestinal stem cells. In addition, the investigators explained that lysozyme expression was not unique to Paneth cells, and these cells lacked expression of certain “expected niche factors.” In fact, the researchers demonstrated lysozyme’s insufficiency for marking human Paneth cells.
Bestrophin-4þ (BEST4þ) cells, which expressed neuropeptide Y, demonstrated maturational differences between the colon and small intestine, suggesting organ-specific maturation for tuft and BEST4+ cells. In addition, the data from Dr. Burclaff and colleagues suggest BEST4+ cells are engaged in “diverse roles within the intestinal epithelium, laying the groundwork for functional studies.”
The researchers noted that “tuft cells possess a broad ability to interact with the innate and adaptive immune systems through previously unreported receptors.” Specifically, the researchers found these cells exhibit genes believed to be important for taste signaling, monitoring intestinal content, and signaling the immune system.
Certain classes of cell junctions, hormones, mucins, and nutrient absorption genes demonstrated “unappreciated regional expression differences across lineages,” the researchers wrote. The investigators added that the differential expression of receptors as well as drug targets across lineages demonstrated “biological variation and the potential for variegated responses.”
The researchers noted that while the regional differences identified in their study show the importance of regional selection during gut investigations, several previous colonic single-cell RNA sequencing studies did not specify the sample region or explain “if pooled samples are from consistent regions.”
In the study, the investigators also assessed how drugs may affect the intestinal epithelium and why certain side effects associated with pharmacologic agents occur. The researchers identified 498 drugs approved by the Food and Drug Administration that had 232 primary gene targets expressed in the gut epithelial dataset.
In their analysis, the researchers found that carboxylesterase-2, which metabolizes the drug irinotecan into biologically active SN-38, is the highest expressed phase 1 metabolism gene in the small intestine. Phase 2 enzyme UGT1A1, which inactivates SN-38, features low gut epithelial expression. The researchers explained that this finding suggests the cancer drug irinotecan may feature prolonged gut activation, supporting the notion that the orally administered agent may have efficacy against cancers of the intestine.
The researchers concluded their “database provides a foundation for understanding individual contributions of diverse epithelial cells across the length of the human intestine and colon to maintain physiologic function.”
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Locoregional therapy lowers waitlist dropout in HCC
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in waitlist dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings online August 3 in Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and waitlist outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to 1 approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to waitlist dropout.
The waitlist dropout variable was defined by list removal due to death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waitlist.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waitlist. Between the 2003 and 2006 period, the median waitlist time was 123 days, but this median waitlist duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had 2 LRTs, 6.2% had 3 LRTs, and 2.2% had greater than or equal to 4 LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 waitlist days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for waitlist dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of waitlist dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio [sHR], 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of waitlist dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of waitlist dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waitlist between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in waitlist dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings online August 3 in Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and waitlist outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to 1 approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to waitlist dropout.
The waitlist dropout variable was defined by list removal due to death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waitlist.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waitlist. Between the 2003 and 2006 period, the median waitlist time was 123 days, but this median waitlist duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had 2 LRTs, 6.2% had 3 LRTs, and 2.2% had greater than or equal to 4 LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 waitlist days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for waitlist dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of waitlist dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio [sHR], 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of waitlist dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of waitlist dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waitlist between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in waitlist dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings online August 3 in Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and waitlist outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to 1 approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to waitlist dropout.
The waitlist dropout variable was defined by list removal due to death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waitlist.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waitlist. Between the 2003 and 2006 period, the median waitlist time was 123 days, but this median waitlist duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had 2 LRTs, 6.2% had 3 LRTs, and 2.2% had greater than or equal to 4 LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 waitlist days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for waitlist dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of waitlist dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio [sHR], 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of waitlist dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of waitlist dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waitlist between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Guselkumab found promising for Crohn’s in phase 2 study
according to phase 2 trial data
Conventional first-line therapies for Crohn’s disease (CD) often are not effective for maintaining clinical remission and are associated with significant toxicity concerns, wrote study investigator William J. Sandborn, MD, of the University of California, San Diego, and colleagues. Guselkumab is a human monoclonal antibody that selectively inhibits the p19 subunit of interleukin 23, a cytokine that plays an important role in gut inflammation, the researchers wrote. Their report was published online Feb. 5 in Gastroenterology.
In the phase 2 GALAXI-1 study, Dr. Sandborn and colleagues evaluated the safety and efficacy of guselkumab in 309 patients with moderate to severe CD for at least 3 months. All patients previously had experienced either an inadequate response or intolerance to convention treatment or biologic agents.
Patients were randomly assigned to either placebo (n = 61); intravenous guselkumab at doses of 200 mg (n = 61), 600 mg (n = 63), or 1,200 mg at weeks 0, 4, and 8 (n = 61); or a reference arm comprising ustekinumab approximately 6 mg/kg IV at week 0 and subcutaneous 90 mg at week 8 (n = 63).
The study’s primary endpoint included the change from baseline to 12 weeks in the CD Activity Index score. The mean age of the population was 38.8 years and the mean duration of CD was 8.8 years.
There were patients in the primary efficacy analysis set who discontinued the study through week 12. At one point the study was paused to assess a serious adverse event of toxic hepatitis in a guselkumab-treated patient. Fifty-one patients were discontinued from the study because their induction treatment was paused during the adverse event evaluation; however, these patients were included in the safety analyses.
At the 12-week follow-up assessment, patients assigned to all doses of guselkumab experienced significantly greater reductions in the CD Activity Index from baseline when compared with placebo (least squares mean: 200 mg: –160.4; 600 mg: –138.9; and 1,200 mg: –144.9 vs. placebo: –36.2; all P < .05). In addition, a significantly greater proportion of patients in each guselkumab arm achieved clinical remission compared with the placebo group (CD Activity Index < 150; 57.4%, 55.6%, and 45.9% vs. 16.4%; all P < .05).
Among the patients who had an inadequate response or intolerance to prior biologic therapy, 47.5% of those in the combined guselkumab arm and 10.0% in the placebo arm met the criteria for clinical remission at 12 weeks. In addition, 62.4% of patients in the combined guselkumab group and 20% in the placebo group within the prior biologic therapy subgroup achieved clinical response at week 12.
Of patients with inadequate response or intolerance to prior conventional therapy, approximately 60% treated with guselkumab at all doses vs. 22.6% of the placebo group had clinical remission by 12 weeks. Also within this subgroup, 70.2% of patients in the combined guselkumab arm and 29% in the placebo arm had clinical response.
Finally, among the 360 patients in the safety analysis set, the proportions of patients with at least one adverse event were similar across the treatment groups during the treatment period (60% for placebo; 45.7% for guselkumab combined; 50.7% for ustekinumab).
There was no observable relationship between the dose of guselkumab and the proportion of patients with adverse events. Infection rates were 21.4% in the placebo arm, 15.1% in the combined guselkumab group, and 12.7% in the ustekinumab arm. Approximately 3.7% of patients in the combined guselkumab arm, 5.7% of patients in the placebo arm, and 5.6% of patients in the ustekinumab arm experienced at least one serious adverse event.
Greater proportions of patients receiving guselkumab achieved clinical response, Patient Reported Outcomes–2 remission, clinical-biomarker response, and endoscopic response at week 12 vs. placebo. Efficacy of ustekinumab vs. placebo was demonstrated. Safety event rates were generally similar across treatment groups.
Limitations of the study included the small number of patients in the overall dataset and the relatively short treatment period of 12 weeks. The researchers noted that phase 3 studies of guselkumab for the treatment of Crohn’s disease are underway.
Several of the researchers reported conflicts of interest with the pharmaceutical industry. The study received funding from Janssen Research & Development, LLC.
Over the last 20 years, multiple targeted therapies have been developed for Crohn’s disease (CD) and have changed the management landscape for this chronic disease. Despite many successes, a proportion of patients still experience treatment failure or intolerance to the currently available biologics, and the need for ongoing development of new therapies remains. This study by Sandborn and colleagues highlights the development of a novel therapy for Crohn’s disease patients. The novel therapy, guselkumab, targets a more specific interleukin pathway (IL-23p19 inhibition) than is currently available. In the study, guselkumab was found to be effective at improving multiple clinical parameters such as Crohn’s Disease Activity Index and Patient-Reported Outcome–2 as well as objective parameters including biomarker response and endoscopic response in patients with moderate to severe CD. There was no apparent exposure response observed over multiple dose regimens. Guselkumab also demonstrated a favorable safety profile.
As clinicians, the promising results from this phase 2 trial bring hope for additional treatment options for Crohn’s disease patients. As the management landscape for CD further changes, options for patients will grow and thoughtful decisions regarding sequencing of the available therapies will become more important. More selective interleukin inhibition with IL-23p19 has been shown to be superior to dual blockade of IL-12/IL-23 in psoriasis; however, it is unknown if the same will be true for Crohn’s disease. Further research will be needed in the future to address any potential efficacy and safety differences between the more specific target of IL-23 signaling.
Robin Dalal, MD, is an assistant professor of medicine, director of IBD education, and director of the advanced IBD fellowship at Vanderbilt University Medical Center in Nashville, Tenn. She reported being a consultant for AbbVie.
Over the last 20 years, multiple targeted therapies have been developed for Crohn’s disease (CD) and have changed the management landscape for this chronic disease. Despite many successes, a proportion of patients still experience treatment failure or intolerance to the currently available biologics, and the need for ongoing development of new therapies remains. This study by Sandborn and colleagues highlights the development of a novel therapy for Crohn’s disease patients. The novel therapy, guselkumab, targets a more specific interleukin pathway (IL-23p19 inhibition) than is currently available. In the study, guselkumab was found to be effective at improving multiple clinical parameters such as Crohn’s Disease Activity Index and Patient-Reported Outcome–2 as well as objective parameters including biomarker response and endoscopic response in patients with moderate to severe CD. There was no apparent exposure response observed over multiple dose regimens. Guselkumab also demonstrated a favorable safety profile.
As clinicians, the promising results from this phase 2 trial bring hope for additional treatment options for Crohn’s disease patients. As the management landscape for CD further changes, options for patients will grow and thoughtful decisions regarding sequencing of the available therapies will become more important. More selective interleukin inhibition with IL-23p19 has been shown to be superior to dual blockade of IL-12/IL-23 in psoriasis; however, it is unknown if the same will be true for Crohn’s disease. Further research will be needed in the future to address any potential efficacy and safety differences between the more specific target of IL-23 signaling.
Robin Dalal, MD, is an assistant professor of medicine, director of IBD education, and director of the advanced IBD fellowship at Vanderbilt University Medical Center in Nashville, Tenn. She reported being a consultant for AbbVie.
Over the last 20 years, multiple targeted therapies have been developed for Crohn’s disease (CD) and have changed the management landscape for this chronic disease. Despite many successes, a proportion of patients still experience treatment failure or intolerance to the currently available biologics, and the need for ongoing development of new therapies remains. This study by Sandborn and colleagues highlights the development of a novel therapy for Crohn’s disease patients. The novel therapy, guselkumab, targets a more specific interleukin pathway (IL-23p19 inhibition) than is currently available. In the study, guselkumab was found to be effective at improving multiple clinical parameters such as Crohn’s Disease Activity Index and Patient-Reported Outcome–2 as well as objective parameters including biomarker response and endoscopic response in patients with moderate to severe CD. There was no apparent exposure response observed over multiple dose regimens. Guselkumab also demonstrated a favorable safety profile.
As clinicians, the promising results from this phase 2 trial bring hope for additional treatment options for Crohn’s disease patients. As the management landscape for CD further changes, options for patients will grow and thoughtful decisions regarding sequencing of the available therapies will become more important. More selective interleukin inhibition with IL-23p19 has been shown to be superior to dual blockade of IL-12/IL-23 in psoriasis; however, it is unknown if the same will be true for Crohn’s disease. Further research will be needed in the future to address any potential efficacy and safety differences between the more specific target of IL-23 signaling.
Robin Dalal, MD, is an assistant professor of medicine, director of IBD education, and director of the advanced IBD fellowship at Vanderbilt University Medical Center in Nashville, Tenn. She reported being a consultant for AbbVie.
according to phase 2 trial data
Conventional first-line therapies for Crohn’s disease (CD) often are not effective for maintaining clinical remission and are associated with significant toxicity concerns, wrote study investigator William J. Sandborn, MD, of the University of California, San Diego, and colleagues. Guselkumab is a human monoclonal antibody that selectively inhibits the p19 subunit of interleukin 23, a cytokine that plays an important role in gut inflammation, the researchers wrote. Their report was published online Feb. 5 in Gastroenterology.
In the phase 2 GALAXI-1 study, Dr. Sandborn and colleagues evaluated the safety and efficacy of guselkumab in 309 patients with moderate to severe CD for at least 3 months. All patients previously had experienced either an inadequate response or intolerance to convention treatment or biologic agents.
Patients were randomly assigned to either placebo (n = 61); intravenous guselkumab at doses of 200 mg (n = 61), 600 mg (n = 63), or 1,200 mg at weeks 0, 4, and 8 (n = 61); or a reference arm comprising ustekinumab approximately 6 mg/kg IV at week 0 and subcutaneous 90 mg at week 8 (n = 63).
The study’s primary endpoint included the change from baseline to 12 weeks in the CD Activity Index score. The mean age of the population was 38.8 years and the mean duration of CD was 8.8 years.
There were patients in the primary efficacy analysis set who discontinued the study through week 12. At one point the study was paused to assess a serious adverse event of toxic hepatitis in a guselkumab-treated patient. Fifty-one patients were discontinued from the study because their induction treatment was paused during the adverse event evaluation; however, these patients were included in the safety analyses.
At the 12-week follow-up assessment, patients assigned to all doses of guselkumab experienced significantly greater reductions in the CD Activity Index from baseline when compared with placebo (least squares mean: 200 mg: –160.4; 600 mg: –138.9; and 1,200 mg: –144.9 vs. placebo: –36.2; all P < .05). In addition, a significantly greater proportion of patients in each guselkumab arm achieved clinical remission compared with the placebo group (CD Activity Index < 150; 57.4%, 55.6%, and 45.9% vs. 16.4%; all P < .05).
Among the patients who had an inadequate response or intolerance to prior biologic therapy, 47.5% of those in the combined guselkumab arm and 10.0% in the placebo arm met the criteria for clinical remission at 12 weeks. In addition, 62.4% of patients in the combined guselkumab group and 20% in the placebo group within the prior biologic therapy subgroup achieved clinical response at week 12.
Of patients with inadequate response or intolerance to prior conventional therapy, approximately 60% treated with guselkumab at all doses vs. 22.6% of the placebo group had clinical remission by 12 weeks. Also within this subgroup, 70.2% of patients in the combined guselkumab arm and 29% in the placebo arm had clinical response.
Finally, among the 360 patients in the safety analysis set, the proportions of patients with at least one adverse event were similar across the treatment groups during the treatment period (60% for placebo; 45.7% for guselkumab combined; 50.7% for ustekinumab).
There was no observable relationship between the dose of guselkumab and the proportion of patients with adverse events. Infection rates were 21.4% in the placebo arm, 15.1% in the combined guselkumab group, and 12.7% in the ustekinumab arm. Approximately 3.7% of patients in the combined guselkumab arm, 5.7% of patients in the placebo arm, and 5.6% of patients in the ustekinumab arm experienced at least one serious adverse event.
Greater proportions of patients receiving guselkumab achieved clinical response, Patient Reported Outcomes–2 remission, clinical-biomarker response, and endoscopic response at week 12 vs. placebo. Efficacy of ustekinumab vs. placebo was demonstrated. Safety event rates were generally similar across treatment groups.
Limitations of the study included the small number of patients in the overall dataset and the relatively short treatment period of 12 weeks. The researchers noted that phase 3 studies of guselkumab for the treatment of Crohn’s disease are underway.
Several of the researchers reported conflicts of interest with the pharmaceutical industry. The study received funding from Janssen Research & Development, LLC.
according to phase 2 trial data
Conventional first-line therapies for Crohn’s disease (CD) often are not effective for maintaining clinical remission and are associated with significant toxicity concerns, wrote study investigator William J. Sandborn, MD, of the University of California, San Diego, and colleagues. Guselkumab is a human monoclonal antibody that selectively inhibits the p19 subunit of interleukin 23, a cytokine that plays an important role in gut inflammation, the researchers wrote. Their report was published online Feb. 5 in Gastroenterology.
In the phase 2 GALAXI-1 study, Dr. Sandborn and colleagues evaluated the safety and efficacy of guselkumab in 309 patients with moderate to severe CD for at least 3 months. All patients previously had experienced either an inadequate response or intolerance to convention treatment or biologic agents.
Patients were randomly assigned to either placebo (n = 61); intravenous guselkumab at doses of 200 mg (n = 61), 600 mg (n = 63), or 1,200 mg at weeks 0, 4, and 8 (n = 61); or a reference arm comprising ustekinumab approximately 6 mg/kg IV at week 0 and subcutaneous 90 mg at week 8 (n = 63).
The study’s primary endpoint included the change from baseline to 12 weeks in the CD Activity Index score. The mean age of the population was 38.8 years and the mean duration of CD was 8.8 years.
There were patients in the primary efficacy analysis set who discontinued the study through week 12. At one point the study was paused to assess a serious adverse event of toxic hepatitis in a guselkumab-treated patient. Fifty-one patients were discontinued from the study because their induction treatment was paused during the adverse event evaluation; however, these patients were included in the safety analyses.
At the 12-week follow-up assessment, patients assigned to all doses of guselkumab experienced significantly greater reductions in the CD Activity Index from baseline when compared with placebo (least squares mean: 200 mg: –160.4; 600 mg: –138.9; and 1,200 mg: –144.9 vs. placebo: –36.2; all P < .05). In addition, a significantly greater proportion of patients in each guselkumab arm achieved clinical remission compared with the placebo group (CD Activity Index < 150; 57.4%, 55.6%, and 45.9% vs. 16.4%; all P < .05).
Among the patients who had an inadequate response or intolerance to prior biologic therapy, 47.5% of those in the combined guselkumab arm and 10.0% in the placebo arm met the criteria for clinical remission at 12 weeks. In addition, 62.4% of patients in the combined guselkumab group and 20% in the placebo group within the prior biologic therapy subgroup achieved clinical response at week 12.
Of patients with inadequate response or intolerance to prior conventional therapy, approximately 60% treated with guselkumab at all doses vs. 22.6% of the placebo group had clinical remission by 12 weeks. Also within this subgroup, 70.2% of patients in the combined guselkumab arm and 29% in the placebo arm had clinical response.
Finally, among the 360 patients in the safety analysis set, the proportions of patients with at least one adverse event were similar across the treatment groups during the treatment period (60% for placebo; 45.7% for guselkumab combined; 50.7% for ustekinumab).
There was no observable relationship between the dose of guselkumab and the proportion of patients with adverse events. Infection rates were 21.4% in the placebo arm, 15.1% in the combined guselkumab group, and 12.7% in the ustekinumab arm. Approximately 3.7% of patients in the combined guselkumab arm, 5.7% of patients in the placebo arm, and 5.6% of patients in the ustekinumab arm experienced at least one serious adverse event.
Greater proportions of patients receiving guselkumab achieved clinical response, Patient Reported Outcomes–2 remission, clinical-biomarker response, and endoscopic response at week 12 vs. placebo. Efficacy of ustekinumab vs. placebo was demonstrated. Safety event rates were generally similar across treatment groups.
Limitations of the study included the small number of patients in the overall dataset and the relatively short treatment period of 12 weeks. The researchers noted that phase 3 studies of guselkumab for the treatment of Crohn’s disease are underway.
Several of the researchers reported conflicts of interest with the pharmaceutical industry. The study received funding from Janssen Research & Development, LLC.
FROM GASTROENTEROLOGY
Registry data support lowering CRC screening age to 45
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
An alarming trend of increased colorectal cancer (CRC) incidence has been noted among individuals 20-49 years of age over the past 2 decades. This fact combined with the results of microsimulation studies have led all purveyors of CRC screening guidelines in the United States to lower their recommended age for the initiation of average-risk screening from 50 to 45. Despite this major shift in recommendations, relatively little is known about the rates of premalignant neoplasia in this population.
The current study by Trivedi et. al. presents the results of a massive retrospective cross-sectional study of findings at initial colonoscopy in patients 18-54 who underwent endoscopy at an AMSURG ambulatory surgical center between 2014 and 2021. Data concerning these procedures had previously been collected using the GI Quality Improvement Consortium (GIQuIC) registry. They found that the prevalence of advanced premalignant lesions (APLs) in those 45-49 was almost as high as in those 50-54, and that the prevalence of CRC was even higher. Moreover, 40- to 44-year-olds had APL and CRC prevalence rates almost as high as those aged 45-49. They further found that increasing age, male sex, White race, family history of CRC, and examinations performed for bleeding indications or screening were all associated with higher odds for APLs and CRC. They concluded that these data provide support for lowering the screening age to 45 for all average-risk individuals.
Future studies will need to document the actual effectiveness of CRC screening in persons aged 45-49 and examine the actual cost-benefit of lowering the recommended screening age.
Reid M. Ness, MD, MPH, AGAF is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center, Nashville, Tenn., and at the VA Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness is a study investigator with Guardant Health.
An alarming trend of increased colorectal cancer (CRC) incidence has been noted among individuals 20-49 years of age over the past 2 decades. This fact combined with the results of microsimulation studies have led all purveyors of CRC screening guidelines in the United States to lower their recommended age for the initiation of average-risk screening from 50 to 45. Despite this major shift in recommendations, relatively little is known about the rates of premalignant neoplasia in this population.
The current study by Trivedi et. al. presents the results of a massive retrospective cross-sectional study of findings at initial colonoscopy in patients 18-54 who underwent endoscopy at an AMSURG ambulatory surgical center between 2014 and 2021. Data concerning these procedures had previously been collected using the GI Quality Improvement Consortium (GIQuIC) registry. They found that the prevalence of advanced premalignant lesions (APLs) in those 45-49 was almost as high as in those 50-54, and that the prevalence of CRC was even higher. Moreover, 40- to 44-year-olds had APL and CRC prevalence rates almost as high as those aged 45-49. They further found that increasing age, male sex, White race, family history of CRC, and examinations performed for bleeding indications or screening were all associated with higher odds for APLs and CRC. They concluded that these data provide support for lowering the screening age to 45 for all average-risk individuals.
Future studies will need to document the actual effectiveness of CRC screening in persons aged 45-49 and examine the actual cost-benefit of lowering the recommended screening age.
Reid M. Ness, MD, MPH, AGAF is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center, Nashville, Tenn., and at the VA Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness is a study investigator with Guardant Health.
An alarming trend of increased colorectal cancer (CRC) incidence has been noted among individuals 20-49 years of age over the past 2 decades. This fact combined with the results of microsimulation studies have led all purveyors of CRC screening guidelines in the United States to lower their recommended age for the initiation of average-risk screening from 50 to 45. Despite this major shift in recommendations, relatively little is known about the rates of premalignant neoplasia in this population.
The current study by Trivedi et. al. presents the results of a massive retrospective cross-sectional study of findings at initial colonoscopy in patients 18-54 who underwent endoscopy at an AMSURG ambulatory surgical center between 2014 and 2021. Data concerning these procedures had previously been collected using the GI Quality Improvement Consortium (GIQuIC) registry. They found that the prevalence of advanced premalignant lesions (APLs) in those 45-49 was almost as high as in those 50-54, and that the prevalence of CRC was even higher. Moreover, 40- to 44-year-olds had APL and CRC prevalence rates almost as high as those aged 45-49. They further found that increasing age, male sex, White race, family history of CRC, and examinations performed for bleeding indications or screening were all associated with higher odds for APLs and CRC. They concluded that these data provide support for lowering the screening age to 45 for all average-risk individuals.
Future studies will need to document the actual effectiveness of CRC screening in persons aged 45-49 and examine the actual cost-benefit of lowering the recommended screening age.
Reid M. Ness, MD, MPH, AGAF is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center, Nashville, Tenn., and at the VA Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness is a study investigator with Guardant Health.
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
FROM GASTROENTEROLOGY
Cryoballoon ablation demonstrates long-term durability in BE
Similar to radiofrequency ablation, cryoballoon ablation (CBA) is a durable approach that can eradicate Barrett’s esophagus (BE) in treatment-naive patients with dysplastic BE, according to a single-center cohort study.
Endoscopic mucosal resection (EMR), radiofrequency ablation (RFA), and cryotherapy are established techniques used in endoscopic eradication therapy of BE, wrote study authors led by Mohamad Dbouk, MD, of Johns Hopkins Medical Institutions in Baltimore in Techniques and Innovations in Gastrointestinal Endoscopy. Unlike RFA which uses heat to induce tissue necrosis and reepithelialization of normal tissue, cryotherapy applies extreme cold in the treatment of BE. While cryotherapy as an endoscopic ablative technique has been studied over the past decade as an alternative treatment modality, Dr. Dbouk and researchers noted that long-term data on durability of and outcomes with this approach are lacking.
To gauge the durability of CBA for dysplastic BE, the researchers examined outcomes of 59 consecutive patients with BE and confirmed low-grade dysplasia (n = 22), high-grade dysplasia (n = 33), or intramucosal cancer (n = 4), all of whom were treated with CBA for the purposes of BE eradication. The single-center cohort comprised only treatment-naive patients who had a mean age of 66.8 (91.5% male). In the overall cohort, the mean length of the BE was 5 cm, although 23.7% of patients had BE ≥8 cm in length.
Following confirmation of dysplastic BE in biopsies and/or EMR specimens at baseline, patients underwent CBA applied to the gastric cardia as well as all visible BE with the cryoballoon focal ablation system and focal or pear-shaped cryoballoon. The investigators performed surveillance esophagogastroduodenoscopy (EGD) to assess the CBA response. Patients with high-grade dysplasia underwent EGD and biopsy every 3 months for the first year after completing CBA, every 6 months for the second year, and once per year thereafter. However, those with biopsies at baseline that showed low-grade dysplasia (LGD) underwent EGD and biopsy every 6 months during the first year after CBA and annually thereafter. Retreatment with ablation was allowed if recurrent dysplasia or intestinal metaplasia was found.
The study’s primary endpoints included short-term efficacy – defined as the rate of complete eradication of dysplasia (CE-D) and intestinal metaplasia (CE-IM) at 1-year follow-up – and durability – characterized by the proportion of patients with CE-D and CE-IM within 18 months and maintained at 2- and 3-year follow-up.
The median follow-up period for the patient cohort was 54.3 months. Approximately 95% of the 56 patients who were evaluable at 1 year achieved CE-D, while 75% achieved CE-IM. In an analysis that stratified patients by their baseline dysplasia grade, the rates of CE-D were each 96% in the LGD and HGD groups. At 1 year, the median number of CBA sessions used to achieve CE-IM was 3.
Throughout treatment and the follow-up period, none of the patients progressed beyond their dysplasia grade at baseline or developed esophageal cancer. All patients had maintained CE-D for years 2, 3, and 4. In addition, 98% of patients had CE-IM at 2 years, 98% had CE-IM at 3 years, and 97% of patients had CE-IM at 4 years. After stratification of patients by baseline grade of dysplasia, the researchers found no significant difference between groups in the rates of CE-D and CE-IM at each follow-up year.
In 48 patients who initially achieved CE-IM, 14.6% developed recurrent intestinal metaplasia (IM), including six in the esophagus and one in the GEJ, after a median of 20.7 months. Approximately 57% of patients who developed recurrent IM had baseline LGD, while 43% had HGD at baseline. The length of BE was not significantly associated with the risk of IM recurrence, according to a Cox proportional hazard analysis (hazard ratio, 1.02; 95% confidence interval, 0.86-1.2; P = .8).
Approximately 8.5% of patients had post-CBA strictures that required dilation during the study period. According to bivariate analysis, individuals with a BE length of ≥8 cm were significantly more likely to develop strictures, compared with patients without ultra-long BE (28.6% vs. 2.2%, respectively; P = .009). Strictures occurred during the first 4 months after the initial CBA. The median period from the first CBA treatment to stricture detection on follow-up EGD was 2 months. Around 1.7% of patients experienced postprocedural bleeding that required clipping for closure. These patients were on clopidogrel for atrial fibrillation during the first year of active treatment.
Limitations of the study included the small sample size as well as the inclusion of patients from a single center, which the researchers suggest may limit the generalizability of the results.
“More research is needed to confirm the long-term durability of CBA,” the authors concluded. “Randomized controlled trials comparing CBA with RFA are needed to assess the role of CBA as a first-line and rescue EET.”
Several of the researchers reported conflicts of interest with industry. The study received no industry funding.
Barrett’s endoscopic eradication therapy, resection of visible lesions, and ablation of remaining Barrett’s mucosa are the standard of care for dysplasia management. Radiofrequency ablation (RFA) is 91% successful in eliminating dysplasia and 78% in eliminating intestinal metaplasia (IM). Recurrence of dysplasia is rare, although recurrence of IM is 20%.
This study by Dbouk et al. examines the success of a newer ablation modality, the cryoballoon focal ablation system (CbFAS), in ablating Barrett’s tissue. With CbFAS, mucosa is focally ablated by freezing when in contact with a nitrous oxide–cooled balloon. In this single-center, single-operator study, CbFAS successfully eliminated dysplasia and IM for up to 4 years at rates comparable to RFA, with dysplasia and IM recurrence seen in 1.9% and 14.6%. The stricture rate was 8.5%, higher than the 5% typically reported for RFA.
Given the impressive results of RFA, one might ask why alternative ablation therapies are needed. CbFAS equipment costs are lower than those of RFA, and discomfort after the procedure may be less. Failure of ablation is poorly understood, likely attributable to inadequate reflux suppression and maybe thicker areas of Barrett’s mucosa. The greater depth of injury with cryoablation may succeed in some cases of RFA failure. Complexity of this ablation procedure remains high, and excessive overlap of treatment sites probably explains the higher stricture rate. Where cryoballoon ablation fits in the Barrett’s ablation paradigm is not clear. The lower cost and availability may provide traction for this new technology in the established field of Barrett’s ablation.
Bruce D. Greenwald, MD, is a professor of medicine at the University of Maryland, Baltimore, and the Marlene and Stewart Greenebaum Comprehensive Cancer Center, Baltimore. He is a consultant for Steris Endoscopy.
Barrett’s endoscopic eradication therapy, resection of visible lesions, and ablation of remaining Barrett’s mucosa are the standard of care for dysplasia management. Radiofrequency ablation (RFA) is 91% successful in eliminating dysplasia and 78% in eliminating intestinal metaplasia (IM). Recurrence of dysplasia is rare, although recurrence of IM is 20%.
This study by Dbouk et al. examines the success of a newer ablation modality, the cryoballoon focal ablation system (CbFAS), in ablating Barrett’s tissue. With CbFAS, mucosa is focally ablated by freezing when in contact with a nitrous oxide–cooled balloon. In this single-center, single-operator study, CbFAS successfully eliminated dysplasia and IM for up to 4 years at rates comparable to RFA, with dysplasia and IM recurrence seen in 1.9% and 14.6%. The stricture rate was 8.5%, higher than the 5% typically reported for RFA.
Given the impressive results of RFA, one might ask why alternative ablation therapies are needed. CbFAS equipment costs are lower than those of RFA, and discomfort after the procedure may be less. Failure of ablation is poorly understood, likely attributable to inadequate reflux suppression and maybe thicker areas of Barrett’s mucosa. The greater depth of injury with cryoablation may succeed in some cases of RFA failure. Complexity of this ablation procedure remains high, and excessive overlap of treatment sites probably explains the higher stricture rate. Where cryoballoon ablation fits in the Barrett’s ablation paradigm is not clear. The lower cost and availability may provide traction for this new technology in the established field of Barrett’s ablation.
Bruce D. Greenwald, MD, is a professor of medicine at the University of Maryland, Baltimore, and the Marlene and Stewart Greenebaum Comprehensive Cancer Center, Baltimore. He is a consultant for Steris Endoscopy.
Barrett’s endoscopic eradication therapy, resection of visible lesions, and ablation of remaining Barrett’s mucosa are the standard of care for dysplasia management. Radiofrequency ablation (RFA) is 91% successful in eliminating dysplasia and 78% in eliminating intestinal metaplasia (IM). Recurrence of dysplasia is rare, although recurrence of IM is 20%.
This study by Dbouk et al. examines the success of a newer ablation modality, the cryoballoon focal ablation system (CbFAS), in ablating Barrett’s tissue. With CbFAS, mucosa is focally ablated by freezing when in contact with a nitrous oxide–cooled balloon. In this single-center, single-operator study, CbFAS successfully eliminated dysplasia and IM for up to 4 years at rates comparable to RFA, with dysplasia and IM recurrence seen in 1.9% and 14.6%. The stricture rate was 8.5%, higher than the 5% typically reported for RFA.
Given the impressive results of RFA, one might ask why alternative ablation therapies are needed. CbFAS equipment costs are lower than those of RFA, and discomfort after the procedure may be less. Failure of ablation is poorly understood, likely attributable to inadequate reflux suppression and maybe thicker areas of Barrett’s mucosa. The greater depth of injury with cryoablation may succeed in some cases of RFA failure. Complexity of this ablation procedure remains high, and excessive overlap of treatment sites probably explains the higher stricture rate. Where cryoballoon ablation fits in the Barrett’s ablation paradigm is not clear. The lower cost and availability may provide traction for this new technology in the established field of Barrett’s ablation.
Bruce D. Greenwald, MD, is a professor of medicine at the University of Maryland, Baltimore, and the Marlene and Stewart Greenebaum Comprehensive Cancer Center, Baltimore. He is a consultant for Steris Endoscopy.
Similar to radiofrequency ablation, cryoballoon ablation (CBA) is a durable approach that can eradicate Barrett’s esophagus (BE) in treatment-naive patients with dysplastic BE, according to a single-center cohort study.
Endoscopic mucosal resection (EMR), radiofrequency ablation (RFA), and cryotherapy are established techniques used in endoscopic eradication therapy of BE, wrote study authors led by Mohamad Dbouk, MD, of Johns Hopkins Medical Institutions in Baltimore in Techniques and Innovations in Gastrointestinal Endoscopy. Unlike RFA which uses heat to induce tissue necrosis and reepithelialization of normal tissue, cryotherapy applies extreme cold in the treatment of BE. While cryotherapy as an endoscopic ablative technique has been studied over the past decade as an alternative treatment modality, Dr. Dbouk and researchers noted that long-term data on durability of and outcomes with this approach are lacking.
To gauge the durability of CBA for dysplastic BE, the researchers examined outcomes of 59 consecutive patients with BE and confirmed low-grade dysplasia (n = 22), high-grade dysplasia (n = 33), or intramucosal cancer (n = 4), all of whom were treated with CBA for the purposes of BE eradication. The single-center cohort comprised only treatment-naive patients who had a mean age of 66.8 (91.5% male). In the overall cohort, the mean length of the BE was 5 cm, although 23.7% of patients had BE ≥8 cm in length.
Following confirmation of dysplastic BE in biopsies and/or EMR specimens at baseline, patients underwent CBA applied to the gastric cardia as well as all visible BE with the cryoballoon focal ablation system and focal or pear-shaped cryoballoon. The investigators performed surveillance esophagogastroduodenoscopy (EGD) to assess the CBA response. Patients with high-grade dysplasia underwent EGD and biopsy every 3 months for the first year after completing CBA, every 6 months for the second year, and once per year thereafter. However, those with biopsies at baseline that showed low-grade dysplasia (LGD) underwent EGD and biopsy every 6 months during the first year after CBA and annually thereafter. Retreatment with ablation was allowed if recurrent dysplasia or intestinal metaplasia was found.
The study’s primary endpoints included short-term efficacy – defined as the rate of complete eradication of dysplasia (CE-D) and intestinal metaplasia (CE-IM) at 1-year follow-up – and durability – characterized by the proportion of patients with CE-D and CE-IM within 18 months and maintained at 2- and 3-year follow-up.
The median follow-up period for the patient cohort was 54.3 months. Approximately 95% of the 56 patients who were evaluable at 1 year achieved CE-D, while 75% achieved CE-IM. In an analysis that stratified patients by their baseline dysplasia grade, the rates of CE-D were each 96% in the LGD and HGD groups. At 1 year, the median number of CBA sessions used to achieve CE-IM was 3.
Throughout treatment and the follow-up period, none of the patients progressed beyond their dysplasia grade at baseline or developed esophageal cancer. All patients had maintained CE-D for years 2, 3, and 4. In addition, 98% of patients had CE-IM at 2 years, 98% had CE-IM at 3 years, and 97% of patients had CE-IM at 4 years. After stratification of patients by baseline grade of dysplasia, the researchers found no significant difference between groups in the rates of CE-D and CE-IM at each follow-up year.
In 48 patients who initially achieved CE-IM, 14.6% developed recurrent intestinal metaplasia (IM), including six in the esophagus and one in the GEJ, after a median of 20.7 months. Approximately 57% of patients who developed recurrent IM had baseline LGD, while 43% had HGD at baseline. The length of BE was not significantly associated with the risk of IM recurrence, according to a Cox proportional hazard analysis (hazard ratio, 1.02; 95% confidence interval, 0.86-1.2; P = .8).
Approximately 8.5% of patients had post-CBA strictures that required dilation during the study period. According to bivariate analysis, individuals with a BE length of ≥8 cm were significantly more likely to develop strictures, compared with patients without ultra-long BE (28.6% vs. 2.2%, respectively; P = .009). Strictures occurred during the first 4 months after the initial CBA. The median period from the first CBA treatment to stricture detection on follow-up EGD was 2 months. Around 1.7% of patients experienced postprocedural bleeding that required clipping for closure. These patients were on clopidogrel for atrial fibrillation during the first year of active treatment.
Limitations of the study included the small sample size as well as the inclusion of patients from a single center, which the researchers suggest may limit the generalizability of the results.
“More research is needed to confirm the long-term durability of CBA,” the authors concluded. “Randomized controlled trials comparing CBA with RFA are needed to assess the role of CBA as a first-line and rescue EET.”
Several of the researchers reported conflicts of interest with industry. The study received no industry funding.
Similar to radiofrequency ablation, cryoballoon ablation (CBA) is a durable approach that can eradicate Barrett’s esophagus (BE) in treatment-naive patients with dysplastic BE, according to a single-center cohort study.
Endoscopic mucosal resection (EMR), radiofrequency ablation (RFA), and cryotherapy are established techniques used in endoscopic eradication therapy of BE, wrote study authors led by Mohamad Dbouk, MD, of Johns Hopkins Medical Institutions in Baltimore in Techniques and Innovations in Gastrointestinal Endoscopy. Unlike RFA which uses heat to induce tissue necrosis and reepithelialization of normal tissue, cryotherapy applies extreme cold in the treatment of BE. While cryotherapy as an endoscopic ablative technique has been studied over the past decade as an alternative treatment modality, Dr. Dbouk and researchers noted that long-term data on durability of and outcomes with this approach are lacking.
To gauge the durability of CBA for dysplastic BE, the researchers examined outcomes of 59 consecutive patients with BE and confirmed low-grade dysplasia (n = 22), high-grade dysplasia (n = 33), or intramucosal cancer (n = 4), all of whom were treated with CBA for the purposes of BE eradication. The single-center cohort comprised only treatment-naive patients who had a mean age of 66.8 (91.5% male). In the overall cohort, the mean length of the BE was 5 cm, although 23.7% of patients had BE ≥8 cm in length.
Following confirmation of dysplastic BE in biopsies and/or EMR specimens at baseline, patients underwent CBA applied to the gastric cardia as well as all visible BE with the cryoballoon focal ablation system and focal or pear-shaped cryoballoon. The investigators performed surveillance esophagogastroduodenoscopy (EGD) to assess the CBA response. Patients with high-grade dysplasia underwent EGD and biopsy every 3 months for the first year after completing CBA, every 6 months for the second year, and once per year thereafter. However, those with biopsies at baseline that showed low-grade dysplasia (LGD) underwent EGD and biopsy every 6 months during the first year after CBA and annually thereafter. Retreatment with ablation was allowed if recurrent dysplasia or intestinal metaplasia was found.
The study’s primary endpoints included short-term efficacy – defined as the rate of complete eradication of dysplasia (CE-D) and intestinal metaplasia (CE-IM) at 1-year follow-up – and durability – characterized by the proportion of patients with CE-D and CE-IM within 18 months and maintained at 2- and 3-year follow-up.
The median follow-up period for the patient cohort was 54.3 months. Approximately 95% of the 56 patients who were evaluable at 1 year achieved CE-D, while 75% achieved CE-IM. In an analysis that stratified patients by their baseline dysplasia grade, the rates of CE-D were each 96% in the LGD and HGD groups. At 1 year, the median number of CBA sessions used to achieve CE-IM was 3.
Throughout treatment and the follow-up period, none of the patients progressed beyond their dysplasia grade at baseline or developed esophageal cancer. All patients had maintained CE-D for years 2, 3, and 4. In addition, 98% of patients had CE-IM at 2 years, 98% had CE-IM at 3 years, and 97% of patients had CE-IM at 4 years. After stratification of patients by baseline grade of dysplasia, the researchers found no significant difference between groups in the rates of CE-D and CE-IM at each follow-up year.
In 48 patients who initially achieved CE-IM, 14.6% developed recurrent intestinal metaplasia (IM), including six in the esophagus and one in the GEJ, after a median of 20.7 months. Approximately 57% of patients who developed recurrent IM had baseline LGD, while 43% had HGD at baseline. The length of BE was not significantly associated with the risk of IM recurrence, according to a Cox proportional hazard analysis (hazard ratio, 1.02; 95% confidence interval, 0.86-1.2; P = .8).
Approximately 8.5% of patients had post-CBA strictures that required dilation during the study period. According to bivariate analysis, individuals with a BE length of ≥8 cm were significantly more likely to develop strictures, compared with patients without ultra-long BE (28.6% vs. 2.2%, respectively; P = .009). Strictures occurred during the first 4 months after the initial CBA. The median period from the first CBA treatment to stricture detection on follow-up EGD was 2 months. Around 1.7% of patients experienced postprocedural bleeding that required clipping for closure. These patients were on clopidogrel for atrial fibrillation during the first year of active treatment.
Limitations of the study included the small sample size as well as the inclusion of patients from a single center, which the researchers suggest may limit the generalizability of the results.
“More research is needed to confirm the long-term durability of CBA,” the authors concluded. “Randomized controlled trials comparing CBA with RFA are needed to assess the role of CBA as a first-line and rescue EET.”
Several of the researchers reported conflicts of interest with industry. The study received no industry funding.
FROM TECHNIQUES AND INNOVATIONS IN GASTROINTESTINAL ENDOSCOPY
Registry data support lowering CRC screening age to 45
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
FROM GASTROENTEROLOGY
Improved follow-up needed to find late-stage pancreatic cancers
A relatively large number of late-stage pancreatic ductal adenocarcinomas (PDACs) are detected during follow-up surveillance, yet no single patient- or protocol-specific factor appears to be significantly associated with detecting late-stage disease during this period, according to a new systematic literature review and meta-analysis.
The researchers, led by Ankit Chhoda, MD, of Yale University, New Haven, Conn., wrote in Gastroenterology that interval progression in high-risk individuals “highlights the need for improved follow-up methodology with higher accuracy to detect prognostically significant and treatable lesions.”
Individuals at high risk for PDAC are encouraged to undergo routine surveillance for the disease because early detection and resection of T1N0M0 PDAC and high-grade precursors may improve survival outcomes. According to Dr. Chhoda and colleagues, challenges of interval progression of cancers during the surveillance period for gastrointestinal malignancies have been well described in the general and at-risk patient populations. Previous studies, the authors explained, have not scrutinized the issues associated with late-stage PDACs detected during follow-up surveillance.
“Late-stage PDACs necessitate critical appraisal of current follow-up strategies to detect successful targets and perform timely resections,” the authors wrote. The researchers added that the diagnosis of late-stage PDACs during follow-up emphasizes the need for implementing “quality measures to avoid preventable causes, including surveillance adherence and diagnostic errors.”
To understand the incidence rates of late-stage PDACs during follow-up in high-risk individuals, Dr. Chhoda and researchers performed a systematic literature review and meta-analysis of data that included follow-up strategies for early PDAC detection among a high-risk population.
Outcomes of interest for the analysis included the overall diagnosis of advanced neoplasia as well as surveillance-detected/interval late-stage PDACs (T2–4N0M0/metastatic stage PDAC) during follow-up. The investigators defined surveillance-detected and interval late-stage PDACs as late-stage PDACs that were detected during surveillance and as those presenting symptomatically between visits, respectively.
The researchers also performed metaregression of the incidence rates of late-stage PDACs to examine the relationship with clinicoradiologic features in high-risk individuals.
A total of 13 studies on surveillance in 2,169 high-risk individuals were included in the systematic review, while 12 studies were included in the meta-analysis. Across studies, high-risk individuals were followed for over 7,302.72 patient-years for the purposes of detecting incident lesions or progression of preexisting pancreatic abnormalities.
In all high-risk individuals who underwent follow-up, the investigators identified a total yield of advanced neoplasia of 53. This total yield consisted of 7 high-grade pancreatic intraepithelial neoplasms, 7 high-grade intraductal papillary mucinous neoplasms, and 39 PDACs. According to the meta-analysis, the cumulative incidence of advanced neoplasia was 3.3 (95% confidence interval, 0.6-7.4; P < .001) per 1,000 patient-years. During follow-up, the cumulative incidence of surveillance-detected/interval late-stage PDACs was 1.7 per 1,000 patient-years (95% CI, 0.2-4.0; P = .03).
In a separate analysis, the investigators sought to identify the relationship between the modality of follow-up imaging and late-stage PDAC incidence. Imaging modalities used during follow-up were mostly cross-sectional imaging, such as computed tomography or magnetic resonance imaging with cholangiopancreatography (n = 4) or endoscopic ultrasound and cross-sectional modalities (n = 8).
The investigators found no significant associations between late-stage PDACs and surveillance imaging, baseline pancreatic morphology, study location, genetic background, gender, or age. Incidence of late-stage PDACs in studies with mostly cross-sectional imaging was 0.7 per 1,000 patient-years (95% CI, 0.0-8.0). This incidence rate was lower than that reported with EUS and cross-sectional modalities (2.5 per 1,000 patient-years; 95% CI, 0.6-5.4), but this difference was not statistically significant (P = .2).
No significant difference was found during follow-up in the incidence of late-stage PDACs between high-risk individuals with baseline pancreatic abnormalities (0.0 no significant difference; 95% CI, 0.0-0.3) vs. high-risk individuals with normal baseline (0.9 per 1,000 patient-years; 95% CI, 0.0-2.8) (P = .9).
Most studies included in the analysis did not report on diagnostic errors and surveillance adherence, the researchers wrote. Nonadherence to surveillance as well as delays in surveillance accounted for four late-stage PDACs, and surveillance cessation and/or delays were reported in 4 out of 19 high-risk individuals. There was limited information on symptoms, presentation timing, site of lesion, and surveillance adherence, which the investigators indicated prevented a formal meta-analysis.
In their summary, the study authors noted that in clinical practice there is a need for improved quality measures and adherence to surveillance programs to reduce the risk of diagnostic errors. The authors stated that evidence on the impact of these quality measures “on surveillance outcomes will not only improve quality of surveillance practices, but also enrich our communication with patients who undergo surveillance.”
The researchers reported no conflicts of interest with the pharmaceutical industry, and the study did not receive any funding.
Surveillance of individuals at increased risk of pancreatic ductal adenocarcinoma (PDAC) offers an opportunity to improve disease mortality through detection of premalignant lesions and earlier stage PDAC. Emerging data suggest that outcomes in surveillance-detected PDAC are superior to those diagnosed after onset of signs and symptoms. This study by Chhoda et al. highlights a potential quality gap in current surveillance programs, namely the diagnosis of interval cancers and late-stage metastatic PDAC.
Investigators report a cumulative incidence of late-stage PDAC of 1.7 per 1,000 patient-years in surveillance, while the incidence of any advanced neoplasia (high-grade PanIN, high-grade IPMN, NET > 1 cm and any stage PDAC) was 3.3 per 1,000 patient-years. Importantly, late-stage PDAC was defined as T2-4N0-1M0-1 in this study. This is based on the 2013 International Cancer of the Pancreas Screening definition of “success of a screening program” as treatment of T1N0M0 PDAC, which was later updated to include a resected PDAC confined to the pancreas. The cumulative incidence of resectable lesions was 2.2 per 1,000 patient-years, while the incidence of unresectable PDAC was 0.6 per 1,000 patient-years in surveillance. Unfortunately, clinical features were unable to predict the onset of these 11 unresectable PDACs.
Given data reporting limitations, it is uncertain how many advanced PDACs were a result of delayed surveillance, diagnostic errors, or other preventable factors. Addressing these contributing factors as well identifying clinical indicators that may improve the efficacy of existing regimens (such as new onset diabetes, worsening in glycemic control in a person with diabetes, weight loss, and incorporation of novel biomarkers) will be critical to optimizing PDAC surveillance outcomes in in high-risk individuals.
Aimee Lucas, MD, is associate professor of medicine, division of gastroenterology, Icahn School of Medicine at Mount Sinai, New York. She reports receiving research support and consulting from Immunovia, who developed a blood-based biomarker for early PDAC detection.
Surveillance of individuals at increased risk of pancreatic ductal adenocarcinoma (PDAC) offers an opportunity to improve disease mortality through detection of premalignant lesions and earlier stage PDAC. Emerging data suggest that outcomes in surveillance-detected PDAC are superior to those diagnosed after onset of signs and symptoms. This study by Chhoda et al. highlights a potential quality gap in current surveillance programs, namely the diagnosis of interval cancers and late-stage metastatic PDAC.
Investigators report a cumulative incidence of late-stage PDAC of 1.7 per 1,000 patient-years in surveillance, while the incidence of any advanced neoplasia (high-grade PanIN, high-grade IPMN, NET > 1 cm and any stage PDAC) was 3.3 per 1,000 patient-years. Importantly, late-stage PDAC was defined as T2-4N0-1M0-1 in this study. This is based on the 2013 International Cancer of the Pancreas Screening definition of “success of a screening program” as treatment of T1N0M0 PDAC, which was later updated to include a resected PDAC confined to the pancreas. The cumulative incidence of resectable lesions was 2.2 per 1,000 patient-years, while the incidence of unresectable PDAC was 0.6 per 1,000 patient-years in surveillance. Unfortunately, clinical features were unable to predict the onset of these 11 unresectable PDACs.
Given data reporting limitations, it is uncertain how many advanced PDACs were a result of delayed surveillance, diagnostic errors, or other preventable factors. Addressing these contributing factors as well identifying clinical indicators that may improve the efficacy of existing regimens (such as new onset diabetes, worsening in glycemic control in a person with diabetes, weight loss, and incorporation of novel biomarkers) will be critical to optimizing PDAC surveillance outcomes in in high-risk individuals.
Aimee Lucas, MD, is associate professor of medicine, division of gastroenterology, Icahn School of Medicine at Mount Sinai, New York. She reports receiving research support and consulting from Immunovia, who developed a blood-based biomarker for early PDAC detection.
Surveillance of individuals at increased risk of pancreatic ductal adenocarcinoma (PDAC) offers an opportunity to improve disease mortality through detection of premalignant lesions and earlier stage PDAC. Emerging data suggest that outcomes in surveillance-detected PDAC are superior to those diagnosed after onset of signs and symptoms. This study by Chhoda et al. highlights a potential quality gap in current surveillance programs, namely the diagnosis of interval cancers and late-stage metastatic PDAC.
Investigators report a cumulative incidence of late-stage PDAC of 1.7 per 1,000 patient-years in surveillance, while the incidence of any advanced neoplasia (high-grade PanIN, high-grade IPMN, NET > 1 cm and any stage PDAC) was 3.3 per 1,000 patient-years. Importantly, late-stage PDAC was defined as T2-4N0-1M0-1 in this study. This is based on the 2013 International Cancer of the Pancreas Screening definition of “success of a screening program” as treatment of T1N0M0 PDAC, which was later updated to include a resected PDAC confined to the pancreas. The cumulative incidence of resectable lesions was 2.2 per 1,000 patient-years, while the incidence of unresectable PDAC was 0.6 per 1,000 patient-years in surveillance. Unfortunately, clinical features were unable to predict the onset of these 11 unresectable PDACs.
Given data reporting limitations, it is uncertain how many advanced PDACs were a result of delayed surveillance, diagnostic errors, or other preventable factors. Addressing these contributing factors as well identifying clinical indicators that may improve the efficacy of existing regimens (such as new onset diabetes, worsening in glycemic control in a person with diabetes, weight loss, and incorporation of novel biomarkers) will be critical to optimizing PDAC surveillance outcomes in in high-risk individuals.
Aimee Lucas, MD, is associate professor of medicine, division of gastroenterology, Icahn School of Medicine at Mount Sinai, New York. She reports receiving research support and consulting from Immunovia, who developed a blood-based biomarker for early PDAC detection.
A relatively large number of late-stage pancreatic ductal adenocarcinomas (PDACs) are detected during follow-up surveillance, yet no single patient- or protocol-specific factor appears to be significantly associated with detecting late-stage disease during this period, according to a new systematic literature review and meta-analysis.
The researchers, led by Ankit Chhoda, MD, of Yale University, New Haven, Conn., wrote in Gastroenterology that interval progression in high-risk individuals “highlights the need for improved follow-up methodology with higher accuracy to detect prognostically significant and treatable lesions.”
Individuals at high risk for PDAC are encouraged to undergo routine surveillance for the disease because early detection and resection of T1N0M0 PDAC and high-grade precursors may improve survival outcomes. According to Dr. Chhoda and colleagues, challenges of interval progression of cancers during the surveillance period for gastrointestinal malignancies have been well described in the general and at-risk patient populations. Previous studies, the authors explained, have not scrutinized the issues associated with late-stage PDACs detected during follow-up surveillance.
“Late-stage PDACs necessitate critical appraisal of current follow-up strategies to detect successful targets and perform timely resections,” the authors wrote. The researchers added that the diagnosis of late-stage PDACs during follow-up emphasizes the need for implementing “quality measures to avoid preventable causes, including surveillance adherence and diagnostic errors.”
To understand the incidence rates of late-stage PDACs during follow-up in high-risk individuals, Dr. Chhoda and researchers performed a systematic literature review and meta-analysis of data that included follow-up strategies for early PDAC detection among a high-risk population.
Outcomes of interest for the analysis included the overall diagnosis of advanced neoplasia as well as surveillance-detected/interval late-stage PDACs (T2–4N0M0/metastatic stage PDAC) during follow-up. The investigators defined surveillance-detected and interval late-stage PDACs as late-stage PDACs that were detected during surveillance and as those presenting symptomatically between visits, respectively.
The researchers also performed metaregression of the incidence rates of late-stage PDACs to examine the relationship with clinicoradiologic features in high-risk individuals.
A total of 13 studies on surveillance in 2,169 high-risk individuals were included in the systematic review, while 12 studies were included in the meta-analysis. Across studies, high-risk individuals were followed for over 7,302.72 patient-years for the purposes of detecting incident lesions or progression of preexisting pancreatic abnormalities.
In all high-risk individuals who underwent follow-up, the investigators identified a total yield of advanced neoplasia of 53. This total yield consisted of 7 high-grade pancreatic intraepithelial neoplasms, 7 high-grade intraductal papillary mucinous neoplasms, and 39 PDACs. According to the meta-analysis, the cumulative incidence of advanced neoplasia was 3.3 (95% confidence interval, 0.6-7.4; P < .001) per 1,000 patient-years. During follow-up, the cumulative incidence of surveillance-detected/interval late-stage PDACs was 1.7 per 1,000 patient-years (95% CI, 0.2-4.0; P = .03).
In a separate analysis, the investigators sought to identify the relationship between the modality of follow-up imaging and late-stage PDAC incidence. Imaging modalities used during follow-up were mostly cross-sectional imaging, such as computed tomography or magnetic resonance imaging with cholangiopancreatography (n = 4) or endoscopic ultrasound and cross-sectional modalities (n = 8).
The investigators found no significant associations between late-stage PDACs and surveillance imaging, baseline pancreatic morphology, study location, genetic background, gender, or age. Incidence of late-stage PDACs in studies with mostly cross-sectional imaging was 0.7 per 1,000 patient-years (95% CI, 0.0-8.0). This incidence rate was lower than that reported with EUS and cross-sectional modalities (2.5 per 1,000 patient-years; 95% CI, 0.6-5.4), but this difference was not statistically significant (P = .2).
No significant difference was found during follow-up in the incidence of late-stage PDACs between high-risk individuals with baseline pancreatic abnormalities (0.0 no significant difference; 95% CI, 0.0-0.3) vs. high-risk individuals with normal baseline (0.9 per 1,000 patient-years; 95% CI, 0.0-2.8) (P = .9).
Most studies included in the analysis did not report on diagnostic errors and surveillance adherence, the researchers wrote. Nonadherence to surveillance as well as delays in surveillance accounted for four late-stage PDACs, and surveillance cessation and/or delays were reported in 4 out of 19 high-risk individuals. There was limited information on symptoms, presentation timing, site of lesion, and surveillance adherence, which the investigators indicated prevented a formal meta-analysis.
In their summary, the study authors noted that in clinical practice there is a need for improved quality measures and adherence to surveillance programs to reduce the risk of diagnostic errors. The authors stated that evidence on the impact of these quality measures “on surveillance outcomes will not only improve quality of surveillance practices, but also enrich our communication with patients who undergo surveillance.”
The researchers reported no conflicts of interest with the pharmaceutical industry, and the study did not receive any funding.
A relatively large number of late-stage pancreatic ductal adenocarcinomas (PDACs) are detected during follow-up surveillance, yet no single patient- or protocol-specific factor appears to be significantly associated with detecting late-stage disease during this period, according to a new systematic literature review and meta-analysis.
The researchers, led by Ankit Chhoda, MD, of Yale University, New Haven, Conn., wrote in Gastroenterology that interval progression in high-risk individuals “highlights the need for improved follow-up methodology with higher accuracy to detect prognostically significant and treatable lesions.”
Individuals at high risk for PDAC are encouraged to undergo routine surveillance for the disease because early detection and resection of T1N0M0 PDAC and high-grade precursors may improve survival outcomes. According to Dr. Chhoda and colleagues, challenges of interval progression of cancers during the surveillance period for gastrointestinal malignancies have been well described in the general and at-risk patient populations. Previous studies, the authors explained, have not scrutinized the issues associated with late-stage PDACs detected during follow-up surveillance.
“Late-stage PDACs necessitate critical appraisal of current follow-up strategies to detect successful targets and perform timely resections,” the authors wrote. The researchers added that the diagnosis of late-stage PDACs during follow-up emphasizes the need for implementing “quality measures to avoid preventable causes, including surveillance adherence and diagnostic errors.”
To understand the incidence rates of late-stage PDACs during follow-up in high-risk individuals, Dr. Chhoda and researchers performed a systematic literature review and meta-analysis of data that included follow-up strategies for early PDAC detection among a high-risk population.
Outcomes of interest for the analysis included the overall diagnosis of advanced neoplasia as well as surveillance-detected/interval late-stage PDACs (T2–4N0M0/metastatic stage PDAC) during follow-up. The investigators defined surveillance-detected and interval late-stage PDACs as late-stage PDACs that were detected during surveillance and as those presenting symptomatically between visits, respectively.
The researchers also performed metaregression of the incidence rates of late-stage PDACs to examine the relationship with clinicoradiologic features in high-risk individuals.
A total of 13 studies on surveillance in 2,169 high-risk individuals were included in the systematic review, while 12 studies were included in the meta-analysis. Across studies, high-risk individuals were followed for over 7,302.72 patient-years for the purposes of detecting incident lesions or progression of preexisting pancreatic abnormalities.
In all high-risk individuals who underwent follow-up, the investigators identified a total yield of advanced neoplasia of 53. This total yield consisted of 7 high-grade pancreatic intraepithelial neoplasms, 7 high-grade intraductal papillary mucinous neoplasms, and 39 PDACs. According to the meta-analysis, the cumulative incidence of advanced neoplasia was 3.3 (95% confidence interval, 0.6-7.4; P < .001) per 1,000 patient-years. During follow-up, the cumulative incidence of surveillance-detected/interval late-stage PDACs was 1.7 per 1,000 patient-years (95% CI, 0.2-4.0; P = .03).
In a separate analysis, the investigators sought to identify the relationship between the modality of follow-up imaging and late-stage PDAC incidence. Imaging modalities used during follow-up were mostly cross-sectional imaging, such as computed tomography or magnetic resonance imaging with cholangiopancreatography (n = 4) or endoscopic ultrasound and cross-sectional modalities (n = 8).
The investigators found no significant associations between late-stage PDACs and surveillance imaging, baseline pancreatic morphology, study location, genetic background, gender, or age. Incidence of late-stage PDACs in studies with mostly cross-sectional imaging was 0.7 per 1,000 patient-years (95% CI, 0.0-8.0). This incidence rate was lower than that reported with EUS and cross-sectional modalities (2.5 per 1,000 patient-years; 95% CI, 0.6-5.4), but this difference was not statistically significant (P = .2).
No significant difference was found during follow-up in the incidence of late-stage PDACs between high-risk individuals with baseline pancreatic abnormalities (0.0 no significant difference; 95% CI, 0.0-0.3) vs. high-risk individuals with normal baseline (0.9 per 1,000 patient-years; 95% CI, 0.0-2.8) (P = .9).
Most studies included in the analysis did not report on diagnostic errors and surveillance adherence, the researchers wrote. Nonadherence to surveillance as well as delays in surveillance accounted for four late-stage PDACs, and surveillance cessation and/or delays were reported in 4 out of 19 high-risk individuals. There was limited information on symptoms, presentation timing, site of lesion, and surveillance adherence, which the investigators indicated prevented a formal meta-analysis.
In their summary, the study authors noted that in clinical practice there is a need for improved quality measures and adherence to surveillance programs to reduce the risk of diagnostic errors. The authors stated that evidence on the impact of these quality measures “on surveillance outcomes will not only improve quality of surveillance practices, but also enrich our communication with patients who undergo surveillance.”
The researchers reported no conflicts of interest with the pharmaceutical industry, and the study did not receive any funding.
FROM GASTROENTEROLOGY