Vedolizumab Beats Infliximab as Second-Line Therapy for Ulcerative Colitis

Article Type
Changed
Fri, 03/07/2025 - 10:22

Ulcerative colitis (UC) patients who fail on first-line therapy appear to have better outcomes with vedolizumab (Entyvio) than with infliximab, suggests EFFICACI, the first trial directly comparing second-line advanced therapies in patients with the disease.

Vedolizumab was superior to infliximab to achieving steroid-free clinical remission at week 14 in patients who had failed on a first-line subcutaneous anti–tumor necrosis factor (anti-TNF) therapy, said study presenter Guillaume Bouguen, MD, PhD, of the gastroenterology gepartment, CHU Rennes – Pontchaillou Hospital, France.

The drug also outperformed infliximab in the induction of endoscopic improvement, and its safety outcomes were “consistent with the known profile of both drugs in previous trials,” Bouguen said.

The research was presented at the European Crohn’s and Colitis Organisation 2025 Congress.

 

Dr. Tauseef Ali

The study reports only short-term outcomes, so it “remains unclear whether vedolizumab’s advantage is sustained over time or whether infliximab may catch up in effectiveness,” Tauseef Ali, MD, AGAF, executive medical director, SSM Health St. Anthony Digestive Care, Crohn’s and Colitis Center, Oklahoma City, said in an interview.

Bouguen noted that the trial was unblinded at week 14 and that patients were followed up to week 54, data for which will be presented in the near future.

 

Head-to-Head Trial

Treating ulcerative colitis beyond the first line of therapy is “becoming challenging” because there are several therapeutic classes and drugs to choose from but no strong evidence to support physician decision-making, Bouguen said.

No head-to-head trials for second-line advanced therapies for UC had been performed, he said. So Bouguen and colleagues conducted a randomized, double-blind trial to determine whether vedolizumab, an integrin receptor agonist, is superior to infliximab, a TNF antagonist, in ulcerative colitis patients who had failed a first-line subcutaneous TNF antagonist.

They enrolled patients with moderate to severe disease, defined by a total Mayo score ≥ 6, despite at least 12 weeks of treatment with the TNF antagonists golimumab (Simponi) or adalimumab (Humira and others), from 24 centers across France.

Participants were randomly assigned to intravenous 300 mg vedolizumab or 5 mg/kg infliximab. Clinical biological assessments performed at baseline and at weeks 2 and 6. The primary endpoint was steroid-free clinical remission (Mayo score ≤ 2) at week 14.

Of 165 patients assessed for eligibility, 78 were randomly assigned to vedolizumab and 73 to infliximab, of whom 77 and 70 and patients, respectively, were available for assessment at week 14. Approximately 40% of the participants were women, and the average age was almost 40 years.

The mean total Mayo score at baseline was comparable between the two groups (9.0 vedolizumab; 8.7 infliximab). The majority in both groups had previously been treated with adalimumab, and almost 60% had experienced a loss of response to therapy.

Steroid-free clinical remission at week 14 was achieved by 34.6% of patients treated with vedolizumab vs 19.2% of those given infliximab (P = .033).

Endoscopic remission at week 14 was achieved by 19.5% of patients in the vedolizumab group vs 8.3% of those treated with infliximab (P = .0507), while endoscopic improvement was seen in 46.8% and 29.2% of patients, respectively (P = .0273).

There were no statistically significant differences between the two treatment groups in rates of clinical response or mean C-reactive protein (CRP) levels between baseline and week 14, and there was no significant difference in fecal calprotectin levels at week 14.

Interestingly, Bouguen said that, from parameters such as age, sex, Mayo score, CRP levels, and concomitant immunosuppressant use, there were no significant predictors of clinical remission.

The overall incidence of adverse events, including respiratory tract and Clostridioides difficile infections, was comparable between the vedolizumab and infliximab groups, although patients receiving infliximab had higher rates of disease worsening and infusion reactions.

 

Questions Remain

Study coinvestigator Matthieu Allez, MD, PhD, head of the gastroenterology department, Hôpital Saint-Louis, Assistance Publique Hopitaux de Paris, said in an interview that he was surprised by the findings.

“I think infliximab is a much better drug than vedolizumab,” considering the rate of immunosuppressant combination therapy that is administered in ulcerative colitis, said Allez, who was the session’s co-chair.

This is a “key aspect” as “you can give more” of such therapy to patients receiving infliximab, “but, in fact, it seems like they do better” with vedolizumab, Allez said.

Ali said that the trial “addresses a critical gap in the treatment of ulcerative colitis: Whether switching within the anti-TNF class or swapping to vedolizumab is more effective after failure of a first subcutaneous anti-TNF.”

“This question has real-world clinical relevance, as gastroenterologists often face this decision,” he added.

Ali, who was not involved in the study, said that even though the results “suggest that vedolizumab may be a more effective option than infliximab in this patient population” and there were no major safety concerns with either drug, “one must exercise caution in interpreting and applying the results to clinical practice.”

Moreover, the lack of statistically significant clinical response rates between the drugs “raises questions about whether the primary endpoint difference is clinically meaningful over the long term,” he said.

The study was conducted in only one country, thus potentially limiting its generalizability, Ali noted, and it included only patients who had failed on subcutaneous, not intravenous, anti-TNF therapy. There was also a lack of biomarker stratification, “making it unclear which patients would benefit most from switching vs swapping strategies,” he added.

“While vedolizumab may be preferable, many other factors,” such as drug serum levels, immunogenicity, urgency of response, access, and cost, “should guide decision-making,” Ali said.

The study was funded by the French national research program, with additional funding from Takeda. Bouguen declared relationships with Abbvie, Janssen, Lilly, Takeda, Celltrion, Sandoz, Galapagos, Tillotts, and Amgen. No other disclosures were reported.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Ulcerative colitis (UC) patients who fail on first-line therapy appear to have better outcomes with vedolizumab (Entyvio) than with infliximab, suggests EFFICACI, the first trial directly comparing second-line advanced therapies in patients with the disease.

Vedolizumab was superior to infliximab to achieving steroid-free clinical remission at week 14 in patients who had failed on a first-line subcutaneous anti–tumor necrosis factor (anti-TNF) therapy, said study presenter Guillaume Bouguen, MD, PhD, of the gastroenterology gepartment, CHU Rennes – Pontchaillou Hospital, France.

The drug also outperformed infliximab in the induction of endoscopic improvement, and its safety outcomes were “consistent with the known profile of both drugs in previous trials,” Bouguen said.

The research was presented at the European Crohn’s and Colitis Organisation 2025 Congress.

 

Dr. Tauseef Ali

The study reports only short-term outcomes, so it “remains unclear whether vedolizumab’s advantage is sustained over time or whether infliximab may catch up in effectiveness,” Tauseef Ali, MD, AGAF, executive medical director, SSM Health St. Anthony Digestive Care, Crohn’s and Colitis Center, Oklahoma City, said in an interview.

Bouguen noted that the trial was unblinded at week 14 and that patients were followed up to week 54, data for which will be presented in the near future.

 

Head-to-Head Trial

Treating ulcerative colitis beyond the first line of therapy is “becoming challenging” because there are several therapeutic classes and drugs to choose from but no strong evidence to support physician decision-making, Bouguen said.

No head-to-head trials for second-line advanced therapies for UC had been performed, he said. So Bouguen and colleagues conducted a randomized, double-blind trial to determine whether vedolizumab, an integrin receptor agonist, is superior to infliximab, a TNF antagonist, in ulcerative colitis patients who had failed a first-line subcutaneous TNF antagonist.

They enrolled patients with moderate to severe disease, defined by a total Mayo score ≥ 6, despite at least 12 weeks of treatment with the TNF antagonists golimumab (Simponi) or adalimumab (Humira and others), from 24 centers across France.

Participants were randomly assigned to intravenous 300 mg vedolizumab or 5 mg/kg infliximab. Clinical biological assessments performed at baseline and at weeks 2 and 6. The primary endpoint was steroid-free clinical remission (Mayo score ≤ 2) at week 14.

Of 165 patients assessed for eligibility, 78 were randomly assigned to vedolizumab and 73 to infliximab, of whom 77 and 70 and patients, respectively, were available for assessment at week 14. Approximately 40% of the participants were women, and the average age was almost 40 years.

The mean total Mayo score at baseline was comparable between the two groups (9.0 vedolizumab; 8.7 infliximab). The majority in both groups had previously been treated with adalimumab, and almost 60% had experienced a loss of response to therapy.

Steroid-free clinical remission at week 14 was achieved by 34.6% of patients treated with vedolizumab vs 19.2% of those given infliximab (P = .033).

Endoscopic remission at week 14 was achieved by 19.5% of patients in the vedolizumab group vs 8.3% of those treated with infliximab (P = .0507), while endoscopic improvement was seen in 46.8% and 29.2% of patients, respectively (P = .0273).

There were no statistically significant differences between the two treatment groups in rates of clinical response or mean C-reactive protein (CRP) levels between baseline and week 14, and there was no significant difference in fecal calprotectin levels at week 14.

Interestingly, Bouguen said that, from parameters such as age, sex, Mayo score, CRP levels, and concomitant immunosuppressant use, there were no significant predictors of clinical remission.

The overall incidence of adverse events, including respiratory tract and Clostridioides difficile infections, was comparable between the vedolizumab and infliximab groups, although patients receiving infliximab had higher rates of disease worsening and infusion reactions.

 

Questions Remain

Study coinvestigator Matthieu Allez, MD, PhD, head of the gastroenterology department, Hôpital Saint-Louis, Assistance Publique Hopitaux de Paris, said in an interview that he was surprised by the findings.

“I think infliximab is a much better drug than vedolizumab,” considering the rate of immunosuppressant combination therapy that is administered in ulcerative colitis, said Allez, who was the session’s co-chair.

This is a “key aspect” as “you can give more” of such therapy to patients receiving infliximab, “but, in fact, it seems like they do better” with vedolizumab, Allez said.

Ali said that the trial “addresses a critical gap in the treatment of ulcerative colitis: Whether switching within the anti-TNF class or swapping to vedolizumab is more effective after failure of a first subcutaneous anti-TNF.”

“This question has real-world clinical relevance, as gastroenterologists often face this decision,” he added.

Ali, who was not involved in the study, said that even though the results “suggest that vedolizumab may be a more effective option than infliximab in this patient population” and there were no major safety concerns with either drug, “one must exercise caution in interpreting and applying the results to clinical practice.”

Moreover, the lack of statistically significant clinical response rates between the drugs “raises questions about whether the primary endpoint difference is clinically meaningful over the long term,” he said.

The study was conducted in only one country, thus potentially limiting its generalizability, Ali noted, and it included only patients who had failed on subcutaneous, not intravenous, anti-TNF therapy. There was also a lack of biomarker stratification, “making it unclear which patients would benefit most from switching vs swapping strategies,” he added.

“While vedolizumab may be preferable, many other factors,” such as drug serum levels, immunogenicity, urgency of response, access, and cost, “should guide decision-making,” Ali said.

The study was funded by the French national research program, with additional funding from Takeda. Bouguen declared relationships with Abbvie, Janssen, Lilly, Takeda, Celltrion, Sandoz, Galapagos, Tillotts, and Amgen. No other disclosures were reported.

A version of this article appeared on Medscape.com.

Ulcerative colitis (UC) patients who fail on first-line therapy appear to have better outcomes with vedolizumab (Entyvio) than with infliximab, suggests EFFICACI, the first trial directly comparing second-line advanced therapies in patients with the disease.

Vedolizumab was superior to infliximab to achieving steroid-free clinical remission at week 14 in patients who had failed on a first-line subcutaneous anti–tumor necrosis factor (anti-TNF) therapy, said study presenter Guillaume Bouguen, MD, PhD, of the gastroenterology gepartment, CHU Rennes – Pontchaillou Hospital, France.

The drug also outperformed infliximab in the induction of endoscopic improvement, and its safety outcomes were “consistent with the known profile of both drugs in previous trials,” Bouguen said.

The research was presented at the European Crohn’s and Colitis Organisation 2025 Congress.

 

Dr. Tauseef Ali

The study reports only short-term outcomes, so it “remains unclear whether vedolizumab’s advantage is sustained over time or whether infliximab may catch up in effectiveness,” Tauseef Ali, MD, AGAF, executive medical director, SSM Health St. Anthony Digestive Care, Crohn’s and Colitis Center, Oklahoma City, said in an interview.

Bouguen noted that the trial was unblinded at week 14 and that patients were followed up to week 54, data for which will be presented in the near future.

 

Head-to-Head Trial

Treating ulcerative colitis beyond the first line of therapy is “becoming challenging” because there are several therapeutic classes and drugs to choose from but no strong evidence to support physician decision-making, Bouguen said.

No head-to-head trials for second-line advanced therapies for UC had been performed, he said. So Bouguen and colleagues conducted a randomized, double-blind trial to determine whether vedolizumab, an integrin receptor agonist, is superior to infliximab, a TNF antagonist, in ulcerative colitis patients who had failed a first-line subcutaneous TNF antagonist.

They enrolled patients with moderate to severe disease, defined by a total Mayo score ≥ 6, despite at least 12 weeks of treatment with the TNF antagonists golimumab (Simponi) or adalimumab (Humira and others), from 24 centers across France.

Participants were randomly assigned to intravenous 300 mg vedolizumab or 5 mg/kg infliximab. Clinical biological assessments performed at baseline and at weeks 2 and 6. The primary endpoint was steroid-free clinical remission (Mayo score ≤ 2) at week 14.

Of 165 patients assessed for eligibility, 78 were randomly assigned to vedolizumab and 73 to infliximab, of whom 77 and 70 and patients, respectively, were available for assessment at week 14. Approximately 40% of the participants were women, and the average age was almost 40 years.

The mean total Mayo score at baseline was comparable between the two groups (9.0 vedolizumab; 8.7 infliximab). The majority in both groups had previously been treated with adalimumab, and almost 60% had experienced a loss of response to therapy.

Steroid-free clinical remission at week 14 was achieved by 34.6% of patients treated with vedolizumab vs 19.2% of those given infliximab (P = .033).

Endoscopic remission at week 14 was achieved by 19.5% of patients in the vedolizumab group vs 8.3% of those treated with infliximab (P = .0507), while endoscopic improvement was seen in 46.8% and 29.2% of patients, respectively (P = .0273).

There were no statistically significant differences between the two treatment groups in rates of clinical response or mean C-reactive protein (CRP) levels between baseline and week 14, and there was no significant difference in fecal calprotectin levels at week 14.

Interestingly, Bouguen said that, from parameters such as age, sex, Mayo score, CRP levels, and concomitant immunosuppressant use, there were no significant predictors of clinical remission.

The overall incidence of adverse events, including respiratory tract and Clostridioides difficile infections, was comparable between the vedolizumab and infliximab groups, although patients receiving infliximab had higher rates of disease worsening and infusion reactions.

 

Questions Remain

Study coinvestigator Matthieu Allez, MD, PhD, head of the gastroenterology department, Hôpital Saint-Louis, Assistance Publique Hopitaux de Paris, said in an interview that he was surprised by the findings.

“I think infliximab is a much better drug than vedolizumab,” considering the rate of immunosuppressant combination therapy that is administered in ulcerative colitis, said Allez, who was the session’s co-chair.

This is a “key aspect” as “you can give more” of such therapy to patients receiving infliximab, “but, in fact, it seems like they do better” with vedolizumab, Allez said.

Ali said that the trial “addresses a critical gap in the treatment of ulcerative colitis: Whether switching within the anti-TNF class or swapping to vedolizumab is more effective after failure of a first subcutaneous anti-TNF.”

“This question has real-world clinical relevance, as gastroenterologists often face this decision,” he added.

Ali, who was not involved in the study, said that even though the results “suggest that vedolizumab may be a more effective option than infliximab in this patient population” and there were no major safety concerns with either drug, “one must exercise caution in interpreting and applying the results to clinical practice.”

Moreover, the lack of statistically significant clinical response rates between the drugs “raises questions about whether the primary endpoint difference is clinically meaningful over the long term,” he said.

The study was conducted in only one country, thus potentially limiting its generalizability, Ali noted, and it included only patients who had failed on subcutaneous, not intravenous, anti-TNF therapy. There was also a lack of biomarker stratification, “making it unclear which patients would benefit most from switching vs swapping strategies,” he added.

“While vedolizumab may be preferable, many other factors,” such as drug serum levels, immunogenicity, urgency of response, access, and cost, “should guide decision-making,” Ali said.

The study was funded by the French national research program, with additional funding from Takeda. Bouguen declared relationships with Abbvie, Janssen, Lilly, Takeda, Celltrion, Sandoz, Galapagos, Tillotts, and Amgen. No other disclosures were reported.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECCO 2025

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 03/07/2025 - 10:10
Un-Gate On Date
Fri, 03/07/2025 - 10:10
Use ProPublica
CFC Schedule Remove Status
Fri, 03/07/2025 - 10:10
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 03/07/2025 - 10:10

Anxiety, Depression, and Insufficient Exercise Linked to IBD Flare

Article Type
Changed
Thu, 03/06/2025 - 13:01

Psychosocial factors, such as anxiety and depression, are associated with an increased risk for both self-reported “clinical” and symptomatic, or “hard,” flare in inflammatory bowel disease (IBD), suggested a study of UK patients.

The research was presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

“Despite clinical remission, there is a significant burden of psychosocial comorbidity in IBD patients,” said study presenter Lauranne A.A.P. Derikx, PhD, a gastroenterology researcher at Erasmus University MC, Rotterdam, the Netherlands.

“Anxiety, sleep, and somatization were associated with an increased risk of clinical flare, and depression and lack of exercise were associated with an increased risk of hard flare,” she said. “Altogether, this supports a holistic approach in IBD patients.”

 

Dr Stephen Lupe

Stephen E. Lupe, PsyD, director of behavioral medicine for the department of gastroenterology, hepatology and nutrition at the Cleveland Clinic, Ohio, who was not involved in the study, agreed.

“Whole-person care is so important” in IBD, and this study is part of a growing literature making the connection between symptom flare and factors such as anxiety, depression, stress, and even trauma, he said in an interview.

 

Searching for Predictive Links

The relapsing and remitting disease course in IBD is dynamic and hard to predict, Derikx said. Unfortunately, clinicians don’t know which patients with IBD will develop a flare or when it will occur.

There’s a high prevalence of psychosocial comorbidity among patients with IBD and a “bidirectional relationship between psychosocial vulnerabilities” and the disease course via the gut-brain axis, Derikx noted.

To determine which psychosocial factors may be associated with and predictive of IBD flare, researchers analyzed data from the PREdiCCt study, a large prospective study of patients with IBD from 47 centers across the United Kingdom that aims to determine the factors associated with developing a flare.

The median age of PREdiCCT study participants was 44 years, median duration of IBD was 10 years, and 35% were receiving advanced IBD therapy. The median fecal calprotectin level was 49 mcg/g, although 18% of patients had a level > 250 mcg/g, Derikx noted.

To be included in PREdiCCT, patients must have received the diagnosis of IBD more than 6 months previously, had not change their medication for more than 2 months, and answered “yes” to the question: Do you think your disease has been well controlled in the past 1 month? The question was chosen as a measure of clinical remission.

The team collected stool samples and gathered information via questionnaires about lifestyle, diet, and other factors.

 

Depression and Anxiety Increase Risk

Researchers included 1641 patients — 830 with Crohn’s and 811 with ulcerative colitis or IBD unclassified (IBDU) — with complete datasets in their analysis of associations between psychosocial factors and IBD flare.

Baseline questionnaires identified moderate anxiety in 18.8% of participants, severe anxiety in 16.1%, moderate depression in 9.8%, severe depression in 5.7%, sleep disturbances in 46.4%, moderate somatization in 22.8%, severe somatization in 7.9%, insufficient exercise in 22.2%, and consumption of more than 14 units of alcohol in 24%.

After 24 months of follow-up, 36% of patients had experienced a clinical flare, defined as answering “no” to the question: Do you think your disease has been well controlled in the past 1 month/since you last logged in to the [study] portal?

In addition, 13% of patients experienced a hard flare, defined as a clinical flare plus C-reactive protein levels > 5 mg/L and/or a calprotectin level > 250 mcg/g and a change in IBD therapy.

Survival analyses with Cox frailty models adjusted for baseline fecal calprotectin, sex, index of multiple deprivation, hospital site, and patient age revealed statistically significant associations between several psychosocial factors and increased risk for flare.

Moderate anxiety in Crohn’s disease increased clinical flare risk (adjusted hazard ratio [aHR], 1.64), as did severe anxiety in both Crohn’s disease (aHR, 1.86) and ulcerative colitis/IBDU (aHR, 1.46). Moderate depression and severe depression increased the flare risk in ulcerative colitis/IBDU (aHR, 1.72 and 1.67, respectively). Also increasing clinical flare risk was poor sleep quality in Crohn’s disease (aHR, 1.58), and severe somatization in Crohn’s disease (aHR, 3.86) and ulcerative colitis/IBDU (aHR, 1.96).

Fewer psychosocial factors were associated with increased risk for hard flare: moderate depression in ulcerative colitis/IBDU (aHR, 2.5), severe somatization in Crohn’s disease (aHR, 2.34), and lack of exercise in ulcerative colitis/IBDU (aHR, 1.55).

 

Physician-Patient Disconnect

There is “very little correlation” between self-reported and symptomatic flare in IBD, Lupe said. “This happens all the time, where the gastroenterologist will come out of the endoscopy suite and go: ‘You’re in remission.’ And the patient goes: ‘What are you talking about? I’m still going to the bathroom 20 times a day.’ ”

Now there are data showing that, if the care team undertakes behavioral work with patients who have IBD, “the medications work more effectively,” Lupe said.

“I think medicine is in a point of transition right now,” he added. “We’re (moving from) looking at people as disease states and ‘how do I treat the disease’ to ‘how do I take care of this human being,’ knowing that everything this human being does, including everything we put in our mouth, everything we experience, changes what happens inside our body, and it’s measurable.”

The PREdiCCt study is sponsored by the University of Edinburgh, Scotland. Derikx declared relationships with AbbVie, Janssen Pharmaceuticals, Sandoz, Galapagos, and Pfizer. Other authors also declared relationships with pharmaceutical companies.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Psychosocial factors, such as anxiety and depression, are associated with an increased risk for both self-reported “clinical” and symptomatic, or “hard,” flare in inflammatory bowel disease (IBD), suggested a study of UK patients.

The research was presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

“Despite clinical remission, there is a significant burden of psychosocial comorbidity in IBD patients,” said study presenter Lauranne A.A.P. Derikx, PhD, a gastroenterology researcher at Erasmus University MC, Rotterdam, the Netherlands.

“Anxiety, sleep, and somatization were associated with an increased risk of clinical flare, and depression and lack of exercise were associated with an increased risk of hard flare,” she said. “Altogether, this supports a holistic approach in IBD patients.”

 

Dr Stephen Lupe

Stephen E. Lupe, PsyD, director of behavioral medicine for the department of gastroenterology, hepatology and nutrition at the Cleveland Clinic, Ohio, who was not involved in the study, agreed.

“Whole-person care is so important” in IBD, and this study is part of a growing literature making the connection between symptom flare and factors such as anxiety, depression, stress, and even trauma, he said in an interview.

 

Searching for Predictive Links

The relapsing and remitting disease course in IBD is dynamic and hard to predict, Derikx said. Unfortunately, clinicians don’t know which patients with IBD will develop a flare or when it will occur.

There’s a high prevalence of psychosocial comorbidity among patients with IBD and a “bidirectional relationship between psychosocial vulnerabilities” and the disease course via the gut-brain axis, Derikx noted.

To determine which psychosocial factors may be associated with and predictive of IBD flare, researchers analyzed data from the PREdiCCt study, a large prospective study of patients with IBD from 47 centers across the United Kingdom that aims to determine the factors associated with developing a flare.

The median age of PREdiCCT study participants was 44 years, median duration of IBD was 10 years, and 35% were receiving advanced IBD therapy. The median fecal calprotectin level was 49 mcg/g, although 18% of patients had a level > 250 mcg/g, Derikx noted.

To be included in PREdiCCT, patients must have received the diagnosis of IBD more than 6 months previously, had not change their medication for more than 2 months, and answered “yes” to the question: Do you think your disease has been well controlled in the past 1 month? The question was chosen as a measure of clinical remission.

The team collected stool samples and gathered information via questionnaires about lifestyle, diet, and other factors.

 

Depression and Anxiety Increase Risk

Researchers included 1641 patients — 830 with Crohn’s and 811 with ulcerative colitis or IBD unclassified (IBDU) — with complete datasets in their analysis of associations between psychosocial factors and IBD flare.

Baseline questionnaires identified moderate anxiety in 18.8% of participants, severe anxiety in 16.1%, moderate depression in 9.8%, severe depression in 5.7%, sleep disturbances in 46.4%, moderate somatization in 22.8%, severe somatization in 7.9%, insufficient exercise in 22.2%, and consumption of more than 14 units of alcohol in 24%.

After 24 months of follow-up, 36% of patients had experienced a clinical flare, defined as answering “no” to the question: Do you think your disease has been well controlled in the past 1 month/since you last logged in to the [study] portal?

In addition, 13% of patients experienced a hard flare, defined as a clinical flare plus C-reactive protein levels > 5 mg/L and/or a calprotectin level > 250 mcg/g and a change in IBD therapy.

Survival analyses with Cox frailty models adjusted for baseline fecal calprotectin, sex, index of multiple deprivation, hospital site, and patient age revealed statistically significant associations between several psychosocial factors and increased risk for flare.

Moderate anxiety in Crohn’s disease increased clinical flare risk (adjusted hazard ratio [aHR], 1.64), as did severe anxiety in both Crohn’s disease (aHR, 1.86) and ulcerative colitis/IBDU (aHR, 1.46). Moderate depression and severe depression increased the flare risk in ulcerative colitis/IBDU (aHR, 1.72 and 1.67, respectively). Also increasing clinical flare risk was poor sleep quality in Crohn’s disease (aHR, 1.58), and severe somatization in Crohn’s disease (aHR, 3.86) and ulcerative colitis/IBDU (aHR, 1.96).

Fewer psychosocial factors were associated with increased risk for hard flare: moderate depression in ulcerative colitis/IBDU (aHR, 2.5), severe somatization in Crohn’s disease (aHR, 2.34), and lack of exercise in ulcerative colitis/IBDU (aHR, 1.55).

 

Physician-Patient Disconnect

There is “very little correlation” between self-reported and symptomatic flare in IBD, Lupe said. “This happens all the time, where the gastroenterologist will come out of the endoscopy suite and go: ‘You’re in remission.’ And the patient goes: ‘What are you talking about? I’m still going to the bathroom 20 times a day.’ ”

Now there are data showing that, if the care team undertakes behavioral work with patients who have IBD, “the medications work more effectively,” Lupe said.

“I think medicine is in a point of transition right now,” he added. “We’re (moving from) looking at people as disease states and ‘how do I treat the disease’ to ‘how do I take care of this human being,’ knowing that everything this human being does, including everything we put in our mouth, everything we experience, changes what happens inside our body, and it’s measurable.”

The PREdiCCt study is sponsored by the University of Edinburgh, Scotland. Derikx declared relationships with AbbVie, Janssen Pharmaceuticals, Sandoz, Galapagos, and Pfizer. Other authors also declared relationships with pharmaceutical companies.

A version of this article appeared on Medscape.com.

Psychosocial factors, such as anxiety and depression, are associated with an increased risk for both self-reported “clinical” and symptomatic, or “hard,” flare in inflammatory bowel disease (IBD), suggested a study of UK patients.

The research was presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

“Despite clinical remission, there is a significant burden of psychosocial comorbidity in IBD patients,” said study presenter Lauranne A.A.P. Derikx, PhD, a gastroenterology researcher at Erasmus University MC, Rotterdam, the Netherlands.

“Anxiety, sleep, and somatization were associated with an increased risk of clinical flare, and depression and lack of exercise were associated with an increased risk of hard flare,” she said. “Altogether, this supports a holistic approach in IBD patients.”

 

Dr Stephen Lupe

Stephen E. Lupe, PsyD, director of behavioral medicine for the department of gastroenterology, hepatology and nutrition at the Cleveland Clinic, Ohio, who was not involved in the study, agreed.

“Whole-person care is so important” in IBD, and this study is part of a growing literature making the connection between symptom flare and factors such as anxiety, depression, stress, and even trauma, he said in an interview.

 

Searching for Predictive Links

The relapsing and remitting disease course in IBD is dynamic and hard to predict, Derikx said. Unfortunately, clinicians don’t know which patients with IBD will develop a flare or when it will occur.

There’s a high prevalence of psychosocial comorbidity among patients with IBD and a “bidirectional relationship between psychosocial vulnerabilities” and the disease course via the gut-brain axis, Derikx noted.

To determine which psychosocial factors may be associated with and predictive of IBD flare, researchers analyzed data from the PREdiCCt study, a large prospective study of patients with IBD from 47 centers across the United Kingdom that aims to determine the factors associated with developing a flare.

The median age of PREdiCCT study participants was 44 years, median duration of IBD was 10 years, and 35% were receiving advanced IBD therapy. The median fecal calprotectin level was 49 mcg/g, although 18% of patients had a level > 250 mcg/g, Derikx noted.

To be included in PREdiCCT, patients must have received the diagnosis of IBD more than 6 months previously, had not change their medication for more than 2 months, and answered “yes” to the question: Do you think your disease has been well controlled in the past 1 month? The question was chosen as a measure of clinical remission.

The team collected stool samples and gathered information via questionnaires about lifestyle, diet, and other factors.

 

Depression and Anxiety Increase Risk

Researchers included 1641 patients — 830 with Crohn’s and 811 with ulcerative colitis or IBD unclassified (IBDU) — with complete datasets in their analysis of associations between psychosocial factors and IBD flare.

Baseline questionnaires identified moderate anxiety in 18.8% of participants, severe anxiety in 16.1%, moderate depression in 9.8%, severe depression in 5.7%, sleep disturbances in 46.4%, moderate somatization in 22.8%, severe somatization in 7.9%, insufficient exercise in 22.2%, and consumption of more than 14 units of alcohol in 24%.

After 24 months of follow-up, 36% of patients had experienced a clinical flare, defined as answering “no” to the question: Do you think your disease has been well controlled in the past 1 month/since you last logged in to the [study] portal?

In addition, 13% of patients experienced a hard flare, defined as a clinical flare plus C-reactive protein levels > 5 mg/L and/or a calprotectin level > 250 mcg/g and a change in IBD therapy.

Survival analyses with Cox frailty models adjusted for baseline fecal calprotectin, sex, index of multiple deprivation, hospital site, and patient age revealed statistically significant associations between several psychosocial factors and increased risk for flare.

Moderate anxiety in Crohn’s disease increased clinical flare risk (adjusted hazard ratio [aHR], 1.64), as did severe anxiety in both Crohn’s disease (aHR, 1.86) and ulcerative colitis/IBDU (aHR, 1.46). Moderate depression and severe depression increased the flare risk in ulcerative colitis/IBDU (aHR, 1.72 and 1.67, respectively). Also increasing clinical flare risk was poor sleep quality in Crohn’s disease (aHR, 1.58), and severe somatization in Crohn’s disease (aHR, 3.86) and ulcerative colitis/IBDU (aHR, 1.96).

Fewer psychosocial factors were associated with increased risk for hard flare: moderate depression in ulcerative colitis/IBDU (aHR, 2.5), severe somatization in Crohn’s disease (aHR, 2.34), and lack of exercise in ulcerative colitis/IBDU (aHR, 1.55).

 

Physician-Patient Disconnect

There is “very little correlation” between self-reported and symptomatic flare in IBD, Lupe said. “This happens all the time, where the gastroenterologist will come out of the endoscopy suite and go: ‘You’re in remission.’ And the patient goes: ‘What are you talking about? I’m still going to the bathroom 20 times a day.’ ”

Now there are data showing that, if the care team undertakes behavioral work with patients who have IBD, “the medications work more effectively,” Lupe said.

“I think medicine is in a point of transition right now,” he added. “We’re (moving from) looking at people as disease states and ‘how do I treat the disease’ to ‘how do I take care of this human being,’ knowing that everything this human being does, including everything we put in our mouth, everything we experience, changes what happens inside our body, and it’s measurable.”

The PREdiCCt study is sponsored by the University of Edinburgh, Scotland. Derikx declared relationships with AbbVie, Janssen Pharmaceuticals, Sandoz, Galapagos, and Pfizer. Other authors also declared relationships with pharmaceutical companies.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECCO 2025

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Thu, 03/06/2025 - 12:53
Un-Gate On Date
Thu, 03/06/2025 - 12:53
Use ProPublica
CFC Schedule Remove Status
Thu, 03/06/2025 - 12:53
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Thu, 03/06/2025 - 12:53

Machine-Learning Model Identifies Gut Biomarkers That May Help Diagnose IBD Patients

Article Type
Changed
Thu, 03/06/2025 - 12:59

Gut microbial biomarkers identified using machine learning can differentiate patients with inflammatory bowel disease (IBD) from healthy control individuals, according to a study presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

Of the four techniques the researchers tested, a “machine-learning approach achieves the highest diagnostic accuracy, effectively distinguishing IBD and particularly differentiating Crohn’s disease from healthy controls in independent cohorts,” said study presenter Jee-Won Choi, department of biology, Kyung Hee University, Seoul, Republic of Korea.

“Integrating microbial markers with conventional diagnostics could enhance [their] clinical utility,” Choi said. However, further research is needed to determine the long-term validity of the biomarkers.

Some experts questioned the reliability of the markers for IBD diagnosis because of the makeup of study populations, which included patients with known IBD who likely have undergone treatment that may have altered their gut microbiomes.

 

Biomarkers Found and Tested

The gut microbiota exists in two states: Eubiosis, which supports health, and inflammatory dysbiosis, an imbalanced state associated with disease, most notably IBD, Choi noted.

Although many studies have explored the differences between these two states, there have been three major challenges in identifying IBD biomarkers: The studies have had small sample sizes, they’ve concentrated on a single analytical approach, and they’ve had low reproducibility.

To overcome those challenges, researchers used a large-scale dataset and used multiple methods to determine which analytical approach yielded the most reliable results, Choi said. They validated their results in three independent cohorts with diverse populations.

The study included 414 patients with Crohn’s disease, 880 with ulcerative colitis, and 2467 healthy control individuals from 21 centers in the Republic of Korea. Their gut microbiota profiles were analyzed from stool samples using 16S ribosomal RNA gene sequencing.

Researchers used four techniques to identify potential IBD biomarkers in the samples: differential abundance analysis, supervised random forest machine learning, unsupervised network analysis, and literature-based curation.

Biomarker candidates generated by these methods were then compared for their diagnostic ability using a machine learning model. The findings were tested in three independent cohorts — one domestic and one international population, both of which included patients with IBD and healthy control individuals, and one dataset of patients without IBD.

The results showed that there were distinct differences in the microbial composition between healthy control individuals and patients with Crohn’s disease and with ulcerative colitis. Patients with IBD, particularly those with Crohn’s disease, consistently had a significantly higher prevalence of dysbiosis, Choi said.

Each of the four analytical techniques revealed distinct microbial biomarkers associated with IBD in general, as well as with Crohn’s disease and ulcerative colitis individually.

When comparing IBD patients overall with healthy control individuals, supervised machine learning resulted in the most effective biomarker sets for distinguishing between groups, with the area under the receiver operating characteristics curve (AUC) reaching 0.971. By comparison, the AUC results were 0.94 for literature-based curation, 0.924 for differential abundance analyses, and 0.914 for unsupervised network analysis.

Supervised machine learning also outperformed the other techniques when distinguishing between healthy control individuals and patients with ulcerative colitis (AUC, 0.958), and between patients with ulcerative colitis and those with Crohn’s disease (AUC, 0.902).

All the techniques performed strongly when distinguishing between healthy control individuals and patients with Crohn’s disease, with AUCs ranging from 0.911 to 0.95.

When the researchers turned to the independent datasets, they found that the biomarkers were able to distinguish between healthy control individuals and patients with IBD in general and particularly between healthy control individuals and those with Crohn’s disease, with AUCs of 0.969 in the domestic cohort and 0.848 in the international cohort.

The non-IBD cohort also demonstrated that the biomarkers were able to differentiate patients with metabolic dysfunction–associated steatotic liver disease, colorectal cancer, rheumatoid arthritis, and irritable bowel syndrome from those with ulcerative colitis and Crohn’s disease with a high degree of accuracy (AUCs ranging from 0.97 to 0.999).

 

Diagnostic Utility Questioned

Speaking from the audience, James Lindsay, PhD, professor of inflammatory bowel disease, Barts and The London School of Medicine and Dentistry, England, questioned the utility of the findings.

“Obviously, all these patients had IBD, and so they will have had treatment with antibiotics, etc,” he said. “Surely the right validation cohort would be a group of people who have not yet been diagnosed with IBD to see whether your biomarker is able to separate those because the reason that people with IBD will have a difference is all the reasons that you have explained, ie, these patients were on treatment at the time that you took the samples.”

As a result, the biomarker panel isn’t for diagnosis but to confirm known disease, he added.

It’s important to look for microbiome signals of IBD, session co-chair, Lissy de Ridder, MD, PhD, associate professor of pediatric gastroenterology, Erasmus MC Sophia Children’s Hospital, Rotterdam, the Netherlands, said in an interview.

De Ridder agreed that the biomarkers need to be validated in patients who aren’t on treatments that could affect their gut microbiomes. Not only do medications for IBD make a big difference but also do other drugs such as proton-pump inhibitors and antibiotics, as well as dietary interventions.

“Having said that, because it’s a large population, that’s always a good start to take lessons from and then go more into the details” in further analyses, de Ridder added.

This research was funded by a grant from the Korea Health Technology R&D Project through the Korea Health Industry Development Institute, funded by the Ministry of Health & Welfare, Republic of Korea. No relevant financial relationships were declared.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Gut microbial biomarkers identified using machine learning can differentiate patients with inflammatory bowel disease (IBD) from healthy control individuals, according to a study presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

Of the four techniques the researchers tested, a “machine-learning approach achieves the highest diagnostic accuracy, effectively distinguishing IBD and particularly differentiating Crohn’s disease from healthy controls in independent cohorts,” said study presenter Jee-Won Choi, department of biology, Kyung Hee University, Seoul, Republic of Korea.

“Integrating microbial markers with conventional diagnostics could enhance [their] clinical utility,” Choi said. However, further research is needed to determine the long-term validity of the biomarkers.

Some experts questioned the reliability of the markers for IBD diagnosis because of the makeup of study populations, which included patients with known IBD who likely have undergone treatment that may have altered their gut microbiomes.

 

Biomarkers Found and Tested

The gut microbiota exists in two states: Eubiosis, which supports health, and inflammatory dysbiosis, an imbalanced state associated with disease, most notably IBD, Choi noted.

Although many studies have explored the differences between these two states, there have been three major challenges in identifying IBD biomarkers: The studies have had small sample sizes, they’ve concentrated on a single analytical approach, and they’ve had low reproducibility.

To overcome those challenges, researchers used a large-scale dataset and used multiple methods to determine which analytical approach yielded the most reliable results, Choi said. They validated their results in three independent cohorts with diverse populations.

The study included 414 patients with Crohn’s disease, 880 with ulcerative colitis, and 2467 healthy control individuals from 21 centers in the Republic of Korea. Their gut microbiota profiles were analyzed from stool samples using 16S ribosomal RNA gene sequencing.

Researchers used four techniques to identify potential IBD biomarkers in the samples: differential abundance analysis, supervised random forest machine learning, unsupervised network analysis, and literature-based curation.

Biomarker candidates generated by these methods were then compared for their diagnostic ability using a machine learning model. The findings were tested in three independent cohorts — one domestic and one international population, both of which included patients with IBD and healthy control individuals, and one dataset of patients without IBD.

The results showed that there were distinct differences in the microbial composition between healthy control individuals and patients with Crohn’s disease and with ulcerative colitis. Patients with IBD, particularly those with Crohn’s disease, consistently had a significantly higher prevalence of dysbiosis, Choi said.

Each of the four analytical techniques revealed distinct microbial biomarkers associated with IBD in general, as well as with Crohn’s disease and ulcerative colitis individually.

When comparing IBD patients overall with healthy control individuals, supervised machine learning resulted in the most effective biomarker sets for distinguishing between groups, with the area under the receiver operating characteristics curve (AUC) reaching 0.971. By comparison, the AUC results were 0.94 for literature-based curation, 0.924 for differential abundance analyses, and 0.914 for unsupervised network analysis.

Supervised machine learning also outperformed the other techniques when distinguishing between healthy control individuals and patients with ulcerative colitis (AUC, 0.958), and between patients with ulcerative colitis and those with Crohn’s disease (AUC, 0.902).

All the techniques performed strongly when distinguishing between healthy control individuals and patients with Crohn’s disease, with AUCs ranging from 0.911 to 0.95.

When the researchers turned to the independent datasets, they found that the biomarkers were able to distinguish between healthy control individuals and patients with IBD in general and particularly between healthy control individuals and those with Crohn’s disease, with AUCs of 0.969 in the domestic cohort and 0.848 in the international cohort.

The non-IBD cohort also demonstrated that the biomarkers were able to differentiate patients with metabolic dysfunction–associated steatotic liver disease, colorectal cancer, rheumatoid arthritis, and irritable bowel syndrome from those with ulcerative colitis and Crohn’s disease with a high degree of accuracy (AUCs ranging from 0.97 to 0.999).

 

Diagnostic Utility Questioned

Speaking from the audience, James Lindsay, PhD, professor of inflammatory bowel disease, Barts and The London School of Medicine and Dentistry, England, questioned the utility of the findings.

“Obviously, all these patients had IBD, and so they will have had treatment with antibiotics, etc,” he said. “Surely the right validation cohort would be a group of people who have not yet been diagnosed with IBD to see whether your biomarker is able to separate those because the reason that people with IBD will have a difference is all the reasons that you have explained, ie, these patients were on treatment at the time that you took the samples.”

As a result, the biomarker panel isn’t for diagnosis but to confirm known disease, he added.

It’s important to look for microbiome signals of IBD, session co-chair, Lissy de Ridder, MD, PhD, associate professor of pediatric gastroenterology, Erasmus MC Sophia Children’s Hospital, Rotterdam, the Netherlands, said in an interview.

De Ridder agreed that the biomarkers need to be validated in patients who aren’t on treatments that could affect their gut microbiomes. Not only do medications for IBD make a big difference but also do other drugs such as proton-pump inhibitors and antibiotics, as well as dietary interventions.

“Having said that, because it’s a large population, that’s always a good start to take lessons from and then go more into the details” in further analyses, de Ridder added.

This research was funded by a grant from the Korea Health Technology R&D Project through the Korea Health Industry Development Institute, funded by the Ministry of Health & Welfare, Republic of Korea. No relevant financial relationships were declared.

A version of this article appeared on Medscape.com.

Gut microbial biomarkers identified using machine learning can differentiate patients with inflammatory bowel disease (IBD) from healthy control individuals, according to a study presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

Of the four techniques the researchers tested, a “machine-learning approach achieves the highest diagnostic accuracy, effectively distinguishing IBD and particularly differentiating Crohn’s disease from healthy controls in independent cohorts,” said study presenter Jee-Won Choi, department of biology, Kyung Hee University, Seoul, Republic of Korea.

“Integrating microbial markers with conventional diagnostics could enhance [their] clinical utility,” Choi said. However, further research is needed to determine the long-term validity of the biomarkers.

Some experts questioned the reliability of the markers for IBD diagnosis because of the makeup of study populations, which included patients with known IBD who likely have undergone treatment that may have altered their gut microbiomes.

 

Biomarkers Found and Tested

The gut microbiota exists in two states: Eubiosis, which supports health, and inflammatory dysbiosis, an imbalanced state associated with disease, most notably IBD, Choi noted.

Although many studies have explored the differences between these two states, there have been three major challenges in identifying IBD biomarkers: The studies have had small sample sizes, they’ve concentrated on a single analytical approach, and they’ve had low reproducibility.

To overcome those challenges, researchers used a large-scale dataset and used multiple methods to determine which analytical approach yielded the most reliable results, Choi said. They validated their results in three independent cohorts with diverse populations.

The study included 414 patients with Crohn’s disease, 880 with ulcerative colitis, and 2467 healthy control individuals from 21 centers in the Republic of Korea. Their gut microbiota profiles were analyzed from stool samples using 16S ribosomal RNA gene sequencing.

Researchers used four techniques to identify potential IBD biomarkers in the samples: differential abundance analysis, supervised random forest machine learning, unsupervised network analysis, and literature-based curation.

Biomarker candidates generated by these methods were then compared for their diagnostic ability using a machine learning model. The findings were tested in three independent cohorts — one domestic and one international population, both of which included patients with IBD and healthy control individuals, and one dataset of patients without IBD.

The results showed that there were distinct differences in the microbial composition between healthy control individuals and patients with Crohn’s disease and with ulcerative colitis. Patients with IBD, particularly those with Crohn’s disease, consistently had a significantly higher prevalence of dysbiosis, Choi said.

Each of the four analytical techniques revealed distinct microbial biomarkers associated with IBD in general, as well as with Crohn’s disease and ulcerative colitis individually.

When comparing IBD patients overall with healthy control individuals, supervised machine learning resulted in the most effective biomarker sets for distinguishing between groups, with the area under the receiver operating characteristics curve (AUC) reaching 0.971. By comparison, the AUC results were 0.94 for literature-based curation, 0.924 for differential abundance analyses, and 0.914 for unsupervised network analysis.

Supervised machine learning also outperformed the other techniques when distinguishing between healthy control individuals and patients with ulcerative colitis (AUC, 0.958), and between patients with ulcerative colitis and those with Crohn’s disease (AUC, 0.902).

All the techniques performed strongly when distinguishing between healthy control individuals and patients with Crohn’s disease, with AUCs ranging from 0.911 to 0.95.

When the researchers turned to the independent datasets, they found that the biomarkers were able to distinguish between healthy control individuals and patients with IBD in general and particularly between healthy control individuals and those with Crohn’s disease, with AUCs of 0.969 in the domestic cohort and 0.848 in the international cohort.

The non-IBD cohort also demonstrated that the biomarkers were able to differentiate patients with metabolic dysfunction–associated steatotic liver disease, colorectal cancer, rheumatoid arthritis, and irritable bowel syndrome from those with ulcerative colitis and Crohn’s disease with a high degree of accuracy (AUCs ranging from 0.97 to 0.999).

 

Diagnostic Utility Questioned

Speaking from the audience, James Lindsay, PhD, professor of inflammatory bowel disease, Barts and The London School of Medicine and Dentistry, England, questioned the utility of the findings.

“Obviously, all these patients had IBD, and so they will have had treatment with antibiotics, etc,” he said. “Surely the right validation cohort would be a group of people who have not yet been diagnosed with IBD to see whether your biomarker is able to separate those because the reason that people with IBD will have a difference is all the reasons that you have explained, ie, these patients were on treatment at the time that you took the samples.”

As a result, the biomarker panel isn’t for diagnosis but to confirm known disease, he added.

It’s important to look for microbiome signals of IBD, session co-chair, Lissy de Ridder, MD, PhD, associate professor of pediatric gastroenterology, Erasmus MC Sophia Children’s Hospital, Rotterdam, the Netherlands, said in an interview.

De Ridder agreed that the biomarkers need to be validated in patients who aren’t on treatments that could affect their gut microbiomes. Not only do medications for IBD make a big difference but also do other drugs such as proton-pump inhibitors and antibiotics, as well as dietary interventions.

“Having said that, because it’s a large population, that’s always a good start to take lessons from and then go more into the details” in further analyses, de Ridder added.

This research was funded by a grant from the Korea Health Technology R&D Project through the Korea Health Industry Development Institute, funded by the Ministry of Health & Welfare, Republic of Korea. No relevant financial relationships were declared.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECCO 2025

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Thu, 03/06/2025 - 12:32
Un-Gate On Date
Thu, 03/06/2025 - 12:32
Use ProPublica
CFC Schedule Remove Status
Thu, 03/06/2025 - 12:32
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Thu, 03/06/2025 - 12:32

Virtual Chromoendoscopy Beats Other Modalities at Neoplasia Detection in IBD

Article Type
Changed
Tue, 03/04/2025 - 13:59

BERLIN — A multicenter study comparing three endoscopic imaging techniques used to monitor patients with inflammatory bowel disease (IBD) for neoplasia found that virtual chromoendoscopy has the highest detection rate.

The research, presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress, also found “significant variability in IBD surveillance practice in the real world,” said study presenter Chandni Radia, MD, Department of Gastroenterology, King’s College Hospital NHS Foundation Trust, London, England.

Although dye chromoendoscopy with targeted biopsies traditionally was considered the gold standard for neoplasia detection in patients with IBD, randomized trials have challenged its superiority over virtual chromoendoscopy and high-definition white-light endoscopy, the researchers noted. They hypothesized that the modality used would not affect the neoplasia detection rate.

To investigate, they conducted a retrospective observational cohort study of adults with ulcerative colitis, Crohn’s disease or primary sclerosing cholangitis (PSC) who underwent routine clinical IBD surveillance at one of five centers in the United Kingdom between 2019 and 2023. They examined data from the endoscopy reporting software, alongside endoscopy reports, endoscopy images, and electronic patient records.

In all, 2673 colonoscopies performed on 2050 patients were included, with 1032 procedures using dye chromoendoscopy, 366 using virtual chromoendoscopy, and 1275 using high-definition white-light endoscopy.

The overall neoplasia detection rate was 11.4%, “which is very similar to what has previously been seen in the literature,” Radia said.

However, the detection rate varied significantly by procedure: 19% in virtual chromoendoscopy, 12% in dye chromoendoscopy, and 9% in white-light endoscopy (P < .001). After accounting for a range of potential confounding factors, virtual chromoendoscopy still had the highest neoplasia detection rate.

Dye chromoendoscopy had a “prolonged withdrawal time and increased need for targeted biopsies without improving their neoplasia yield, which goes against our aspirations of sustainability,” Radia noted.

“It was interesting to see that the procedures with the most dye chromoendoscopy seem to have the longest withdrawal time, and those with the most white-light endoscopy seem to have the shortest,” she said. The difference remained significant even after controlling for procedures with polypectomy, “which has a significantly longer withdrawal time compared to procedures without.” 

 

Results Varied by Center

There was wide variability between the five centers on several findings. The neoplasia detection rate ranged from 7.4% to 17.2%, depending on the center.

The surveillance method also varied. One center, for example, used white-light endoscopy in 82% of cases and dye chromoendoscopy in the other 18%. At another center, 61% of patients had dye chromoendoscopy, 36% white-light endoscopy, and 3% virtual chromoendoscopy. In a third center, 48% had virtual chromoendoscopy, 46% white-light endoscopy, and 6% dye chromoendoscopy.

The centers had varying proportions of patients with each of the three conditions, with ulcerative colitis ranging from 46% to 63%, Crohn’s disease from 9% to 39%, and PSC from 14% to 45%.

The heterogeneity of patients between the modality groups is one of the study’s limitations, Radia said. Others are the shorter withdrawal time with white-light endoscopy and the lack of standardized withdrawal time for the procedures.

The research team’s analyses are ongoing and include examination of the types of neoplasia detected, as well as accounting for endoscopist experience and patients who underwent two procedures with different modalities, Radia said.

 

Reflection of ‘Real-Life Practice’

Because the study was a retrospective analysis, it contains inherent biases and other issues, Raf Bisschops, MD, PhD, director of endoscopy, University of Leuven, Belgium, who co-chaired the session, said in an interview.

However, it was a “thorough analysis” that reflects “real-life practice,” he said. As such, it lends “huge support” to virtual chromoendoscopy, which “actually goes against the new [British Society of Gastroenterology] guideline that is about to come out.” The society plans to recommend in favor of dye chromoendoscopy, but the new study findings could be still incorporated into the upcoming guidelines so as to also endorse virtual chromoendoscopy.

Whatever the modality used, clinicians need to make sure they “pay attention” when looking for small neoplastic lesions, and “anything that can help you do that, that draws your attention to cell lesions ... can be helpful,” Bisschops said.

Performing targeted biopsies, as with dye chromoendoscopy, can be problematic, as “people don’t pay attention anymore to those cell lesions; they just focus on taking the 32 biopsies, which is a huge endeavor and it’s a pain to do it,” he added.

Radia has received a Research Training Fellowship Award from the UK patient organization PSC Support. No other funding was declared. Radia declared relationships with Abbvie, Galapogos, and Dr. Falk Pharma.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

BERLIN — A multicenter study comparing three endoscopic imaging techniques used to monitor patients with inflammatory bowel disease (IBD) for neoplasia found that virtual chromoendoscopy has the highest detection rate.

The research, presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress, also found “significant variability in IBD surveillance practice in the real world,” said study presenter Chandni Radia, MD, Department of Gastroenterology, King’s College Hospital NHS Foundation Trust, London, England.

Although dye chromoendoscopy with targeted biopsies traditionally was considered the gold standard for neoplasia detection in patients with IBD, randomized trials have challenged its superiority over virtual chromoendoscopy and high-definition white-light endoscopy, the researchers noted. They hypothesized that the modality used would not affect the neoplasia detection rate.

To investigate, they conducted a retrospective observational cohort study of adults with ulcerative colitis, Crohn’s disease or primary sclerosing cholangitis (PSC) who underwent routine clinical IBD surveillance at one of five centers in the United Kingdom between 2019 and 2023. They examined data from the endoscopy reporting software, alongside endoscopy reports, endoscopy images, and electronic patient records.

In all, 2673 colonoscopies performed on 2050 patients were included, with 1032 procedures using dye chromoendoscopy, 366 using virtual chromoendoscopy, and 1275 using high-definition white-light endoscopy.

The overall neoplasia detection rate was 11.4%, “which is very similar to what has previously been seen in the literature,” Radia said.

However, the detection rate varied significantly by procedure: 19% in virtual chromoendoscopy, 12% in dye chromoendoscopy, and 9% in white-light endoscopy (P < .001). After accounting for a range of potential confounding factors, virtual chromoendoscopy still had the highest neoplasia detection rate.

Dye chromoendoscopy had a “prolonged withdrawal time and increased need for targeted biopsies without improving their neoplasia yield, which goes against our aspirations of sustainability,” Radia noted.

“It was interesting to see that the procedures with the most dye chromoendoscopy seem to have the longest withdrawal time, and those with the most white-light endoscopy seem to have the shortest,” she said. The difference remained significant even after controlling for procedures with polypectomy, “which has a significantly longer withdrawal time compared to procedures without.” 

 

Results Varied by Center

There was wide variability between the five centers on several findings. The neoplasia detection rate ranged from 7.4% to 17.2%, depending on the center.

The surveillance method also varied. One center, for example, used white-light endoscopy in 82% of cases and dye chromoendoscopy in the other 18%. At another center, 61% of patients had dye chromoendoscopy, 36% white-light endoscopy, and 3% virtual chromoendoscopy. In a third center, 48% had virtual chromoendoscopy, 46% white-light endoscopy, and 6% dye chromoendoscopy.

The centers had varying proportions of patients with each of the three conditions, with ulcerative colitis ranging from 46% to 63%, Crohn’s disease from 9% to 39%, and PSC from 14% to 45%.

The heterogeneity of patients between the modality groups is one of the study’s limitations, Radia said. Others are the shorter withdrawal time with white-light endoscopy and the lack of standardized withdrawal time for the procedures.

The research team’s analyses are ongoing and include examination of the types of neoplasia detected, as well as accounting for endoscopist experience and patients who underwent two procedures with different modalities, Radia said.

 

Reflection of ‘Real-Life Practice’

Because the study was a retrospective analysis, it contains inherent biases and other issues, Raf Bisschops, MD, PhD, director of endoscopy, University of Leuven, Belgium, who co-chaired the session, said in an interview.

However, it was a “thorough analysis” that reflects “real-life practice,” he said. As such, it lends “huge support” to virtual chromoendoscopy, which “actually goes against the new [British Society of Gastroenterology] guideline that is about to come out.” The society plans to recommend in favor of dye chromoendoscopy, but the new study findings could be still incorporated into the upcoming guidelines so as to also endorse virtual chromoendoscopy.

Whatever the modality used, clinicians need to make sure they “pay attention” when looking for small neoplastic lesions, and “anything that can help you do that, that draws your attention to cell lesions ... can be helpful,” Bisschops said.

Performing targeted biopsies, as with dye chromoendoscopy, can be problematic, as “people don’t pay attention anymore to those cell lesions; they just focus on taking the 32 biopsies, which is a huge endeavor and it’s a pain to do it,” he added.

Radia has received a Research Training Fellowship Award from the UK patient organization PSC Support. No other funding was declared. Radia declared relationships with Abbvie, Galapogos, and Dr. Falk Pharma.

A version of this article appeared on Medscape.com.

BERLIN — A multicenter study comparing three endoscopic imaging techniques used to monitor patients with inflammatory bowel disease (IBD) for neoplasia found that virtual chromoendoscopy has the highest detection rate.

The research, presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress, also found “significant variability in IBD surveillance practice in the real world,” said study presenter Chandni Radia, MD, Department of Gastroenterology, King’s College Hospital NHS Foundation Trust, London, England.

Although dye chromoendoscopy with targeted biopsies traditionally was considered the gold standard for neoplasia detection in patients with IBD, randomized trials have challenged its superiority over virtual chromoendoscopy and high-definition white-light endoscopy, the researchers noted. They hypothesized that the modality used would not affect the neoplasia detection rate.

To investigate, they conducted a retrospective observational cohort study of adults with ulcerative colitis, Crohn’s disease or primary sclerosing cholangitis (PSC) who underwent routine clinical IBD surveillance at one of five centers in the United Kingdom between 2019 and 2023. They examined data from the endoscopy reporting software, alongside endoscopy reports, endoscopy images, and electronic patient records.

In all, 2673 colonoscopies performed on 2050 patients were included, with 1032 procedures using dye chromoendoscopy, 366 using virtual chromoendoscopy, and 1275 using high-definition white-light endoscopy.

The overall neoplasia detection rate was 11.4%, “which is very similar to what has previously been seen in the literature,” Radia said.

However, the detection rate varied significantly by procedure: 19% in virtual chromoendoscopy, 12% in dye chromoendoscopy, and 9% in white-light endoscopy (P < .001). After accounting for a range of potential confounding factors, virtual chromoendoscopy still had the highest neoplasia detection rate.

Dye chromoendoscopy had a “prolonged withdrawal time and increased need for targeted biopsies without improving their neoplasia yield, which goes against our aspirations of sustainability,” Radia noted.

“It was interesting to see that the procedures with the most dye chromoendoscopy seem to have the longest withdrawal time, and those with the most white-light endoscopy seem to have the shortest,” she said. The difference remained significant even after controlling for procedures with polypectomy, “which has a significantly longer withdrawal time compared to procedures without.” 

 

Results Varied by Center

There was wide variability between the five centers on several findings. The neoplasia detection rate ranged from 7.4% to 17.2%, depending on the center.

The surveillance method also varied. One center, for example, used white-light endoscopy in 82% of cases and dye chromoendoscopy in the other 18%. At another center, 61% of patients had dye chromoendoscopy, 36% white-light endoscopy, and 3% virtual chromoendoscopy. In a third center, 48% had virtual chromoendoscopy, 46% white-light endoscopy, and 6% dye chromoendoscopy.

The centers had varying proportions of patients with each of the three conditions, with ulcerative colitis ranging from 46% to 63%, Crohn’s disease from 9% to 39%, and PSC from 14% to 45%.

The heterogeneity of patients between the modality groups is one of the study’s limitations, Radia said. Others are the shorter withdrawal time with white-light endoscopy and the lack of standardized withdrawal time for the procedures.

The research team’s analyses are ongoing and include examination of the types of neoplasia detected, as well as accounting for endoscopist experience and patients who underwent two procedures with different modalities, Radia said.

 

Reflection of ‘Real-Life Practice’

Because the study was a retrospective analysis, it contains inherent biases and other issues, Raf Bisschops, MD, PhD, director of endoscopy, University of Leuven, Belgium, who co-chaired the session, said in an interview.

However, it was a “thorough analysis” that reflects “real-life practice,” he said. As such, it lends “huge support” to virtual chromoendoscopy, which “actually goes against the new [British Society of Gastroenterology] guideline that is about to come out.” The society plans to recommend in favor of dye chromoendoscopy, but the new study findings could be still incorporated into the upcoming guidelines so as to also endorse virtual chromoendoscopy.

Whatever the modality used, clinicians need to make sure they “pay attention” when looking for small neoplastic lesions, and “anything that can help you do that, that draws your attention to cell lesions ... can be helpful,” Bisschops said.

Performing targeted biopsies, as with dye chromoendoscopy, can be problematic, as “people don’t pay attention anymore to those cell lesions; they just focus on taking the 32 biopsies, which is a huge endeavor and it’s a pain to do it,” he added.

Radia has received a Research Training Fellowship Award from the UK patient organization PSC Support. No other funding was declared. Radia declared relationships with Abbvie, Galapogos, and Dr. Falk Pharma.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECCO 2025

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Tue, 03/04/2025 - 10:32
Un-Gate On Date
Tue, 03/04/2025 - 10:32
Use ProPublica
CFC Schedule Remove Status
Tue, 03/04/2025 - 10:32
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Tue, 03/04/2025 - 10:32

Antibody Profiles Predict IBD Up To 10 Years Before Onset

Article Type
Changed
Tue, 03/04/2025 - 13:55

An individual’s profile of antibody responses to a range of herpes viruses and encapsulated bacteria such as Streptococcus could predict the onset of inflammatory bowel disease (IBD) up to 10 years prior to diagnosis, with differential responses between Crohn’s disease and ulcerative colitis, a new study suggested.

The research was presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

“High-throughput and high-resolution antibody profiling delineates a previously underappreciated landscape of selective serological responses in inflammatory bowel disease,” said study presenter Arno R. Bourgonje, MD, PhD, of the Henry D. Janowitz Division of Gastroenterology, Icahn School of Medicine at Mount Sinai, New York City.

The discovery represents just the “tip of the iceberg” in terms of understanding how antibody response could predict IBD onset, he added. Although validation studies are ongoing, the findings “allow for novel insights into disease pathogenesis and also for allowing for disease prediction.”

In IBD, the integrity of the intestinal barrier is compromised and luminal agents, like bacteria, can leak through, which leads to immune activation, Bourgonje said.

However, only a few serological antibody responses are known to occur in IBD, such as antibodies against the yeast Saccharomyces cerevisiae and those against the cytoplasm of neutrophils, he said.

But most antibody responses are directed against bacteria, Bourgonje noted. The gut microbiome represents thousands of different bacterial species, each of which encode for thousands of different genes, representing a tremendous number of potential antigens. But conventional antibody-profiling technologies weren’t powerful enough to identify antibodies in patients with IBD that signal an immune response to potential antigens in the gut.

To get at that problem, the researchers recently leveraged a high-throughput technology called phage-display immunoprecipitation sequencing (PhIP-Seq) to look for specific immune-based biomarker signatures in the blood of individuals with IBD. This effort revealed a distinct repertoire of antibodies not only against bacteria but also against viruses and cell antigens.

The researchers next turned their sights on discovering whether they could find evidence of immunological alterations before IBD onset to enable disease prediction.

 

Predictive Signatures Found

The team used a longitudinal preclinical IBD cohort called PREDICTS (Proteomic Evaluation and Discovery in an IBD Cohort of Tri-service Subjects) that is housed in the US Department of Defense Serum Repository.

Using PhIP-Seq, the researchers analyzed serum samples from 200 individuals who developed Crohn’s disease, 200 who developed ulcerative colitis, and 100 non-IBD controls matched for age, sex, race, and study time point. The samples were collected approximately 2 years, 4 years, and 10 years prior to diagnosis as well around the time of diagnosis.

The results showed that, compared with healthy controls, the diversity of the antibody repertoire was significantly lower in the sera of individuals with preclinical Crohn’s disease (P < .05) and ulcerative colitis (P < .001), with the lowest similarity seen in people with preclinical Crohn’s disease approximately 4 years prior to their diagnosis (P < .001).

The study also found that, compared with healthy controls, antibody responses in individuals with preclinical Crohn’s disease against herpes viruses such as Epstein-Barr virus (EBV), cytomegalovirus (CMV), and herpes simplex virus (HSV)–1 and HSV-2 were significantly higher approximately 10 years prior to the diagnosis of Crohn’s disease, whereas anti-Streptococcus responses were lower.

In individuals with ulcerative colitis, antibody responses to EBV, CMV, HSV-1, and influenza viruses were significantly higher than that in healthy controls approximately 10 years prior to diagnosis, whereas anti-rhinovirus responses were lower.

Further analysis demonstrated that antibody responses to CMV and EBV proteins increased over the course of the preclinical phase of Crohn’s disease vs healthy controls (P = .008 and P = .011, respectively).

Similarly, autoantibody responses to MAP kinase–activating death domain increased during the preclinical phase of ulcerative colitis vs healthy controls (P = .0025), whereas anti-Streptococcus responses decreased (P = .005).

Interestingly, no one single antibody response difference with healthy controls was able to accurately predict the onset of IBD 10 years prior to diagnosis, but distinct sets of antibody responses were, with area under the receiver operating characteristic curve of 0.90 for Crohn’s disease and 0.84 for ulcerative colitis.

 

A Promising Start

The study has potential to be useful for identifying people at risk for IBD, Robin Dart, MD, PhD, a consultant gastroenterologist at Guy’s and St Thomas Hospital, London, England, who co-chaired the session, said in an interview.

The difference in antibody responses to viral and bacterial antigens between Crohn’s disease and ulcerative colitis could point toward underlying biological mechanisms, although it is “too early to say,” Dart said.

However, “when you do these kind of big fishing exercises” and identify microbes may be implicated in IBD, “you end up finding more questions than answers,” although that “can only be a good thing,” he added.

Bourgonje noted that the study cohort consisted entirely of men enrolled in the US Army, limiting the applicability of the findings. Another limitation was that researchers were unable to control smoking, antibiotic use, and diet, all of which could have affected the results.

This study was funded by the Leona M. and Harry B. Helmsley Charitable Trust. Bourgonje declared relationships with Janssen Pharmaceuticals, Ferring, AbbVie. Other authors also declared numerous relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

An individual’s profile of antibody responses to a range of herpes viruses and encapsulated bacteria such as Streptococcus could predict the onset of inflammatory bowel disease (IBD) up to 10 years prior to diagnosis, with differential responses between Crohn’s disease and ulcerative colitis, a new study suggested.

The research was presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

“High-throughput and high-resolution antibody profiling delineates a previously underappreciated landscape of selective serological responses in inflammatory bowel disease,” said study presenter Arno R. Bourgonje, MD, PhD, of the Henry D. Janowitz Division of Gastroenterology, Icahn School of Medicine at Mount Sinai, New York City.

The discovery represents just the “tip of the iceberg” in terms of understanding how antibody response could predict IBD onset, he added. Although validation studies are ongoing, the findings “allow for novel insights into disease pathogenesis and also for allowing for disease prediction.”

In IBD, the integrity of the intestinal barrier is compromised and luminal agents, like bacteria, can leak through, which leads to immune activation, Bourgonje said.

However, only a few serological antibody responses are known to occur in IBD, such as antibodies against the yeast Saccharomyces cerevisiae and those against the cytoplasm of neutrophils, he said.

But most antibody responses are directed against bacteria, Bourgonje noted. The gut microbiome represents thousands of different bacterial species, each of which encode for thousands of different genes, representing a tremendous number of potential antigens. But conventional antibody-profiling technologies weren’t powerful enough to identify antibodies in patients with IBD that signal an immune response to potential antigens in the gut.

To get at that problem, the researchers recently leveraged a high-throughput technology called phage-display immunoprecipitation sequencing (PhIP-Seq) to look for specific immune-based biomarker signatures in the blood of individuals with IBD. This effort revealed a distinct repertoire of antibodies not only against bacteria but also against viruses and cell antigens.

The researchers next turned their sights on discovering whether they could find evidence of immunological alterations before IBD onset to enable disease prediction.

 

Predictive Signatures Found

The team used a longitudinal preclinical IBD cohort called PREDICTS (Proteomic Evaluation and Discovery in an IBD Cohort of Tri-service Subjects) that is housed in the US Department of Defense Serum Repository.

Using PhIP-Seq, the researchers analyzed serum samples from 200 individuals who developed Crohn’s disease, 200 who developed ulcerative colitis, and 100 non-IBD controls matched for age, sex, race, and study time point. The samples were collected approximately 2 years, 4 years, and 10 years prior to diagnosis as well around the time of diagnosis.

The results showed that, compared with healthy controls, the diversity of the antibody repertoire was significantly lower in the sera of individuals with preclinical Crohn’s disease (P < .05) and ulcerative colitis (P < .001), with the lowest similarity seen in people with preclinical Crohn’s disease approximately 4 years prior to their diagnosis (P < .001).

The study also found that, compared with healthy controls, antibody responses in individuals with preclinical Crohn’s disease against herpes viruses such as Epstein-Barr virus (EBV), cytomegalovirus (CMV), and herpes simplex virus (HSV)–1 and HSV-2 were significantly higher approximately 10 years prior to the diagnosis of Crohn’s disease, whereas anti-Streptococcus responses were lower.

In individuals with ulcerative colitis, antibody responses to EBV, CMV, HSV-1, and influenza viruses were significantly higher than that in healthy controls approximately 10 years prior to diagnosis, whereas anti-rhinovirus responses were lower.

Further analysis demonstrated that antibody responses to CMV and EBV proteins increased over the course of the preclinical phase of Crohn’s disease vs healthy controls (P = .008 and P = .011, respectively).

Similarly, autoantibody responses to MAP kinase–activating death domain increased during the preclinical phase of ulcerative colitis vs healthy controls (P = .0025), whereas anti-Streptococcus responses decreased (P = .005).

Interestingly, no one single antibody response difference with healthy controls was able to accurately predict the onset of IBD 10 years prior to diagnosis, but distinct sets of antibody responses were, with area under the receiver operating characteristic curve of 0.90 for Crohn’s disease and 0.84 for ulcerative colitis.

 

A Promising Start

The study has potential to be useful for identifying people at risk for IBD, Robin Dart, MD, PhD, a consultant gastroenterologist at Guy’s and St Thomas Hospital, London, England, who co-chaired the session, said in an interview.

The difference in antibody responses to viral and bacterial antigens between Crohn’s disease and ulcerative colitis could point toward underlying biological mechanisms, although it is “too early to say,” Dart said.

However, “when you do these kind of big fishing exercises” and identify microbes may be implicated in IBD, “you end up finding more questions than answers,” although that “can only be a good thing,” he added.

Bourgonje noted that the study cohort consisted entirely of men enrolled in the US Army, limiting the applicability of the findings. Another limitation was that researchers were unable to control smoking, antibiotic use, and diet, all of which could have affected the results.

This study was funded by the Leona M. and Harry B. Helmsley Charitable Trust. Bourgonje declared relationships with Janssen Pharmaceuticals, Ferring, AbbVie. Other authors also declared numerous relationships.

A version of this article appeared on Medscape.com.

An individual’s profile of antibody responses to a range of herpes viruses and encapsulated bacteria such as Streptococcus could predict the onset of inflammatory bowel disease (IBD) up to 10 years prior to diagnosis, with differential responses between Crohn’s disease and ulcerative colitis, a new study suggested.

The research was presented at the European Crohn’s and Colitis Organisation (ECCO) 2025 Congress.

“High-throughput and high-resolution antibody profiling delineates a previously underappreciated landscape of selective serological responses in inflammatory bowel disease,” said study presenter Arno R. Bourgonje, MD, PhD, of the Henry D. Janowitz Division of Gastroenterology, Icahn School of Medicine at Mount Sinai, New York City.

The discovery represents just the “tip of the iceberg” in terms of understanding how antibody response could predict IBD onset, he added. Although validation studies are ongoing, the findings “allow for novel insights into disease pathogenesis and also for allowing for disease prediction.”

In IBD, the integrity of the intestinal barrier is compromised and luminal agents, like bacteria, can leak through, which leads to immune activation, Bourgonje said.

However, only a few serological antibody responses are known to occur in IBD, such as antibodies against the yeast Saccharomyces cerevisiae and those against the cytoplasm of neutrophils, he said.

But most antibody responses are directed against bacteria, Bourgonje noted. The gut microbiome represents thousands of different bacterial species, each of which encode for thousands of different genes, representing a tremendous number of potential antigens. But conventional antibody-profiling technologies weren’t powerful enough to identify antibodies in patients with IBD that signal an immune response to potential antigens in the gut.

To get at that problem, the researchers recently leveraged a high-throughput technology called phage-display immunoprecipitation sequencing (PhIP-Seq) to look for specific immune-based biomarker signatures in the blood of individuals with IBD. This effort revealed a distinct repertoire of antibodies not only against bacteria but also against viruses and cell antigens.

The researchers next turned their sights on discovering whether they could find evidence of immunological alterations before IBD onset to enable disease prediction.

 

Predictive Signatures Found

The team used a longitudinal preclinical IBD cohort called PREDICTS (Proteomic Evaluation and Discovery in an IBD Cohort of Tri-service Subjects) that is housed in the US Department of Defense Serum Repository.

Using PhIP-Seq, the researchers analyzed serum samples from 200 individuals who developed Crohn’s disease, 200 who developed ulcerative colitis, and 100 non-IBD controls matched for age, sex, race, and study time point. The samples were collected approximately 2 years, 4 years, and 10 years prior to diagnosis as well around the time of diagnosis.

The results showed that, compared with healthy controls, the diversity of the antibody repertoire was significantly lower in the sera of individuals with preclinical Crohn’s disease (P < .05) and ulcerative colitis (P < .001), with the lowest similarity seen in people with preclinical Crohn’s disease approximately 4 years prior to their diagnosis (P < .001).

The study also found that, compared with healthy controls, antibody responses in individuals with preclinical Crohn’s disease against herpes viruses such as Epstein-Barr virus (EBV), cytomegalovirus (CMV), and herpes simplex virus (HSV)–1 and HSV-2 were significantly higher approximately 10 years prior to the diagnosis of Crohn’s disease, whereas anti-Streptococcus responses were lower.

In individuals with ulcerative colitis, antibody responses to EBV, CMV, HSV-1, and influenza viruses were significantly higher than that in healthy controls approximately 10 years prior to diagnosis, whereas anti-rhinovirus responses were lower.

Further analysis demonstrated that antibody responses to CMV and EBV proteins increased over the course of the preclinical phase of Crohn’s disease vs healthy controls (P = .008 and P = .011, respectively).

Similarly, autoantibody responses to MAP kinase–activating death domain increased during the preclinical phase of ulcerative colitis vs healthy controls (P = .0025), whereas anti-Streptococcus responses decreased (P = .005).

Interestingly, no one single antibody response difference with healthy controls was able to accurately predict the onset of IBD 10 years prior to diagnosis, but distinct sets of antibody responses were, with area under the receiver operating characteristic curve of 0.90 for Crohn’s disease and 0.84 for ulcerative colitis.

 

A Promising Start

The study has potential to be useful for identifying people at risk for IBD, Robin Dart, MD, PhD, a consultant gastroenterologist at Guy’s and St Thomas Hospital, London, England, who co-chaired the session, said in an interview.

The difference in antibody responses to viral and bacterial antigens between Crohn’s disease and ulcerative colitis could point toward underlying biological mechanisms, although it is “too early to say,” Dart said.

However, “when you do these kind of big fishing exercises” and identify microbes may be implicated in IBD, “you end up finding more questions than answers,” although that “can only be a good thing,” he added.

Bourgonje noted that the study cohort consisted entirely of men enrolled in the US Army, limiting the applicability of the findings. Another limitation was that researchers were unable to control smoking, antibiotic use, and diet, all of which could have affected the results.

This study was funded by the Leona M. and Harry B. Helmsley Charitable Trust. Bourgonje declared relationships with Janssen Pharmaceuticals, Ferring, AbbVie. Other authors also declared numerous relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECCO 2025

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Tue, 03/04/2025 - 10:03
Un-Gate On Date
Tue, 03/04/2025 - 10:03
Use ProPublica
CFC Schedule Remove Status
Tue, 03/04/2025 - 10:03
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Tue, 03/04/2025 - 10:03

Eye Toxicities Are a Growing Concern With Certain ADCs

Article Type
Changed
Wed, 12/18/2024 - 13:55

Despite being a targeted therapy, antibody-drug conjugates (ADCs) can cause significant off-target toxicity to the eyes of patients being treated for advanced multiple myeloma or cervical cancer, yet the risks remain relatively unknown, according to oncologists and ophthalmologists.

Such experts called for greater collaboration between oncologists and ophthalmologists, in interviews with Medscape Medical News.

ADCs combine a monoclonal antibody targeted at an antigen overexpressed on cancer cells with a toxic chemotherapy payload — the aim being to maximize the effectiveness of the drug against the tumor while minimizing the damage to healthy tissues and reducing systemic toxicity.

Yet trastuzumab duocarmazine (T-Duo), a third-generation human epidermal growth factor receptor 2 (HER2)–targeted ADC designed to treat HER2-positive breast cancer, was recently found to have a notable adverse effect in the TULIP trial of 437 patients.

As reported by Medscape Medical News, the drug was associated with a significant increase in progression-free survival over physician’s choice of therapy. However, 78% of patients in the ADC group experienced at least one treatment-emergent ocular toxicity adverse event vs 29.2% of those in the control group.

Moreover, grade 3 or high ocular toxicity events were reported by 21% of patients in the experimental group compared with none of those who received physician’s choice.

 

Ocular Toxicities Seen on Ocular Surface

Ocular toxicities with these drugs are “not necessarily a new thing,” said Joann J. Kang, MD, director, Cornea and Refractive Surgery, and associate professor of ophthalmology at Albert Einstein College of Medicine, Montefiore Medical Center, Bronx, New York.

“But what we’re seeing with certain ADCs is a lot of ocular toxicity, especially on the ocular surface,” with the degree toxicity varying depending on the ADC in question. “It’s definitely a real concern.”

Kang noted that separate from T-Duo, certain ADCs already come with black box warnings for ocular toxicity, including:

  • Belantamab mafodotin (Blenrep) — approved for relapsed or refractory multiple myeloma and carries a warning specifically for keratopathy.
  • Tisotumab vedotin (Tivdak) — indicated for recurrent or metastatic cervical cancer and can cause changes in the corneal epithelium and conjunctiva.
  • Mirvetuximab soravtansine (Elahere) — used to treat folate receptor (FR) alpha–positive ovarian, fallopian tube, and peritoneal cancers and can lead to keratopathy, blurred vision, and dry eyes.

Indeed, the American Academy of Ophthalmology 2024 annual meeting saw research presented indicating that mirvetuximab was associated with moderate or severe corneal toxicity in 47% of patients treated for primary gynecologic malignancies.

As reported by Medscape Medical News, the study, by researchers at Byers Eye Institute of Stanford University in Stanford, California, was a retrospective analysis of 36 eyes of 18 women who received mirvetuximab for FR alpha–positive, platinum-resistant primary ovarian cancer.

 

What Are the Causes?

But why would a drug that is targeted specifically to a cancer tumor, thanks to the presence of a monoclonal antibody, cause off-target effects such as ocular toxicity?

Kathy D. Miller, MD, professor of oncology and medicine at Indiana University School of Medicine in Indianapolis, pointed out that they are targeted in a relative and not absolute sense, meaning that the antigen target may not be truly limited to the tumor cells.

There can also be “a lot of ways that you could get systemic toxicities,” she said.

For example, if the linker connecting the antibody and the chemotherapy payload breaks prematurely or is not stable, or if the drug leaches out into the tumor microenvironment and then is “picked up into the circulation, that can give you systemic toxicity,” she said.

In addition, the drug may, once it is in the tumor cells, be metabolized to an active metabolite that could, again, result in systemic exposure.

 

Side Effects Are Underappreciated and Distressing

Ocular toxicity remains underappreciated among oncologists prescribing these drugs. One reason is that it “did not get enough attention” in the initial clinical trial reports, Miller said she suspects.

Another potential reason for this is that “we’re not used to thinking about it because it’s not particularly common among the drugs that oncologists use frequently,” she added. Additionally, it tends to come up later during treatment, “so people have to be on therapy for some time before you start to see it.”

Nevertheless, Miller underlined that ocular toxicity “can be particularly distressing for patients, as it’s uncomfortable [and] can lead to scarring, so some of the vision issues can be permanent.”

“We often see in these situations that there are different types of ocular toxicities that present in different patients,” said Jane L. Meisel, MD, co-director, Breast Medical Oncology, Department of Hematology and Medical Oncology at Emory University School of Medicine in Atlanta.

“Corneal damage is pretty common, and patients can present with blurry vision, or dry eyes, or light sensitivity. And unlike some side effects, these are things that really impact people at every waking moment of their day.”

“So they’re pretty clinically significant side effects, even if they’re not life-threatening,” Meisel emphasized.

Miller suspects that more heavily pretreated patients may be more likely to experience ocular toxicity, as “there’s a much higher incidence of dry eyes in our patients than we recognize.”

She added: “We don’t usually ask about it, and we certainly don’t routinely do Schirmer’s tests,” which determine whether the eye produces enough tears to keep it moist.

 

Preventive Measures

For patients receiving tisotumab or mirvetuximab who experience ocular toxicity, Kang said the recommendation is to use steroid eye drops before, during, and after treatment with the ADC.

However, she noted that steroids have not been found to be useful in patients given belantamab, so clinicians have tried vasoconstrictor eye drops immediately prior to the infusion, as well as ocular cooling masks, which “are thought to help by reducing blood supply to the ocular areas.”

Other approaches to minimize ocular toxicity have included longer infusion times, so it’s “not so much of a hefty dose at one time,” Kang added.

She underlined that grade 2 and 3 ocular toxicities can lead to dose delays or dose modifications, and “usually by the time you get a grade 4 event, then you may need to discontinue the medication.”

This can have consequences for the patients because they are often “very sick, and this may be their third agent that they’re trying,” or it may be that their tumor is responding to a new treatment, but it has to be withheld because of an ocular toxicity.

“It can be incredibly frustrating for patients, and also for oncologists, and then for ophthalmologists,” Kang said.

 

Closer Collaboration Between Specialists Needed

What’s known about ocular side effects in patients taking ADCs underlines that there is a need for closer collaboration between oncologists and ophthalmologists.

“In oncology, especially as immunotherapies came to the forefront, our relationships with our endocrinology colleagues have become stronger because we’ve needed them to help us manage things like thyroid toxicity and pituitary issues related to immunotherapy,” Meisel said.

With toxicities that may be “very impactful for patient quality of life, like ocular toxicity, we will need to learn more about them and develop protocols for management, along with our ophthalmology colleagues, so that we can keep patients as comfortable as possible, while maximizing the efficacy of these drugs.”

Miller agreed, saying oncologists need to have “a conversation with a local ophthalmologist,” although she conceded that, in many areas, such specialists “are in short supply.”

The oncologist “not only needs to be aware” of and looking for ocular toxicity when using these ADCs but also needs to be thinking: “If I run into trouble here, who’s my ophthalmology backup? Are they familiar with this drug? And do we have a plan for the multispecialty management of patients who run into this toxicity?”

 

Setting Counts When Assessing Toxicities

But do all these considerations mean that ADCs’ potential ocular toxicity should give clinicians pause when considering whether to use these drugs?

“What my patients most want are drugs that work; that are effective in controlling their tumors,” Miller said.

“Every drug we use has potential toxicities, and which toxicities are most physically troublesome [or] are the greatest concern may vary from patient to patient, and it may vary a lot from patients with metastatic disease to those in the curative setting.”

She explained that “toxicities that might not be prohibitive at all in the metastatic setting [may] have to be a much bigger part of our considerations” when moving drugs into the adjuvant or neoadjuvant setting.

This, Miller underlined, is where the ocular toxicity with these ADCs “may be much more prohibitive.”

TULIP was funded by Byondis BV.

Turner declared relationships with Novartis, AstraZeneca, Pfizer, Merck Sharp & Dohme, Lilly, Repare Therapeutics, Roche, GlaxoSmithKline, Gilead Sciences, Inivata, Guardant Health, Exact Sciences, and Relay Therapeutics.

Meisel declared relationships with Novartis, AstraZeneca, Genentech, Seagen, Olema Oncology, GE Healthcare, Pfizer, Stemline, and Sermonix Pharmaceuticals.

 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Despite being a targeted therapy, antibody-drug conjugates (ADCs) can cause significant off-target toxicity to the eyes of patients being treated for advanced multiple myeloma or cervical cancer, yet the risks remain relatively unknown, according to oncologists and ophthalmologists.

Such experts called for greater collaboration between oncologists and ophthalmologists, in interviews with Medscape Medical News.

ADCs combine a monoclonal antibody targeted at an antigen overexpressed on cancer cells with a toxic chemotherapy payload — the aim being to maximize the effectiveness of the drug against the tumor while minimizing the damage to healthy tissues and reducing systemic toxicity.

Yet trastuzumab duocarmazine (T-Duo), a third-generation human epidermal growth factor receptor 2 (HER2)–targeted ADC designed to treat HER2-positive breast cancer, was recently found to have a notable adverse effect in the TULIP trial of 437 patients.

As reported by Medscape Medical News, the drug was associated with a significant increase in progression-free survival over physician’s choice of therapy. However, 78% of patients in the ADC group experienced at least one treatment-emergent ocular toxicity adverse event vs 29.2% of those in the control group.

Moreover, grade 3 or high ocular toxicity events were reported by 21% of patients in the experimental group compared with none of those who received physician’s choice.

 

Ocular Toxicities Seen on Ocular Surface

Ocular toxicities with these drugs are “not necessarily a new thing,” said Joann J. Kang, MD, director, Cornea and Refractive Surgery, and associate professor of ophthalmology at Albert Einstein College of Medicine, Montefiore Medical Center, Bronx, New York.

“But what we’re seeing with certain ADCs is a lot of ocular toxicity, especially on the ocular surface,” with the degree toxicity varying depending on the ADC in question. “It’s definitely a real concern.”

Kang noted that separate from T-Duo, certain ADCs already come with black box warnings for ocular toxicity, including:

  • Belantamab mafodotin (Blenrep) — approved for relapsed or refractory multiple myeloma and carries a warning specifically for keratopathy.
  • Tisotumab vedotin (Tivdak) — indicated for recurrent or metastatic cervical cancer and can cause changes in the corneal epithelium and conjunctiva.
  • Mirvetuximab soravtansine (Elahere) — used to treat folate receptor (FR) alpha–positive ovarian, fallopian tube, and peritoneal cancers and can lead to keratopathy, blurred vision, and dry eyes.

Indeed, the American Academy of Ophthalmology 2024 annual meeting saw research presented indicating that mirvetuximab was associated with moderate or severe corneal toxicity in 47% of patients treated for primary gynecologic malignancies.

As reported by Medscape Medical News, the study, by researchers at Byers Eye Institute of Stanford University in Stanford, California, was a retrospective analysis of 36 eyes of 18 women who received mirvetuximab for FR alpha–positive, platinum-resistant primary ovarian cancer.

 

What Are the Causes?

But why would a drug that is targeted specifically to a cancer tumor, thanks to the presence of a monoclonal antibody, cause off-target effects such as ocular toxicity?

Kathy D. Miller, MD, professor of oncology and medicine at Indiana University School of Medicine in Indianapolis, pointed out that they are targeted in a relative and not absolute sense, meaning that the antigen target may not be truly limited to the tumor cells.

There can also be “a lot of ways that you could get systemic toxicities,” she said.

For example, if the linker connecting the antibody and the chemotherapy payload breaks prematurely or is not stable, or if the drug leaches out into the tumor microenvironment and then is “picked up into the circulation, that can give you systemic toxicity,” she said.

In addition, the drug may, once it is in the tumor cells, be metabolized to an active metabolite that could, again, result in systemic exposure.

 

Side Effects Are Underappreciated and Distressing

Ocular toxicity remains underappreciated among oncologists prescribing these drugs. One reason is that it “did not get enough attention” in the initial clinical trial reports, Miller said she suspects.

Another potential reason for this is that “we’re not used to thinking about it because it’s not particularly common among the drugs that oncologists use frequently,” she added. Additionally, it tends to come up later during treatment, “so people have to be on therapy for some time before you start to see it.”

Nevertheless, Miller underlined that ocular toxicity “can be particularly distressing for patients, as it’s uncomfortable [and] can lead to scarring, so some of the vision issues can be permanent.”

“We often see in these situations that there are different types of ocular toxicities that present in different patients,” said Jane L. Meisel, MD, co-director, Breast Medical Oncology, Department of Hematology and Medical Oncology at Emory University School of Medicine in Atlanta.

“Corneal damage is pretty common, and patients can present with blurry vision, or dry eyes, or light sensitivity. And unlike some side effects, these are things that really impact people at every waking moment of their day.”

“So they’re pretty clinically significant side effects, even if they’re not life-threatening,” Meisel emphasized.

Miller suspects that more heavily pretreated patients may be more likely to experience ocular toxicity, as “there’s a much higher incidence of dry eyes in our patients than we recognize.”

She added: “We don’t usually ask about it, and we certainly don’t routinely do Schirmer’s tests,” which determine whether the eye produces enough tears to keep it moist.

 

Preventive Measures

For patients receiving tisotumab or mirvetuximab who experience ocular toxicity, Kang said the recommendation is to use steroid eye drops before, during, and after treatment with the ADC.

However, she noted that steroids have not been found to be useful in patients given belantamab, so clinicians have tried vasoconstrictor eye drops immediately prior to the infusion, as well as ocular cooling masks, which “are thought to help by reducing blood supply to the ocular areas.”

Other approaches to minimize ocular toxicity have included longer infusion times, so it’s “not so much of a hefty dose at one time,” Kang added.

She underlined that grade 2 and 3 ocular toxicities can lead to dose delays or dose modifications, and “usually by the time you get a grade 4 event, then you may need to discontinue the medication.”

This can have consequences for the patients because they are often “very sick, and this may be their third agent that they’re trying,” or it may be that their tumor is responding to a new treatment, but it has to be withheld because of an ocular toxicity.

“It can be incredibly frustrating for patients, and also for oncologists, and then for ophthalmologists,” Kang said.

 

Closer Collaboration Between Specialists Needed

What’s known about ocular side effects in patients taking ADCs underlines that there is a need for closer collaboration between oncologists and ophthalmologists.

“In oncology, especially as immunotherapies came to the forefront, our relationships with our endocrinology colleagues have become stronger because we’ve needed them to help us manage things like thyroid toxicity and pituitary issues related to immunotherapy,” Meisel said.

With toxicities that may be “very impactful for patient quality of life, like ocular toxicity, we will need to learn more about them and develop protocols for management, along with our ophthalmology colleagues, so that we can keep patients as comfortable as possible, while maximizing the efficacy of these drugs.”

Miller agreed, saying oncologists need to have “a conversation with a local ophthalmologist,” although she conceded that, in many areas, such specialists “are in short supply.”

The oncologist “not only needs to be aware” of and looking for ocular toxicity when using these ADCs but also needs to be thinking: “If I run into trouble here, who’s my ophthalmology backup? Are they familiar with this drug? And do we have a plan for the multispecialty management of patients who run into this toxicity?”

 

Setting Counts When Assessing Toxicities

But do all these considerations mean that ADCs’ potential ocular toxicity should give clinicians pause when considering whether to use these drugs?

“What my patients most want are drugs that work; that are effective in controlling their tumors,” Miller said.

“Every drug we use has potential toxicities, and which toxicities are most physically troublesome [or] are the greatest concern may vary from patient to patient, and it may vary a lot from patients with metastatic disease to those in the curative setting.”

She explained that “toxicities that might not be prohibitive at all in the metastatic setting [may] have to be a much bigger part of our considerations” when moving drugs into the adjuvant or neoadjuvant setting.

This, Miller underlined, is where the ocular toxicity with these ADCs “may be much more prohibitive.”

TULIP was funded by Byondis BV.

Turner declared relationships with Novartis, AstraZeneca, Pfizer, Merck Sharp & Dohme, Lilly, Repare Therapeutics, Roche, GlaxoSmithKline, Gilead Sciences, Inivata, Guardant Health, Exact Sciences, and Relay Therapeutics.

Meisel declared relationships with Novartis, AstraZeneca, Genentech, Seagen, Olema Oncology, GE Healthcare, Pfizer, Stemline, and Sermonix Pharmaceuticals.

 

A version of this article appeared on Medscape.com.

Despite being a targeted therapy, antibody-drug conjugates (ADCs) can cause significant off-target toxicity to the eyes of patients being treated for advanced multiple myeloma or cervical cancer, yet the risks remain relatively unknown, according to oncologists and ophthalmologists.

Such experts called for greater collaboration between oncologists and ophthalmologists, in interviews with Medscape Medical News.

ADCs combine a monoclonal antibody targeted at an antigen overexpressed on cancer cells with a toxic chemotherapy payload — the aim being to maximize the effectiveness of the drug against the tumor while minimizing the damage to healthy tissues and reducing systemic toxicity.

Yet trastuzumab duocarmazine (T-Duo), a third-generation human epidermal growth factor receptor 2 (HER2)–targeted ADC designed to treat HER2-positive breast cancer, was recently found to have a notable adverse effect in the TULIP trial of 437 patients.

As reported by Medscape Medical News, the drug was associated with a significant increase in progression-free survival over physician’s choice of therapy. However, 78% of patients in the ADC group experienced at least one treatment-emergent ocular toxicity adverse event vs 29.2% of those in the control group.

Moreover, grade 3 or high ocular toxicity events were reported by 21% of patients in the experimental group compared with none of those who received physician’s choice.

 

Ocular Toxicities Seen on Ocular Surface

Ocular toxicities with these drugs are “not necessarily a new thing,” said Joann J. Kang, MD, director, Cornea and Refractive Surgery, and associate professor of ophthalmology at Albert Einstein College of Medicine, Montefiore Medical Center, Bronx, New York.

“But what we’re seeing with certain ADCs is a lot of ocular toxicity, especially on the ocular surface,” with the degree toxicity varying depending on the ADC in question. “It’s definitely a real concern.”

Kang noted that separate from T-Duo, certain ADCs already come with black box warnings for ocular toxicity, including:

  • Belantamab mafodotin (Blenrep) — approved for relapsed or refractory multiple myeloma and carries a warning specifically for keratopathy.
  • Tisotumab vedotin (Tivdak) — indicated for recurrent or metastatic cervical cancer and can cause changes in the corneal epithelium and conjunctiva.
  • Mirvetuximab soravtansine (Elahere) — used to treat folate receptor (FR) alpha–positive ovarian, fallopian tube, and peritoneal cancers and can lead to keratopathy, blurred vision, and dry eyes.

Indeed, the American Academy of Ophthalmology 2024 annual meeting saw research presented indicating that mirvetuximab was associated with moderate or severe corneal toxicity in 47% of patients treated for primary gynecologic malignancies.

As reported by Medscape Medical News, the study, by researchers at Byers Eye Institute of Stanford University in Stanford, California, was a retrospective analysis of 36 eyes of 18 women who received mirvetuximab for FR alpha–positive, platinum-resistant primary ovarian cancer.

 

What Are the Causes?

But why would a drug that is targeted specifically to a cancer tumor, thanks to the presence of a monoclonal antibody, cause off-target effects such as ocular toxicity?

Kathy D. Miller, MD, professor of oncology and medicine at Indiana University School of Medicine in Indianapolis, pointed out that they are targeted in a relative and not absolute sense, meaning that the antigen target may not be truly limited to the tumor cells.

There can also be “a lot of ways that you could get systemic toxicities,” she said.

For example, if the linker connecting the antibody and the chemotherapy payload breaks prematurely or is not stable, or if the drug leaches out into the tumor microenvironment and then is “picked up into the circulation, that can give you systemic toxicity,” she said.

In addition, the drug may, once it is in the tumor cells, be metabolized to an active metabolite that could, again, result in systemic exposure.

 

Side Effects Are Underappreciated and Distressing

Ocular toxicity remains underappreciated among oncologists prescribing these drugs. One reason is that it “did not get enough attention” in the initial clinical trial reports, Miller said she suspects.

Another potential reason for this is that “we’re not used to thinking about it because it’s not particularly common among the drugs that oncologists use frequently,” she added. Additionally, it tends to come up later during treatment, “so people have to be on therapy for some time before you start to see it.”

Nevertheless, Miller underlined that ocular toxicity “can be particularly distressing for patients, as it’s uncomfortable [and] can lead to scarring, so some of the vision issues can be permanent.”

“We often see in these situations that there are different types of ocular toxicities that present in different patients,” said Jane L. Meisel, MD, co-director, Breast Medical Oncology, Department of Hematology and Medical Oncology at Emory University School of Medicine in Atlanta.

“Corneal damage is pretty common, and patients can present with blurry vision, or dry eyes, or light sensitivity. And unlike some side effects, these are things that really impact people at every waking moment of their day.”

“So they’re pretty clinically significant side effects, even if they’re not life-threatening,” Meisel emphasized.

Miller suspects that more heavily pretreated patients may be more likely to experience ocular toxicity, as “there’s a much higher incidence of dry eyes in our patients than we recognize.”

She added: “We don’t usually ask about it, and we certainly don’t routinely do Schirmer’s tests,” which determine whether the eye produces enough tears to keep it moist.

 

Preventive Measures

For patients receiving tisotumab or mirvetuximab who experience ocular toxicity, Kang said the recommendation is to use steroid eye drops before, during, and after treatment with the ADC.

However, she noted that steroids have not been found to be useful in patients given belantamab, so clinicians have tried vasoconstrictor eye drops immediately prior to the infusion, as well as ocular cooling masks, which “are thought to help by reducing blood supply to the ocular areas.”

Other approaches to minimize ocular toxicity have included longer infusion times, so it’s “not so much of a hefty dose at one time,” Kang added.

She underlined that grade 2 and 3 ocular toxicities can lead to dose delays or dose modifications, and “usually by the time you get a grade 4 event, then you may need to discontinue the medication.”

This can have consequences for the patients because they are often “very sick, and this may be their third agent that they’re trying,” or it may be that their tumor is responding to a new treatment, but it has to be withheld because of an ocular toxicity.

“It can be incredibly frustrating for patients, and also for oncologists, and then for ophthalmologists,” Kang said.

 

Closer Collaboration Between Specialists Needed

What’s known about ocular side effects in patients taking ADCs underlines that there is a need for closer collaboration between oncologists and ophthalmologists.

“In oncology, especially as immunotherapies came to the forefront, our relationships with our endocrinology colleagues have become stronger because we’ve needed them to help us manage things like thyroid toxicity and pituitary issues related to immunotherapy,” Meisel said.

With toxicities that may be “very impactful for patient quality of life, like ocular toxicity, we will need to learn more about them and develop protocols for management, along with our ophthalmology colleagues, so that we can keep patients as comfortable as possible, while maximizing the efficacy of these drugs.”

Miller agreed, saying oncologists need to have “a conversation with a local ophthalmologist,” although she conceded that, in many areas, such specialists “are in short supply.”

The oncologist “not only needs to be aware” of and looking for ocular toxicity when using these ADCs but also needs to be thinking: “If I run into trouble here, who’s my ophthalmology backup? Are they familiar with this drug? And do we have a plan for the multispecialty management of patients who run into this toxicity?”

 

Setting Counts When Assessing Toxicities

But do all these considerations mean that ADCs’ potential ocular toxicity should give clinicians pause when considering whether to use these drugs?

“What my patients most want are drugs that work; that are effective in controlling their tumors,” Miller said.

“Every drug we use has potential toxicities, and which toxicities are most physically troublesome [or] are the greatest concern may vary from patient to patient, and it may vary a lot from patients with metastatic disease to those in the curative setting.”

She explained that “toxicities that might not be prohibitive at all in the metastatic setting [may] have to be a much bigger part of our considerations” when moving drugs into the adjuvant or neoadjuvant setting.

This, Miller underlined, is where the ocular toxicity with these ADCs “may be much more prohibitive.”

TULIP was funded by Byondis BV.

Turner declared relationships with Novartis, AstraZeneca, Pfizer, Merck Sharp & Dohme, Lilly, Repare Therapeutics, Roche, GlaxoSmithKline, Gilead Sciences, Inivata, Guardant Health, Exact Sciences, and Relay Therapeutics.

Meisel declared relationships with Novartis, AstraZeneca, Genentech, Seagen, Olema Oncology, GE Healthcare, Pfizer, Stemline, and Sermonix Pharmaceuticals.

 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 12/18/2024 - 13:52
Un-Gate On Date
Wed, 12/18/2024 - 13:52
Use ProPublica
CFC Schedule Remove Status
Wed, 12/18/2024 - 13:52
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 12/18/2024 - 13:52

New Hope for Antimicrobial Peptides?

Article Type
Changed
Fri, 12/13/2024 - 13:12

The story of antimicrobial peptides (AMPs), particularly in tackling antibiotic resistance, has been one of false dawns and unfulfilled promises. But perhaps a new generation of “smarter” compounds could see them find a wider role in clinical practice, said experts.

AMPs may be small molecules, consisting of short chains of amino acids, but these naturally occurring compounds have an important function: They are the “frontline defense” against invasive bacteria, said Henrik Franzyk, MSc Engineering, associate professor in the Department of Drug Design and Pharmacology at the University of Copenhagen in Denmark.

 

Multifunction Line of Defense

AMPs are cationic, meaning they are positively charged. “The reason why nature has maintained these molecules is that all the microbes out there have a negative surface charge,” explained Hans-Georg Sahl, PhD, emeritus professor of pharmaceutical microbiology at the University of Bonn in Germany.

While AMPs are also hydrophobic, they are often amphipathic, with both hydrophobic and hydrophilic regions that allow them to target cell membranes and cause them to rupture similarly to how detergent acts. 

“Thus, the content of a cell gets released, and it destroys the pathogen,” explained Paulina Szymczak, a PhD candidate in the Institute of AI for Health at Helmholtz Munich, Neuherberg, Germany.

“There are variations of that theme,” said Eefjan Breukink, PhD, professor of microbial membranes and antibiotics at Utrecht University in the Netherlands. “And then it depends on the sequence of the particular peptide,” as some can cross the cell membrane and damage the bacterium internally.

Szymczak explained that AMPs can, in this way, target the cell DNA, as both the membrane and the DNA are negatively charged. “That’s also what makes them so powerful because they don’t have just one mechanism of action, as opposed to conventional antibiotics.” 

 

Indiscriminate Killers

But they also have another crucial function. They activate the innate immune system via so-called resident immune cells that are “sitting in the tissues and waiting for bacteria to turn up,” explained Franzyk.

“The problem with antibodies is that they typically need to replicate,” he continued, which takes between 4 and 7 days — a timeline that is much better suited to tackling a viral infection. Bacteria, on the other hand, have a replication cycle of just 30 minutes.

Another big problem is that AMPs kill cells indiscriminately, including our own.

“But the human body is clever in that it only produces these antimicrobial peptides where the bacteria are, so they are not circulating in the blood,” said Franzyk. If a small part of tissue becomes infected, the innate immune cells start producing AMPs, which may kill the bacteria, or call on other immune cells to help.

As part of this process, “they will also kill part of our own tissue, but that’s the price we have to pay,” he said.

 

Local Applications

It is this aspect that has, so far, limited the use of AMPs in clinical practice, certainly as a replacement for conventional antibiotics limited by bacterial resistance. The trials conducted so far have been, by and large, negative, which has dampened enthusiasm and led to the perception that the risk they pose is too great for large-scale investment.

AMPs “are not made for what we need from antibiotics in the first place,” explained Sahl. “That is, a nice, easy distribution in the body, going into abscesses” and throughout the tissues.

He continued that AMPs are “more about controlling the flora in our bodies,” and they are “really not made for being used systemically.” 

Szymczak and colleagues are now working on designing active peptides with a strong antibacterial profile but limited toxicity for systematic use.

However, the “downside with these peptides is that they are not orally available, so you can’t take a pill,” Breukink said, but instead they need to be administered intravenously.

There are, nevertheless, some antibiotics in clinical use that have the same molecular features as AMPs. These include colistin, a last-resort treatment for multidrug-resistant gram-negative bacteria, and daptomycin, which is used in the treatment of systemic infections caused by gram-positive species.

Szymczak added that there have been successes in using AMPs in a more targeted way, such as using a topical cream. Another potentially promising avenue is lung infections, which are being studied in mouse models.

 

Less Prone to Resistance

Crucially, AMPs are markedly less prone to bacterial resistance than conventional antibiotics, partly because of their typical target: the cell membrane.

“Biologically and evolutionarily, it is a very costly operation to rebuild the membrane and change its charge,” Szymczak explained. “It’s quite hard for bacteria to learn this because it’s not a single protein that you have to mutate but the whole membrane.”

This is seen in the laboratory, where it takes around five generations, or passages, for bacteria to develop resistance when grown in the presence of antibiotics, but up to 40 passages when cultured with an AMP.

The limits of the ability of AMPs to withstand the development of bacterial resistance have been tested in the real world.

Colistin has been used widely in Asia as a growth promoter, especially in pig farming. Franzyk explained that farmers have used enormous quantities of this AMP-based antibiotic, which has indeed led to the development of resistance, including contamination of meat for human consumption, leading to resistance spreading to other parts of the world.

“The bad thing about this is it’s not something each individual bacteria needs to acquire,” he said. Because resistance is stored on small, cyclic DNA called plasmids, it “can be transferred from one bacterial species to another.”

 

Novel Avenues

Franzyk suggested that AMPs could nevertheless be used in combination with, or to modify, existing antibiotics to revitalize those for which there is already bacterial resistance, or to allow antibiotics that ordinarily target only gram-positive bacteria to also treat gram-negative infections, for example.

Szymczak and her colleagues are using artificial intelligence to design novel AMP candidates. Instead of manually going through compounds and checking their activity profiles in the lab, those steps are carried out computationally “so that, in the end, you synthesize as few candidates as possible” and can proceed to a mouse model “as fast as possible.”

She personally is looking at the issue of strain-specific activity to design a compound that would target, for example, only multidrug-resistant strains. “What we can do now is something that will target everything, so a kind of last resort peptide. But we are trying to make them smarter in their targets.”

Szymczak also pointed out that cancer cells are “negatively charged, similarly to bacterial cells, as opposed to mammalian cells, which are neutral.”

“So in theory, maybe we could design something that will target cancer cells but not our host cells, and that would be extremely exciting.” However, she underlined that, first, they are trying to tackle antimicrobial resistance before looking at other spaces.

Finally, Breukink is screening for small antibacterial compounds in fungi that are around half the size of a normal peptide and more hydrophobic, meaning there is a much greater chance of them being orally available.

But “you first have to test, of course,” he said, as “if you don’t have specific targets, then you will get problems with toxicity, or other issues that you do not foresee.” 

No funding was declared. No relevant financial relationships were declared.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The story of antimicrobial peptides (AMPs), particularly in tackling antibiotic resistance, has been one of false dawns and unfulfilled promises. But perhaps a new generation of “smarter” compounds could see them find a wider role in clinical practice, said experts.

AMPs may be small molecules, consisting of short chains of amino acids, but these naturally occurring compounds have an important function: They are the “frontline defense” against invasive bacteria, said Henrik Franzyk, MSc Engineering, associate professor in the Department of Drug Design and Pharmacology at the University of Copenhagen in Denmark.

 

Multifunction Line of Defense

AMPs are cationic, meaning they are positively charged. “The reason why nature has maintained these molecules is that all the microbes out there have a negative surface charge,” explained Hans-Georg Sahl, PhD, emeritus professor of pharmaceutical microbiology at the University of Bonn in Germany.

While AMPs are also hydrophobic, they are often amphipathic, with both hydrophobic and hydrophilic regions that allow them to target cell membranes and cause them to rupture similarly to how detergent acts. 

“Thus, the content of a cell gets released, and it destroys the pathogen,” explained Paulina Szymczak, a PhD candidate in the Institute of AI for Health at Helmholtz Munich, Neuherberg, Germany.

“There are variations of that theme,” said Eefjan Breukink, PhD, professor of microbial membranes and antibiotics at Utrecht University in the Netherlands. “And then it depends on the sequence of the particular peptide,” as some can cross the cell membrane and damage the bacterium internally.

Szymczak explained that AMPs can, in this way, target the cell DNA, as both the membrane and the DNA are negatively charged. “That’s also what makes them so powerful because they don’t have just one mechanism of action, as opposed to conventional antibiotics.” 

 

Indiscriminate Killers

But they also have another crucial function. They activate the innate immune system via so-called resident immune cells that are “sitting in the tissues and waiting for bacteria to turn up,” explained Franzyk.

“The problem with antibodies is that they typically need to replicate,” he continued, which takes between 4 and 7 days — a timeline that is much better suited to tackling a viral infection. Bacteria, on the other hand, have a replication cycle of just 30 minutes.

Another big problem is that AMPs kill cells indiscriminately, including our own.

“But the human body is clever in that it only produces these antimicrobial peptides where the bacteria are, so they are not circulating in the blood,” said Franzyk. If a small part of tissue becomes infected, the innate immune cells start producing AMPs, which may kill the bacteria, or call on other immune cells to help.

As part of this process, “they will also kill part of our own tissue, but that’s the price we have to pay,” he said.

 

Local Applications

It is this aspect that has, so far, limited the use of AMPs in clinical practice, certainly as a replacement for conventional antibiotics limited by bacterial resistance. The trials conducted so far have been, by and large, negative, which has dampened enthusiasm and led to the perception that the risk they pose is too great for large-scale investment.

AMPs “are not made for what we need from antibiotics in the first place,” explained Sahl. “That is, a nice, easy distribution in the body, going into abscesses” and throughout the tissues.

He continued that AMPs are “more about controlling the flora in our bodies,” and they are “really not made for being used systemically.” 

Szymczak and colleagues are now working on designing active peptides with a strong antibacterial profile but limited toxicity for systematic use.

However, the “downside with these peptides is that they are not orally available, so you can’t take a pill,” Breukink said, but instead they need to be administered intravenously.

There are, nevertheless, some antibiotics in clinical use that have the same molecular features as AMPs. These include colistin, a last-resort treatment for multidrug-resistant gram-negative bacteria, and daptomycin, which is used in the treatment of systemic infections caused by gram-positive species.

Szymczak added that there have been successes in using AMPs in a more targeted way, such as using a topical cream. Another potentially promising avenue is lung infections, which are being studied in mouse models.

 

Less Prone to Resistance

Crucially, AMPs are markedly less prone to bacterial resistance than conventional antibiotics, partly because of their typical target: the cell membrane.

“Biologically and evolutionarily, it is a very costly operation to rebuild the membrane and change its charge,” Szymczak explained. “It’s quite hard for bacteria to learn this because it’s not a single protein that you have to mutate but the whole membrane.”

This is seen in the laboratory, where it takes around five generations, or passages, for bacteria to develop resistance when grown in the presence of antibiotics, but up to 40 passages when cultured with an AMP.

The limits of the ability of AMPs to withstand the development of bacterial resistance have been tested in the real world.

Colistin has been used widely in Asia as a growth promoter, especially in pig farming. Franzyk explained that farmers have used enormous quantities of this AMP-based antibiotic, which has indeed led to the development of resistance, including contamination of meat for human consumption, leading to resistance spreading to other parts of the world.

“The bad thing about this is it’s not something each individual bacteria needs to acquire,” he said. Because resistance is stored on small, cyclic DNA called plasmids, it “can be transferred from one bacterial species to another.”

 

Novel Avenues

Franzyk suggested that AMPs could nevertheless be used in combination with, or to modify, existing antibiotics to revitalize those for which there is already bacterial resistance, or to allow antibiotics that ordinarily target only gram-positive bacteria to also treat gram-negative infections, for example.

Szymczak and her colleagues are using artificial intelligence to design novel AMP candidates. Instead of manually going through compounds and checking their activity profiles in the lab, those steps are carried out computationally “so that, in the end, you synthesize as few candidates as possible” and can proceed to a mouse model “as fast as possible.”

She personally is looking at the issue of strain-specific activity to design a compound that would target, for example, only multidrug-resistant strains. “What we can do now is something that will target everything, so a kind of last resort peptide. But we are trying to make them smarter in their targets.”

Szymczak also pointed out that cancer cells are “negatively charged, similarly to bacterial cells, as opposed to mammalian cells, which are neutral.”

“So in theory, maybe we could design something that will target cancer cells but not our host cells, and that would be extremely exciting.” However, she underlined that, first, they are trying to tackle antimicrobial resistance before looking at other spaces.

Finally, Breukink is screening for small antibacterial compounds in fungi that are around half the size of a normal peptide and more hydrophobic, meaning there is a much greater chance of them being orally available.

But “you first have to test, of course,” he said, as “if you don’t have specific targets, then you will get problems with toxicity, or other issues that you do not foresee.” 

No funding was declared. No relevant financial relationships were declared.

A version of this article first appeared on Medscape.com.

The story of antimicrobial peptides (AMPs), particularly in tackling antibiotic resistance, has been one of false dawns and unfulfilled promises. But perhaps a new generation of “smarter” compounds could see them find a wider role in clinical practice, said experts.

AMPs may be small molecules, consisting of short chains of amino acids, but these naturally occurring compounds have an important function: They are the “frontline defense” against invasive bacteria, said Henrik Franzyk, MSc Engineering, associate professor in the Department of Drug Design and Pharmacology at the University of Copenhagen in Denmark.

 

Multifunction Line of Defense

AMPs are cationic, meaning they are positively charged. “The reason why nature has maintained these molecules is that all the microbes out there have a negative surface charge,” explained Hans-Georg Sahl, PhD, emeritus professor of pharmaceutical microbiology at the University of Bonn in Germany.

While AMPs are also hydrophobic, they are often amphipathic, with both hydrophobic and hydrophilic regions that allow them to target cell membranes and cause them to rupture similarly to how detergent acts. 

“Thus, the content of a cell gets released, and it destroys the pathogen,” explained Paulina Szymczak, a PhD candidate in the Institute of AI for Health at Helmholtz Munich, Neuherberg, Germany.

“There are variations of that theme,” said Eefjan Breukink, PhD, professor of microbial membranes and antibiotics at Utrecht University in the Netherlands. “And then it depends on the sequence of the particular peptide,” as some can cross the cell membrane and damage the bacterium internally.

Szymczak explained that AMPs can, in this way, target the cell DNA, as both the membrane and the DNA are negatively charged. “That’s also what makes them so powerful because they don’t have just one mechanism of action, as opposed to conventional antibiotics.” 

 

Indiscriminate Killers

But they also have another crucial function. They activate the innate immune system via so-called resident immune cells that are “sitting in the tissues and waiting for bacteria to turn up,” explained Franzyk.

“The problem with antibodies is that they typically need to replicate,” he continued, which takes between 4 and 7 days — a timeline that is much better suited to tackling a viral infection. Bacteria, on the other hand, have a replication cycle of just 30 minutes.

Another big problem is that AMPs kill cells indiscriminately, including our own.

“But the human body is clever in that it only produces these antimicrobial peptides where the bacteria are, so they are not circulating in the blood,” said Franzyk. If a small part of tissue becomes infected, the innate immune cells start producing AMPs, which may kill the bacteria, or call on other immune cells to help.

As part of this process, “they will also kill part of our own tissue, but that’s the price we have to pay,” he said.

 

Local Applications

It is this aspect that has, so far, limited the use of AMPs in clinical practice, certainly as a replacement for conventional antibiotics limited by bacterial resistance. The trials conducted so far have been, by and large, negative, which has dampened enthusiasm and led to the perception that the risk they pose is too great for large-scale investment.

AMPs “are not made for what we need from antibiotics in the first place,” explained Sahl. “That is, a nice, easy distribution in the body, going into abscesses” and throughout the tissues.

He continued that AMPs are “more about controlling the flora in our bodies,” and they are “really not made for being used systemically.” 

Szymczak and colleagues are now working on designing active peptides with a strong antibacterial profile but limited toxicity for systematic use.

However, the “downside with these peptides is that they are not orally available, so you can’t take a pill,” Breukink said, but instead they need to be administered intravenously.

There are, nevertheless, some antibiotics in clinical use that have the same molecular features as AMPs. These include colistin, a last-resort treatment for multidrug-resistant gram-negative bacteria, and daptomycin, which is used in the treatment of systemic infections caused by gram-positive species.

Szymczak added that there have been successes in using AMPs in a more targeted way, such as using a topical cream. Another potentially promising avenue is lung infections, which are being studied in mouse models.

 

Less Prone to Resistance

Crucially, AMPs are markedly less prone to bacterial resistance than conventional antibiotics, partly because of their typical target: the cell membrane.

“Biologically and evolutionarily, it is a very costly operation to rebuild the membrane and change its charge,” Szymczak explained. “It’s quite hard for bacteria to learn this because it’s not a single protein that you have to mutate but the whole membrane.”

This is seen in the laboratory, where it takes around five generations, or passages, for bacteria to develop resistance when grown in the presence of antibiotics, but up to 40 passages when cultured with an AMP.

The limits of the ability of AMPs to withstand the development of bacterial resistance have been tested in the real world.

Colistin has been used widely in Asia as a growth promoter, especially in pig farming. Franzyk explained that farmers have used enormous quantities of this AMP-based antibiotic, which has indeed led to the development of resistance, including contamination of meat for human consumption, leading to resistance spreading to other parts of the world.

“The bad thing about this is it’s not something each individual bacteria needs to acquire,” he said. Because resistance is stored on small, cyclic DNA called plasmids, it “can be transferred from one bacterial species to another.”

 

Novel Avenues

Franzyk suggested that AMPs could nevertheless be used in combination with, or to modify, existing antibiotics to revitalize those for which there is already bacterial resistance, or to allow antibiotics that ordinarily target only gram-positive bacteria to also treat gram-negative infections, for example.

Szymczak and her colleagues are using artificial intelligence to design novel AMP candidates. Instead of manually going through compounds and checking their activity profiles in the lab, those steps are carried out computationally “so that, in the end, you synthesize as few candidates as possible” and can proceed to a mouse model “as fast as possible.”

She personally is looking at the issue of strain-specific activity to design a compound that would target, for example, only multidrug-resistant strains. “What we can do now is something that will target everything, so a kind of last resort peptide. But we are trying to make them smarter in their targets.”

Szymczak also pointed out that cancer cells are “negatively charged, similarly to bacterial cells, as opposed to mammalian cells, which are neutral.”

“So in theory, maybe we could design something that will target cancer cells but not our host cells, and that would be extremely exciting.” However, she underlined that, first, they are trying to tackle antimicrobial resistance before looking at other spaces.

Finally, Breukink is screening for small antibacterial compounds in fungi that are around half the size of a normal peptide and more hydrophobic, meaning there is a much greater chance of them being orally available.

But “you first have to test, of course,” he said, as “if you don’t have specific targets, then you will get problems with toxicity, or other issues that you do not foresee.” 

No funding was declared. No relevant financial relationships were declared.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 12/13/2024 - 13:10
Un-Gate On Date
Fri, 12/13/2024 - 13:10
Use ProPublica
CFC Schedule Remove Status
Fri, 12/13/2024 - 13:10
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 12/13/2024 - 13:10

Goodbye CHADSVASc: Sex Complicates Stroke Risk Scoring in AF

Article Type
Changed
Wed, 11/27/2024 - 04:48

The European Society of Cardiology (ESC) caused a stir when they recommended in their latest atrial fibrillation (AF) management guideline that gender no longer be included in the decision to initiate oral anticoagulation therapy.

The move aims to level the playing field between men and women and follows a more nuanced understanding of stroke risk in patients with AF, said experts. It also acknowledges the lack of evidence in people receiving cross-sex hormone therapy.

In any case, the guidelines, developed in collaboration with the European Association for Cardio-Thoracic Surgery and published by the European Heart Journal on August 30, simply follow 2023’s US recommendations, they added.

 

One Size Does Not Fit All

So, what to the ESC guidelines actually say?

They underline that, if left untreated, the risk for ischemic stroke is increased fivefold in patients with AF, and the “default approach should therefore be to provide oral anticoagulation to all eligible AF patients, except those at low risk for incident stroke or thromboembolism.”

However, the authors note that there is a lack of strong evidence on how to apply the current risk scores to help inform that decision in real-world patients.

Dipak Kotecha, MBChB, PhD, Professor of Cardiology at the University of Birmingham and University Hospitals Birmingham NHS Foundation Trust, Birmingham, England, and senior author of the ESC guidelines, said in an interview that “the available scores have a relatively poor ability to accurately predict which patients will have a stroke or thromboembolic event.”

Instead, he said “a much better approach is for healthcare professionals to look at each patient’s individual risk factors, using the risk scores to identify those patients that might not benefit from oral anticoagulant therapy.”

For these guidelines, the authors therefore wanted to “move away from a one-size-fits-all” approach, Kotecha said, and instead ensure that more patients can benefit from the new range of direct oral anticoagulants (DOACs) that are easier to take and with much lower chance of side effects or major bleeding.

To achieve this, they separated their clinical recommendations from any particular risk score, and instead focused on the practicalities of implementation.

 

Risk Modifier Vs Risk Factor

To explain their decision the authors highlight that “the most popular risk score” is the CHA2DS2–VASc, which gives a point for female sex, alongside factors such as congestive heart failure, hypertension, and diabetes mellitus, and a sliding scale of points for increasing age.

Kotecha pointed out the score was developed before the DOACs were available and may not account for how risk factors have changed in recent decades.

The result is that CHA2DS2–VASc gives the same number of points to an individual with heart failure or prior transient ischemic attack as to a woman aged less than 65 years, “but the magnitude of increased risk is not the same,” Usha Beth Tedrow, MD, Associate Professor of Medicine, Brigham and Women’s Hospital, Boston, Massachusetts, said in an interview.

As far back as 2018, it was known that “female sex is a risk modifier, rather than a risk factor for stroke in atrial fibrillation,” noted Jose Joglar, MD, lead author of the 2023 ACC/AHA/ACCP/HRS Guideline for the Diagnosis and Management of Atrial Fibrillation said in an interview.

Danish national registry study involving 239,671 AF patients treated between 1997 and 2015, nearly half of whom were women, showed that, at a CHA2DS2–VASc score of 0, the “risk of stroke between men and women is absolutely the same,” he said.

“It is not until after a CHA2DS2–VASc score of 2 that the curves start to separate,” Joglar, Program Director, Clinical Cardiac Electrophysiology Fellowship Program, The University of Texas Southwestern Medical Center, Dallas, continued, “but by then you have already made the decision to anticoagulate.”

More recently, Kotecha and colleagues conducted a population cohort study of the electronic healthcare records of UK primary care patients treated between 2005 and 2020, and identified 78,852 with AF; more than a third were women.

Their analysis, published on September 1, showed that women had a lower adjusted rate of the primary composite outcome of all-cause mortality, ischemic stroke, or arterial thromboembolism, driven by a reduced mortality rate.

“Removal of gender from clinical risk scoring could simplify the approach to which patients with AF should be offered oral anticoagulation,” Kotecha and colleagues concluded.

Joglar clarified that “women are at increased risk for stroke than men” overall, but by the time that risk “becomes manifest, other risk factors have come into play, and they have already met the criteria for anticoagulation.”

The authors of the latest ESC guideline therefore concluded that the “inclusion of gender complicates clinical practice both for healthcare professionals and patients.” Their solution was to remove the question of gender for decisions over initiating oral anticoagulant therapy in clinical practice altogether.

This includes individuals who identify as transgender or are undergoing sex hormone therapy, as all the experts interviewed by Medscape Medical News agreed that there is currently insufficient evidence to know if that affects stroke risk.

Instead, guidelines state that the drugs are “recommended in those with a CHA2DS2-VA score of 2 or more and should be considered in those with a CHA2DS2-VA score of 1, following a patient-centered and shared care approach.”

“Dropping the gender part of the risk score is not really a substantial change” from previous ESC or other guidelines, as different points were required in the past to recommend anticoagulants for women and men, Kotecha said, adding that “making the approach easier for clinicians may avoid penalizing women as well as nonbinary and transgender patients.”

Anne B. Curtis, MD, SUNY Distinguished Professor, Department of Medicine, Jacobs School of Medicine & Biomedical Sciences, University at Buffalo in New York, agreed.

Putting aside the question of female sex, she said that there are not a lot of people under the age of 65 years with “absolutely no risk factors,” and so, “if the only reason you would anticoagulate” someone of that age is because they are a woman that “doesn’t make a lot of sense to me.”

The ESC guidelines are “trying to say, ‘look at the other risk factors, and if anything is there, go ahead and anticoagulate,” Curtis said in an interview.

“It’s actually a very thoughtful decision,” Tedrow said, and not “intended to discount risk in women.” Rather, it’s a statement that acknowledges the problem of recommending anticoagulation therapy in women “for whom it is not appropriate.”

Joglar pointed out that that recommendation, although not characterized in the same way, was in fact included in the 2023 US guidelines.

“We wanted to use a more nuanced approach,” he said, and move away from using CHA2DS2–VASc as the prime determinant of whether to start oral anticoagulation and towards a magnitude risk assessment, in which female sex is seen as a risk modifier.

“The Europeans and the Americans are looking at the same data, so we often reach the same conclusions,” Joglar said, although “we sometimes use different wordings.”

Overall, Kotecha expressed the hope that the move “will lead to better implementation of guidelines, at the end of the day.”

“That’s all we can hope for: Patients will be offered a more individualized approach, leading to more appropriate use of treatment in the right patients.”

The newer direct oral anticoagulation is “a much simpler therapy,” he added. “There is very little monitoring, a similar risk of bleeding as aspirin, and yet the ability to largely prevent the high rate of stroke and thromboembolism associated with atrial fibrillation.”

“So, it’s a big ticket item for our communities and public health, particularly as atrial fibrillation is expected to double in prevalence in the next few decades and evidence is building that it can lead to vascular dementia in the long-term.”

No funding was declared. Kotecha declares relationships with Bayer, Protherics Medicines Development, Boston Scientific, Daiichi Sankyo, Boehringer Ingelheim, BMS-Pfizer Alliance, Amomed, MyoKardia. Curtis declared relationships with Janssen Pharmaceuticals, Medtronic, Abbott. Joglar declared no relevant relationships. Tedrow declared no relevant relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

The European Society of Cardiology (ESC) caused a stir when they recommended in their latest atrial fibrillation (AF) management guideline that gender no longer be included in the decision to initiate oral anticoagulation therapy.

The move aims to level the playing field between men and women and follows a more nuanced understanding of stroke risk in patients with AF, said experts. It also acknowledges the lack of evidence in people receiving cross-sex hormone therapy.

In any case, the guidelines, developed in collaboration with the European Association for Cardio-Thoracic Surgery and published by the European Heart Journal on August 30, simply follow 2023’s US recommendations, they added.

 

One Size Does Not Fit All

So, what to the ESC guidelines actually say?

They underline that, if left untreated, the risk for ischemic stroke is increased fivefold in patients with AF, and the “default approach should therefore be to provide oral anticoagulation to all eligible AF patients, except those at low risk for incident stroke or thromboembolism.”

However, the authors note that there is a lack of strong evidence on how to apply the current risk scores to help inform that decision in real-world patients.

Dipak Kotecha, MBChB, PhD, Professor of Cardiology at the University of Birmingham and University Hospitals Birmingham NHS Foundation Trust, Birmingham, England, and senior author of the ESC guidelines, said in an interview that “the available scores have a relatively poor ability to accurately predict which patients will have a stroke or thromboembolic event.”

Instead, he said “a much better approach is for healthcare professionals to look at each patient’s individual risk factors, using the risk scores to identify those patients that might not benefit from oral anticoagulant therapy.”

For these guidelines, the authors therefore wanted to “move away from a one-size-fits-all” approach, Kotecha said, and instead ensure that more patients can benefit from the new range of direct oral anticoagulants (DOACs) that are easier to take and with much lower chance of side effects or major bleeding.

To achieve this, they separated their clinical recommendations from any particular risk score, and instead focused on the practicalities of implementation.

 

Risk Modifier Vs Risk Factor

To explain their decision the authors highlight that “the most popular risk score” is the CHA2DS2–VASc, which gives a point for female sex, alongside factors such as congestive heart failure, hypertension, and diabetes mellitus, and a sliding scale of points for increasing age.

Kotecha pointed out the score was developed before the DOACs were available and may not account for how risk factors have changed in recent decades.

The result is that CHA2DS2–VASc gives the same number of points to an individual with heart failure or prior transient ischemic attack as to a woman aged less than 65 years, “but the magnitude of increased risk is not the same,” Usha Beth Tedrow, MD, Associate Professor of Medicine, Brigham and Women’s Hospital, Boston, Massachusetts, said in an interview.

As far back as 2018, it was known that “female sex is a risk modifier, rather than a risk factor for stroke in atrial fibrillation,” noted Jose Joglar, MD, lead author of the 2023 ACC/AHA/ACCP/HRS Guideline for the Diagnosis and Management of Atrial Fibrillation said in an interview.

Danish national registry study involving 239,671 AF patients treated between 1997 and 2015, nearly half of whom were women, showed that, at a CHA2DS2–VASc score of 0, the “risk of stroke between men and women is absolutely the same,” he said.

“It is not until after a CHA2DS2–VASc score of 2 that the curves start to separate,” Joglar, Program Director, Clinical Cardiac Electrophysiology Fellowship Program, The University of Texas Southwestern Medical Center, Dallas, continued, “but by then you have already made the decision to anticoagulate.”

More recently, Kotecha and colleagues conducted a population cohort study of the electronic healthcare records of UK primary care patients treated between 2005 and 2020, and identified 78,852 with AF; more than a third were women.

Their analysis, published on September 1, showed that women had a lower adjusted rate of the primary composite outcome of all-cause mortality, ischemic stroke, or arterial thromboembolism, driven by a reduced mortality rate.

“Removal of gender from clinical risk scoring could simplify the approach to which patients with AF should be offered oral anticoagulation,” Kotecha and colleagues concluded.

Joglar clarified that “women are at increased risk for stroke than men” overall, but by the time that risk “becomes manifest, other risk factors have come into play, and they have already met the criteria for anticoagulation.”

The authors of the latest ESC guideline therefore concluded that the “inclusion of gender complicates clinical practice both for healthcare professionals and patients.” Their solution was to remove the question of gender for decisions over initiating oral anticoagulant therapy in clinical practice altogether.

This includes individuals who identify as transgender or are undergoing sex hormone therapy, as all the experts interviewed by Medscape Medical News agreed that there is currently insufficient evidence to know if that affects stroke risk.

Instead, guidelines state that the drugs are “recommended in those with a CHA2DS2-VA score of 2 or more and should be considered in those with a CHA2DS2-VA score of 1, following a patient-centered and shared care approach.”

“Dropping the gender part of the risk score is not really a substantial change” from previous ESC or other guidelines, as different points were required in the past to recommend anticoagulants for women and men, Kotecha said, adding that “making the approach easier for clinicians may avoid penalizing women as well as nonbinary and transgender patients.”

Anne B. Curtis, MD, SUNY Distinguished Professor, Department of Medicine, Jacobs School of Medicine & Biomedical Sciences, University at Buffalo in New York, agreed.

Putting aside the question of female sex, she said that there are not a lot of people under the age of 65 years with “absolutely no risk factors,” and so, “if the only reason you would anticoagulate” someone of that age is because they are a woman that “doesn’t make a lot of sense to me.”

The ESC guidelines are “trying to say, ‘look at the other risk factors, and if anything is there, go ahead and anticoagulate,” Curtis said in an interview.

“It’s actually a very thoughtful decision,” Tedrow said, and not “intended to discount risk in women.” Rather, it’s a statement that acknowledges the problem of recommending anticoagulation therapy in women “for whom it is not appropriate.”

Joglar pointed out that that recommendation, although not characterized in the same way, was in fact included in the 2023 US guidelines.

“We wanted to use a more nuanced approach,” he said, and move away from using CHA2DS2–VASc as the prime determinant of whether to start oral anticoagulation and towards a magnitude risk assessment, in which female sex is seen as a risk modifier.

“The Europeans and the Americans are looking at the same data, so we often reach the same conclusions,” Joglar said, although “we sometimes use different wordings.”

Overall, Kotecha expressed the hope that the move “will lead to better implementation of guidelines, at the end of the day.”

“That’s all we can hope for: Patients will be offered a more individualized approach, leading to more appropriate use of treatment in the right patients.”

The newer direct oral anticoagulation is “a much simpler therapy,” he added. “There is very little monitoring, a similar risk of bleeding as aspirin, and yet the ability to largely prevent the high rate of stroke and thromboembolism associated with atrial fibrillation.”

“So, it’s a big ticket item for our communities and public health, particularly as atrial fibrillation is expected to double in prevalence in the next few decades and evidence is building that it can lead to vascular dementia in the long-term.”

No funding was declared. Kotecha declares relationships with Bayer, Protherics Medicines Development, Boston Scientific, Daiichi Sankyo, Boehringer Ingelheim, BMS-Pfizer Alliance, Amomed, MyoKardia. Curtis declared relationships with Janssen Pharmaceuticals, Medtronic, Abbott. Joglar declared no relevant relationships. Tedrow declared no relevant relationships.

A version of this article appeared on Medscape.com.

The European Society of Cardiology (ESC) caused a stir when they recommended in their latest atrial fibrillation (AF) management guideline that gender no longer be included in the decision to initiate oral anticoagulation therapy.

The move aims to level the playing field between men and women and follows a more nuanced understanding of stroke risk in patients with AF, said experts. It also acknowledges the lack of evidence in people receiving cross-sex hormone therapy.

In any case, the guidelines, developed in collaboration with the European Association for Cardio-Thoracic Surgery and published by the European Heart Journal on August 30, simply follow 2023’s US recommendations, they added.

 

One Size Does Not Fit All

So, what to the ESC guidelines actually say?

They underline that, if left untreated, the risk for ischemic stroke is increased fivefold in patients with AF, and the “default approach should therefore be to provide oral anticoagulation to all eligible AF patients, except those at low risk for incident stroke or thromboembolism.”

However, the authors note that there is a lack of strong evidence on how to apply the current risk scores to help inform that decision in real-world patients.

Dipak Kotecha, MBChB, PhD, Professor of Cardiology at the University of Birmingham and University Hospitals Birmingham NHS Foundation Trust, Birmingham, England, and senior author of the ESC guidelines, said in an interview that “the available scores have a relatively poor ability to accurately predict which patients will have a stroke or thromboembolic event.”

Instead, he said “a much better approach is for healthcare professionals to look at each patient’s individual risk factors, using the risk scores to identify those patients that might not benefit from oral anticoagulant therapy.”

For these guidelines, the authors therefore wanted to “move away from a one-size-fits-all” approach, Kotecha said, and instead ensure that more patients can benefit from the new range of direct oral anticoagulants (DOACs) that are easier to take and with much lower chance of side effects or major bleeding.

To achieve this, they separated their clinical recommendations from any particular risk score, and instead focused on the practicalities of implementation.

 

Risk Modifier Vs Risk Factor

To explain their decision the authors highlight that “the most popular risk score” is the CHA2DS2–VASc, which gives a point for female sex, alongside factors such as congestive heart failure, hypertension, and diabetes mellitus, and a sliding scale of points for increasing age.

Kotecha pointed out the score was developed before the DOACs were available and may not account for how risk factors have changed in recent decades.

The result is that CHA2DS2–VASc gives the same number of points to an individual with heart failure or prior transient ischemic attack as to a woman aged less than 65 years, “but the magnitude of increased risk is not the same,” Usha Beth Tedrow, MD, Associate Professor of Medicine, Brigham and Women’s Hospital, Boston, Massachusetts, said in an interview.

As far back as 2018, it was known that “female sex is a risk modifier, rather than a risk factor for stroke in atrial fibrillation,” noted Jose Joglar, MD, lead author of the 2023 ACC/AHA/ACCP/HRS Guideline for the Diagnosis and Management of Atrial Fibrillation said in an interview.

Danish national registry study involving 239,671 AF patients treated between 1997 and 2015, nearly half of whom were women, showed that, at a CHA2DS2–VASc score of 0, the “risk of stroke between men and women is absolutely the same,” he said.

“It is not until after a CHA2DS2–VASc score of 2 that the curves start to separate,” Joglar, Program Director, Clinical Cardiac Electrophysiology Fellowship Program, The University of Texas Southwestern Medical Center, Dallas, continued, “but by then you have already made the decision to anticoagulate.”

More recently, Kotecha and colleagues conducted a population cohort study of the electronic healthcare records of UK primary care patients treated between 2005 and 2020, and identified 78,852 with AF; more than a third were women.

Their analysis, published on September 1, showed that women had a lower adjusted rate of the primary composite outcome of all-cause mortality, ischemic stroke, or arterial thromboembolism, driven by a reduced mortality rate.

“Removal of gender from clinical risk scoring could simplify the approach to which patients with AF should be offered oral anticoagulation,” Kotecha and colleagues concluded.

Joglar clarified that “women are at increased risk for stroke than men” overall, but by the time that risk “becomes manifest, other risk factors have come into play, and they have already met the criteria for anticoagulation.”

The authors of the latest ESC guideline therefore concluded that the “inclusion of gender complicates clinical practice both for healthcare professionals and patients.” Their solution was to remove the question of gender for decisions over initiating oral anticoagulant therapy in clinical practice altogether.

This includes individuals who identify as transgender or are undergoing sex hormone therapy, as all the experts interviewed by Medscape Medical News agreed that there is currently insufficient evidence to know if that affects stroke risk.

Instead, guidelines state that the drugs are “recommended in those with a CHA2DS2-VA score of 2 or more and should be considered in those with a CHA2DS2-VA score of 1, following a patient-centered and shared care approach.”

“Dropping the gender part of the risk score is not really a substantial change” from previous ESC or other guidelines, as different points were required in the past to recommend anticoagulants for women and men, Kotecha said, adding that “making the approach easier for clinicians may avoid penalizing women as well as nonbinary and transgender patients.”

Anne B. Curtis, MD, SUNY Distinguished Professor, Department of Medicine, Jacobs School of Medicine & Biomedical Sciences, University at Buffalo in New York, agreed.

Putting aside the question of female sex, she said that there are not a lot of people under the age of 65 years with “absolutely no risk factors,” and so, “if the only reason you would anticoagulate” someone of that age is because they are a woman that “doesn’t make a lot of sense to me.”

The ESC guidelines are “trying to say, ‘look at the other risk factors, and if anything is there, go ahead and anticoagulate,” Curtis said in an interview.

“It’s actually a very thoughtful decision,” Tedrow said, and not “intended to discount risk in women.” Rather, it’s a statement that acknowledges the problem of recommending anticoagulation therapy in women “for whom it is not appropriate.”

Joglar pointed out that that recommendation, although not characterized in the same way, was in fact included in the 2023 US guidelines.

“We wanted to use a more nuanced approach,” he said, and move away from using CHA2DS2–VASc as the prime determinant of whether to start oral anticoagulation and towards a magnitude risk assessment, in which female sex is seen as a risk modifier.

“The Europeans and the Americans are looking at the same data, so we often reach the same conclusions,” Joglar said, although “we sometimes use different wordings.”

Overall, Kotecha expressed the hope that the move “will lead to better implementation of guidelines, at the end of the day.”

“That’s all we can hope for: Patients will be offered a more individualized approach, leading to more appropriate use of treatment in the right patients.”

The newer direct oral anticoagulation is “a much simpler therapy,” he added. “There is very little monitoring, a similar risk of bleeding as aspirin, and yet the ability to largely prevent the high rate of stroke and thromboembolism associated with atrial fibrillation.”

“So, it’s a big ticket item for our communities and public health, particularly as atrial fibrillation is expected to double in prevalence in the next few decades and evidence is building that it can lead to vascular dementia in the long-term.”

No funding was declared. Kotecha declares relationships with Bayer, Protherics Medicines Development, Boston Scientific, Daiichi Sankyo, Boehringer Ingelheim, BMS-Pfizer Alliance, Amomed, MyoKardia. Curtis declared relationships with Janssen Pharmaceuticals, Medtronic, Abbott. Joglar declared no relevant relationships. Tedrow declared no relevant relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Thu, 11/14/2024 - 16:54
Un-Gate On Date
Thu, 11/14/2024 - 16:54
Use ProPublica
CFC Schedule Remove Status
Thu, 11/14/2024 - 16:54
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Thu, 11/14/2024 - 16:54

Barzolvolimab Effective for CSU in Phase 2 Study

Article Type
Changed
Wed, 11/27/2024 - 04:35

Patients with chronic spontaneous urticaria (CSU) experienced early and sustained improvements in symptom scores on treatment with barzolvolimab, during the 52-week follow-up of an ongoing phase 2 study.

Moreover, in the study, barzolvolimab, an anti-KIT monoclonal antibody that inhibits the activation of and depletes mast cells, induced comparable responses in a subset of patients who had taken omalizumab, an anti–immunoglobulin E monoclonal antibody approved by the Food and Drug Administration for treating CSU.

The findings were presented at the annual European Academy of Dermatology and Venereology (EADV) 2024 Congress. Barzolvolimab is being developed by Celldex Therapeutics.

“Barzolvolimab treatment resulted in rapid, profound, and durable improvement in UAS7 [weekly Urticaria Activity Score 7],” said presenter Martin Metz, MD, professor of dermatology, Institute of Allergology, Charité – Universitätsmedizin Berlin in Germany, “with a deepening of response over 52 weeks in patients with antihistamine-refractory CSU.”

“Similar robust improvement was seen in patients previously treated with omalizumab, including refractory patients,” he added.

Because barzolvolimab was well tolerated over the course of the follow-up period, Metz said, it “has the potential to be an important new treatment option,” noting that patients are now being enrolled in global phase 3 studies of barzolvolimab.
 

Sustained Symptom Relief

Ana M. Giménez-Arnau, MD, PhD, associate professor of dermatology, Autonomous University and Pompeu Fabra University, Barcelona, Spain, told Medscape Medical News that the results are important, as they showed people who switched from placebo to the active drug also saw a long-term benefit.

What is “remarkable” about barzolvolimab, continued Giménez-Arnau, who was not involved in the study, is that it is the first drug to target the KIT receptor on mast cells and interfere with stimulating growth factors, thus making the cells that drive the development of CSU “disappear.”

The study included three different barzolvolimab regimens, with the 150-mg dose every 4 weeks and the 300-mg dose every 8 weeks achieving similar results, noted Giménez-Arnau.

For her, there are important questions to answer around the pharmacokinetic and pharmacodynamic profiles of the two regimens that remain, but she underlined that for the patient, the choice of regimen could have an impact on their quality of life.

“If we give 300 mg every 8 weeks,” she said, it appears “you can achieve disease control” while halving the frequency of subcutaneous injections.

She said that it would be “interesting to know” if 300 mg every 8 weeks is given as two 150-mg injections every 2 months or one 300-mg injection. If it is the former, Giménez Arnau said, “This is potentially an important benefit for the patient.”
 

Sustained Benefits at 1 Year

The study enrolled 208 patients with antihistamine-refractory CSU at sites in 10 countries, randomizing them to one of four arms: Subcutaneous injections of barzolvolimab 75 mg or 150 mg every 4 weeks, 300 mg every 8 weeks, or placebo every 4 weeks.

The mean age in each arm was between 42 and 47 years, and around 75% were women. Across the arms, 64%-76% had severe disease, as measured on the UAS7, at a mean score of 30.0-31.3. Around 20% had previously been treated with omalizumab.

Patients were treated for 16 weeks, during which time they completed daily and weekly diaries and attended six clinic visits at weeks 0, 2, 4, 8, 12, and 16. Results from the trial published earlier this year demonstrated that both the regimens (150 mg every 4 weeks and 300 mg every 8 weeks) achieved clinically meaningful and statistically significant improvement in UAS7, the primary endpoint, vs placebo at 12 weeks.

Participants in the barzolvolimab 75 mg and placebo arms were then randomized to receive barzolvolimab 150 mg every 4 weeks or 300 mg every 8 weeks, and those who had been in the 150-mg and 300-mg treatment arms continued with that treatment for a further 36 weeks. (The remaining patients have been continued on a further 24-week follow-up, but the data are not yet available.)

By the 52-week follow-up, 25% of patients who started in each of the barzolvolimab arms had discontinued treatment, as well as 16% first randomized to the placebo arm.

Metz reported that the improvements in UAS7 scores, observed as early as week 1, were sustained through week 52 in patients in both the ongoing 150-mg and 300-mg arms. Patients who initially started in the placebo and the barzolvolimab 75-mg groups caught up with those who had started on the higher doses, so that by week 52, there were no significant differences in urticaria activity, hives, or itch scores between the arms.

By week 52, the proportion of patients achieving well-controlled disease, defined as a UAS7 score ≤ 6, was 73.7% in the barzolvolimab 150 mg every 4-week arm and 68.2% in the 300 mg barzolvolimab every 8-week arm.

Notably, just 12.8% of patients in the placebo arm had achieved well-controlled CSU by week 16, but after switching to barzolvolimab 150 mg every 4 weeks or 300 mg every 8 weeks, 63% reached that target at week 52.

“Maybe even more striking and very interesting to look at,” said Metz, was the complete control of symptoms, meaning “not one single wheal and no itch.” By week 52, 52% of those on 300 mg every 8 weeks and 71.1% of those on 150 mg every 4 weeks had a complete response, with no itch/hives (UAS7 of 0).

Importantly, complete responses with barzolvolimab were observed early and were sustained or improved to week 52, Metz said, with, again, placebo and former barzolvolimab 75 mg patients catching up with those who started on 150 mg every 4 weeks and 300 mg every 8 weeks once they switched at week 16.

“This is the best data for chronic spontaneous urticaria that we have so far seen,” he said, adding that the responses were seen regardless of prior experience with omalizumab.
 

 

 

Changes in Hair Color, Skin Pigmentation

As for safety, during the first 16 weeks, 66% of those on active treatment and 39% on placebo experienced at least one adverse event. There were no treatment-related serious adverse events, compared with two among those who received treatment for the full 52 weeks.

The most common adverse events with active treatment were hair color changes (14% in the first 16 weeks and 26% among those treated for the full 52 weeks), neutropenia/reduced neutrophil count (9% in the first 16 weeks and 17% among those treated for the full 52 weeks), and skin hypopigmentation (1% in the first 16 weeks, 13% among those treated for the full 52 weeks, and 19% among those who switched from placebo to active treatment at 36 weeks). Urticaria was reported by 10% among patients on active treatment and 10% among those on placebo in the first 16 weeks, and by 15% of those treated for the full 52 weeks.

In the post-presentation discussion, Metz explained that the hypopigmentation appears to start around the hair follicle and is diffuse, so tends to look like vitiligo.

He suggested that the melanocytes around the hair follicle “seem to be the ones that are more stressed, maybe because of the hair follicle cycling,” adding that the effect is reversible and does not appear to be dose dependent.

The study was funded by Celldex Therapeutics. Metz declared relationships with AbbVie, ALK-Abelló, Almirall, Amgen, argenx, AstraZeneca, Astria, Attovia Therapeutics, Celldex, Celltrion, Escient Pharmaceuticals, Galen, Galderma, GSK, Incyte, Jasper, Lilly, Novartis, Pfizer, Pharvaris, Regeneron, Sanofi, Teva, Third Harmonic Bio, and Vifor.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Patients with chronic spontaneous urticaria (CSU) experienced early and sustained improvements in symptom scores on treatment with barzolvolimab, during the 52-week follow-up of an ongoing phase 2 study.

Moreover, in the study, barzolvolimab, an anti-KIT monoclonal antibody that inhibits the activation of and depletes mast cells, induced comparable responses in a subset of patients who had taken omalizumab, an anti–immunoglobulin E monoclonal antibody approved by the Food and Drug Administration for treating CSU.

The findings were presented at the annual European Academy of Dermatology and Venereology (EADV) 2024 Congress. Barzolvolimab is being developed by Celldex Therapeutics.

“Barzolvolimab treatment resulted in rapid, profound, and durable improvement in UAS7 [weekly Urticaria Activity Score 7],” said presenter Martin Metz, MD, professor of dermatology, Institute of Allergology, Charité – Universitätsmedizin Berlin in Germany, “with a deepening of response over 52 weeks in patients with antihistamine-refractory CSU.”

“Similar robust improvement was seen in patients previously treated with omalizumab, including refractory patients,” he added.

Because barzolvolimab was well tolerated over the course of the follow-up period, Metz said, it “has the potential to be an important new treatment option,” noting that patients are now being enrolled in global phase 3 studies of barzolvolimab.
 

Sustained Symptom Relief

Ana M. Giménez-Arnau, MD, PhD, associate professor of dermatology, Autonomous University and Pompeu Fabra University, Barcelona, Spain, told Medscape Medical News that the results are important, as they showed people who switched from placebo to the active drug also saw a long-term benefit.

What is “remarkable” about barzolvolimab, continued Giménez-Arnau, who was not involved in the study, is that it is the first drug to target the KIT receptor on mast cells and interfere with stimulating growth factors, thus making the cells that drive the development of CSU “disappear.”

The study included three different barzolvolimab regimens, with the 150-mg dose every 4 weeks and the 300-mg dose every 8 weeks achieving similar results, noted Giménez-Arnau.

For her, there are important questions to answer around the pharmacokinetic and pharmacodynamic profiles of the two regimens that remain, but she underlined that for the patient, the choice of regimen could have an impact on their quality of life.

“If we give 300 mg every 8 weeks,” she said, it appears “you can achieve disease control” while halving the frequency of subcutaneous injections.

She said that it would be “interesting to know” if 300 mg every 8 weeks is given as two 150-mg injections every 2 months or one 300-mg injection. If it is the former, Giménez Arnau said, “This is potentially an important benefit for the patient.”
 

Sustained Benefits at 1 Year

The study enrolled 208 patients with antihistamine-refractory CSU at sites in 10 countries, randomizing them to one of four arms: Subcutaneous injections of barzolvolimab 75 mg or 150 mg every 4 weeks, 300 mg every 8 weeks, or placebo every 4 weeks.

The mean age in each arm was between 42 and 47 years, and around 75% were women. Across the arms, 64%-76% had severe disease, as measured on the UAS7, at a mean score of 30.0-31.3. Around 20% had previously been treated with omalizumab.

Patients were treated for 16 weeks, during which time they completed daily and weekly diaries and attended six clinic visits at weeks 0, 2, 4, 8, 12, and 16. Results from the trial published earlier this year demonstrated that both the regimens (150 mg every 4 weeks and 300 mg every 8 weeks) achieved clinically meaningful and statistically significant improvement in UAS7, the primary endpoint, vs placebo at 12 weeks.

Participants in the barzolvolimab 75 mg and placebo arms were then randomized to receive barzolvolimab 150 mg every 4 weeks or 300 mg every 8 weeks, and those who had been in the 150-mg and 300-mg treatment arms continued with that treatment for a further 36 weeks. (The remaining patients have been continued on a further 24-week follow-up, but the data are not yet available.)

By the 52-week follow-up, 25% of patients who started in each of the barzolvolimab arms had discontinued treatment, as well as 16% first randomized to the placebo arm.

Metz reported that the improvements in UAS7 scores, observed as early as week 1, were sustained through week 52 in patients in both the ongoing 150-mg and 300-mg arms. Patients who initially started in the placebo and the barzolvolimab 75-mg groups caught up with those who had started on the higher doses, so that by week 52, there were no significant differences in urticaria activity, hives, or itch scores between the arms.

By week 52, the proportion of patients achieving well-controlled disease, defined as a UAS7 score ≤ 6, was 73.7% in the barzolvolimab 150 mg every 4-week arm and 68.2% in the 300 mg barzolvolimab every 8-week arm.

Notably, just 12.8% of patients in the placebo arm had achieved well-controlled CSU by week 16, but after switching to barzolvolimab 150 mg every 4 weeks or 300 mg every 8 weeks, 63% reached that target at week 52.

“Maybe even more striking and very interesting to look at,” said Metz, was the complete control of symptoms, meaning “not one single wheal and no itch.” By week 52, 52% of those on 300 mg every 8 weeks and 71.1% of those on 150 mg every 4 weeks had a complete response, with no itch/hives (UAS7 of 0).

Importantly, complete responses with barzolvolimab were observed early and were sustained or improved to week 52, Metz said, with, again, placebo and former barzolvolimab 75 mg patients catching up with those who started on 150 mg every 4 weeks and 300 mg every 8 weeks once they switched at week 16.

“This is the best data for chronic spontaneous urticaria that we have so far seen,” he said, adding that the responses were seen regardless of prior experience with omalizumab.
 

 

 

Changes in Hair Color, Skin Pigmentation

As for safety, during the first 16 weeks, 66% of those on active treatment and 39% on placebo experienced at least one adverse event. There were no treatment-related serious adverse events, compared with two among those who received treatment for the full 52 weeks.

The most common adverse events with active treatment were hair color changes (14% in the first 16 weeks and 26% among those treated for the full 52 weeks), neutropenia/reduced neutrophil count (9% in the first 16 weeks and 17% among those treated for the full 52 weeks), and skin hypopigmentation (1% in the first 16 weeks, 13% among those treated for the full 52 weeks, and 19% among those who switched from placebo to active treatment at 36 weeks). Urticaria was reported by 10% among patients on active treatment and 10% among those on placebo in the first 16 weeks, and by 15% of those treated for the full 52 weeks.

In the post-presentation discussion, Metz explained that the hypopigmentation appears to start around the hair follicle and is diffuse, so tends to look like vitiligo.

He suggested that the melanocytes around the hair follicle “seem to be the ones that are more stressed, maybe because of the hair follicle cycling,” adding that the effect is reversible and does not appear to be dose dependent.

The study was funded by Celldex Therapeutics. Metz declared relationships with AbbVie, ALK-Abelló, Almirall, Amgen, argenx, AstraZeneca, Astria, Attovia Therapeutics, Celldex, Celltrion, Escient Pharmaceuticals, Galen, Galderma, GSK, Incyte, Jasper, Lilly, Novartis, Pfizer, Pharvaris, Regeneron, Sanofi, Teva, Third Harmonic Bio, and Vifor.

A version of this article first appeared on Medscape.com.

Patients with chronic spontaneous urticaria (CSU) experienced early and sustained improvements in symptom scores on treatment with barzolvolimab, during the 52-week follow-up of an ongoing phase 2 study.

Moreover, in the study, barzolvolimab, an anti-KIT monoclonal antibody that inhibits the activation of and depletes mast cells, induced comparable responses in a subset of patients who had taken omalizumab, an anti–immunoglobulin E monoclonal antibody approved by the Food and Drug Administration for treating CSU.

The findings were presented at the annual European Academy of Dermatology and Venereology (EADV) 2024 Congress. Barzolvolimab is being developed by Celldex Therapeutics.

“Barzolvolimab treatment resulted in rapid, profound, and durable improvement in UAS7 [weekly Urticaria Activity Score 7],” said presenter Martin Metz, MD, professor of dermatology, Institute of Allergology, Charité – Universitätsmedizin Berlin in Germany, “with a deepening of response over 52 weeks in patients with antihistamine-refractory CSU.”

“Similar robust improvement was seen in patients previously treated with omalizumab, including refractory patients,” he added.

Because barzolvolimab was well tolerated over the course of the follow-up period, Metz said, it “has the potential to be an important new treatment option,” noting that patients are now being enrolled in global phase 3 studies of barzolvolimab.
 

Sustained Symptom Relief

Ana M. Giménez-Arnau, MD, PhD, associate professor of dermatology, Autonomous University and Pompeu Fabra University, Barcelona, Spain, told Medscape Medical News that the results are important, as they showed people who switched from placebo to the active drug also saw a long-term benefit.

What is “remarkable” about barzolvolimab, continued Giménez-Arnau, who was not involved in the study, is that it is the first drug to target the KIT receptor on mast cells and interfere with stimulating growth factors, thus making the cells that drive the development of CSU “disappear.”

The study included three different barzolvolimab regimens, with the 150-mg dose every 4 weeks and the 300-mg dose every 8 weeks achieving similar results, noted Giménez-Arnau.

For her, there are important questions to answer around the pharmacokinetic and pharmacodynamic profiles of the two regimens that remain, but she underlined that for the patient, the choice of regimen could have an impact on their quality of life.

“If we give 300 mg every 8 weeks,” she said, it appears “you can achieve disease control” while halving the frequency of subcutaneous injections.

She said that it would be “interesting to know” if 300 mg every 8 weeks is given as two 150-mg injections every 2 months or one 300-mg injection. If it is the former, Giménez Arnau said, “This is potentially an important benefit for the patient.”
 

Sustained Benefits at 1 Year

The study enrolled 208 patients with antihistamine-refractory CSU at sites in 10 countries, randomizing them to one of four arms: Subcutaneous injections of barzolvolimab 75 mg or 150 mg every 4 weeks, 300 mg every 8 weeks, or placebo every 4 weeks.

The mean age in each arm was between 42 and 47 years, and around 75% were women. Across the arms, 64%-76% had severe disease, as measured on the UAS7, at a mean score of 30.0-31.3. Around 20% had previously been treated with omalizumab.

Patients were treated for 16 weeks, during which time they completed daily and weekly diaries and attended six clinic visits at weeks 0, 2, 4, 8, 12, and 16. Results from the trial published earlier this year demonstrated that both the regimens (150 mg every 4 weeks and 300 mg every 8 weeks) achieved clinically meaningful and statistically significant improvement in UAS7, the primary endpoint, vs placebo at 12 weeks.

Participants in the barzolvolimab 75 mg and placebo arms were then randomized to receive barzolvolimab 150 mg every 4 weeks or 300 mg every 8 weeks, and those who had been in the 150-mg and 300-mg treatment arms continued with that treatment for a further 36 weeks. (The remaining patients have been continued on a further 24-week follow-up, but the data are not yet available.)

By the 52-week follow-up, 25% of patients who started in each of the barzolvolimab arms had discontinued treatment, as well as 16% first randomized to the placebo arm.

Metz reported that the improvements in UAS7 scores, observed as early as week 1, were sustained through week 52 in patients in both the ongoing 150-mg and 300-mg arms. Patients who initially started in the placebo and the barzolvolimab 75-mg groups caught up with those who had started on the higher doses, so that by week 52, there were no significant differences in urticaria activity, hives, or itch scores between the arms.

By week 52, the proportion of patients achieving well-controlled disease, defined as a UAS7 score ≤ 6, was 73.7% in the barzolvolimab 150 mg every 4-week arm and 68.2% in the 300 mg barzolvolimab every 8-week arm.

Notably, just 12.8% of patients in the placebo arm had achieved well-controlled CSU by week 16, but after switching to barzolvolimab 150 mg every 4 weeks or 300 mg every 8 weeks, 63% reached that target at week 52.

“Maybe even more striking and very interesting to look at,” said Metz, was the complete control of symptoms, meaning “not one single wheal and no itch.” By week 52, 52% of those on 300 mg every 8 weeks and 71.1% of those on 150 mg every 4 weeks had a complete response, with no itch/hives (UAS7 of 0).

Importantly, complete responses with barzolvolimab were observed early and were sustained or improved to week 52, Metz said, with, again, placebo and former barzolvolimab 75 mg patients catching up with those who started on 150 mg every 4 weeks and 300 mg every 8 weeks once they switched at week 16.

“This is the best data for chronic spontaneous urticaria that we have so far seen,” he said, adding that the responses were seen regardless of prior experience with omalizumab.
 

 

 

Changes in Hair Color, Skin Pigmentation

As for safety, during the first 16 weeks, 66% of those on active treatment and 39% on placebo experienced at least one adverse event. There were no treatment-related serious adverse events, compared with two among those who received treatment for the full 52 weeks.

The most common adverse events with active treatment were hair color changes (14% in the first 16 weeks and 26% among those treated for the full 52 weeks), neutropenia/reduced neutrophil count (9% in the first 16 weeks and 17% among those treated for the full 52 weeks), and skin hypopigmentation (1% in the first 16 weeks, 13% among those treated for the full 52 weeks, and 19% among those who switched from placebo to active treatment at 36 weeks). Urticaria was reported by 10% among patients on active treatment and 10% among those on placebo in the first 16 weeks, and by 15% of those treated for the full 52 weeks.

In the post-presentation discussion, Metz explained that the hypopigmentation appears to start around the hair follicle and is diffuse, so tends to look like vitiligo.

He suggested that the melanocytes around the hair follicle “seem to be the ones that are more stressed, maybe because of the hair follicle cycling,” adding that the effect is reversible and does not appear to be dose dependent.

The study was funded by Celldex Therapeutics. Metz declared relationships with AbbVie, ALK-Abelló, Almirall, Amgen, argenx, AstraZeneca, Astria, Attovia Therapeutics, Celldex, Celltrion, Escient Pharmaceuticals, Galen, Galderma, GSK, Incyte, Jasper, Lilly, Novartis, Pfizer, Pharvaris, Regeneron, Sanofi, Teva, Third Harmonic Bio, and Vifor.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM EADV 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 11/27/2024 - 04:35
Un-Gate On Date
Wed, 11/27/2024 - 04:35
Use ProPublica
CFC Schedule Remove Status
Wed, 11/27/2024 - 04:35
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 11/27/2024 - 04:35

Children With Severe Atopic Dermatitis Catch Up on Growth With Dupilumab

Article Type
Changed
Wed, 11/27/2024 - 04:35

Children with short stature related to severe atopic dermatitis not only can have their condition effectively treated with 16 weeks of dupilumab but also may experience improved growth, bringing them back toward standard height curves, revealed a post hoc trial analysis.

The research was presented at the European Academy of Dermatology and Venereology (EADV) 2024 Congress.

The trial included a “rigorously selected … well-characterized, well-studied” population of children aged 6-11 years, said presenter Alan D. Irvine, MD, DSc, professor of dermatology, Trinity College Dublin, Ireland.

It showed that “severe atopic dermatitis does cause restriction of growth, as well as a higher weight, and therefore obviously a higher BMI [body mass index].”

He continued, however, that children at the lower percentiles of height receiving prompt treatment with dupilumab (Dupixent) “were able to rapidly move through the centiles over the 16 weeks of the study, and that may be the window for catch-up growth … when children are growing rapidly.”

Anna Yasmine Kirkorian, MD, chief of dermatology, Children’s National Hospital, Washington, DC, who was not involved in the study, said that she was “surprised” at the degree of growth achieved over the study period, as height is not something that jumps up “overnight.”

“On the other hand, it fits with my experience with children who’ve had the brakes on all of their life due to inflammation, whether it be height, going to school, sleeping — everything is sort of put on pause by this terrible inflammatory process,” she said.

“When you take the brakes off, they get to be who they are going to be,” Kirkorian added. “So I was surprised by the speed of it, but not by the fact that height was acquired.”

Her belief is that in the pre-dupilumab era, severe atopic dermatitis was often insufficiently controlled, so children were “smaller than you would predict from parental height,” and the treatment is “allowing them to reach their genetic potential.”
 

Post Hoc Analysis 

In his presentation, Irvine emphasized that it has been clearly demonstrated that adolescents with moderate and severe atopic dermatitis have a significantly higher likelihood of being below the 25th percentile of height on growth reference charts.

Such children are also at a higher risk of having low bone mineral density and low serum alkaline phosphatase (ALP) levels . While data presented at the EADV 2023 Congress showed that dupilumab significantly increased serum levels of bone ALP compared with placebo, the underlying mechanism remains unclear.

For the current analysis, Irvine and colleagues determined that the proportion of children aged 6-11 years with severe atopic dermatitis and lower stature reach a ≥ 5 centile improvement in height following 16 weeks of dupilumab treatment.

They examined data from the LIBERTY AD PEDS trial, in which patients aged 6-11 years with severe atopic dermatitis were randomized to 300 mg dupilumab every 4 weeks or placebo along with a mild or moderately potent topical corticosteroid. The study found that, overall, dupilumab was associated with significant improvements in signs, symptoms, and quality of life compared with placebo.

Height measures at baseline revealed that “more boys and more girls were below the 50th centile than you would predict for a healthy, normal control population,” Irvine said. “If we look at weight, we see the opposite,” he continued, “with a disproportionate number of boys and girls who are above the 50th centile for weight at baseline.”

Consequently, “we’re seeing these children who are shorter and heavier than the predicted healthy weight range and, as a result, obviously have higher BMI,” Irvine noted, with 67% girls and 62% boys found to have a higher BMI than normal for their age.

After 16 weeks of treatment with dupilumab, there was a much greater gain in height than that seen among those on placebo, with the most pronounced effect seen in children who had the lowest height at baseline. Indeed, among children in the lowest 25% height percentile at baseline, 30.6% on dupilumab vs 11.9% on placebo experienced an increase in height of 5 centiles or more(P < .05).

“This reflects what we see in clinical practice,” Irvine said. “Children often grow dramatically on treatment for atopic dermatitis.”

Among patients with a baseline height below the 30th percentile, 31.9% treated with dupilumab vs 11.1% treated with placebo gained at least 5 centiles in height. The figures for children below the 40th height percentile at baseline were 31.3% vs 15.5% (P < .05 for both).

Although there remained a marked difference in the proportion of children below the 50th height percentile at baseline gaining 5 centiles or more in height, at 29.0% with dupilumab versus 15.7% with placebo, it was no longer significant.

“So the effect of catch-up growth, or growth through the centiles, is most marked in those who are in the 40th centile or below,” Irvine said, indicating that the “more growth restricted kids have much more potential to catch up.”
 

 

 

‘Convincing’ Data

Overall, Kirkorian said in the interview, the data are “convincing” and support her view that severe atopic dermatitis is a “terrible chronic disease that we really underappreciate.” Atopic dermatitis, she added, “should get the respect that any severe chronic illness would have, whether that be arthritis, diabetes, or cardiac disease, because it is a systemic disorder that … profoundly affects quality of life, every minute of every day.”

However, “we don’t get all the referrals we should, until the child has suffered for years and years, and the family has suffered,” as there is a bias that it can be outgrown — although not everybody does — and it “doesn’t look as conspicuous as other chronic skin disorders,” such as psoriasis.

“Now with this study,” Kirkorian said, “it gives us a really compelling point to make to parents, to the community, and to insurers that not only are we affecting the quality of life from the itch standpoint [with dupilumab] but we may have long profound effects on growth and bone health.”

The research was sponsored by Sanofi and Regeneron Pharmaceuticals. Irvine declared relationships with AbbVie, Arena Pharmaceuticals, BenevolentAI, Chugai Pharmaceutical, Dermavant, Eli Lily, Genentech, LEO Pharma, Menlo Therapeutics, Novartis, Pfizer, Regeneron, Sanofi, UCB, DS Biopharma, and Inflazome. Kirkorian declared relationships with Dermavant, Verrica Pharmaceuticals, Pfizer, and Incyte.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Children with short stature related to severe atopic dermatitis not only can have their condition effectively treated with 16 weeks of dupilumab but also may experience improved growth, bringing them back toward standard height curves, revealed a post hoc trial analysis.

The research was presented at the European Academy of Dermatology and Venereology (EADV) 2024 Congress.

The trial included a “rigorously selected … well-characterized, well-studied” population of children aged 6-11 years, said presenter Alan D. Irvine, MD, DSc, professor of dermatology, Trinity College Dublin, Ireland.

It showed that “severe atopic dermatitis does cause restriction of growth, as well as a higher weight, and therefore obviously a higher BMI [body mass index].”

He continued, however, that children at the lower percentiles of height receiving prompt treatment with dupilumab (Dupixent) “were able to rapidly move through the centiles over the 16 weeks of the study, and that may be the window for catch-up growth … when children are growing rapidly.”

Anna Yasmine Kirkorian, MD, chief of dermatology, Children’s National Hospital, Washington, DC, who was not involved in the study, said that she was “surprised” at the degree of growth achieved over the study period, as height is not something that jumps up “overnight.”

“On the other hand, it fits with my experience with children who’ve had the brakes on all of their life due to inflammation, whether it be height, going to school, sleeping — everything is sort of put on pause by this terrible inflammatory process,” she said.

“When you take the brakes off, they get to be who they are going to be,” Kirkorian added. “So I was surprised by the speed of it, but not by the fact that height was acquired.”

Her belief is that in the pre-dupilumab era, severe atopic dermatitis was often insufficiently controlled, so children were “smaller than you would predict from parental height,” and the treatment is “allowing them to reach their genetic potential.”
 

Post Hoc Analysis 

In his presentation, Irvine emphasized that it has been clearly demonstrated that adolescents with moderate and severe atopic dermatitis have a significantly higher likelihood of being below the 25th percentile of height on growth reference charts.

Such children are also at a higher risk of having low bone mineral density and low serum alkaline phosphatase (ALP) levels . While data presented at the EADV 2023 Congress showed that dupilumab significantly increased serum levels of bone ALP compared with placebo, the underlying mechanism remains unclear.

For the current analysis, Irvine and colleagues determined that the proportion of children aged 6-11 years with severe atopic dermatitis and lower stature reach a ≥ 5 centile improvement in height following 16 weeks of dupilumab treatment.

They examined data from the LIBERTY AD PEDS trial, in which patients aged 6-11 years with severe atopic dermatitis were randomized to 300 mg dupilumab every 4 weeks or placebo along with a mild or moderately potent topical corticosteroid. The study found that, overall, dupilumab was associated with significant improvements in signs, symptoms, and quality of life compared with placebo.

Height measures at baseline revealed that “more boys and more girls were below the 50th centile than you would predict for a healthy, normal control population,” Irvine said. “If we look at weight, we see the opposite,” he continued, “with a disproportionate number of boys and girls who are above the 50th centile for weight at baseline.”

Consequently, “we’re seeing these children who are shorter and heavier than the predicted healthy weight range and, as a result, obviously have higher BMI,” Irvine noted, with 67% girls and 62% boys found to have a higher BMI than normal for their age.

After 16 weeks of treatment with dupilumab, there was a much greater gain in height than that seen among those on placebo, with the most pronounced effect seen in children who had the lowest height at baseline. Indeed, among children in the lowest 25% height percentile at baseline, 30.6% on dupilumab vs 11.9% on placebo experienced an increase in height of 5 centiles or more(P < .05).

“This reflects what we see in clinical practice,” Irvine said. “Children often grow dramatically on treatment for atopic dermatitis.”

Among patients with a baseline height below the 30th percentile, 31.9% treated with dupilumab vs 11.1% treated with placebo gained at least 5 centiles in height. The figures for children below the 40th height percentile at baseline were 31.3% vs 15.5% (P < .05 for both).

Although there remained a marked difference in the proportion of children below the 50th height percentile at baseline gaining 5 centiles or more in height, at 29.0% with dupilumab versus 15.7% with placebo, it was no longer significant.

“So the effect of catch-up growth, or growth through the centiles, is most marked in those who are in the 40th centile or below,” Irvine said, indicating that the “more growth restricted kids have much more potential to catch up.”
 

 

 

‘Convincing’ Data

Overall, Kirkorian said in the interview, the data are “convincing” and support her view that severe atopic dermatitis is a “terrible chronic disease that we really underappreciate.” Atopic dermatitis, she added, “should get the respect that any severe chronic illness would have, whether that be arthritis, diabetes, or cardiac disease, because it is a systemic disorder that … profoundly affects quality of life, every minute of every day.”

However, “we don’t get all the referrals we should, until the child has suffered for years and years, and the family has suffered,” as there is a bias that it can be outgrown — although not everybody does — and it “doesn’t look as conspicuous as other chronic skin disorders,” such as psoriasis.

“Now with this study,” Kirkorian said, “it gives us a really compelling point to make to parents, to the community, and to insurers that not only are we affecting the quality of life from the itch standpoint [with dupilumab] but we may have long profound effects on growth and bone health.”

The research was sponsored by Sanofi and Regeneron Pharmaceuticals. Irvine declared relationships with AbbVie, Arena Pharmaceuticals, BenevolentAI, Chugai Pharmaceutical, Dermavant, Eli Lily, Genentech, LEO Pharma, Menlo Therapeutics, Novartis, Pfizer, Regeneron, Sanofi, UCB, DS Biopharma, and Inflazome. Kirkorian declared relationships with Dermavant, Verrica Pharmaceuticals, Pfizer, and Incyte.
 

A version of this article first appeared on Medscape.com.

Children with short stature related to severe atopic dermatitis not only can have their condition effectively treated with 16 weeks of dupilumab but also may experience improved growth, bringing them back toward standard height curves, revealed a post hoc trial analysis.

The research was presented at the European Academy of Dermatology and Venereology (EADV) 2024 Congress.

The trial included a “rigorously selected … well-characterized, well-studied” population of children aged 6-11 years, said presenter Alan D. Irvine, MD, DSc, professor of dermatology, Trinity College Dublin, Ireland.

It showed that “severe atopic dermatitis does cause restriction of growth, as well as a higher weight, and therefore obviously a higher BMI [body mass index].”

He continued, however, that children at the lower percentiles of height receiving prompt treatment with dupilumab (Dupixent) “were able to rapidly move through the centiles over the 16 weeks of the study, and that may be the window for catch-up growth … when children are growing rapidly.”

Anna Yasmine Kirkorian, MD, chief of dermatology, Children’s National Hospital, Washington, DC, who was not involved in the study, said that she was “surprised” at the degree of growth achieved over the study period, as height is not something that jumps up “overnight.”

“On the other hand, it fits with my experience with children who’ve had the brakes on all of their life due to inflammation, whether it be height, going to school, sleeping — everything is sort of put on pause by this terrible inflammatory process,” she said.

“When you take the brakes off, they get to be who they are going to be,” Kirkorian added. “So I was surprised by the speed of it, but not by the fact that height was acquired.”

Her belief is that in the pre-dupilumab era, severe atopic dermatitis was often insufficiently controlled, so children were “smaller than you would predict from parental height,” and the treatment is “allowing them to reach their genetic potential.”
 

Post Hoc Analysis 

In his presentation, Irvine emphasized that it has been clearly demonstrated that adolescents with moderate and severe atopic dermatitis have a significantly higher likelihood of being below the 25th percentile of height on growth reference charts.

Such children are also at a higher risk of having low bone mineral density and low serum alkaline phosphatase (ALP) levels . While data presented at the EADV 2023 Congress showed that dupilumab significantly increased serum levels of bone ALP compared with placebo, the underlying mechanism remains unclear.

For the current analysis, Irvine and colleagues determined that the proportion of children aged 6-11 years with severe atopic dermatitis and lower stature reach a ≥ 5 centile improvement in height following 16 weeks of dupilumab treatment.

They examined data from the LIBERTY AD PEDS trial, in which patients aged 6-11 years with severe atopic dermatitis were randomized to 300 mg dupilumab every 4 weeks or placebo along with a mild or moderately potent topical corticosteroid. The study found that, overall, dupilumab was associated with significant improvements in signs, symptoms, and quality of life compared with placebo.

Height measures at baseline revealed that “more boys and more girls were below the 50th centile than you would predict for a healthy, normal control population,” Irvine said. “If we look at weight, we see the opposite,” he continued, “with a disproportionate number of boys and girls who are above the 50th centile for weight at baseline.”

Consequently, “we’re seeing these children who are shorter and heavier than the predicted healthy weight range and, as a result, obviously have higher BMI,” Irvine noted, with 67% girls and 62% boys found to have a higher BMI than normal for their age.

After 16 weeks of treatment with dupilumab, there was a much greater gain in height than that seen among those on placebo, with the most pronounced effect seen in children who had the lowest height at baseline. Indeed, among children in the lowest 25% height percentile at baseline, 30.6% on dupilumab vs 11.9% on placebo experienced an increase in height of 5 centiles or more(P < .05).

“This reflects what we see in clinical practice,” Irvine said. “Children often grow dramatically on treatment for atopic dermatitis.”

Among patients with a baseline height below the 30th percentile, 31.9% treated with dupilumab vs 11.1% treated with placebo gained at least 5 centiles in height. The figures for children below the 40th height percentile at baseline were 31.3% vs 15.5% (P < .05 for both).

Although there remained a marked difference in the proportion of children below the 50th height percentile at baseline gaining 5 centiles or more in height, at 29.0% with dupilumab versus 15.7% with placebo, it was no longer significant.

“So the effect of catch-up growth, or growth through the centiles, is most marked in those who are in the 40th centile or below,” Irvine said, indicating that the “more growth restricted kids have much more potential to catch up.”
 

 

 

‘Convincing’ Data

Overall, Kirkorian said in the interview, the data are “convincing” and support her view that severe atopic dermatitis is a “terrible chronic disease that we really underappreciate.” Atopic dermatitis, she added, “should get the respect that any severe chronic illness would have, whether that be arthritis, diabetes, or cardiac disease, because it is a systemic disorder that … profoundly affects quality of life, every minute of every day.”

However, “we don’t get all the referrals we should, until the child has suffered for years and years, and the family has suffered,” as there is a bias that it can be outgrown — although not everybody does — and it “doesn’t look as conspicuous as other chronic skin disorders,” such as psoriasis.

“Now with this study,” Kirkorian said, “it gives us a really compelling point to make to parents, to the community, and to insurers that not only are we affecting the quality of life from the itch standpoint [with dupilumab] but we may have long profound effects on growth and bone health.”

The research was sponsored by Sanofi and Regeneron Pharmaceuticals. Irvine declared relationships with AbbVie, Arena Pharmaceuticals, BenevolentAI, Chugai Pharmaceutical, Dermavant, Eli Lily, Genentech, LEO Pharma, Menlo Therapeutics, Novartis, Pfizer, Regeneron, Sanofi, UCB, DS Biopharma, and Inflazome. Kirkorian declared relationships with Dermavant, Verrica Pharmaceuticals, Pfizer, and Incyte.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM EADV 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 11/27/2024 - 04:35
Un-Gate On Date
Wed, 11/27/2024 - 04:35
Use ProPublica
CFC Schedule Remove Status
Wed, 11/27/2024 - 04:35
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 11/27/2024 - 04:35