User login
New Evidence Red Meat–Rich Diet Can Exacerbate IBD
Researchers from China observed that mice fed a red meat diet experienced more severe intestinal inflammation after colitis was experimentally induced compared to those on a control diet.
“These results highlight the necessity of dietary optimization, particularly the reduction of red meat consumption, as a preventive strategy against the development of IBD,” wrote Dan Tian, MD, PhD, with Capital Medical University, Beijing, China, and colleagues. The study was published online in Molecular Nutrition & Food Research.
Environmental Trigger
The exact causes of IBD remain unclear, but diet has long been considered a key environmental trigger. Western dietary patterns, which often feature high consumption of red and processed meats and low fiber, have been associated with higher IBD rates, especially ulcerative colitis.
Tian and colleagues tested the aggravating effects of three red meat diets on intestinal inflammation, gut microbiota composition, and susceptibility to colitis in mice.
They fed mice red meat diets prepared from pork, beef, and mutton for 2 weeks before inducing colitis using dextran sulfate sodium. They monitored the animals for changes in weight, colon length, tissue damage, and immune activity.
Histological analysis revealed that all three red meat diets aggravated colonic inflammation, with mutton producing the most pronounced effects.
RNA sequencing of colon tissue further showed that red meat intake activated pathways linked to inflammation. “Notably,” expression off proinflammatory cytokines, including interleukin (IL)-1 beta and IL-6, was significantly upregulated and expression of genes related to myeloid cell chemotaxis and activation was also increased, the researchers reported.
Flow cytometry confirmed that red meat diets promoted a surge in colonic myeloid immune cells, potentially driving inflammation. However, only minimal changes in T lymphocytes were observed, suggesting that red meat primarily drives innate immune rather than adaptive immune activation, they suggested.
While overall microbial diversity was not significantly altered, red meat-fed mice displayed marked dysbiosis.
Beneficial bacteria such as Streptococcus, Akkermansia, Faecalibacterium, and Lactococcus declined, while harmful groups including Clostridium and Mucispirillum increased. Each type of meat had distinct microbial effects, but all skewed the balance toward potentially harmful bacteria known to promote gut inflammation.
Overall, these results suggest that red meat diets exacerbate colitis by simultaneously promoting immune cell infiltration and disturbing microbial communities in the gut.
The fact that these effects occurred without significant change in weight, suggests that red meat consumption exerts proinflammatory effects through mechanisms other than weight gain.
“These results offer valuable insights into the relationship between dietary interventions and IBD, suggesting that a balanced diet, adequate nutrients, and moderated red meat consumption may help prevent the development of IBD,” the researchers concluded.
In support of their findings, a 2024 umbrella review that synthesized data from multiple cohort and observational studies, found strong associations between Western-style dietary patterns — including high processed/red meat, saturated fats, and additives — and both the incidence and progression of IBD.
The study had no commercial funding. The authors declared having no conflicts of interest.
A version of this article appeared on Medscape.com.
Researchers from China observed that mice fed a red meat diet experienced more severe intestinal inflammation after colitis was experimentally induced compared to those on a control diet.
“These results highlight the necessity of dietary optimization, particularly the reduction of red meat consumption, as a preventive strategy against the development of IBD,” wrote Dan Tian, MD, PhD, with Capital Medical University, Beijing, China, and colleagues. The study was published online in Molecular Nutrition & Food Research.
Environmental Trigger
The exact causes of IBD remain unclear, but diet has long been considered a key environmental trigger. Western dietary patterns, which often feature high consumption of red and processed meats and low fiber, have been associated with higher IBD rates, especially ulcerative colitis.
Tian and colleagues tested the aggravating effects of three red meat diets on intestinal inflammation, gut microbiota composition, and susceptibility to colitis in mice.
They fed mice red meat diets prepared from pork, beef, and mutton for 2 weeks before inducing colitis using dextran sulfate sodium. They monitored the animals for changes in weight, colon length, tissue damage, and immune activity.
Histological analysis revealed that all three red meat diets aggravated colonic inflammation, with mutton producing the most pronounced effects.
RNA sequencing of colon tissue further showed that red meat intake activated pathways linked to inflammation. “Notably,” expression off proinflammatory cytokines, including interleukin (IL)-1 beta and IL-6, was significantly upregulated and expression of genes related to myeloid cell chemotaxis and activation was also increased, the researchers reported.
Flow cytometry confirmed that red meat diets promoted a surge in colonic myeloid immune cells, potentially driving inflammation. However, only minimal changes in T lymphocytes were observed, suggesting that red meat primarily drives innate immune rather than adaptive immune activation, they suggested.
While overall microbial diversity was not significantly altered, red meat-fed mice displayed marked dysbiosis.
Beneficial bacteria such as Streptococcus, Akkermansia, Faecalibacterium, and Lactococcus declined, while harmful groups including Clostridium and Mucispirillum increased. Each type of meat had distinct microbial effects, but all skewed the balance toward potentially harmful bacteria known to promote gut inflammation.
Overall, these results suggest that red meat diets exacerbate colitis by simultaneously promoting immune cell infiltration and disturbing microbial communities in the gut.
The fact that these effects occurred without significant change in weight, suggests that red meat consumption exerts proinflammatory effects through mechanisms other than weight gain.
“These results offer valuable insights into the relationship between dietary interventions and IBD, suggesting that a balanced diet, adequate nutrients, and moderated red meat consumption may help prevent the development of IBD,” the researchers concluded.
In support of their findings, a 2024 umbrella review that synthesized data from multiple cohort and observational studies, found strong associations between Western-style dietary patterns — including high processed/red meat, saturated fats, and additives — and both the incidence and progression of IBD.
The study had no commercial funding. The authors declared having no conflicts of interest.
A version of this article appeared on Medscape.com.
Researchers from China observed that mice fed a red meat diet experienced more severe intestinal inflammation after colitis was experimentally induced compared to those on a control diet.
“These results highlight the necessity of dietary optimization, particularly the reduction of red meat consumption, as a preventive strategy against the development of IBD,” wrote Dan Tian, MD, PhD, with Capital Medical University, Beijing, China, and colleagues. The study was published online in Molecular Nutrition & Food Research.
Environmental Trigger
The exact causes of IBD remain unclear, but diet has long been considered a key environmental trigger. Western dietary patterns, which often feature high consumption of red and processed meats and low fiber, have been associated with higher IBD rates, especially ulcerative colitis.
Tian and colleagues tested the aggravating effects of three red meat diets on intestinal inflammation, gut microbiota composition, and susceptibility to colitis in mice.
They fed mice red meat diets prepared from pork, beef, and mutton for 2 weeks before inducing colitis using dextran sulfate sodium. They monitored the animals for changes in weight, colon length, tissue damage, and immune activity.
Histological analysis revealed that all three red meat diets aggravated colonic inflammation, with mutton producing the most pronounced effects.
RNA sequencing of colon tissue further showed that red meat intake activated pathways linked to inflammation. “Notably,” expression off proinflammatory cytokines, including interleukin (IL)-1 beta and IL-6, was significantly upregulated and expression of genes related to myeloid cell chemotaxis and activation was also increased, the researchers reported.
Flow cytometry confirmed that red meat diets promoted a surge in colonic myeloid immune cells, potentially driving inflammation. However, only minimal changes in T lymphocytes were observed, suggesting that red meat primarily drives innate immune rather than adaptive immune activation, they suggested.
While overall microbial diversity was not significantly altered, red meat-fed mice displayed marked dysbiosis.
Beneficial bacteria such as Streptococcus, Akkermansia, Faecalibacterium, and Lactococcus declined, while harmful groups including Clostridium and Mucispirillum increased. Each type of meat had distinct microbial effects, but all skewed the balance toward potentially harmful bacteria known to promote gut inflammation.
Overall, these results suggest that red meat diets exacerbate colitis by simultaneously promoting immune cell infiltration and disturbing microbial communities in the gut.
The fact that these effects occurred without significant change in weight, suggests that red meat consumption exerts proinflammatory effects through mechanisms other than weight gain.
“These results offer valuable insights into the relationship between dietary interventions and IBD, suggesting that a balanced diet, adequate nutrients, and moderated red meat consumption may help prevent the development of IBD,” the researchers concluded.
In support of their findings, a 2024 umbrella review that synthesized data from multiple cohort and observational studies, found strong associations between Western-style dietary patterns — including high processed/red meat, saturated fats, and additives — and both the incidence and progression of IBD.
The study had no commercial funding. The authors declared having no conflicts of interest.
A version of this article appeared on Medscape.com.
How IBS Disrupts Daily Life: AGA Survey
A new survey from AGA, in partnership with The Harris Poll, revealed that IBS symptoms interfere with people’s lives an average of 19 days each month — about 11 days affecting work or school and 8 days curtailing personal activities.
Missed work or school has climbed to 3.6 days per month from 2.1 days in 2015 — the last time the AGA released the “IBS in America” survey. And more patients report spending less time with family and friends because of their symptoms (58% now, up from 48% in 2015).
The latest survey was conducted in fall 2024 among more than 2000 patients with IBS and 600 healthcare providers, including gastroenterologists, primary care physicians, and advanced practitioners.
Stark Realities of Life With IBS
Fewer patients in 2024 described their IBS symptoms as very or extremely bothersome (43%, compared to 62% in 2015), yet three quarters said it’s tough to manage their symptoms and most can’t accurately predict whether they will experience symptoms on a given day.
All this affects patients’ willingness or ability to make plans. More than three quarters (77%) said they avoid situations where bathroom access is limited, and nearly that many (72%) said their symptoms cause them to stay home more often.
About 7 in 10 patients said their IBS symptoms make them feel like they’re not “normal” or that their symptoms prevent them from reaching their full potential.
“The findings of this survey underscore the persistent challenges and impact IBS has on patients’ lives,” said Andrea Shin, MD, gastroenterologist with UCLA Health, Los Angeles, and AGA patient education advisor.
“Despite progress in the medical community’s approach to diagnosing and managing IBS, patients continue to suffer significant disruptions to their personal and professional lives,” Shin noted.
How Is IBS Treated?
Treatment options for IBS have evolved over the last decade or so and now include several FDA-approved agents, such as plecanatide (Trulance) and tenapanor (Ibsrela) for IBS with constipation (IBS-C) and rifaximin (Xifaxan) and eluxadoline (Viberzi) for IBS with diarrhea (IBS-D).
According to patients who have tried them, prescription medications are among the most helpful treatments (18% for IBS-C and 19% for IBS-D).
Yet, clinicians tend to prioritize fiber, nonprescription laxatives, and exercise for IBS-C, and diet changes, antidiarrheals, and probiotics for IBS-D, over prescription medications, the survey found.
Nonetheless, about 78% of patients reported being satisfied with what they take for their symptoms, with about one quarter very satisfied.
Compared to 10 years ago, more physicians in the latest survey said effective relief of abdominal pain (49% vs 39%) or diarrhea/constipation (47% vs 33%) and the availability of treatment options (49% vs 34%) are what is most lacking in IBS treatment today, despite advancements in the IBS treatment landscape.
“IBS is a condition that continues to challenge patients to find a treatment that consistently works for them,” said Jeffrey Roberts, founder of the IBS Patient Support Group community and World IBS Day.
“The AGA IBS in America Survey sheds light on patients who are still not being offered a variety of treatments that could provide them with a better quality of life. This continues to result in disruptions to their career, schooling, and life with their families and friends,” Roberts added.
A version of this article appeared on Medscape.com.
A new survey from AGA, in partnership with The Harris Poll, revealed that IBS symptoms interfere with people’s lives an average of 19 days each month — about 11 days affecting work or school and 8 days curtailing personal activities.
Missed work or school has climbed to 3.6 days per month from 2.1 days in 2015 — the last time the AGA released the “IBS in America” survey. And more patients report spending less time with family and friends because of their symptoms (58% now, up from 48% in 2015).
The latest survey was conducted in fall 2024 among more than 2000 patients with IBS and 600 healthcare providers, including gastroenterologists, primary care physicians, and advanced practitioners.
Stark Realities of Life With IBS
Fewer patients in 2024 described their IBS symptoms as very or extremely bothersome (43%, compared to 62% in 2015), yet three quarters said it’s tough to manage their symptoms and most can’t accurately predict whether they will experience symptoms on a given day.
All this affects patients’ willingness or ability to make plans. More than three quarters (77%) said they avoid situations where bathroom access is limited, and nearly that many (72%) said their symptoms cause them to stay home more often.
About 7 in 10 patients said their IBS symptoms make them feel like they’re not “normal” or that their symptoms prevent them from reaching their full potential.
“The findings of this survey underscore the persistent challenges and impact IBS has on patients’ lives,” said Andrea Shin, MD, gastroenterologist with UCLA Health, Los Angeles, and AGA patient education advisor.
“Despite progress in the medical community’s approach to diagnosing and managing IBS, patients continue to suffer significant disruptions to their personal and professional lives,” Shin noted.
How Is IBS Treated?
Treatment options for IBS have evolved over the last decade or so and now include several FDA-approved agents, such as plecanatide (Trulance) and tenapanor (Ibsrela) for IBS with constipation (IBS-C) and rifaximin (Xifaxan) and eluxadoline (Viberzi) for IBS with diarrhea (IBS-D).
According to patients who have tried them, prescription medications are among the most helpful treatments (18% for IBS-C and 19% for IBS-D).
Yet, clinicians tend to prioritize fiber, nonprescription laxatives, and exercise for IBS-C, and diet changes, antidiarrheals, and probiotics for IBS-D, over prescription medications, the survey found.
Nonetheless, about 78% of patients reported being satisfied with what they take for their symptoms, with about one quarter very satisfied.
Compared to 10 years ago, more physicians in the latest survey said effective relief of abdominal pain (49% vs 39%) or diarrhea/constipation (47% vs 33%) and the availability of treatment options (49% vs 34%) are what is most lacking in IBS treatment today, despite advancements in the IBS treatment landscape.
“IBS is a condition that continues to challenge patients to find a treatment that consistently works for them,” said Jeffrey Roberts, founder of the IBS Patient Support Group community and World IBS Day.
“The AGA IBS in America Survey sheds light on patients who are still not being offered a variety of treatments that could provide them with a better quality of life. This continues to result in disruptions to their career, schooling, and life with their families and friends,” Roberts added.
A version of this article appeared on Medscape.com.
A new survey from AGA, in partnership with The Harris Poll, revealed that IBS symptoms interfere with people’s lives an average of 19 days each month — about 11 days affecting work or school and 8 days curtailing personal activities.
Missed work or school has climbed to 3.6 days per month from 2.1 days in 2015 — the last time the AGA released the “IBS in America” survey. And more patients report spending less time with family and friends because of their symptoms (58% now, up from 48% in 2015).
The latest survey was conducted in fall 2024 among more than 2000 patients with IBS and 600 healthcare providers, including gastroenterologists, primary care physicians, and advanced practitioners.
Stark Realities of Life With IBS
Fewer patients in 2024 described their IBS symptoms as very or extremely bothersome (43%, compared to 62% in 2015), yet three quarters said it’s tough to manage their symptoms and most can’t accurately predict whether they will experience symptoms on a given day.
All this affects patients’ willingness or ability to make plans. More than three quarters (77%) said they avoid situations where bathroom access is limited, and nearly that many (72%) said their symptoms cause them to stay home more often.
About 7 in 10 patients said their IBS symptoms make them feel like they’re not “normal” or that their symptoms prevent them from reaching their full potential.
“The findings of this survey underscore the persistent challenges and impact IBS has on patients’ lives,” said Andrea Shin, MD, gastroenterologist with UCLA Health, Los Angeles, and AGA patient education advisor.
“Despite progress in the medical community’s approach to diagnosing and managing IBS, patients continue to suffer significant disruptions to their personal and professional lives,” Shin noted.
How Is IBS Treated?
Treatment options for IBS have evolved over the last decade or so and now include several FDA-approved agents, such as plecanatide (Trulance) and tenapanor (Ibsrela) for IBS with constipation (IBS-C) and rifaximin (Xifaxan) and eluxadoline (Viberzi) for IBS with diarrhea (IBS-D).
According to patients who have tried them, prescription medications are among the most helpful treatments (18% for IBS-C and 19% for IBS-D).
Yet, clinicians tend to prioritize fiber, nonprescription laxatives, and exercise for IBS-C, and diet changes, antidiarrheals, and probiotics for IBS-D, over prescription medications, the survey found.
Nonetheless, about 78% of patients reported being satisfied with what they take for their symptoms, with about one quarter very satisfied.
Compared to 10 years ago, more physicians in the latest survey said effective relief of abdominal pain (49% vs 39%) or diarrhea/constipation (47% vs 33%) and the availability of treatment options (49% vs 34%) are what is most lacking in IBS treatment today, despite advancements in the IBS treatment landscape.
“IBS is a condition that continues to challenge patients to find a treatment that consistently works for them,” said Jeffrey Roberts, founder of the IBS Patient Support Group community and World IBS Day.
“The AGA IBS in America Survey sheds light on patients who are still not being offered a variety of treatments that could provide them with a better quality of life. This continues to result in disruptions to their career, schooling, and life with their families and friends,” Roberts added.
A version of this article appeared on Medscape.com.
Skip Antibiotic Prophylaxis for Upper GI Bleeding in Cirrhosis?
Pooled data from 14 randomized controlled trials (RCTs) found a high probability that no or shorter durations of antibiotic prophylaxis are not worse than longer durations in preventing death from any cause in these patients.
The findings suggest that recommendations for routine antibiotic prophylaxis in patients with cirrhosis and upper GI bleeding “should be reassessed,” the authors said.
They acknowledged, however, that the studies were of low-to-moderate quality and higher quality randomized clinical trial data are needed.
The study, with first author Connor Prosty, MD, of McGill University, in Montreal, Quebec, Canada, was published online in JAMA Internal Medicine.
Questionable Benefit?
Antibiotic prophylaxis became standard decades ago, when up to 60% of variceal bleeds were complicated by infections, which were thought to increase the risk for rebleeding and death.
Yet, the evidence to support the recommendation remains limited, and a recent RCT called into question the necessity of prophylaxis. The study showed no statistically significant difference in mortality or infection among patients with Child-Pugh class A cirrhosis randomized to receive no prophylaxis compared to third-generation cephalosporin.
While generally perceived as safe, antibiotics have potential adverse effects and can select for resistant superinfections, Prosty and colleagues pointed out.
They also noted that shorter courses of antibiotics have been proven to be as good, if not better, than longer courses across numerous other infectious indications. Recommendations for primary and secondary antibiotic prophylaxis for spontaneous bacterial peritonitis are being reassessed due to a weak evidence base, lack of mortality benefit, and potential for harm.
To revisit antibiotic prophylaxis for upper GI bleeding in patients with cirrhosis, Prosty and colleagues did a systematic review and meta-analysis of 14 RCTs involving 1322 patients.
Two of the trials compared longer (5-7 days) with shorter (2-3 days) antibiotics, and 12 compared any antibiotic prophylaxis (1-10 days) to none.
The primary outcome was all-cause mortality, with a prespecified noninferiority margin of 5% on the risk difference (RD) scale. Secondary outcomes included early rebleeding and bacterial infections.
Overall, shorter antibiotic durations (including none) had a 97.3% probability of noninferiority to longer durations for all-cause mortality (RD, 0.9%; 95% credible interval [CrI], -2.6% to 4.9%).
Shorter durations had a 73.8% probability of noninferiority for early rebleeding (RD, 2.9%; 95% CrI, -4.2% to 10.0%) but were associated with more study-defined bacterial infections (RD, 15.2%; 95% CrI, 5.0%-25.9%). However, the authors cited methodological concerns about the definitions of these infections in the included studies.
The probabilities of noninferiority of shorter durations for mortality, early rebleeding, and bacterial infections were higher in studies published after 2004.
Change Practice Now?
“Our findings re-open the discussion surrounding the long-standing and firmly held belief that antibiotic prophylaxis has a mortality benefit in patients with cirrhosis presenting with upper gastrointestinal bleeds,” Prosty and colleagues wrote.
They cautioned, however, that the study quality was “low to moderate, bacterial infections were heterogeneously defined, and no studies reported adverse events. Higher-quality RCTs are needed to determine the benefit and optimal duration of antibiotic prophylaxis in the modern era of advanced interventions.”
The authors of a commentary published with the study noted that management of upper GI bleeding in cirrhosis patients has “greatly improved” since the 1990s, when some of the trials included in the analysis were conducted.
Hepatologists Catherine Mezzacappa, MD, MPH, and Guadalupe Garcia-Tsao, MD, both at the Yale School of Medicine, New Haven, Connecticut, agree that it “may be time to revisit whether prophylactic antibiotics continue to provide benefit in patients with cirrhosis and upper GI bleeding, and if so, in which patients.”
They caution, however, that the current level of evidence is “inadequate to answer whether it is time to stop this practice, which has become the standard of care.
New trials for shorter duration and no antibiotic prophylaxis “should be designed in specific patient populations to compare sequelae of antibiotic prophylaxis, including subsequent infections and all-cause mortality,” Mezzacappa and Garcia-Tsao concluded.
The study received no specific funding. The authors and commentary writers had no relevant disclosures.
A version of this article appeared on Medscape.com.
Pooled data from 14 randomized controlled trials (RCTs) found a high probability that no or shorter durations of antibiotic prophylaxis are not worse than longer durations in preventing death from any cause in these patients.
The findings suggest that recommendations for routine antibiotic prophylaxis in patients with cirrhosis and upper GI bleeding “should be reassessed,” the authors said.
They acknowledged, however, that the studies were of low-to-moderate quality and higher quality randomized clinical trial data are needed.
The study, with first author Connor Prosty, MD, of McGill University, in Montreal, Quebec, Canada, was published online in JAMA Internal Medicine.
Questionable Benefit?
Antibiotic prophylaxis became standard decades ago, when up to 60% of variceal bleeds were complicated by infections, which were thought to increase the risk for rebleeding and death.
Yet, the evidence to support the recommendation remains limited, and a recent RCT called into question the necessity of prophylaxis. The study showed no statistically significant difference in mortality or infection among patients with Child-Pugh class A cirrhosis randomized to receive no prophylaxis compared to third-generation cephalosporin.
While generally perceived as safe, antibiotics have potential adverse effects and can select for resistant superinfections, Prosty and colleagues pointed out.
They also noted that shorter courses of antibiotics have been proven to be as good, if not better, than longer courses across numerous other infectious indications. Recommendations for primary and secondary antibiotic prophylaxis for spontaneous bacterial peritonitis are being reassessed due to a weak evidence base, lack of mortality benefit, and potential for harm.
To revisit antibiotic prophylaxis for upper GI bleeding in patients with cirrhosis, Prosty and colleagues did a systematic review and meta-analysis of 14 RCTs involving 1322 patients.
Two of the trials compared longer (5-7 days) with shorter (2-3 days) antibiotics, and 12 compared any antibiotic prophylaxis (1-10 days) to none.
The primary outcome was all-cause mortality, with a prespecified noninferiority margin of 5% on the risk difference (RD) scale. Secondary outcomes included early rebleeding and bacterial infections.
Overall, shorter antibiotic durations (including none) had a 97.3% probability of noninferiority to longer durations for all-cause mortality (RD, 0.9%; 95% credible interval [CrI], -2.6% to 4.9%).
Shorter durations had a 73.8% probability of noninferiority for early rebleeding (RD, 2.9%; 95% CrI, -4.2% to 10.0%) but were associated with more study-defined bacterial infections (RD, 15.2%; 95% CrI, 5.0%-25.9%). However, the authors cited methodological concerns about the definitions of these infections in the included studies.
The probabilities of noninferiority of shorter durations for mortality, early rebleeding, and bacterial infections were higher in studies published after 2004.
Change Practice Now?
“Our findings re-open the discussion surrounding the long-standing and firmly held belief that antibiotic prophylaxis has a mortality benefit in patients with cirrhosis presenting with upper gastrointestinal bleeds,” Prosty and colleagues wrote.
They cautioned, however, that the study quality was “low to moderate, bacterial infections were heterogeneously defined, and no studies reported adverse events. Higher-quality RCTs are needed to determine the benefit and optimal duration of antibiotic prophylaxis in the modern era of advanced interventions.”
The authors of a commentary published with the study noted that management of upper GI bleeding in cirrhosis patients has “greatly improved” since the 1990s, when some of the trials included in the analysis were conducted.
Hepatologists Catherine Mezzacappa, MD, MPH, and Guadalupe Garcia-Tsao, MD, both at the Yale School of Medicine, New Haven, Connecticut, agree that it “may be time to revisit whether prophylactic antibiotics continue to provide benefit in patients with cirrhosis and upper GI bleeding, and if so, in which patients.”
They caution, however, that the current level of evidence is “inadequate to answer whether it is time to stop this practice, which has become the standard of care.
New trials for shorter duration and no antibiotic prophylaxis “should be designed in specific patient populations to compare sequelae of antibiotic prophylaxis, including subsequent infections and all-cause mortality,” Mezzacappa and Garcia-Tsao concluded.
The study received no specific funding. The authors and commentary writers had no relevant disclosures.
A version of this article appeared on Medscape.com.
Pooled data from 14 randomized controlled trials (RCTs) found a high probability that no or shorter durations of antibiotic prophylaxis are not worse than longer durations in preventing death from any cause in these patients.
The findings suggest that recommendations for routine antibiotic prophylaxis in patients with cirrhosis and upper GI bleeding “should be reassessed,” the authors said.
They acknowledged, however, that the studies were of low-to-moderate quality and higher quality randomized clinical trial data are needed.
The study, with first author Connor Prosty, MD, of McGill University, in Montreal, Quebec, Canada, was published online in JAMA Internal Medicine.
Questionable Benefit?
Antibiotic prophylaxis became standard decades ago, when up to 60% of variceal bleeds were complicated by infections, which were thought to increase the risk for rebleeding and death.
Yet, the evidence to support the recommendation remains limited, and a recent RCT called into question the necessity of prophylaxis. The study showed no statistically significant difference in mortality or infection among patients with Child-Pugh class A cirrhosis randomized to receive no prophylaxis compared to third-generation cephalosporin.
While generally perceived as safe, antibiotics have potential adverse effects and can select for resistant superinfections, Prosty and colleagues pointed out.
They also noted that shorter courses of antibiotics have been proven to be as good, if not better, than longer courses across numerous other infectious indications. Recommendations for primary and secondary antibiotic prophylaxis for spontaneous bacterial peritonitis are being reassessed due to a weak evidence base, lack of mortality benefit, and potential for harm.
To revisit antibiotic prophylaxis for upper GI bleeding in patients with cirrhosis, Prosty and colleagues did a systematic review and meta-analysis of 14 RCTs involving 1322 patients.
Two of the trials compared longer (5-7 days) with shorter (2-3 days) antibiotics, and 12 compared any antibiotic prophylaxis (1-10 days) to none.
The primary outcome was all-cause mortality, with a prespecified noninferiority margin of 5% on the risk difference (RD) scale. Secondary outcomes included early rebleeding and bacterial infections.
Overall, shorter antibiotic durations (including none) had a 97.3% probability of noninferiority to longer durations for all-cause mortality (RD, 0.9%; 95% credible interval [CrI], -2.6% to 4.9%).
Shorter durations had a 73.8% probability of noninferiority for early rebleeding (RD, 2.9%; 95% CrI, -4.2% to 10.0%) but were associated with more study-defined bacterial infections (RD, 15.2%; 95% CrI, 5.0%-25.9%). However, the authors cited methodological concerns about the definitions of these infections in the included studies.
The probabilities of noninferiority of shorter durations for mortality, early rebleeding, and bacterial infections were higher in studies published after 2004.
Change Practice Now?
“Our findings re-open the discussion surrounding the long-standing and firmly held belief that antibiotic prophylaxis has a mortality benefit in patients with cirrhosis presenting with upper gastrointestinal bleeds,” Prosty and colleagues wrote.
They cautioned, however, that the study quality was “low to moderate, bacterial infections were heterogeneously defined, and no studies reported adverse events. Higher-quality RCTs are needed to determine the benefit and optimal duration of antibiotic prophylaxis in the modern era of advanced interventions.”
The authors of a commentary published with the study noted that management of upper GI bleeding in cirrhosis patients has “greatly improved” since the 1990s, when some of the trials included in the analysis were conducted.
Hepatologists Catherine Mezzacappa, MD, MPH, and Guadalupe Garcia-Tsao, MD, both at the Yale School of Medicine, New Haven, Connecticut, agree that it “may be time to revisit whether prophylactic antibiotics continue to provide benefit in patients with cirrhosis and upper GI bleeding, and if so, in which patients.”
They caution, however, that the current level of evidence is “inadequate to answer whether it is time to stop this practice, which has become the standard of care.
New trials for shorter duration and no antibiotic prophylaxis “should be designed in specific patient populations to compare sequelae of antibiotic prophylaxis, including subsequent infections and all-cause mortality,” Mezzacappa and Garcia-Tsao concluded.
The study received no specific funding. The authors and commentary writers had no relevant disclosures.
A version of this article appeared on Medscape.com.
Diet Rich in Ultraprocessed Grains Increases Risk for IBD
, a large study has found.
The sweeping analysis of 124,590 adults from 21 countries found that those eating at least 19 g of ultraprocessed grains a day were about twice as likely to be diagnosed with IBD as peers eating less than 9 g daily.
“Our study adds robust evidence from a large, diverse global cohort that frequent consumption of ultraprocessed grains is associated with an increased risk of developing inflammatory bowel disease,” Neeraj Narula, MD, MPH, gastroenterologist and associate professor of medicine, McMaster University, Hamilton, Ontario, Canada, told GI & Hepatology News.
The study also “further clarifies that not all grains carry risk — minimally processed grains like fresh bread and rice were associated with lower risk even. These results build on and specify previous findings linking ultraprocessed foods more broadly to IBD,” Narula said.
The study was published in The American Journal of Gastroenterology.
Diet Matters to IBD Risk
According to the latest US data (2021-2023), ultraprocessed foods made up 62% of daily calories for young people and 53% for adults in 2021-2023.
The Prospective Urban Rural Epidemiology (PURE) study has followed participants aged 35-70 years for a median of nearly 13 years. At enrollment, volunteers completed country-specific food-frequency questionnaires, enabling researchers to quantify usual intake of more than 130 food items and track new cases of IBD reported at biennial follow-ups.
The researchers classified packaged breads, sweet breakfast cereals, crackers, pastries and ready-to-heat pizza or pasta as ultraprocessed grains because they are refined and typically contain additives such as emulsifiers and preservatives. Fresh bakery bread and plain rice were analyzed separately as minimally processed grain references.
During a median of 12.9 years, 605 participants developed IBD; 497 developed ulcerative colitis (UC) and 108 developed Crohn’s disease.
Increased intake of ultraprocessed grains was associated with a higher risk for IBD, with hazard ratios (HR) of 2.08 for intake of ≥ 50 g/d and 1.37 for 19-50 g/d compared to intake of < 19 g/d. The increased risk was largely driven by a significantly increased risk for UC (HR, 2.46) and not Crohn’s disease (HR, 0.98).
Among the different ultraprocessed grain products, packaged bread stood out: Consuming ≥ 30 g/d of packaged bread (a little more than one slice) was associated with a greater than twofold increased risk for IBD (HR, 2.11) compared to no intake of packaged bread.
In contrast, greater consumption of fresh bread was associated with a reduced risk of developing IBD (HR, 0.61 for ≥ 65 g/d and 0.45 for 16-65 g/d compared to < 16 g/d).
Increased intake of rice was also associated with a lower risk of developing IBD (HR, 0.63 for ≥ 1 serving/d and 0.99 for < 1 serving/d).
When the researchers widened the lens to all ultraprocessed foods — from sodas to salty snacks — the risk for IBD climbed further.
Participants eating at least five servings a day had nearly a fourfold greater odds of IBD than those eating fewer than one serving (HR, 3.95) — a finding consistent with other data from the PURE study cohort.
What to Tell Patients?
The authors acknowledged in their paper that it’s difficult — if not impossible — to completely avoid ultraprocessed food in the Western diet.
They said their findings support “public health strategies to promote consumption of whole and minimally processed foods while reducing the consumption of highly processed alternatives.”
“I tell my patients that emerging literature shows an association between ultraprocessed food intake and IBD risk, but it’s not yet clear whether simply cutting out those foods will improve disease activity once IBD is established,” Narula told GI & Hepatology News.
“However, I still encourage patients to reduce ultraprocessed foods and to follow a Mediterranean-style diet — focusing on minimally processed grains, fruits, vegetables, healthy fats, and lean proteins — to support overall gut and general health,” Narula said.
Reached for comment, Ashwin Ananthakrishnan, MD, MPH, AGAF, associate professor of medicine, Massachusetts General Hospital, Boston, who wasn’t part of the study, said it “adds incrementally to the growing data on how ultraprocessed foods may affect the risk of IBD.”
“They (and others) have previously shown a link between general ultraprocessed food consumption and risk of IBD. Others have shown that some of this is mediated through refined grains. This study more specifically studies that question and demonstrates an association,” said Ananthkrishnan.
“This should not be used, however, to counsel patients. It does not study the impact of grain intake on patients with IBD. It may help inform population level preventive strategies (or in high-risk individuals) but requires more confirmation since there is significant heterogeneity between the various countries in this cohort. Countries that have high refined grain intake are also enriched in several other IBD risk factors (including genetics),” Ananthkrishnan told GI & Hepatology News.
The PURE study is an investigator-initiated study funded by the Population Health Research Institute, Hamilton Health Sciences Research Institute, Canadian Institutes of Health Research, and Heart and Stroke Foundation of Ontario. It received support from Canadian Institutes of Health Research’s Strategy for Patient Oriented Research, Ontario SPOR Support Unit, and Ontario Ministry of Health and Long-Term Care and unrestricted grants from several pharmaceutical companies. Narula declared receiving honoraria from Janssen, Abbvie, Takeda, Pfizer, Sandoz, Novartis, Iterative Health, Innomar Strategies, Fresinius Kabi, Amgen, Organon, Eli Lilly, and Ferring. Ananthkrishnan declared having no relevant disclosures.
A version of this article appeared on Medscape.com.
, a large study has found.
The sweeping analysis of 124,590 adults from 21 countries found that those eating at least 19 g of ultraprocessed grains a day were about twice as likely to be diagnosed with IBD as peers eating less than 9 g daily.
“Our study adds robust evidence from a large, diverse global cohort that frequent consumption of ultraprocessed grains is associated with an increased risk of developing inflammatory bowel disease,” Neeraj Narula, MD, MPH, gastroenterologist and associate professor of medicine, McMaster University, Hamilton, Ontario, Canada, told GI & Hepatology News.
The study also “further clarifies that not all grains carry risk — minimally processed grains like fresh bread and rice were associated with lower risk even. These results build on and specify previous findings linking ultraprocessed foods more broadly to IBD,” Narula said.
The study was published in The American Journal of Gastroenterology.
Diet Matters to IBD Risk
According to the latest US data (2021-2023), ultraprocessed foods made up 62% of daily calories for young people and 53% for adults in 2021-2023.
The Prospective Urban Rural Epidemiology (PURE) study has followed participants aged 35-70 years for a median of nearly 13 years. At enrollment, volunteers completed country-specific food-frequency questionnaires, enabling researchers to quantify usual intake of more than 130 food items and track new cases of IBD reported at biennial follow-ups.
The researchers classified packaged breads, sweet breakfast cereals, crackers, pastries and ready-to-heat pizza or pasta as ultraprocessed grains because they are refined and typically contain additives such as emulsifiers and preservatives. Fresh bakery bread and plain rice were analyzed separately as minimally processed grain references.
During a median of 12.9 years, 605 participants developed IBD; 497 developed ulcerative colitis (UC) and 108 developed Crohn’s disease.
Increased intake of ultraprocessed grains was associated with a higher risk for IBD, with hazard ratios (HR) of 2.08 for intake of ≥ 50 g/d and 1.37 for 19-50 g/d compared to intake of < 19 g/d. The increased risk was largely driven by a significantly increased risk for UC (HR, 2.46) and not Crohn’s disease (HR, 0.98).
Among the different ultraprocessed grain products, packaged bread stood out: Consuming ≥ 30 g/d of packaged bread (a little more than one slice) was associated with a greater than twofold increased risk for IBD (HR, 2.11) compared to no intake of packaged bread.
In contrast, greater consumption of fresh bread was associated with a reduced risk of developing IBD (HR, 0.61 for ≥ 65 g/d and 0.45 for 16-65 g/d compared to < 16 g/d).
Increased intake of rice was also associated with a lower risk of developing IBD (HR, 0.63 for ≥ 1 serving/d and 0.99 for < 1 serving/d).
When the researchers widened the lens to all ultraprocessed foods — from sodas to salty snacks — the risk for IBD climbed further.
Participants eating at least five servings a day had nearly a fourfold greater odds of IBD than those eating fewer than one serving (HR, 3.95) — a finding consistent with other data from the PURE study cohort.
What to Tell Patients?
The authors acknowledged in their paper that it’s difficult — if not impossible — to completely avoid ultraprocessed food in the Western diet.
They said their findings support “public health strategies to promote consumption of whole and minimally processed foods while reducing the consumption of highly processed alternatives.”
“I tell my patients that emerging literature shows an association between ultraprocessed food intake and IBD risk, but it’s not yet clear whether simply cutting out those foods will improve disease activity once IBD is established,” Narula told GI & Hepatology News.
“However, I still encourage patients to reduce ultraprocessed foods and to follow a Mediterranean-style diet — focusing on minimally processed grains, fruits, vegetables, healthy fats, and lean proteins — to support overall gut and general health,” Narula said.
Reached for comment, Ashwin Ananthakrishnan, MD, MPH, AGAF, associate professor of medicine, Massachusetts General Hospital, Boston, who wasn’t part of the study, said it “adds incrementally to the growing data on how ultraprocessed foods may affect the risk of IBD.”
“They (and others) have previously shown a link between general ultraprocessed food consumption and risk of IBD. Others have shown that some of this is mediated through refined grains. This study more specifically studies that question and demonstrates an association,” said Ananthkrishnan.
“This should not be used, however, to counsel patients. It does not study the impact of grain intake on patients with IBD. It may help inform population level preventive strategies (or in high-risk individuals) but requires more confirmation since there is significant heterogeneity between the various countries in this cohort. Countries that have high refined grain intake are also enriched in several other IBD risk factors (including genetics),” Ananthkrishnan told GI & Hepatology News.
The PURE study is an investigator-initiated study funded by the Population Health Research Institute, Hamilton Health Sciences Research Institute, Canadian Institutes of Health Research, and Heart and Stroke Foundation of Ontario. It received support from Canadian Institutes of Health Research’s Strategy for Patient Oriented Research, Ontario SPOR Support Unit, and Ontario Ministry of Health and Long-Term Care and unrestricted grants from several pharmaceutical companies. Narula declared receiving honoraria from Janssen, Abbvie, Takeda, Pfizer, Sandoz, Novartis, Iterative Health, Innomar Strategies, Fresinius Kabi, Amgen, Organon, Eli Lilly, and Ferring. Ananthkrishnan declared having no relevant disclosures.
A version of this article appeared on Medscape.com.
, a large study has found.
The sweeping analysis of 124,590 adults from 21 countries found that those eating at least 19 g of ultraprocessed grains a day were about twice as likely to be diagnosed with IBD as peers eating less than 9 g daily.
“Our study adds robust evidence from a large, diverse global cohort that frequent consumption of ultraprocessed grains is associated with an increased risk of developing inflammatory bowel disease,” Neeraj Narula, MD, MPH, gastroenterologist and associate professor of medicine, McMaster University, Hamilton, Ontario, Canada, told GI & Hepatology News.
The study also “further clarifies that not all grains carry risk — minimally processed grains like fresh bread and rice were associated with lower risk even. These results build on and specify previous findings linking ultraprocessed foods more broadly to IBD,” Narula said.
The study was published in The American Journal of Gastroenterology.
Diet Matters to IBD Risk
According to the latest US data (2021-2023), ultraprocessed foods made up 62% of daily calories for young people and 53% for adults in 2021-2023.
The Prospective Urban Rural Epidemiology (PURE) study has followed participants aged 35-70 years for a median of nearly 13 years. At enrollment, volunteers completed country-specific food-frequency questionnaires, enabling researchers to quantify usual intake of more than 130 food items and track new cases of IBD reported at biennial follow-ups.
The researchers classified packaged breads, sweet breakfast cereals, crackers, pastries and ready-to-heat pizza or pasta as ultraprocessed grains because they are refined and typically contain additives such as emulsifiers and preservatives. Fresh bakery bread and plain rice were analyzed separately as minimally processed grain references.
During a median of 12.9 years, 605 participants developed IBD; 497 developed ulcerative colitis (UC) and 108 developed Crohn’s disease.
Increased intake of ultraprocessed grains was associated with a higher risk for IBD, with hazard ratios (HR) of 2.08 for intake of ≥ 50 g/d and 1.37 for 19-50 g/d compared to intake of < 19 g/d. The increased risk was largely driven by a significantly increased risk for UC (HR, 2.46) and not Crohn’s disease (HR, 0.98).
Among the different ultraprocessed grain products, packaged bread stood out: Consuming ≥ 30 g/d of packaged bread (a little more than one slice) was associated with a greater than twofold increased risk for IBD (HR, 2.11) compared to no intake of packaged bread.
In contrast, greater consumption of fresh bread was associated with a reduced risk of developing IBD (HR, 0.61 for ≥ 65 g/d and 0.45 for 16-65 g/d compared to < 16 g/d).
Increased intake of rice was also associated with a lower risk of developing IBD (HR, 0.63 for ≥ 1 serving/d and 0.99 for < 1 serving/d).
When the researchers widened the lens to all ultraprocessed foods — from sodas to salty snacks — the risk for IBD climbed further.
Participants eating at least five servings a day had nearly a fourfold greater odds of IBD than those eating fewer than one serving (HR, 3.95) — a finding consistent with other data from the PURE study cohort.
What to Tell Patients?
The authors acknowledged in their paper that it’s difficult — if not impossible — to completely avoid ultraprocessed food in the Western diet.
They said their findings support “public health strategies to promote consumption of whole and minimally processed foods while reducing the consumption of highly processed alternatives.”
“I tell my patients that emerging literature shows an association between ultraprocessed food intake and IBD risk, but it’s not yet clear whether simply cutting out those foods will improve disease activity once IBD is established,” Narula told GI & Hepatology News.
“However, I still encourage patients to reduce ultraprocessed foods and to follow a Mediterranean-style diet — focusing on minimally processed grains, fruits, vegetables, healthy fats, and lean proteins — to support overall gut and general health,” Narula said.
Reached for comment, Ashwin Ananthakrishnan, MD, MPH, AGAF, associate professor of medicine, Massachusetts General Hospital, Boston, who wasn’t part of the study, said it “adds incrementally to the growing data on how ultraprocessed foods may affect the risk of IBD.”
“They (and others) have previously shown a link between general ultraprocessed food consumption and risk of IBD. Others have shown that some of this is mediated through refined grains. This study more specifically studies that question and demonstrates an association,” said Ananthkrishnan.
“This should not be used, however, to counsel patients. It does not study the impact of grain intake on patients with IBD. It may help inform population level preventive strategies (or in high-risk individuals) but requires more confirmation since there is significant heterogeneity between the various countries in this cohort. Countries that have high refined grain intake are also enriched in several other IBD risk factors (including genetics),” Ananthkrishnan told GI & Hepatology News.
The PURE study is an investigator-initiated study funded by the Population Health Research Institute, Hamilton Health Sciences Research Institute, Canadian Institutes of Health Research, and Heart and Stroke Foundation of Ontario. It received support from Canadian Institutes of Health Research’s Strategy for Patient Oriented Research, Ontario SPOR Support Unit, and Ontario Ministry of Health and Long-Term Care and unrestricted grants from several pharmaceutical companies. Narula declared receiving honoraria from Janssen, Abbvie, Takeda, Pfizer, Sandoz, Novartis, Iterative Health, Innomar Strategies, Fresinius Kabi, Amgen, Organon, Eli Lilly, and Ferring. Ananthkrishnan declared having no relevant disclosures.
A version of this article appeared on Medscape.com.
Alarming Rise in Early-Onset GI Cancers Calls for Early Screening, Lifestyle Change
, said the authors of a JAMA review.
In the US, early-onset GI cancers are increasing faster than any other type of early-onset cancer, including breast cancer. The trend is not limited to colorectal cancer (CRC). Gastric, pancreatic, esophageal, as well as many biliary tract and appendix cancers, are also on the rise in young adults, Kimmie Ng, MD, MPH, and Thejus Jayakrishnan, MD, both with Dana-Farber Cancer Institute, Boston, noted in their article.
The increase in early-onset GI cancers follows a “birth cohort effect,” with generational variation in risk, suggesting a potential association with changes in environmental exposures, Ng explained in an accompanying JAMA podcast.
All these GI cancers link strongly to multiple modifiable risk factors, and it is a “top area of investigation to determine exactly what environmental exposures are at play,” Ng added.
For many of these GI cancers, obesity has been the “leading hypothesis” given that rising rates seem to parallel the increase in incidence of these early-onset GI cancers, Ng explained.
“But we also have evidence, particularly strong for colorectal cancer, that dietary patterns, such as consuming a Western diet, as well as sedentary behavior and lifestyles seem to be associated with a significantly higher risk of developing these cancers at an age under 50,” Ng said.
Rising Incidence
Globally, among early-onset GI cancers reported in 2022, CRC was the most common (54%), followed by gastric cancer (24%), esophageal cancer (13%), and pancreatic cancer (9%).
In the US in 2022, 20,805 individuals were diagnosed with early-onset CRC, 2689 with early-onset gastric cancer, 2657 with early-onset pancreatic cancer, and 875 with early-onset esophageal cancer.
Since the mid-1990s, CRC among adults of all ages in the US declined by 1.3%-4.2% annually but early-onset CRC increased by roughly 2% per year in both men and women, and currently makes up about 14% of all CRC cases.
Early-onset pancreatic cancer and esophageal cancer each currently make up about 5% of all cases of these cancers in the US.
Between 2010 and 2019, the number of newly diagnosed cases of early-onset GI cancers rose by nearly about 15%, with Black, Hispanic, Indigenous ancestry, and women disproportionately affected, Ng and coauthors noted in a related review published in the British Journal of Surgery.
Modifiable and Nonmodifiable Risk Factors
Along with obesity and poor diet, other modifiable risk factors for early-onset GI cancers include sedentary lifestyle, cigarette smoking, and alcohol consumption.
Nonmodifiable risk factors include family history, hereditary cancer syndromes such as Lynch syndrome and inflammatory bowel disease.
Roughly 15%-30% of early-onset GI cancers have pathogenic germline variants in genes such as DNA mismatch repair genes and BRCA1/2.
All individuals with early-onset GI cancers should undergo germline and somatic genetic testing to guide treatment, screen for other cancers (eg, endometrial cancer in Lynch syndrome), and assess familial risk, Ng and Jayakrishnan advised.
Treatment Challenges
Treatment for early-onset GI cancers is generally similar to later-onset GI cancers and prognosis for patients with early-onset GI cancers is “similar to or worse” than that for patients with later-onset GI cancers, highlighting the need for improved methods of prevention and early detection, the authors said.
Ng noted that younger cancer patients often face more challenges after diagnosis than older patients and benefit from multidisciplinary care, including referral for fertility counseling and preservation if appropriate, and psychosocial support.
“It is very difficult and challenging to receive a cancer diagnosis no matter what age you are, but when a person is diagnosed in their 20s, 30s, or 40s, there are unique challenges,” Ng said.
Studies have documented “much higher levels of psychosocial distress, depression and anxiety” in early-onset cancer patients, “and they also often experience more financial toxicity, disruptions in their education as well as their career and there may be fertility concerns,” Ng added.
Diagnostic Delays and Screening
Currently, screening is not recommended for most early-onset GI cancers — with the exception of CRC, with screening recommended for average-risk adults in the US starting at age 45.
Yet, despite this recommendation, fewer than 1 in 5 (19.7%) US adults aged 45-49 years were screened in 2021, indicating a significant gap in early detection efforts.
High-risk individuals, such as those with Lynch syndrome, a first-degree relative with CRC, or advanced colorectal adenoma, should begin CRC screening earlier, at an age determined by the specific risk factor.
“Studies have shown significant delays in diagnosis among younger patients. It’s important that prompt diagnosis happens so that these patients do not end up being diagnosed with advanced or metastatic stages of cancer, as they often are,” Ng said.
“Screening adherence is absolutely critical,” co-author Jayakrishnan added in a news release.
“We have strong evidence that colorectal cancer screening saves lives by reducing both the number of people who develop colorectal cancer and the number of people who die from it. Each missed screening is a lost opportunity to detect cancer early when it is more treatable, or to prevent cancer altogether by identifying and removing precancerous polyps,” Jayakrishnan said.This research had no funding. Ng reported receipt of nonfinancial support from Pharmavite, institutional grants from Janssen, and personal fees from Bayer, Seagen, GlaxoSmithKline, Pfizer, CytomX, Jazz Pharmaceuticals, Revolution Medicines, Redesign Health, AbbVie, Etiome, and CRICO. Ng is an associate editor of JAMA but was not involved in any of the decisions regarding review of the manuscript or its acceptance. Jayakrishnan had no disclosures.
A version of this article appeared on Medscape.com.
, said the authors of a JAMA review.
In the US, early-onset GI cancers are increasing faster than any other type of early-onset cancer, including breast cancer. The trend is not limited to colorectal cancer (CRC). Gastric, pancreatic, esophageal, as well as many biliary tract and appendix cancers, are also on the rise in young adults, Kimmie Ng, MD, MPH, and Thejus Jayakrishnan, MD, both with Dana-Farber Cancer Institute, Boston, noted in their article.
The increase in early-onset GI cancers follows a “birth cohort effect,” with generational variation in risk, suggesting a potential association with changes in environmental exposures, Ng explained in an accompanying JAMA podcast.
All these GI cancers link strongly to multiple modifiable risk factors, and it is a “top area of investigation to determine exactly what environmental exposures are at play,” Ng added.
For many of these GI cancers, obesity has been the “leading hypothesis” given that rising rates seem to parallel the increase in incidence of these early-onset GI cancers, Ng explained.
“But we also have evidence, particularly strong for colorectal cancer, that dietary patterns, such as consuming a Western diet, as well as sedentary behavior and lifestyles seem to be associated with a significantly higher risk of developing these cancers at an age under 50,” Ng said.
Rising Incidence
Globally, among early-onset GI cancers reported in 2022, CRC was the most common (54%), followed by gastric cancer (24%), esophageal cancer (13%), and pancreatic cancer (9%).
In the US in 2022, 20,805 individuals were diagnosed with early-onset CRC, 2689 with early-onset gastric cancer, 2657 with early-onset pancreatic cancer, and 875 with early-onset esophageal cancer.
Since the mid-1990s, CRC among adults of all ages in the US declined by 1.3%-4.2% annually but early-onset CRC increased by roughly 2% per year in both men and women, and currently makes up about 14% of all CRC cases.
Early-onset pancreatic cancer and esophageal cancer each currently make up about 5% of all cases of these cancers in the US.
Between 2010 and 2019, the number of newly diagnosed cases of early-onset GI cancers rose by nearly about 15%, with Black, Hispanic, Indigenous ancestry, and women disproportionately affected, Ng and coauthors noted in a related review published in the British Journal of Surgery.
Modifiable and Nonmodifiable Risk Factors
Along with obesity and poor diet, other modifiable risk factors for early-onset GI cancers include sedentary lifestyle, cigarette smoking, and alcohol consumption.
Nonmodifiable risk factors include family history, hereditary cancer syndromes such as Lynch syndrome and inflammatory bowel disease.
Roughly 15%-30% of early-onset GI cancers have pathogenic germline variants in genes such as DNA mismatch repair genes and BRCA1/2.
All individuals with early-onset GI cancers should undergo germline and somatic genetic testing to guide treatment, screen for other cancers (eg, endometrial cancer in Lynch syndrome), and assess familial risk, Ng and Jayakrishnan advised.
Treatment Challenges
Treatment for early-onset GI cancers is generally similar to later-onset GI cancers and prognosis for patients with early-onset GI cancers is “similar to or worse” than that for patients with later-onset GI cancers, highlighting the need for improved methods of prevention and early detection, the authors said.
Ng noted that younger cancer patients often face more challenges after diagnosis than older patients and benefit from multidisciplinary care, including referral for fertility counseling and preservation if appropriate, and psychosocial support.
“It is very difficult and challenging to receive a cancer diagnosis no matter what age you are, but when a person is diagnosed in their 20s, 30s, or 40s, there are unique challenges,” Ng said.
Studies have documented “much higher levels of psychosocial distress, depression and anxiety” in early-onset cancer patients, “and they also often experience more financial toxicity, disruptions in their education as well as their career and there may be fertility concerns,” Ng added.
Diagnostic Delays and Screening
Currently, screening is not recommended for most early-onset GI cancers — with the exception of CRC, with screening recommended for average-risk adults in the US starting at age 45.
Yet, despite this recommendation, fewer than 1 in 5 (19.7%) US adults aged 45-49 years were screened in 2021, indicating a significant gap in early detection efforts.
High-risk individuals, such as those with Lynch syndrome, a first-degree relative with CRC, or advanced colorectal adenoma, should begin CRC screening earlier, at an age determined by the specific risk factor.
“Studies have shown significant delays in diagnosis among younger patients. It’s important that prompt diagnosis happens so that these patients do not end up being diagnosed with advanced or metastatic stages of cancer, as they often are,” Ng said.
“Screening adherence is absolutely critical,” co-author Jayakrishnan added in a news release.
“We have strong evidence that colorectal cancer screening saves lives by reducing both the number of people who develop colorectal cancer and the number of people who die from it. Each missed screening is a lost opportunity to detect cancer early when it is more treatable, or to prevent cancer altogether by identifying and removing precancerous polyps,” Jayakrishnan said.This research had no funding. Ng reported receipt of nonfinancial support from Pharmavite, institutional grants from Janssen, and personal fees from Bayer, Seagen, GlaxoSmithKline, Pfizer, CytomX, Jazz Pharmaceuticals, Revolution Medicines, Redesign Health, AbbVie, Etiome, and CRICO. Ng is an associate editor of JAMA but was not involved in any of the decisions regarding review of the manuscript or its acceptance. Jayakrishnan had no disclosures.
A version of this article appeared on Medscape.com.
, said the authors of a JAMA review.
In the US, early-onset GI cancers are increasing faster than any other type of early-onset cancer, including breast cancer. The trend is not limited to colorectal cancer (CRC). Gastric, pancreatic, esophageal, as well as many biliary tract and appendix cancers, are also on the rise in young adults, Kimmie Ng, MD, MPH, and Thejus Jayakrishnan, MD, both with Dana-Farber Cancer Institute, Boston, noted in their article.
The increase in early-onset GI cancers follows a “birth cohort effect,” with generational variation in risk, suggesting a potential association with changes in environmental exposures, Ng explained in an accompanying JAMA podcast.
All these GI cancers link strongly to multiple modifiable risk factors, and it is a “top area of investigation to determine exactly what environmental exposures are at play,” Ng added.
For many of these GI cancers, obesity has been the “leading hypothesis” given that rising rates seem to parallel the increase in incidence of these early-onset GI cancers, Ng explained.
“But we also have evidence, particularly strong for colorectal cancer, that dietary patterns, such as consuming a Western diet, as well as sedentary behavior and lifestyles seem to be associated with a significantly higher risk of developing these cancers at an age under 50,” Ng said.
Rising Incidence
Globally, among early-onset GI cancers reported in 2022, CRC was the most common (54%), followed by gastric cancer (24%), esophageal cancer (13%), and pancreatic cancer (9%).
In the US in 2022, 20,805 individuals were diagnosed with early-onset CRC, 2689 with early-onset gastric cancer, 2657 with early-onset pancreatic cancer, and 875 with early-onset esophageal cancer.
Since the mid-1990s, CRC among adults of all ages in the US declined by 1.3%-4.2% annually but early-onset CRC increased by roughly 2% per year in both men and women, and currently makes up about 14% of all CRC cases.
Early-onset pancreatic cancer and esophageal cancer each currently make up about 5% of all cases of these cancers in the US.
Between 2010 and 2019, the number of newly diagnosed cases of early-onset GI cancers rose by nearly about 15%, with Black, Hispanic, Indigenous ancestry, and women disproportionately affected, Ng and coauthors noted in a related review published in the British Journal of Surgery.
Modifiable and Nonmodifiable Risk Factors
Along with obesity and poor diet, other modifiable risk factors for early-onset GI cancers include sedentary lifestyle, cigarette smoking, and alcohol consumption.
Nonmodifiable risk factors include family history, hereditary cancer syndromes such as Lynch syndrome and inflammatory bowel disease.
Roughly 15%-30% of early-onset GI cancers have pathogenic germline variants in genes such as DNA mismatch repair genes and BRCA1/2.
All individuals with early-onset GI cancers should undergo germline and somatic genetic testing to guide treatment, screen for other cancers (eg, endometrial cancer in Lynch syndrome), and assess familial risk, Ng and Jayakrishnan advised.
Treatment Challenges
Treatment for early-onset GI cancers is generally similar to later-onset GI cancers and prognosis for patients with early-onset GI cancers is “similar to or worse” than that for patients with later-onset GI cancers, highlighting the need for improved methods of prevention and early detection, the authors said.
Ng noted that younger cancer patients often face more challenges after diagnosis than older patients and benefit from multidisciplinary care, including referral for fertility counseling and preservation if appropriate, and psychosocial support.
“It is very difficult and challenging to receive a cancer diagnosis no matter what age you are, but when a person is diagnosed in their 20s, 30s, or 40s, there are unique challenges,” Ng said.
Studies have documented “much higher levels of psychosocial distress, depression and anxiety” in early-onset cancer patients, “and they also often experience more financial toxicity, disruptions in their education as well as their career and there may be fertility concerns,” Ng added.
Diagnostic Delays and Screening
Currently, screening is not recommended for most early-onset GI cancers — with the exception of CRC, with screening recommended for average-risk adults in the US starting at age 45.
Yet, despite this recommendation, fewer than 1 in 5 (19.7%) US adults aged 45-49 years were screened in 2021, indicating a significant gap in early detection efforts.
High-risk individuals, such as those with Lynch syndrome, a first-degree relative with CRC, or advanced colorectal adenoma, should begin CRC screening earlier, at an age determined by the specific risk factor.
“Studies have shown significant delays in diagnosis among younger patients. It’s important that prompt diagnosis happens so that these patients do not end up being diagnosed with advanced or metastatic stages of cancer, as they often are,” Ng said.
“Screening adherence is absolutely critical,” co-author Jayakrishnan added in a news release.
“We have strong evidence that colorectal cancer screening saves lives by reducing both the number of people who develop colorectal cancer and the number of people who die from it. Each missed screening is a lost opportunity to detect cancer early when it is more treatable, or to prevent cancer altogether by identifying and removing precancerous polyps,” Jayakrishnan said.This research had no funding. Ng reported receipt of nonfinancial support from Pharmavite, institutional grants from Janssen, and personal fees from Bayer, Seagen, GlaxoSmithKline, Pfizer, CytomX, Jazz Pharmaceuticals, Revolution Medicines, Redesign Health, AbbVie, Etiome, and CRICO. Ng is an associate editor of JAMA but was not involved in any of the decisions regarding review of the manuscript or its acceptance. Jayakrishnan had no disclosures.
A version of this article appeared on Medscape.com.
First New PTSD Drug in Two Decades On the Horizon?
The Psychopharmacologic Drugs Advisory Committee of the FDA is set to meet on July 18 to consider a supplemental new drug application for brexpiprazole (Rexulti, Otsuka Pharmaceutical Co., Ltd.), in combination with sertraline, for the treatment of adults with posttraumatic stress disorder (PTSD).
If approved, it would be the first new treatment for PTSD in more than 20 years.
“It is my hope that the FDA does approve this treatment for two related reasons — the data look positive and compelling, and there’s a tremendous unmet need in PTSD,” Roger McIntyre, MD, professor of psychiatry and pharmacology and head of the Mood Disorders Psychopharmacology Unit, University of Toronto, Toronto, Ontario, Canada, told this news organization.
What’s in the Treatment Toolbox Now?
PTSD is a “common, severe, and nonremitting condition,” McIntyre noted. According to the National Center for PTSD, the condition affects roughly 13 million adults in the US in any given year. This represents about 5% of the adult population.
PTSD can develop following exposure to traumatic events such as combat, assault, disasters, or severe accidents. Core symptoms of PTSD include intrusive memories and flashbacks, avoidance behaviors, negative alterations in mood and cognition, and hyperarousal.
Currently, the selective serotonin reuptake inhibitors (SSRI), sertraline and paroxetine, are the only FDA-approved medications for PTSD, and while these medications can be effective, many patients fail to achieve remission or discontinue treatment due to adverse effects or lack of response.
Other medications used off-label to treat PTSD — including prazosin, mirtazapine, atypical antipsychotics, and mood stabilizers — have shown variable efficacy.
There has not been a new FDA-approved drug for PTSD in over two decades, underscoring the need for better therapeutic options, particularly for patients who do not fully respond to SSRI alone.
Why Brexpiprazole Plus Sertraline?
Brexpiprazole is an atypical antipsychotic currently approved as adjunctive treatment of major depressive disorder (MDD) in adults; treatment of schizophrenia in adults and adolescents aged 13 years or older; and treatment of agitation associated with Alzheimer’s dementia.
The combination of brexpiprazole and sertraline could address the limitations of SSRI alone by working synergistically to treat PTSD.
Sertraline increases serotonin levels in the brain to improve mood and reduce anxiety. Brexpiprazole has a complex mechanism of action involving multiple neurotransmitter systems, including but not limited to serotonin and dopamine.
Together, they may target different aspects of PTSD, potentially leading to a more comprehensive reduction in symptoms.
What Do the Phase 3 Data Show?
In a pivotal, double-blind, randomized controlled, phase 3 trial, brexpiprazole plus sertraline provided significantly greater relief of PTSD symptoms than sertraline plus placebo.
The results were published late last year in JAMA Psychiatry and reported by this news organization at that time.
The trial enrolled 416 adults (mean age, 37 years; 75% women) aged 18-65 years with a Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) diagnosis of PTSD and symptoms for at least 6 months prior to screening.
At baseline, participants had a mean Clinician Administered PTSD Scale (CAPS-5) for DSM-5 total score of 38.4, indicating moderate to high severity PTSD. The average time from the index traumatic event was 4 years, and three fourths had no prior exposure to PTSD prescription medications.
Participants underwent a 1-week placebo run-in period followed by randomization to daily oral brexpiprazole 2-3 mg plus sertraline 150 mg or daily sertraline 150 mg plus placebo for 11 weeks.
At week 10, brexpiprazole plus sertraline demonstrated statistically significant greater improvement in the CAPS-5 total score (primary outcome) than sertraline plus placebo (mean change, -19.2 points vs -13.6 points; P < .001).
Brexpiprazole plus sertraline also led to statistically significant greater improvement on all key secondary and other efficacy endpoints, both clinician-reported and patient-reported, including measures of anxiety, depression, intrusive symptoms, hyperarousal, and overall functioning.
Combining an atypical antipsychotic with an antidepressant for PTSD “builds on what we’ve been doing in depression,” Elspeth Ritchie, MD, chair of Psychiatry, MedStar Washington Hospital Center, Washington, DC, noted in an interview with this news organization.
“We have found that a combination of a low-dose antipsychotic and an antidepressant is helpful for depression, so it makes sense that it will be helpful for PTSD. However, this has been mostly based on clinical decisions, without a heavy research background. Good science is always helpful to support those clinical decisions,” Ritchie told this news organization.
What About Safety?
In the phase 3 trial, brexpiprazole plus sertraline had a safety profile consistent with that of brexpiprazole in approved indications. The rate of discontinuation due to adverse events was low (3.9% for brexpiprazole plus sertraline vs 10.2% for sertraline plus placebo), indicating that most participants tolerated the brexpiprazole and sertraline combination treatment, the study team said.
In both treatment groups, the only treatment-emergent adverse event (TEAE) with incidence greater than 10% was nausea, a known adverse effect of sertraline treatment.
Weight gain was greater in participants receiving the combination. At the last visit, a weight gain of 7% or greater from week 1 was experienced by 8% of participants taking brexpiprazole with the sertraline group and 5% of those taking the sertraline plus placebo. Previous analyses in schizophrenia and MDD show that brexpiprazole is associated with moderate weight gain (+3 to 4 kg over 1 year).
The incidence of sedating TEAEs (a concern with some antipsychotics) was generally low, although fatigue (7% vs 4%) and somnolence (5% vs 3%) were more common with brexpiprazole plus sertraline than with sertraline alone.
There were no clinically meaningful between-group differences in changes in laboratory test parameters, vital signs, or ECG and participant-reported TEAEs related to suicidality.
Potential Concerns
As with any new drug application, several questions and issues are likely to be raised by the advisory committee. They could include whether the clinical benefit is substantial enough to warrant approval and how the observed effect sizes compare to existing approved therapies and evidence-based psychotherapies.
McIntyre told this news organization what’s particularly noteworthy is that the magnitude of the improvement in PTSD symptoms with brexpiprazole plus sertraline is greater than with sertraline alone. “That’s a very important statement. And this high level of efficacy was consistent on the secondary outcome measures, and the overall tolerability and safety seemed very acceptable,” he said.
What’s equally important, said McIntyre, is that most people with PTSD have depression and anxiety, and the brexpiprazole plus sertraline combination was more helpful than sertraline alone on the measures of anxiety and depression. “This is really important, especially in light of the fact that this medication [brexpiprazole] is already approved for adults living with major depressive disorder and inadequate response to antidepressants,” McIntyre said.
McIntyre added he suspects some questions the committee may have could relate to the extent to which it’s the case that brexpiprazole is effective in PTSD regardless of the antidepressant that is prescribed with it.
“There also will be the inevitable questions about the absence of long-term data which I think will need to be addressed given how chronic and relapse prone this condition is,” McIntyre said.
The committee may ask how trauma and PTSD will be screened in primary care and how outcomes related to this therapy will be evaluated in everyday clinical practice, McIntyre said.
Overall, McIntyre said brexpiprazole plus sertraline in PTSD is a “very positive” development for the field.
“PTSD is a terrible condition. It’s so darn common, and we just don’t have enough treatments for it. The data look good for my perspective. My fingers are crossed for the patients with PTSD and their families,” said McIntyre.
Ritchie reported having no relevant disclosures. McIntyre received speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, and atai Life Sciences. McIntyre is a CEO of Braxia Scientific Corp.
A version of this article first appeared on Medscape.com.
The Psychopharmacologic Drugs Advisory Committee of the FDA is set to meet on July 18 to consider a supplemental new drug application for brexpiprazole (Rexulti, Otsuka Pharmaceutical Co., Ltd.), in combination with sertraline, for the treatment of adults with posttraumatic stress disorder (PTSD).
If approved, it would be the first new treatment for PTSD in more than 20 years.
“It is my hope that the FDA does approve this treatment for two related reasons — the data look positive and compelling, and there’s a tremendous unmet need in PTSD,” Roger McIntyre, MD, professor of psychiatry and pharmacology and head of the Mood Disorders Psychopharmacology Unit, University of Toronto, Toronto, Ontario, Canada, told this news organization.
What’s in the Treatment Toolbox Now?
PTSD is a “common, severe, and nonremitting condition,” McIntyre noted. According to the National Center for PTSD, the condition affects roughly 13 million adults in the US in any given year. This represents about 5% of the adult population.
PTSD can develop following exposure to traumatic events such as combat, assault, disasters, or severe accidents. Core symptoms of PTSD include intrusive memories and flashbacks, avoidance behaviors, negative alterations in mood and cognition, and hyperarousal.
Currently, the selective serotonin reuptake inhibitors (SSRI), sertraline and paroxetine, are the only FDA-approved medications for PTSD, and while these medications can be effective, many patients fail to achieve remission or discontinue treatment due to adverse effects or lack of response.
Other medications used off-label to treat PTSD — including prazosin, mirtazapine, atypical antipsychotics, and mood stabilizers — have shown variable efficacy.
There has not been a new FDA-approved drug for PTSD in over two decades, underscoring the need for better therapeutic options, particularly for patients who do not fully respond to SSRI alone.
Why Brexpiprazole Plus Sertraline?
Brexpiprazole is an atypical antipsychotic currently approved as adjunctive treatment of major depressive disorder (MDD) in adults; treatment of schizophrenia in adults and adolescents aged 13 years or older; and treatment of agitation associated with Alzheimer’s dementia.
The combination of brexpiprazole and sertraline could address the limitations of SSRI alone by working synergistically to treat PTSD.
Sertraline increases serotonin levels in the brain to improve mood and reduce anxiety. Brexpiprazole has a complex mechanism of action involving multiple neurotransmitter systems, including but not limited to serotonin and dopamine.
Together, they may target different aspects of PTSD, potentially leading to a more comprehensive reduction in symptoms.
What Do the Phase 3 Data Show?
In a pivotal, double-blind, randomized controlled, phase 3 trial, brexpiprazole plus sertraline provided significantly greater relief of PTSD symptoms than sertraline plus placebo.
The results were published late last year in JAMA Psychiatry and reported by this news organization at that time.
The trial enrolled 416 adults (mean age, 37 years; 75% women) aged 18-65 years with a Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) diagnosis of PTSD and symptoms for at least 6 months prior to screening.
At baseline, participants had a mean Clinician Administered PTSD Scale (CAPS-5) for DSM-5 total score of 38.4, indicating moderate to high severity PTSD. The average time from the index traumatic event was 4 years, and three fourths had no prior exposure to PTSD prescription medications.
Participants underwent a 1-week placebo run-in period followed by randomization to daily oral brexpiprazole 2-3 mg plus sertraline 150 mg or daily sertraline 150 mg plus placebo for 11 weeks.
At week 10, brexpiprazole plus sertraline demonstrated statistically significant greater improvement in the CAPS-5 total score (primary outcome) than sertraline plus placebo (mean change, -19.2 points vs -13.6 points; P < .001).
Brexpiprazole plus sertraline also led to statistically significant greater improvement on all key secondary and other efficacy endpoints, both clinician-reported and patient-reported, including measures of anxiety, depression, intrusive symptoms, hyperarousal, and overall functioning.
Combining an atypical antipsychotic with an antidepressant for PTSD “builds on what we’ve been doing in depression,” Elspeth Ritchie, MD, chair of Psychiatry, MedStar Washington Hospital Center, Washington, DC, noted in an interview with this news organization.
“We have found that a combination of a low-dose antipsychotic and an antidepressant is helpful for depression, so it makes sense that it will be helpful for PTSD. However, this has been mostly based on clinical decisions, without a heavy research background. Good science is always helpful to support those clinical decisions,” Ritchie told this news organization.
What About Safety?
In the phase 3 trial, brexpiprazole plus sertraline had a safety profile consistent with that of brexpiprazole in approved indications. The rate of discontinuation due to adverse events was low (3.9% for brexpiprazole plus sertraline vs 10.2% for sertraline plus placebo), indicating that most participants tolerated the brexpiprazole and sertraline combination treatment, the study team said.
In both treatment groups, the only treatment-emergent adverse event (TEAE) with incidence greater than 10% was nausea, a known adverse effect of sertraline treatment.
Weight gain was greater in participants receiving the combination. At the last visit, a weight gain of 7% or greater from week 1 was experienced by 8% of participants taking brexpiprazole with the sertraline group and 5% of those taking the sertraline plus placebo. Previous analyses in schizophrenia and MDD show that brexpiprazole is associated with moderate weight gain (+3 to 4 kg over 1 year).
The incidence of sedating TEAEs (a concern with some antipsychotics) was generally low, although fatigue (7% vs 4%) and somnolence (5% vs 3%) were more common with brexpiprazole plus sertraline than with sertraline alone.
There were no clinically meaningful between-group differences in changes in laboratory test parameters, vital signs, or ECG and participant-reported TEAEs related to suicidality.
Potential Concerns
As with any new drug application, several questions and issues are likely to be raised by the advisory committee. They could include whether the clinical benefit is substantial enough to warrant approval and how the observed effect sizes compare to existing approved therapies and evidence-based psychotherapies.
McIntyre told this news organization what’s particularly noteworthy is that the magnitude of the improvement in PTSD symptoms with brexpiprazole plus sertraline is greater than with sertraline alone. “That’s a very important statement. And this high level of efficacy was consistent on the secondary outcome measures, and the overall tolerability and safety seemed very acceptable,” he said.
What’s equally important, said McIntyre, is that most people with PTSD have depression and anxiety, and the brexpiprazole plus sertraline combination was more helpful than sertraline alone on the measures of anxiety and depression. “This is really important, especially in light of the fact that this medication [brexpiprazole] is already approved for adults living with major depressive disorder and inadequate response to antidepressants,” McIntyre said.
McIntyre added he suspects some questions the committee may have could relate to the extent to which it’s the case that brexpiprazole is effective in PTSD regardless of the antidepressant that is prescribed with it.
“There also will be the inevitable questions about the absence of long-term data which I think will need to be addressed given how chronic and relapse prone this condition is,” McIntyre said.
The committee may ask how trauma and PTSD will be screened in primary care and how outcomes related to this therapy will be evaluated in everyday clinical practice, McIntyre said.
Overall, McIntyre said brexpiprazole plus sertraline in PTSD is a “very positive” development for the field.
“PTSD is a terrible condition. It’s so darn common, and we just don’t have enough treatments for it. The data look good for my perspective. My fingers are crossed for the patients with PTSD and their families,” said McIntyre.
Ritchie reported having no relevant disclosures. McIntyre received speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, and atai Life Sciences. McIntyre is a CEO of Braxia Scientific Corp.
A version of this article first appeared on Medscape.com.
The Psychopharmacologic Drugs Advisory Committee of the FDA is set to meet on July 18 to consider a supplemental new drug application for brexpiprazole (Rexulti, Otsuka Pharmaceutical Co., Ltd.), in combination with sertraline, for the treatment of adults with posttraumatic stress disorder (PTSD).
If approved, it would be the first new treatment for PTSD in more than 20 years.
“It is my hope that the FDA does approve this treatment for two related reasons — the data look positive and compelling, and there’s a tremendous unmet need in PTSD,” Roger McIntyre, MD, professor of psychiatry and pharmacology and head of the Mood Disorders Psychopharmacology Unit, University of Toronto, Toronto, Ontario, Canada, told this news organization.
What’s in the Treatment Toolbox Now?
PTSD is a “common, severe, and nonremitting condition,” McIntyre noted. According to the National Center for PTSD, the condition affects roughly 13 million adults in the US in any given year. This represents about 5% of the adult population.
PTSD can develop following exposure to traumatic events such as combat, assault, disasters, or severe accidents. Core symptoms of PTSD include intrusive memories and flashbacks, avoidance behaviors, negative alterations in mood and cognition, and hyperarousal.
Currently, the selective serotonin reuptake inhibitors (SSRI), sertraline and paroxetine, are the only FDA-approved medications for PTSD, and while these medications can be effective, many patients fail to achieve remission or discontinue treatment due to adverse effects or lack of response.
Other medications used off-label to treat PTSD — including prazosin, mirtazapine, atypical antipsychotics, and mood stabilizers — have shown variable efficacy.
There has not been a new FDA-approved drug for PTSD in over two decades, underscoring the need for better therapeutic options, particularly for patients who do not fully respond to SSRI alone.
Why Brexpiprazole Plus Sertraline?
Brexpiprazole is an atypical antipsychotic currently approved as adjunctive treatment of major depressive disorder (MDD) in adults; treatment of schizophrenia in adults and adolescents aged 13 years or older; and treatment of agitation associated with Alzheimer’s dementia.
The combination of brexpiprazole and sertraline could address the limitations of SSRI alone by working synergistically to treat PTSD.
Sertraline increases serotonin levels in the brain to improve mood and reduce anxiety. Brexpiprazole has a complex mechanism of action involving multiple neurotransmitter systems, including but not limited to serotonin and dopamine.
Together, they may target different aspects of PTSD, potentially leading to a more comprehensive reduction in symptoms.
What Do the Phase 3 Data Show?
In a pivotal, double-blind, randomized controlled, phase 3 trial, brexpiprazole plus sertraline provided significantly greater relief of PTSD symptoms than sertraline plus placebo.
The results were published late last year in JAMA Psychiatry and reported by this news organization at that time.
The trial enrolled 416 adults (mean age, 37 years; 75% women) aged 18-65 years with a Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) diagnosis of PTSD and symptoms for at least 6 months prior to screening.
At baseline, participants had a mean Clinician Administered PTSD Scale (CAPS-5) for DSM-5 total score of 38.4, indicating moderate to high severity PTSD. The average time from the index traumatic event was 4 years, and three fourths had no prior exposure to PTSD prescription medications.
Participants underwent a 1-week placebo run-in period followed by randomization to daily oral brexpiprazole 2-3 mg plus sertraline 150 mg or daily sertraline 150 mg plus placebo for 11 weeks.
At week 10, brexpiprazole plus sertraline demonstrated statistically significant greater improvement in the CAPS-5 total score (primary outcome) than sertraline plus placebo (mean change, -19.2 points vs -13.6 points; P < .001).
Brexpiprazole plus sertraline also led to statistically significant greater improvement on all key secondary and other efficacy endpoints, both clinician-reported and patient-reported, including measures of anxiety, depression, intrusive symptoms, hyperarousal, and overall functioning.
Combining an atypical antipsychotic with an antidepressant for PTSD “builds on what we’ve been doing in depression,” Elspeth Ritchie, MD, chair of Psychiatry, MedStar Washington Hospital Center, Washington, DC, noted in an interview with this news organization.
“We have found that a combination of a low-dose antipsychotic and an antidepressant is helpful for depression, so it makes sense that it will be helpful for PTSD. However, this has been mostly based on clinical decisions, without a heavy research background. Good science is always helpful to support those clinical decisions,” Ritchie told this news organization.
What About Safety?
In the phase 3 trial, brexpiprazole plus sertraline had a safety profile consistent with that of brexpiprazole in approved indications. The rate of discontinuation due to adverse events was low (3.9% for brexpiprazole plus sertraline vs 10.2% for sertraline plus placebo), indicating that most participants tolerated the brexpiprazole and sertraline combination treatment, the study team said.
In both treatment groups, the only treatment-emergent adverse event (TEAE) with incidence greater than 10% was nausea, a known adverse effect of sertraline treatment.
Weight gain was greater in participants receiving the combination. At the last visit, a weight gain of 7% or greater from week 1 was experienced by 8% of participants taking brexpiprazole with the sertraline group and 5% of those taking the sertraline plus placebo. Previous analyses in schizophrenia and MDD show that brexpiprazole is associated with moderate weight gain (+3 to 4 kg over 1 year).
The incidence of sedating TEAEs (a concern with some antipsychotics) was generally low, although fatigue (7% vs 4%) and somnolence (5% vs 3%) were more common with brexpiprazole plus sertraline than with sertraline alone.
There were no clinically meaningful between-group differences in changes in laboratory test parameters, vital signs, or ECG and participant-reported TEAEs related to suicidality.
Potential Concerns
As with any new drug application, several questions and issues are likely to be raised by the advisory committee. They could include whether the clinical benefit is substantial enough to warrant approval and how the observed effect sizes compare to existing approved therapies and evidence-based psychotherapies.
McIntyre told this news organization what’s particularly noteworthy is that the magnitude of the improvement in PTSD symptoms with brexpiprazole plus sertraline is greater than with sertraline alone. “That’s a very important statement. And this high level of efficacy was consistent on the secondary outcome measures, and the overall tolerability and safety seemed very acceptable,” he said.
What’s equally important, said McIntyre, is that most people with PTSD have depression and anxiety, and the brexpiprazole plus sertraline combination was more helpful than sertraline alone on the measures of anxiety and depression. “This is really important, especially in light of the fact that this medication [brexpiprazole] is already approved for adults living with major depressive disorder and inadequate response to antidepressants,” McIntyre said.
McIntyre added he suspects some questions the committee may have could relate to the extent to which it’s the case that brexpiprazole is effective in PTSD regardless of the antidepressant that is prescribed with it.
“There also will be the inevitable questions about the absence of long-term data which I think will need to be addressed given how chronic and relapse prone this condition is,” McIntyre said.
The committee may ask how trauma and PTSD will be screened in primary care and how outcomes related to this therapy will be evaluated in everyday clinical practice, McIntyre said.
Overall, McIntyre said brexpiprazole plus sertraline in PTSD is a “very positive” development for the field.
“PTSD is a terrible condition. It’s so darn common, and we just don’t have enough treatments for it. The data look good for my perspective. My fingers are crossed for the patients with PTSD and their families,” said McIntyre.
Ritchie reported having no relevant disclosures. McIntyre received speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, and atai Life Sciences. McIntyre is a CEO of Braxia Scientific Corp.
A version of this article first appeared on Medscape.com.
IBS, Chronic Idiopathic Constipation Surged During Pandemic
, with a near doubling of the national rate of IBS over 2 years, a study has found.
The uptick is probably due to not only the direct impact of SARS-CoV-2 infection on the gastrointestinal tract but also to the psychological stress associated with pandemic life, the study team said.
“COVID infection itself can definitely cause gastrointestinal symptoms like diarrhea, nausea, and abdominal pain — and for some people, those symptoms can linger and lead to chronic conditions like IBS,” Christopher V. Almario, MD, MSHPM, lead author and gastroenterologist at Cedars-Sinai Medical Center, Los Angeles, California, told GI & Hepatology News.
“But the stress of living through the pandemic — lockdowns, fear, isolation — also likely played a major role as well in the increased prevalence of digestive disorders. Both the infection itself and the psychological toll of the pandemic can disrupt the gut-brain axis and trigger chronic digestive disorders like IBS,” Almario said.
The study was published in Neurogastroenterology & Motility.
Growing Burden of Gut Disorders
Disorders of gut-brain interaction (DGBIs) are a heterogeneous group of conditions in which gastrointestinal symptoms occur without any detectable structural or biochemical abnormalities in the digestive tract. They include IBS, functional dyspepsia, and chronic idiopathic constipation, among others.
DGBIs are highly prevalent. Research has shown that nearly 40% of people in the US meet Rome IV criteria for at least one DGBI.
Almario and colleagues assessed trends in prevalence of these conditions during the COVID-19 pandemic. Starting in May 2020 through May 2022, they conducted a series of online surveys with more than 160,000 adults aged 18 or older using validated Rome IV diagnostic questionnaires.
Results showed that during the pandemic, IBS prevalence rose from 6.1% in May 2020 to 11.0% by May 2022, an increase of 0.188% per month (adjusted P < .001).
Chronic idiopathic constipation showed a smaller but statistically significant increase, from 6.0% to 6.4% (0.056% per month; adjusted P < .001).
Within the IBS subtypes, mixed-type IBS showed the largest relative increase (0.085% per month), followed by IBS with constipation (0.041% per month) and IBS with diarrhea (0.037% per month).
There were no significant changes in the prevalence of other DGBIs, such as functional bloating, functional diarrhea, or functional dyspepsia, during the study period.
Almario told GI & Hepatology News only about 9% of those surveyed reported a positive COVID test at the time of the surveys, but that figure probably underrepresents actual infections, especially in the early months of the pandemic. “Most of the survey responses came in during the earlier phases of the pandemic, and the percentage reporting a positive test increased over time,” he explained.
Almario also noted that this study did not directly compare digestive disorder rates between infected and uninfected individuals. However, a separate study by the Cedars-Sinai team currently undergoing peer review addresses that question more directly. “That study, along with several other studies, show that having COVID increases the risk of developing conditions like IBS and functional dyspepsia,” Almario said.
Taken together, the findings “underscore the increasing healthcare and economic burden of DGBI in the post-pandemic era, emphasizing the need for targeted efforts to effectively diagnose and manage these complex conditions,” they wrote.
“This will be especially challenging for healthcare systems to address, given the existing shortage of primary care physicians and gastroenterologists — clinicians who primarily manage individuals with DGBI,” they noted.
Support for this study was received from Ironwood Pharmaceuticals and Salix Pharmaceuticals in the form of institutional research grants to Cedars-Sinai. Almario has consulted for Exact Sciences, Greenspace Labs, Owlstone Medical, Salix Pharmaceuticals, and Universal DX.
A version of this article appeared on Medscape.com.
, with a near doubling of the national rate of IBS over 2 years, a study has found.
The uptick is probably due to not only the direct impact of SARS-CoV-2 infection on the gastrointestinal tract but also to the psychological stress associated with pandemic life, the study team said.
“COVID infection itself can definitely cause gastrointestinal symptoms like diarrhea, nausea, and abdominal pain — and for some people, those symptoms can linger and lead to chronic conditions like IBS,” Christopher V. Almario, MD, MSHPM, lead author and gastroenterologist at Cedars-Sinai Medical Center, Los Angeles, California, told GI & Hepatology News.
“But the stress of living through the pandemic — lockdowns, fear, isolation — also likely played a major role as well in the increased prevalence of digestive disorders. Both the infection itself and the psychological toll of the pandemic can disrupt the gut-brain axis and trigger chronic digestive disorders like IBS,” Almario said.
The study was published in Neurogastroenterology & Motility.
Growing Burden of Gut Disorders
Disorders of gut-brain interaction (DGBIs) are a heterogeneous group of conditions in which gastrointestinal symptoms occur without any detectable structural or biochemical abnormalities in the digestive tract. They include IBS, functional dyspepsia, and chronic idiopathic constipation, among others.
DGBIs are highly prevalent. Research has shown that nearly 40% of people in the US meet Rome IV criteria for at least one DGBI.
Almario and colleagues assessed trends in prevalence of these conditions during the COVID-19 pandemic. Starting in May 2020 through May 2022, they conducted a series of online surveys with more than 160,000 adults aged 18 or older using validated Rome IV diagnostic questionnaires.
Results showed that during the pandemic, IBS prevalence rose from 6.1% in May 2020 to 11.0% by May 2022, an increase of 0.188% per month (adjusted P < .001).
Chronic idiopathic constipation showed a smaller but statistically significant increase, from 6.0% to 6.4% (0.056% per month; adjusted P < .001).
Within the IBS subtypes, mixed-type IBS showed the largest relative increase (0.085% per month), followed by IBS with constipation (0.041% per month) and IBS with diarrhea (0.037% per month).
There were no significant changes in the prevalence of other DGBIs, such as functional bloating, functional diarrhea, or functional dyspepsia, during the study period.
Almario told GI & Hepatology News only about 9% of those surveyed reported a positive COVID test at the time of the surveys, but that figure probably underrepresents actual infections, especially in the early months of the pandemic. “Most of the survey responses came in during the earlier phases of the pandemic, and the percentage reporting a positive test increased over time,” he explained.
Almario also noted that this study did not directly compare digestive disorder rates between infected and uninfected individuals. However, a separate study by the Cedars-Sinai team currently undergoing peer review addresses that question more directly. “That study, along with several other studies, show that having COVID increases the risk of developing conditions like IBS and functional dyspepsia,” Almario said.
Taken together, the findings “underscore the increasing healthcare and economic burden of DGBI in the post-pandemic era, emphasizing the need for targeted efforts to effectively diagnose and manage these complex conditions,” they wrote.
“This will be especially challenging for healthcare systems to address, given the existing shortage of primary care physicians and gastroenterologists — clinicians who primarily manage individuals with DGBI,” they noted.
Support for this study was received from Ironwood Pharmaceuticals and Salix Pharmaceuticals in the form of institutional research grants to Cedars-Sinai. Almario has consulted for Exact Sciences, Greenspace Labs, Owlstone Medical, Salix Pharmaceuticals, and Universal DX.
A version of this article appeared on Medscape.com.
, with a near doubling of the national rate of IBS over 2 years, a study has found.
The uptick is probably due to not only the direct impact of SARS-CoV-2 infection on the gastrointestinal tract but also to the psychological stress associated with pandemic life, the study team said.
“COVID infection itself can definitely cause gastrointestinal symptoms like diarrhea, nausea, and abdominal pain — and for some people, those symptoms can linger and lead to chronic conditions like IBS,” Christopher V. Almario, MD, MSHPM, lead author and gastroenterologist at Cedars-Sinai Medical Center, Los Angeles, California, told GI & Hepatology News.
“But the stress of living through the pandemic — lockdowns, fear, isolation — also likely played a major role as well in the increased prevalence of digestive disorders. Both the infection itself and the psychological toll of the pandemic can disrupt the gut-brain axis and trigger chronic digestive disorders like IBS,” Almario said.
The study was published in Neurogastroenterology & Motility.
Growing Burden of Gut Disorders
Disorders of gut-brain interaction (DGBIs) are a heterogeneous group of conditions in which gastrointestinal symptoms occur without any detectable structural or biochemical abnormalities in the digestive tract. They include IBS, functional dyspepsia, and chronic idiopathic constipation, among others.
DGBIs are highly prevalent. Research has shown that nearly 40% of people in the US meet Rome IV criteria for at least one DGBI.
Almario and colleagues assessed trends in prevalence of these conditions during the COVID-19 pandemic. Starting in May 2020 through May 2022, they conducted a series of online surveys with more than 160,000 adults aged 18 or older using validated Rome IV diagnostic questionnaires.
Results showed that during the pandemic, IBS prevalence rose from 6.1% in May 2020 to 11.0% by May 2022, an increase of 0.188% per month (adjusted P < .001).
Chronic idiopathic constipation showed a smaller but statistically significant increase, from 6.0% to 6.4% (0.056% per month; adjusted P < .001).
Within the IBS subtypes, mixed-type IBS showed the largest relative increase (0.085% per month), followed by IBS with constipation (0.041% per month) and IBS with diarrhea (0.037% per month).
There were no significant changes in the prevalence of other DGBIs, such as functional bloating, functional diarrhea, or functional dyspepsia, during the study period.
Almario told GI & Hepatology News only about 9% of those surveyed reported a positive COVID test at the time of the surveys, but that figure probably underrepresents actual infections, especially in the early months of the pandemic. “Most of the survey responses came in during the earlier phases of the pandemic, and the percentage reporting a positive test increased over time,” he explained.
Almario also noted that this study did not directly compare digestive disorder rates between infected and uninfected individuals. However, a separate study by the Cedars-Sinai team currently undergoing peer review addresses that question more directly. “That study, along with several other studies, show that having COVID increases the risk of developing conditions like IBS and functional dyspepsia,” Almario said.
Taken together, the findings “underscore the increasing healthcare and economic burden of DGBI in the post-pandemic era, emphasizing the need for targeted efforts to effectively diagnose and manage these complex conditions,” they wrote.
“This will be especially challenging for healthcare systems to address, given the existing shortage of primary care physicians and gastroenterologists — clinicians who primarily manage individuals with DGBI,” they noted.
Support for this study was received from Ironwood Pharmaceuticals and Salix Pharmaceuticals in the form of institutional research grants to Cedars-Sinai. Almario has consulted for Exact Sciences, Greenspace Labs, Owlstone Medical, Salix Pharmaceuticals, and Universal DX.
A version of this article appeared on Medscape.com.
Are Breast Cancer Survivors Vulnerable to Alzheimer’s Disease?
Despite concerns about cognitive decline after cancer treatment, most breast cancer survivors show no increased risk of developing Alzheimer’s disease, and some may have a slightly lower risk than their cancer-free peers, according to a large retrospective study from Korea.
However, any apparent protective effect faded with time, the investigators reported online in JAMA Network Open.
Overall, this is “reassuring news for cancer survivors,” Tim Ahles, PhD, a psychologist with Memorial Sloan Kettering Cancer Center, New York City, who wasn’t involved in the study, told this news organization.
“I get this question from patients a lot,” Ahles said. And based on these findings, “it doesn’t look like a history of breast cancer and breast cancer treatment increases your risk for Alzheimer’s disease.”
Breast cancer survivors often report cancer-related cognitive impairment, such as difficulties with concentration and memory, both during and after cancer treatment. But evidence surrounding patients’ risk for Alzheimer’s disease is mixed. One large study based in Sweden, for instance, reported a 35% increased risk for Alzheimer’s disease among patients diagnosed with breast cancer after the age of 65 years, but not among younger patients. A population-based study from Taiwan, however, found no increase in the risk for dementia overall compared with cancer-free individuals but did note a lower dementia risk in patients who had received tamoxifen.
To help clarify the evidence, investigators assessed Alzheimer’s disease risk in a large cohort of patients and explored the association by treatment type, age, and important risk factors.
Using the Korean National Health Insurance Service database, the researchers matched 70,701 patients who underwent breast cancer surgery between 2010 and 2016 with 180,360 cancer-free control individuals.
The mean age of breast cancer survivors was 53.1 years. Overall, 72% received radiotherapy. Cyclophosphamide (57%) and anthracycline (50%) were the most commonly used chemotherapies, and tamoxifen (47%) and aromatase inhibitors (30%) were the most commonly used endocrine therapies.
The primary outcome of this study was the incidence of newly diagnosed Alzheimer’s disease, which was defined on the basis of at least one prescription for medications to manage dementia associated with Alzheimer’s disease (donepezil, rivastigmine, galantamine, or memantine).
During a median follow-up of about 7 years, 1229 newly diagnosed Alzheimer’s disease cases were detected in breast cancer survivors and 3430 cases in control individuals — incidence rates of 2.45 and 2.63 per 1000 person-years, respectively.
This corresponded to an 8% lower risk for Alzheimer’s disease in breast cancer survivors compared with cancer-free control individuals at 6 months (subdistribution hazard ratio [SHR], 0.92; 95% CI, 0.86-0.98). The association was especially notable in survivors older than 65 years (SHR, 0.92; 95% CI, 0.85-0.99).
Looking at individual treatment modalities, only radiation therapy was associated with significantly lower risk for Alzheimer’s disease among breast cancer survivors (adjusted HR [aHR], 0.77).
Several risk factors were associated with a significantly higher risk for Alzheimer’s disease: current smoker vs never or ex-smokers (aHR, 2.04), diabetes (aHR, 1.58), and chronic kidney disease (aHR, 3.11). Notably, alcohol use, physical activity level, and hypertension were not associated with Alzheimer’s disease risk.
However, any potential protective effect may be short-lived. The reduced risk for Alzheimer’s disease was no longer significant at 1 year (SHR, 0.94; 95% CI, 0.87-1.01), 3 years (SHR, 0.97; 95% CI, 0.90-1.05), or 5 years (SHR, 0.98; 95% CI, 0.89-1.08).
Even so, breast cancer survivors can still feel reassured by the findings.
“Concerns about chemobrain and the long-term adverse effects of breast cancer treatment on cognition are common, but our findings suggest that this treatment does not directly lead to Alzheimer’s disease,” wrote the authors, led by Su-Min Jeong, MD, with Seoul National University College of Medicine, Seoul, South Korea.
Ahles agreed. The general takeaway from this study is that there is “no strong evidence that the cancer treatment is going to increase your risk for developing Alzheimer’s,” Ahles said. When patients ask about the risk for Alzheimer’s disease, “I can say, ‘Here’s yet another new study that supports the idea that there’s no increased risk.’”
He cautioned, however, that the study doesn’t address whether people with a genetic predisposition to Alzheimer’s might develop it sooner due to cancer treatment.
“Does the cancer treatment increase your probability or nudge you along? The study doesn’t answer that question,” Ahles said.
The study reported having no commercial funding. Jeong and Ahles reported having no relevant disclosures.
A version of this article first appeared on Medscape.com.
Despite concerns about cognitive decline after cancer treatment, most breast cancer survivors show no increased risk of developing Alzheimer’s disease, and some may have a slightly lower risk than their cancer-free peers, according to a large retrospective study from Korea.
However, any apparent protective effect faded with time, the investigators reported online in JAMA Network Open.
Overall, this is “reassuring news for cancer survivors,” Tim Ahles, PhD, a psychologist with Memorial Sloan Kettering Cancer Center, New York City, who wasn’t involved in the study, told this news organization.
“I get this question from patients a lot,” Ahles said. And based on these findings, “it doesn’t look like a history of breast cancer and breast cancer treatment increases your risk for Alzheimer’s disease.”
Breast cancer survivors often report cancer-related cognitive impairment, such as difficulties with concentration and memory, both during and after cancer treatment. But evidence surrounding patients’ risk for Alzheimer’s disease is mixed. One large study based in Sweden, for instance, reported a 35% increased risk for Alzheimer’s disease among patients diagnosed with breast cancer after the age of 65 years, but not among younger patients. A population-based study from Taiwan, however, found no increase in the risk for dementia overall compared with cancer-free individuals but did note a lower dementia risk in patients who had received tamoxifen.
To help clarify the evidence, investigators assessed Alzheimer’s disease risk in a large cohort of patients and explored the association by treatment type, age, and important risk factors.
Using the Korean National Health Insurance Service database, the researchers matched 70,701 patients who underwent breast cancer surgery between 2010 and 2016 with 180,360 cancer-free control individuals.
The mean age of breast cancer survivors was 53.1 years. Overall, 72% received radiotherapy. Cyclophosphamide (57%) and anthracycline (50%) were the most commonly used chemotherapies, and tamoxifen (47%) and aromatase inhibitors (30%) were the most commonly used endocrine therapies.
The primary outcome of this study was the incidence of newly diagnosed Alzheimer’s disease, which was defined on the basis of at least one prescription for medications to manage dementia associated with Alzheimer’s disease (donepezil, rivastigmine, galantamine, or memantine).
During a median follow-up of about 7 years, 1229 newly diagnosed Alzheimer’s disease cases were detected in breast cancer survivors and 3430 cases in control individuals — incidence rates of 2.45 and 2.63 per 1000 person-years, respectively.
This corresponded to an 8% lower risk for Alzheimer’s disease in breast cancer survivors compared with cancer-free control individuals at 6 months (subdistribution hazard ratio [SHR], 0.92; 95% CI, 0.86-0.98). The association was especially notable in survivors older than 65 years (SHR, 0.92; 95% CI, 0.85-0.99).
Looking at individual treatment modalities, only radiation therapy was associated with significantly lower risk for Alzheimer’s disease among breast cancer survivors (adjusted HR [aHR], 0.77).
Several risk factors were associated with a significantly higher risk for Alzheimer’s disease: current smoker vs never or ex-smokers (aHR, 2.04), diabetes (aHR, 1.58), and chronic kidney disease (aHR, 3.11). Notably, alcohol use, physical activity level, and hypertension were not associated with Alzheimer’s disease risk.
However, any potential protective effect may be short-lived. The reduced risk for Alzheimer’s disease was no longer significant at 1 year (SHR, 0.94; 95% CI, 0.87-1.01), 3 years (SHR, 0.97; 95% CI, 0.90-1.05), or 5 years (SHR, 0.98; 95% CI, 0.89-1.08).
Even so, breast cancer survivors can still feel reassured by the findings.
“Concerns about chemobrain and the long-term adverse effects of breast cancer treatment on cognition are common, but our findings suggest that this treatment does not directly lead to Alzheimer’s disease,” wrote the authors, led by Su-Min Jeong, MD, with Seoul National University College of Medicine, Seoul, South Korea.
Ahles agreed. The general takeaway from this study is that there is “no strong evidence that the cancer treatment is going to increase your risk for developing Alzheimer’s,” Ahles said. When patients ask about the risk for Alzheimer’s disease, “I can say, ‘Here’s yet another new study that supports the idea that there’s no increased risk.’”
He cautioned, however, that the study doesn’t address whether people with a genetic predisposition to Alzheimer’s might develop it sooner due to cancer treatment.
“Does the cancer treatment increase your probability or nudge you along? The study doesn’t answer that question,” Ahles said.
The study reported having no commercial funding. Jeong and Ahles reported having no relevant disclosures.
A version of this article first appeared on Medscape.com.
Despite concerns about cognitive decline after cancer treatment, most breast cancer survivors show no increased risk of developing Alzheimer’s disease, and some may have a slightly lower risk than their cancer-free peers, according to a large retrospective study from Korea.
However, any apparent protective effect faded with time, the investigators reported online in JAMA Network Open.
Overall, this is “reassuring news for cancer survivors,” Tim Ahles, PhD, a psychologist with Memorial Sloan Kettering Cancer Center, New York City, who wasn’t involved in the study, told this news organization.
“I get this question from patients a lot,” Ahles said. And based on these findings, “it doesn’t look like a history of breast cancer and breast cancer treatment increases your risk for Alzheimer’s disease.”
Breast cancer survivors often report cancer-related cognitive impairment, such as difficulties with concentration and memory, both during and after cancer treatment. But evidence surrounding patients’ risk for Alzheimer’s disease is mixed. One large study based in Sweden, for instance, reported a 35% increased risk for Alzheimer’s disease among patients diagnosed with breast cancer after the age of 65 years, but not among younger patients. A population-based study from Taiwan, however, found no increase in the risk for dementia overall compared with cancer-free individuals but did note a lower dementia risk in patients who had received tamoxifen.
To help clarify the evidence, investigators assessed Alzheimer’s disease risk in a large cohort of patients and explored the association by treatment type, age, and important risk factors.
Using the Korean National Health Insurance Service database, the researchers matched 70,701 patients who underwent breast cancer surgery between 2010 and 2016 with 180,360 cancer-free control individuals.
The mean age of breast cancer survivors was 53.1 years. Overall, 72% received radiotherapy. Cyclophosphamide (57%) and anthracycline (50%) were the most commonly used chemotherapies, and tamoxifen (47%) and aromatase inhibitors (30%) were the most commonly used endocrine therapies.
The primary outcome of this study was the incidence of newly diagnosed Alzheimer’s disease, which was defined on the basis of at least one prescription for medications to manage dementia associated with Alzheimer’s disease (donepezil, rivastigmine, galantamine, or memantine).
During a median follow-up of about 7 years, 1229 newly diagnosed Alzheimer’s disease cases were detected in breast cancer survivors and 3430 cases in control individuals — incidence rates of 2.45 and 2.63 per 1000 person-years, respectively.
This corresponded to an 8% lower risk for Alzheimer’s disease in breast cancer survivors compared with cancer-free control individuals at 6 months (subdistribution hazard ratio [SHR], 0.92; 95% CI, 0.86-0.98). The association was especially notable in survivors older than 65 years (SHR, 0.92; 95% CI, 0.85-0.99).
Looking at individual treatment modalities, only radiation therapy was associated with significantly lower risk for Alzheimer’s disease among breast cancer survivors (adjusted HR [aHR], 0.77).
Several risk factors were associated with a significantly higher risk for Alzheimer’s disease: current smoker vs never or ex-smokers (aHR, 2.04), diabetes (aHR, 1.58), and chronic kidney disease (aHR, 3.11). Notably, alcohol use, physical activity level, and hypertension were not associated with Alzheimer’s disease risk.
However, any potential protective effect may be short-lived. The reduced risk for Alzheimer’s disease was no longer significant at 1 year (SHR, 0.94; 95% CI, 0.87-1.01), 3 years (SHR, 0.97; 95% CI, 0.90-1.05), or 5 years (SHR, 0.98; 95% CI, 0.89-1.08).
Even so, breast cancer survivors can still feel reassured by the findings.
“Concerns about chemobrain and the long-term adverse effects of breast cancer treatment on cognition are common, but our findings suggest that this treatment does not directly lead to Alzheimer’s disease,” wrote the authors, led by Su-Min Jeong, MD, with Seoul National University College of Medicine, Seoul, South Korea.
Ahles agreed. The general takeaway from this study is that there is “no strong evidence that the cancer treatment is going to increase your risk for developing Alzheimer’s,” Ahles said. When patients ask about the risk for Alzheimer’s disease, “I can say, ‘Here’s yet another new study that supports the idea that there’s no increased risk.’”
He cautioned, however, that the study doesn’t address whether people with a genetic predisposition to Alzheimer’s might develop it sooner due to cancer treatment.
“Does the cancer treatment increase your probability or nudge you along? The study doesn’t answer that question,” Ahles said.
The study reported having no commercial funding. Jeong and Ahles reported having no relevant disclosures.
A version of this article first appeared on Medscape.com.
Celiac Blood Test Eliminates Need for Eating Gluten
Think your patient may have celiac disease? The harsh reality is that current diagnostic tests require patients to consume gluten for an accurate diagnosis, which poses challenges for individuals already avoiding gluten.
A more tolerable approach appears to be on the horizon.
“This is a simple and accurate test that can provide a diagnosis within a very short time frame, without the need for patients to continue eating gluten and feeling sick, or to wait months for a gastroscopy,” Olivia Moscatelli, PhD candidate, Tye-Din Lab, Walter and Eliza Hall Institute and University of Melbourne, Parkville, Australia, told GI & Hepatology News.
The study was published in Gastroenterology.
Most Cases Go Undiagnosed
Celiac disease is an autoimmune disorder triggered by gluten found in wheat, rye, and barley. The only available treatment is a strict, life-long gluten-free diet.
The global prevalence of celiac disease is estimated at around 1%-2%, with 50%-80% of cases either undiagnosed or diagnosed late. That’s because the current reliable diagnosis of celiac disease requires the intake of gluten, which may deter people from seeking a diagnosis.
In earlier work, the researchers, working with Robert Anderson, MBChB, BMedSc, PhD, AGAF, now with Novoviah Pharmaceuticals, made the unexpected discovery that interleukin-2 (IL-2) spiked in the blood of people with celiac disease shortly after they ate gluten.
But would this signal be present when no gluten had been consumed?
The team developed and tested a simple whole blood assay measuring IL-2 release (WBAIL- 2) for detecting gluten-specific T cells to aid in diagnosing celiac disease.
They collected blood samples from 181 volunteers — 75 with treated celiac disease on a gluten-free diet, 13 with active untreated celiac disease, 32 with nonceliac gluten sensitivity and 61 healthy controls. The blood samples were mixed with gluten in a test tube for a day to see if the IL-2 signal appeared.
The WBAIL-2 assay demonstrated high accuracy for celiac disease, even in patients following a strict gluten-free diet.
For patients with HLA-DQ2.5+ genetics, sensitivity was 90% and specificity was 95%, with lower sensitivity (56%) for patients with HLA-DQ8+ celiac disease.
The WBAIL-2 assay correlated strongly with the frequency of tetramer-positive gluten-specific CD4+ T cells used to diagnose celiac disease and monitor treatment effectiveness, and with serum IL-2 levels after gluten challenge.
The strength of the IL-2 signal correlated with the severity of a patient’s symptoms, “allowing us to predict how severely a person with celiac disease might react to gluten, without them actually having to eat it,” Moscatelli said in a news release.
“Current diagnostic practice involves a blood-based serology test followed by a confirmatory gastroscopy if positive. Both tests require the patient to eat gluten daily for 6-12 weeks prior for accurate results. We envision the new blood test (IL-2 whole blood assay) will replace the invasive gastroscopy as the confirmatory test following positive serology,” Moscatelli told GI & Hepatology News.
“In people already following a gluten-free diet, we propose they would have this new blood test done on two separate occasions and two positive results would be required for a celiac diagnosis. This would allow a large number of people who previously have been unable to go through the current diagnostic process to receive a diagnosis,” Moscatelli said.
Practice Changing Potential
A blood-based test that can accurately detect celiac disease without the need for a gluten challenge would be “welcome and practice changing,” said Christopher Cao, MD, director, Celiac Disease Program, Division of Gastroenterology, Mount Sinai Health System, New York City.
“A typical ‘gluten challenge’ involves eating the equivalent of 1-2 slices of bread daily for the course of 6 weeks, and this may be incredibly difficult for patients who have already been on a gluten-free diet prior to an official celiac disease diagnosis. Inability to perform a gluten challenge limits the ability to make an accurate celiac disease diagnosis,” Cao told GI & Hepatology News.
“This study shows that gluten-stimulated interleukin release 2 assays may correlate with the presence of pathogenic gluten-specific CD4+ T cell response in celiac disease,” Cao noted.
He cautioned that “further large cohort, multicenter prospective studies are needed to assess generalizability and may be helpful in evaluating the accuracy of WBAIL-2 in non-HLA DQ2.5 genotypes.”
Other considerations prior to implementation may include reproducibility across different laboratories and overall cost effectiveness, Cao said. “Ultimately in clinic, the role of WBAIL-2 will need to be better defined within the algorithm of celiac disease testing,” he added.
The Path Ahead
The researchers plan to test the performance of the IL-2 whole blood assay in a pediatric cohort, as well as in other countries to demonstrate the reproducibility of the test. In these studies, the test will likely be performed alongside the current diagnostic tests (serology and gastroscopy), Moscatelli told GI & Hepatology News.
“There are some validation studies starting in other countries already as many celiac clinicians globally are interested in bringing this test to their clinical practice. I believe the plan is to have this as an approved diagnostic test for celiac disease worldwide,” she said.
Novoviah Pharmaceuticals is managing the commercialization of the test, and the plan is to get it into clinical practice in the next 2 years, Moscatelli said.
The research was supported by Coeliac Australia, Novoviah Pharmaceuticals (who provided the proprietary test for this study), Beck Family Foundation, Butterfield Family, the Veith Foundation. A complete list of author disclosures is available with the original article. Cao had no relevant disclosures.
A version of this article appeared on Medscape.com.
Think your patient may have celiac disease? The harsh reality is that current diagnostic tests require patients to consume gluten for an accurate diagnosis, which poses challenges for individuals already avoiding gluten.
A more tolerable approach appears to be on the horizon.
“This is a simple and accurate test that can provide a diagnosis within a very short time frame, without the need for patients to continue eating gluten and feeling sick, or to wait months for a gastroscopy,” Olivia Moscatelli, PhD candidate, Tye-Din Lab, Walter and Eliza Hall Institute and University of Melbourne, Parkville, Australia, told GI & Hepatology News.
The study was published in Gastroenterology.
Most Cases Go Undiagnosed
Celiac disease is an autoimmune disorder triggered by gluten found in wheat, rye, and barley. The only available treatment is a strict, life-long gluten-free diet.
The global prevalence of celiac disease is estimated at around 1%-2%, with 50%-80% of cases either undiagnosed or diagnosed late. That’s because the current reliable diagnosis of celiac disease requires the intake of gluten, which may deter people from seeking a diagnosis.
In earlier work, the researchers, working with Robert Anderson, MBChB, BMedSc, PhD, AGAF, now with Novoviah Pharmaceuticals, made the unexpected discovery that interleukin-2 (IL-2) spiked in the blood of people with celiac disease shortly after they ate gluten.
But would this signal be present when no gluten had been consumed?
The team developed and tested a simple whole blood assay measuring IL-2 release (WBAIL- 2) for detecting gluten-specific T cells to aid in diagnosing celiac disease.
They collected blood samples from 181 volunteers — 75 with treated celiac disease on a gluten-free diet, 13 with active untreated celiac disease, 32 with nonceliac gluten sensitivity and 61 healthy controls. The blood samples were mixed with gluten in a test tube for a day to see if the IL-2 signal appeared.
The WBAIL-2 assay demonstrated high accuracy for celiac disease, even in patients following a strict gluten-free diet.
For patients with HLA-DQ2.5+ genetics, sensitivity was 90% and specificity was 95%, with lower sensitivity (56%) for patients with HLA-DQ8+ celiac disease.
The WBAIL-2 assay correlated strongly with the frequency of tetramer-positive gluten-specific CD4+ T cells used to diagnose celiac disease and monitor treatment effectiveness, and with serum IL-2 levels after gluten challenge.
The strength of the IL-2 signal correlated with the severity of a patient’s symptoms, “allowing us to predict how severely a person with celiac disease might react to gluten, without them actually having to eat it,” Moscatelli said in a news release.
“Current diagnostic practice involves a blood-based serology test followed by a confirmatory gastroscopy if positive. Both tests require the patient to eat gluten daily for 6-12 weeks prior for accurate results. We envision the new blood test (IL-2 whole blood assay) will replace the invasive gastroscopy as the confirmatory test following positive serology,” Moscatelli told GI & Hepatology News.
“In people already following a gluten-free diet, we propose they would have this new blood test done on two separate occasions and two positive results would be required for a celiac diagnosis. This would allow a large number of people who previously have been unable to go through the current diagnostic process to receive a diagnosis,” Moscatelli said.
Practice Changing Potential
A blood-based test that can accurately detect celiac disease without the need for a gluten challenge would be “welcome and practice changing,” said Christopher Cao, MD, director, Celiac Disease Program, Division of Gastroenterology, Mount Sinai Health System, New York City.
“A typical ‘gluten challenge’ involves eating the equivalent of 1-2 slices of bread daily for the course of 6 weeks, and this may be incredibly difficult for patients who have already been on a gluten-free diet prior to an official celiac disease diagnosis. Inability to perform a gluten challenge limits the ability to make an accurate celiac disease diagnosis,” Cao told GI & Hepatology News.
“This study shows that gluten-stimulated interleukin release 2 assays may correlate with the presence of pathogenic gluten-specific CD4+ T cell response in celiac disease,” Cao noted.
He cautioned that “further large cohort, multicenter prospective studies are needed to assess generalizability and may be helpful in evaluating the accuracy of WBAIL-2 in non-HLA DQ2.5 genotypes.”
Other considerations prior to implementation may include reproducibility across different laboratories and overall cost effectiveness, Cao said. “Ultimately in clinic, the role of WBAIL-2 will need to be better defined within the algorithm of celiac disease testing,” he added.
The Path Ahead
The researchers plan to test the performance of the IL-2 whole blood assay in a pediatric cohort, as well as in other countries to demonstrate the reproducibility of the test. In these studies, the test will likely be performed alongside the current diagnostic tests (serology and gastroscopy), Moscatelli told GI & Hepatology News.
“There are some validation studies starting in other countries already as many celiac clinicians globally are interested in bringing this test to their clinical practice. I believe the plan is to have this as an approved diagnostic test for celiac disease worldwide,” she said.
Novoviah Pharmaceuticals is managing the commercialization of the test, and the plan is to get it into clinical practice in the next 2 years, Moscatelli said.
The research was supported by Coeliac Australia, Novoviah Pharmaceuticals (who provided the proprietary test for this study), Beck Family Foundation, Butterfield Family, the Veith Foundation. A complete list of author disclosures is available with the original article. Cao had no relevant disclosures.
A version of this article appeared on Medscape.com.
Think your patient may have celiac disease? The harsh reality is that current diagnostic tests require patients to consume gluten for an accurate diagnosis, which poses challenges for individuals already avoiding gluten.
A more tolerable approach appears to be on the horizon.
“This is a simple and accurate test that can provide a diagnosis within a very short time frame, without the need for patients to continue eating gluten and feeling sick, or to wait months for a gastroscopy,” Olivia Moscatelli, PhD candidate, Tye-Din Lab, Walter and Eliza Hall Institute and University of Melbourne, Parkville, Australia, told GI & Hepatology News.
The study was published in Gastroenterology.
Most Cases Go Undiagnosed
Celiac disease is an autoimmune disorder triggered by gluten found in wheat, rye, and barley. The only available treatment is a strict, life-long gluten-free diet.
The global prevalence of celiac disease is estimated at around 1%-2%, with 50%-80% of cases either undiagnosed or diagnosed late. That’s because the current reliable diagnosis of celiac disease requires the intake of gluten, which may deter people from seeking a diagnosis.
In earlier work, the researchers, working with Robert Anderson, MBChB, BMedSc, PhD, AGAF, now with Novoviah Pharmaceuticals, made the unexpected discovery that interleukin-2 (IL-2) spiked in the blood of people with celiac disease shortly after they ate gluten.
But would this signal be present when no gluten had been consumed?
The team developed and tested a simple whole blood assay measuring IL-2 release (WBAIL- 2) for detecting gluten-specific T cells to aid in diagnosing celiac disease.
They collected blood samples from 181 volunteers — 75 with treated celiac disease on a gluten-free diet, 13 with active untreated celiac disease, 32 with nonceliac gluten sensitivity and 61 healthy controls. The blood samples were mixed with gluten in a test tube for a day to see if the IL-2 signal appeared.
The WBAIL-2 assay demonstrated high accuracy for celiac disease, even in patients following a strict gluten-free diet.
For patients with HLA-DQ2.5+ genetics, sensitivity was 90% and specificity was 95%, with lower sensitivity (56%) for patients with HLA-DQ8+ celiac disease.
The WBAIL-2 assay correlated strongly with the frequency of tetramer-positive gluten-specific CD4+ T cells used to diagnose celiac disease and monitor treatment effectiveness, and with serum IL-2 levels after gluten challenge.
The strength of the IL-2 signal correlated with the severity of a patient’s symptoms, “allowing us to predict how severely a person with celiac disease might react to gluten, without them actually having to eat it,” Moscatelli said in a news release.
“Current diagnostic practice involves a blood-based serology test followed by a confirmatory gastroscopy if positive. Both tests require the patient to eat gluten daily for 6-12 weeks prior for accurate results. We envision the new blood test (IL-2 whole blood assay) will replace the invasive gastroscopy as the confirmatory test following positive serology,” Moscatelli told GI & Hepatology News.
“In people already following a gluten-free diet, we propose they would have this new blood test done on two separate occasions and two positive results would be required for a celiac diagnosis. This would allow a large number of people who previously have been unable to go through the current diagnostic process to receive a diagnosis,” Moscatelli said.
Practice Changing Potential
A blood-based test that can accurately detect celiac disease without the need for a gluten challenge would be “welcome and practice changing,” said Christopher Cao, MD, director, Celiac Disease Program, Division of Gastroenterology, Mount Sinai Health System, New York City.
“A typical ‘gluten challenge’ involves eating the equivalent of 1-2 slices of bread daily for the course of 6 weeks, and this may be incredibly difficult for patients who have already been on a gluten-free diet prior to an official celiac disease diagnosis. Inability to perform a gluten challenge limits the ability to make an accurate celiac disease diagnosis,” Cao told GI & Hepatology News.
“This study shows that gluten-stimulated interleukin release 2 assays may correlate with the presence of pathogenic gluten-specific CD4+ T cell response in celiac disease,” Cao noted.
He cautioned that “further large cohort, multicenter prospective studies are needed to assess generalizability and may be helpful in evaluating the accuracy of WBAIL-2 in non-HLA DQ2.5 genotypes.”
Other considerations prior to implementation may include reproducibility across different laboratories and overall cost effectiveness, Cao said. “Ultimately in clinic, the role of WBAIL-2 will need to be better defined within the algorithm of celiac disease testing,” he added.
The Path Ahead
The researchers plan to test the performance of the IL-2 whole blood assay in a pediatric cohort, as well as in other countries to demonstrate the reproducibility of the test. In these studies, the test will likely be performed alongside the current diagnostic tests (serology and gastroscopy), Moscatelli told GI & Hepatology News.
“There are some validation studies starting in other countries already as many celiac clinicians globally are interested in bringing this test to their clinical practice. I believe the plan is to have this as an approved diagnostic test for celiac disease worldwide,” she said.
Novoviah Pharmaceuticals is managing the commercialization of the test, and the plan is to get it into clinical practice in the next 2 years, Moscatelli said.
The research was supported by Coeliac Australia, Novoviah Pharmaceuticals (who provided the proprietary test for this study), Beck Family Foundation, Butterfield Family, the Veith Foundation. A complete list of author disclosures is available with the original article. Cao had no relevant disclosures.
A version of this article appeared on Medscape.com.
FROM GASTROENTEROLOGY
Gut Microbiome Changes in Chronic Pain — Test and Treat?
A new study adds to what has been emerging in the literature — namely that
— suggesting that microbiome-based diagnostics and therapeutics may one day be routine for a broad range of pain conditions.“There is now a whole list of pain conditions that appear to have these signatures, including postoperative pain, arthritis, neuropathy and migraine to name a few,” Robert Bonakdar, MD, director of pain management, Scripps Center for Integrative Medicine, San Diego, told GI & Hepatology News.
Fibromyalgia and complex regional pain syndrome (CRPS) are also on the list.
A team led by Amir Minerbi, MD, PhD, director of the Institute for Pain Medicine, Haifa, Israel, and colleagues published one of the first articles on gut changes in fibromyalgia. They noted that the gut microbiome could be utilized to determine which individuals had the condition and which did not — with about a 90% accuracy.
The team went on to show that transplanting gut microbiota from patients with fibromyalgia into germ-free mice was sufficient to induce pain-like behaviors in the animals — “effects that were reversed when healthy human microbiota were transplanted instead,” Minerbi told GI & Hepatology News.
Further, in a pilot clinical study, the researchers showed that transplanting microbiota from healthy donors led to a reduction in pain and other symptoms in women with treatment-resistant fibromyalgia.
Most recently, they found significant differences in the composition of the gut microbiome in a cohort of patients with CRPS from Israel, compared to matched pain-free control individuals.
Notably, two species — Dialister succinatiphilus and Phascolarctobacterium faecium – were enriched in patients with CRPS, while three species — Ligilactobacillus salivarius, Bifidobacterium dentium, and Bifidobacterium adolescentis – were increased in control samples, according to their report published last month in Anesthesiology.
“Importantly,” these findings were replicated in an independent cohort of patients with CRPS from Canada, “suggesting that the observed microbiome signature is robust and consistent across different environments,” Minerbi told GI & Hepatology News.
Causal Role?
“These findings collectively suggest a causal role for the gut microbiome in at least some chronic pain conditions,” Minerbi said.
However, the co-authors of a linked editorial cautioned that it’s “unclear if D succinatiphilus or P faecium are functionally relevant to CRPS pathophysiology or if the bacteria increased in healthy control samples protect against CRPS development.”
Minerbi and colleagues also observed that fecal concentrations of all measured short chain fatty acids (SCFA) in patients with CRPS were lower on average compared to pain-free control individuals, of which butyric, hexanoic, and valeric acid showed significant depletion.
Additionally, plasma concentrations of acetic acid showed significant depletion in patients with CRPS vs control individuals, while propionate, butyrate, isobutyrate and 2-methyl-butyric acid showed a trend toward lower concentrations.
The quantification of SCFA in patient stool and serum is a “notable advance” in this study, Zulmary Manjarres, PhD; Ashley Plumb, PhD; and Katelyn Sadler, PhD; with the Center for Advanced Pain Studies at The University of Texas at Dallas, wrote in their editorial.
SCFA are produced by bacteria as a byproduct of dietary fiber fermentation and appropriate levels of these compounds are important to maintain low levels of inflammation in the colon and overall gut health, they explained.
This begs the question of whether administering probiotic bacteria — many of which are believed to exert health benefits through SCFA production — can be used to treat CRPS-associated pain. It’s something that needs to be studied, the editorialists wrote.
Yet, in their view, the “most notable achievement” of Minerbi and colleagues is the development of a machine learning model that accurately, specifically and sensitively categorized individuals as patients with CRPS or control individuals based on their fecal microbiome signature.
The model, trained on exact sequence variant data from the Israeli patients, achieved 89.5% accuracy, 90.0% sensitivity, and 88.9% specificity in distinguishing patients with CRPS from control individuals in the Canadian cohort.
Interestingly, in three patients with CRPS who underwent limb amputation and recovered from their pain, their gut microbiome signature remained unchanged, suggesting that microbiome alterations might precede or persist beyond symptomatic phases.
Test and Treat: Are We There Yet?
The gut microbiome link to chronic pain syndromes is a hot area of research, but for now gut microbial testing followed by treatment aimed at “fixing” the microbiome remains largely experimental.
At this point, comprehensive gut-microbiome sequencing is not a routine, guideline-supported part of care for fibromyalgia or any chronic pain condition.
“Unfortunately, even for doctors interested in this area, we are not quite at the state of being able to diagnose and treat pain syndrome based on microbiome data,” Bonakdar told GI & Hepatology News.
He said there are many reasons for this including that this type of microbiome analysis is not commonly available at a routine lab. If patients do obtain testing, then the results are quite complex and may not translate to a diagnosis or a simple microbiome intervention.
“I think the closest option we have now is considering supplementing with commonly beneficial probiotic in pain conditions,” Bonakdar said.
One example is a preliminary fibromyalgia trial which found that supplementing with Lactobacillus, Bifidobacterium, and Saccharomyces boulardii appeared to have benefit.
“Unfortunately, this is hit or miss as other trials such as one in low back pain did not find benefit,” Bonakdar said.
Addressing gut microbiome changes will become “more actionable when microbiome analysis is more commonplace as well as is the ability to tailor treatment to the abnormalities seen on testing in a real-world manner,” Bonakdar said.
“Until then, there is no harm in promoting an anti-inflammatory diet for our patients with pain which we know can improve components of the microbiome while also supporting pain management,” he concluded.
Minerbi, Bonakdar, and the editorial writers had no relevant disclosures.
A version of this article appeared on Medscape.com.
A new study adds to what has been emerging in the literature — namely that
— suggesting that microbiome-based diagnostics and therapeutics may one day be routine for a broad range of pain conditions.“There is now a whole list of pain conditions that appear to have these signatures, including postoperative pain, arthritis, neuropathy and migraine to name a few,” Robert Bonakdar, MD, director of pain management, Scripps Center for Integrative Medicine, San Diego, told GI & Hepatology News.
Fibromyalgia and complex regional pain syndrome (CRPS) are also on the list.
A team led by Amir Minerbi, MD, PhD, director of the Institute for Pain Medicine, Haifa, Israel, and colleagues published one of the first articles on gut changes in fibromyalgia. They noted that the gut microbiome could be utilized to determine which individuals had the condition and which did not — with about a 90% accuracy.
The team went on to show that transplanting gut microbiota from patients with fibromyalgia into germ-free mice was sufficient to induce pain-like behaviors in the animals — “effects that were reversed when healthy human microbiota were transplanted instead,” Minerbi told GI & Hepatology News.
Further, in a pilot clinical study, the researchers showed that transplanting microbiota from healthy donors led to a reduction in pain and other symptoms in women with treatment-resistant fibromyalgia.
Most recently, they found significant differences in the composition of the gut microbiome in a cohort of patients with CRPS from Israel, compared to matched pain-free control individuals.
Notably, two species — Dialister succinatiphilus and Phascolarctobacterium faecium – were enriched in patients with CRPS, while three species — Ligilactobacillus salivarius, Bifidobacterium dentium, and Bifidobacterium adolescentis – were increased in control samples, according to their report published last month in Anesthesiology.
“Importantly,” these findings were replicated in an independent cohort of patients with CRPS from Canada, “suggesting that the observed microbiome signature is robust and consistent across different environments,” Minerbi told GI & Hepatology News.
Causal Role?
“These findings collectively suggest a causal role for the gut microbiome in at least some chronic pain conditions,” Minerbi said.
However, the co-authors of a linked editorial cautioned that it’s “unclear if D succinatiphilus or P faecium are functionally relevant to CRPS pathophysiology or if the bacteria increased in healthy control samples protect against CRPS development.”
Minerbi and colleagues also observed that fecal concentrations of all measured short chain fatty acids (SCFA) in patients with CRPS were lower on average compared to pain-free control individuals, of which butyric, hexanoic, and valeric acid showed significant depletion.
Additionally, plasma concentrations of acetic acid showed significant depletion in patients with CRPS vs control individuals, while propionate, butyrate, isobutyrate and 2-methyl-butyric acid showed a trend toward lower concentrations.
The quantification of SCFA in patient stool and serum is a “notable advance” in this study, Zulmary Manjarres, PhD; Ashley Plumb, PhD; and Katelyn Sadler, PhD; with the Center for Advanced Pain Studies at The University of Texas at Dallas, wrote in their editorial.
SCFA are produced by bacteria as a byproduct of dietary fiber fermentation and appropriate levels of these compounds are important to maintain low levels of inflammation in the colon and overall gut health, they explained.
This begs the question of whether administering probiotic bacteria — many of which are believed to exert health benefits through SCFA production — can be used to treat CRPS-associated pain. It’s something that needs to be studied, the editorialists wrote.
Yet, in their view, the “most notable achievement” of Minerbi and colleagues is the development of a machine learning model that accurately, specifically and sensitively categorized individuals as patients with CRPS or control individuals based on their fecal microbiome signature.
The model, trained on exact sequence variant data from the Israeli patients, achieved 89.5% accuracy, 90.0% sensitivity, and 88.9% specificity in distinguishing patients with CRPS from control individuals in the Canadian cohort.
Interestingly, in three patients with CRPS who underwent limb amputation and recovered from their pain, their gut microbiome signature remained unchanged, suggesting that microbiome alterations might precede or persist beyond symptomatic phases.
Test and Treat: Are We There Yet?
The gut microbiome link to chronic pain syndromes is a hot area of research, but for now gut microbial testing followed by treatment aimed at “fixing” the microbiome remains largely experimental.
At this point, comprehensive gut-microbiome sequencing is not a routine, guideline-supported part of care for fibromyalgia or any chronic pain condition.
“Unfortunately, even for doctors interested in this area, we are not quite at the state of being able to diagnose and treat pain syndrome based on microbiome data,” Bonakdar told GI & Hepatology News.
He said there are many reasons for this including that this type of microbiome analysis is not commonly available at a routine lab. If patients do obtain testing, then the results are quite complex and may not translate to a diagnosis or a simple microbiome intervention.
“I think the closest option we have now is considering supplementing with commonly beneficial probiotic in pain conditions,” Bonakdar said.
One example is a preliminary fibromyalgia trial which found that supplementing with Lactobacillus, Bifidobacterium, and Saccharomyces boulardii appeared to have benefit.
“Unfortunately, this is hit or miss as other trials such as one in low back pain did not find benefit,” Bonakdar said.
Addressing gut microbiome changes will become “more actionable when microbiome analysis is more commonplace as well as is the ability to tailor treatment to the abnormalities seen on testing in a real-world manner,” Bonakdar said.
“Until then, there is no harm in promoting an anti-inflammatory diet for our patients with pain which we know can improve components of the microbiome while also supporting pain management,” he concluded.
Minerbi, Bonakdar, and the editorial writers had no relevant disclosures.
A version of this article appeared on Medscape.com.
A new study adds to what has been emerging in the literature — namely that
— suggesting that microbiome-based diagnostics and therapeutics may one day be routine for a broad range of pain conditions.“There is now a whole list of pain conditions that appear to have these signatures, including postoperative pain, arthritis, neuropathy and migraine to name a few,” Robert Bonakdar, MD, director of pain management, Scripps Center for Integrative Medicine, San Diego, told GI & Hepatology News.
Fibromyalgia and complex regional pain syndrome (CRPS) are also on the list.
A team led by Amir Minerbi, MD, PhD, director of the Institute for Pain Medicine, Haifa, Israel, and colleagues published one of the first articles on gut changes in fibromyalgia. They noted that the gut microbiome could be utilized to determine which individuals had the condition and which did not — with about a 90% accuracy.
The team went on to show that transplanting gut microbiota from patients with fibromyalgia into germ-free mice was sufficient to induce pain-like behaviors in the animals — “effects that were reversed when healthy human microbiota were transplanted instead,” Minerbi told GI & Hepatology News.
Further, in a pilot clinical study, the researchers showed that transplanting microbiota from healthy donors led to a reduction in pain and other symptoms in women with treatment-resistant fibromyalgia.
Most recently, they found significant differences in the composition of the gut microbiome in a cohort of patients with CRPS from Israel, compared to matched pain-free control individuals.
Notably, two species — Dialister succinatiphilus and Phascolarctobacterium faecium – were enriched in patients with CRPS, while three species — Ligilactobacillus salivarius, Bifidobacterium dentium, and Bifidobacterium adolescentis – were increased in control samples, according to their report published last month in Anesthesiology.
“Importantly,” these findings were replicated in an independent cohort of patients with CRPS from Canada, “suggesting that the observed microbiome signature is robust and consistent across different environments,” Minerbi told GI & Hepatology News.
Causal Role?
“These findings collectively suggest a causal role for the gut microbiome in at least some chronic pain conditions,” Minerbi said.
However, the co-authors of a linked editorial cautioned that it’s “unclear if D succinatiphilus or P faecium are functionally relevant to CRPS pathophysiology or if the bacteria increased in healthy control samples protect against CRPS development.”
Minerbi and colleagues also observed that fecal concentrations of all measured short chain fatty acids (SCFA) in patients with CRPS were lower on average compared to pain-free control individuals, of which butyric, hexanoic, and valeric acid showed significant depletion.
Additionally, plasma concentrations of acetic acid showed significant depletion in patients with CRPS vs control individuals, while propionate, butyrate, isobutyrate and 2-methyl-butyric acid showed a trend toward lower concentrations.
The quantification of SCFA in patient stool and serum is a “notable advance” in this study, Zulmary Manjarres, PhD; Ashley Plumb, PhD; and Katelyn Sadler, PhD; with the Center for Advanced Pain Studies at The University of Texas at Dallas, wrote in their editorial.
SCFA are produced by bacteria as a byproduct of dietary fiber fermentation and appropriate levels of these compounds are important to maintain low levels of inflammation in the colon and overall gut health, they explained.
This begs the question of whether administering probiotic bacteria — many of which are believed to exert health benefits through SCFA production — can be used to treat CRPS-associated pain. It’s something that needs to be studied, the editorialists wrote.
Yet, in their view, the “most notable achievement” of Minerbi and colleagues is the development of a machine learning model that accurately, specifically and sensitively categorized individuals as patients with CRPS or control individuals based on their fecal microbiome signature.
The model, trained on exact sequence variant data from the Israeli patients, achieved 89.5% accuracy, 90.0% sensitivity, and 88.9% specificity in distinguishing patients with CRPS from control individuals in the Canadian cohort.
Interestingly, in three patients with CRPS who underwent limb amputation and recovered from their pain, their gut microbiome signature remained unchanged, suggesting that microbiome alterations might precede or persist beyond symptomatic phases.
Test and Treat: Are We There Yet?
The gut microbiome link to chronic pain syndromes is a hot area of research, but for now gut microbial testing followed by treatment aimed at “fixing” the microbiome remains largely experimental.
At this point, comprehensive gut-microbiome sequencing is not a routine, guideline-supported part of care for fibromyalgia or any chronic pain condition.
“Unfortunately, even for doctors interested in this area, we are not quite at the state of being able to diagnose and treat pain syndrome based on microbiome data,” Bonakdar told GI & Hepatology News.
He said there are many reasons for this including that this type of microbiome analysis is not commonly available at a routine lab. If patients do obtain testing, then the results are quite complex and may not translate to a diagnosis or a simple microbiome intervention.
“I think the closest option we have now is considering supplementing with commonly beneficial probiotic in pain conditions,” Bonakdar said.
One example is a preliminary fibromyalgia trial which found that supplementing with Lactobacillus, Bifidobacterium, and Saccharomyces boulardii appeared to have benefit.
“Unfortunately, this is hit or miss as other trials such as one in low back pain did not find benefit,” Bonakdar said.
Addressing gut microbiome changes will become “more actionable when microbiome analysis is more commonplace as well as is the ability to tailor treatment to the abnormalities seen on testing in a real-world manner,” Bonakdar said.
“Until then, there is no harm in promoting an anti-inflammatory diet for our patients with pain which we know can improve components of the microbiome while also supporting pain management,” he concluded.
Minerbi, Bonakdar, and the editorial writers had no relevant disclosures.
A version of this article appeared on Medscape.com.