Second-trimester blood test predicts preterm birth

Article Type
Changed

A new blood test performed in the second trimester could help identify pregnancies at risk of early and very early spontaneous preterm birth (sPTB), based on a prospective cohort trial.

The cell-free RNA (cfRNA) profiling tool could guide patient and provider decision-making, while the underlying research illuminates biological pathways that may facilitate novel interventions, reported lead author Joan Camunas-Soler, PhD, of Mirvie, South San Francisco, and colleagues.

“Given the complex etiology of this heterogeneous syndrome, it would be advantageous to develop predictive tests that provide insight on the specific pathophysiology leading to preterm birth for each particular pregnancy,” Dr. Camunas-Soler and colleagues wrote in the American Journal of Obstetrics and Gynecology. “Such an approach could inform the development of preventive treatments and targeted therapeutics that are currently lacking/difficult to implement due to the heterogeneous etiology of sPTB.”

Currently, the best predictor of sPTB is previous sPTB, according to the investigators. Although a combination approach that incorporates cervical length and fetal fibronectin in cervicovaginal fluid is “of use,” they noted, “this is not standard of care in the U.S.A. nor recommended by the American College of Obstetricians and Gynecologists or the Society for Maternal-Fetal Medicine.” Existing molecular tests lack clinical data and may be inaccurate across diverse patient populations, they added.

The present study aimed to address these shortcomings by creating a second-trimester blood test for predicting sPTB. To identify relevant biomarkers, the investigators compared RNA profiles that were differentially expressed in three types of cases: term birth, early sPTB, and very early sPTB.

Among 242 women who contributed second-trimester blood samples for analysis, 194 went on to have a term birth. Of the remaining 48 women who gave birth spontaneously before 35 weeks’ gestation, 32 delivered between 25 and 35 weeks (early sPTB), while 16 delivered before 25 weeks’ gestation (very early sPTB). Slightly more than half of the patients were White, about one-third were Black, approximately 10% were Asian, and the remainder were of unknown race/ethnicity. Cases of preeclampsia were excluded.

The gene discovery and modeling process revealed 25 distinct genes that were significantly associated with early sPTB, offering a risk model with a sensitivity of 76% and a specificity of 72% (area under the curve, 0.80; 95% confidence interval, 0.72-0.87). Very early sPTB was associated with a set of 39 genes, giving a model with a sensitivity of 64% and a specificity of 80% (area under the curve = 0.76; 95% CI, 0.63-0.87).

Characterization of the two RNA profiles offered a glimpse into the underlying biological processes driving preterm birth. The genes predicting early sPTB are largely responsible for extracellular matrix degradation and remodeling, which could, “in terms of mechanism, reflect ongoing processes associated with cervical shortening, a feature often detected some weeks prior to sPTB,” the investigators wrote. In contrast, genes associated with very early sPTB are linked with insulinlike growth factor transport, which drives fetal growth and placentation. These findings could lead to development of pathway-specific interventions, Dr. Camunas-Soler and colleagues suggested.

According to coauthor Michal A. Elovitz, MD, the Hilarie L. Morgan and Mitchell L. Morgan President’s Distinguished Professor in Women’s Health at the University of Pennsylvania, Philadelphia, and chief medical advisor at Mirvie, the proprietary RNA platform moves beyond “unreliable and at times biased clinical factors such as race, BMI, and maternal age” to offer a “precision-based approach to pregnancy health.”

Excluding traditional risk factors also “promises more equitable care than the use of broad sociodemographic factors that often result in bias,” she added, noting that this may help address the higher rate of pregnancy complications among Black patients.

When asked about the potential for false-positive results, considering reported specificity rates of 72%-80%, Dr. Elovitz suggested that such concerns among pregnant women are an “unfortunate misconception.”

“It is not reflective of what women want regarding knowledge about the health of their pregnancy,” she said in a written comment. “Rather than be left in the dark, women want to be prepared for what is to come in their pregnancy journey.”

In support of this statement, Dr. Elovitz cited a recent study involving women with preeclampsia and other hypertensive disorders in pregnancy. A questionnaire showed that women appreciated pregnancy risk models when making decisions, and reported that they would have greater peace of mind if such tests were available.

Dr. Laura Jelliffe-Pawlowski


Laura Jelliffe-Pawlowski, PhD, of the University of California, San Francisco, California Preterm Birth Initiative, supported Dr. Elovitz’s viewpoint.

“If you talk to women who have delivered preterm most (but not all) say that they would have wanted to know their risk so they could have been better prepared,” she said in a written comment. “I think we need to shift the narrative to empowerment away from fear.”

Dr. Jelliffe-Pawlowski, who holds a patent for a separate test predicting preterm birth, said that the Mirvie RNA platform is “promising,” although she expressed concern that excluding patients with preeclampsia – representing approximately 4% of pregnancies in the United States – may have clouded accuracy results.

“What is unclear is how the test would perform more generally when a sample of all pregnancies was included,” she said. “Without that information, it is hard to compare their findings with other predictive models without such exclusions.”

Regardless of the model used, Dr. Jelliffe-Pawlowski said that more research is needed to determine best clinical responses when risk of sPTB is increased.

“Ultimately we want to connect action with results,” she said. “Okay, so [a woman] is at high risk for delivering preterm – now what? There is a lot of untapped potential once you start to focus more with women and birthing people you know have a high likelihood of preterm birth.”

The study was supported by Mirvie, Tommy’s Charity, and the National Institute for Health Research Biomedical Research Centre. The investigators disclosed financial relationships with Mirvie, including equity interest and/or intellectual property rights. Cohort contributors were remunerated for sample collection and/or shipping. Dr. Jelliffe-Pawlowski holds a patent for a different preterm birth prediction blood test.

*This story was updated on 4/26/2022. 

Publications
Topics
Sections

A new blood test performed in the second trimester could help identify pregnancies at risk of early and very early spontaneous preterm birth (sPTB), based on a prospective cohort trial.

The cell-free RNA (cfRNA) profiling tool could guide patient and provider decision-making, while the underlying research illuminates biological pathways that may facilitate novel interventions, reported lead author Joan Camunas-Soler, PhD, of Mirvie, South San Francisco, and colleagues.

“Given the complex etiology of this heterogeneous syndrome, it would be advantageous to develop predictive tests that provide insight on the specific pathophysiology leading to preterm birth for each particular pregnancy,” Dr. Camunas-Soler and colleagues wrote in the American Journal of Obstetrics and Gynecology. “Such an approach could inform the development of preventive treatments and targeted therapeutics that are currently lacking/difficult to implement due to the heterogeneous etiology of sPTB.”

Currently, the best predictor of sPTB is previous sPTB, according to the investigators. Although a combination approach that incorporates cervical length and fetal fibronectin in cervicovaginal fluid is “of use,” they noted, “this is not standard of care in the U.S.A. nor recommended by the American College of Obstetricians and Gynecologists or the Society for Maternal-Fetal Medicine.” Existing molecular tests lack clinical data and may be inaccurate across diverse patient populations, they added.

The present study aimed to address these shortcomings by creating a second-trimester blood test for predicting sPTB. To identify relevant biomarkers, the investigators compared RNA profiles that were differentially expressed in three types of cases: term birth, early sPTB, and very early sPTB.

Among 242 women who contributed second-trimester blood samples for analysis, 194 went on to have a term birth. Of the remaining 48 women who gave birth spontaneously before 35 weeks’ gestation, 32 delivered between 25 and 35 weeks (early sPTB), while 16 delivered before 25 weeks’ gestation (very early sPTB). Slightly more than half of the patients were White, about one-third were Black, approximately 10% were Asian, and the remainder were of unknown race/ethnicity. Cases of preeclampsia were excluded.

The gene discovery and modeling process revealed 25 distinct genes that were significantly associated with early sPTB, offering a risk model with a sensitivity of 76% and a specificity of 72% (area under the curve, 0.80; 95% confidence interval, 0.72-0.87). Very early sPTB was associated with a set of 39 genes, giving a model with a sensitivity of 64% and a specificity of 80% (area under the curve = 0.76; 95% CI, 0.63-0.87).

Characterization of the two RNA profiles offered a glimpse into the underlying biological processes driving preterm birth. The genes predicting early sPTB are largely responsible for extracellular matrix degradation and remodeling, which could, “in terms of mechanism, reflect ongoing processes associated with cervical shortening, a feature often detected some weeks prior to sPTB,” the investigators wrote. In contrast, genes associated with very early sPTB are linked with insulinlike growth factor transport, which drives fetal growth and placentation. These findings could lead to development of pathway-specific interventions, Dr. Camunas-Soler and colleagues suggested.

According to coauthor Michal A. Elovitz, MD, the Hilarie L. Morgan and Mitchell L. Morgan President’s Distinguished Professor in Women’s Health at the University of Pennsylvania, Philadelphia, and chief medical advisor at Mirvie, the proprietary RNA platform moves beyond “unreliable and at times biased clinical factors such as race, BMI, and maternal age” to offer a “precision-based approach to pregnancy health.”

Excluding traditional risk factors also “promises more equitable care than the use of broad sociodemographic factors that often result in bias,” she added, noting that this may help address the higher rate of pregnancy complications among Black patients.

When asked about the potential for false-positive results, considering reported specificity rates of 72%-80%, Dr. Elovitz suggested that such concerns among pregnant women are an “unfortunate misconception.”

“It is not reflective of what women want regarding knowledge about the health of their pregnancy,” she said in a written comment. “Rather than be left in the dark, women want to be prepared for what is to come in their pregnancy journey.”

In support of this statement, Dr. Elovitz cited a recent study involving women with preeclampsia and other hypertensive disorders in pregnancy. A questionnaire showed that women appreciated pregnancy risk models when making decisions, and reported that they would have greater peace of mind if such tests were available.

Dr. Laura Jelliffe-Pawlowski


Laura Jelliffe-Pawlowski, PhD, of the University of California, San Francisco, California Preterm Birth Initiative, supported Dr. Elovitz’s viewpoint.

“If you talk to women who have delivered preterm most (but not all) say that they would have wanted to know their risk so they could have been better prepared,” she said in a written comment. “I think we need to shift the narrative to empowerment away from fear.”

Dr. Jelliffe-Pawlowski, who holds a patent for a separate test predicting preterm birth, said that the Mirvie RNA platform is “promising,” although she expressed concern that excluding patients with preeclampsia – representing approximately 4% of pregnancies in the United States – may have clouded accuracy results.

“What is unclear is how the test would perform more generally when a sample of all pregnancies was included,” she said. “Without that information, it is hard to compare their findings with other predictive models without such exclusions.”

Regardless of the model used, Dr. Jelliffe-Pawlowski said that more research is needed to determine best clinical responses when risk of sPTB is increased.

“Ultimately we want to connect action with results,” she said. “Okay, so [a woman] is at high risk for delivering preterm – now what? There is a lot of untapped potential once you start to focus more with women and birthing people you know have a high likelihood of preterm birth.”

The study was supported by Mirvie, Tommy’s Charity, and the National Institute for Health Research Biomedical Research Centre. The investigators disclosed financial relationships with Mirvie, including equity interest and/or intellectual property rights. Cohort contributors were remunerated for sample collection and/or shipping. Dr. Jelliffe-Pawlowski holds a patent for a different preterm birth prediction blood test.

*This story was updated on 4/26/2022. 

A new blood test performed in the second trimester could help identify pregnancies at risk of early and very early spontaneous preterm birth (sPTB), based on a prospective cohort trial.

The cell-free RNA (cfRNA) profiling tool could guide patient and provider decision-making, while the underlying research illuminates biological pathways that may facilitate novel interventions, reported lead author Joan Camunas-Soler, PhD, of Mirvie, South San Francisco, and colleagues.

“Given the complex etiology of this heterogeneous syndrome, it would be advantageous to develop predictive tests that provide insight on the specific pathophysiology leading to preterm birth for each particular pregnancy,” Dr. Camunas-Soler and colleagues wrote in the American Journal of Obstetrics and Gynecology. “Such an approach could inform the development of preventive treatments and targeted therapeutics that are currently lacking/difficult to implement due to the heterogeneous etiology of sPTB.”

Currently, the best predictor of sPTB is previous sPTB, according to the investigators. Although a combination approach that incorporates cervical length and fetal fibronectin in cervicovaginal fluid is “of use,” they noted, “this is not standard of care in the U.S.A. nor recommended by the American College of Obstetricians and Gynecologists or the Society for Maternal-Fetal Medicine.” Existing molecular tests lack clinical data and may be inaccurate across diverse patient populations, they added.

The present study aimed to address these shortcomings by creating a second-trimester blood test for predicting sPTB. To identify relevant biomarkers, the investigators compared RNA profiles that were differentially expressed in three types of cases: term birth, early sPTB, and very early sPTB.

Among 242 women who contributed second-trimester blood samples for analysis, 194 went on to have a term birth. Of the remaining 48 women who gave birth spontaneously before 35 weeks’ gestation, 32 delivered between 25 and 35 weeks (early sPTB), while 16 delivered before 25 weeks’ gestation (very early sPTB). Slightly more than half of the patients were White, about one-third were Black, approximately 10% were Asian, and the remainder were of unknown race/ethnicity. Cases of preeclampsia were excluded.

The gene discovery and modeling process revealed 25 distinct genes that were significantly associated with early sPTB, offering a risk model with a sensitivity of 76% and a specificity of 72% (area under the curve, 0.80; 95% confidence interval, 0.72-0.87). Very early sPTB was associated with a set of 39 genes, giving a model with a sensitivity of 64% and a specificity of 80% (area under the curve = 0.76; 95% CI, 0.63-0.87).

Characterization of the two RNA profiles offered a glimpse into the underlying biological processes driving preterm birth. The genes predicting early sPTB are largely responsible for extracellular matrix degradation and remodeling, which could, “in terms of mechanism, reflect ongoing processes associated with cervical shortening, a feature often detected some weeks prior to sPTB,” the investigators wrote. In contrast, genes associated with very early sPTB are linked with insulinlike growth factor transport, which drives fetal growth and placentation. These findings could lead to development of pathway-specific interventions, Dr. Camunas-Soler and colleagues suggested.

According to coauthor Michal A. Elovitz, MD, the Hilarie L. Morgan and Mitchell L. Morgan President’s Distinguished Professor in Women’s Health at the University of Pennsylvania, Philadelphia, and chief medical advisor at Mirvie, the proprietary RNA platform moves beyond “unreliable and at times biased clinical factors such as race, BMI, and maternal age” to offer a “precision-based approach to pregnancy health.”

Excluding traditional risk factors also “promises more equitable care than the use of broad sociodemographic factors that often result in bias,” she added, noting that this may help address the higher rate of pregnancy complications among Black patients.

When asked about the potential for false-positive results, considering reported specificity rates of 72%-80%, Dr. Elovitz suggested that such concerns among pregnant women are an “unfortunate misconception.”

“It is not reflective of what women want regarding knowledge about the health of their pregnancy,” she said in a written comment. “Rather than be left in the dark, women want to be prepared for what is to come in their pregnancy journey.”

In support of this statement, Dr. Elovitz cited a recent study involving women with preeclampsia and other hypertensive disorders in pregnancy. A questionnaire showed that women appreciated pregnancy risk models when making decisions, and reported that they would have greater peace of mind if such tests were available.

Dr. Laura Jelliffe-Pawlowski


Laura Jelliffe-Pawlowski, PhD, of the University of California, San Francisco, California Preterm Birth Initiative, supported Dr. Elovitz’s viewpoint.

“If you talk to women who have delivered preterm most (but not all) say that they would have wanted to know their risk so they could have been better prepared,” she said in a written comment. “I think we need to shift the narrative to empowerment away from fear.”

Dr. Jelliffe-Pawlowski, who holds a patent for a separate test predicting preterm birth, said that the Mirvie RNA platform is “promising,” although she expressed concern that excluding patients with preeclampsia – representing approximately 4% of pregnancies in the United States – may have clouded accuracy results.

“What is unclear is how the test would perform more generally when a sample of all pregnancies was included,” she said. “Without that information, it is hard to compare their findings with other predictive models without such exclusions.”

Regardless of the model used, Dr. Jelliffe-Pawlowski said that more research is needed to determine best clinical responses when risk of sPTB is increased.

“Ultimately we want to connect action with results,” she said. “Okay, so [a woman] is at high risk for delivering preterm – now what? There is a lot of untapped potential once you start to focus more with women and birthing people you know have a high likelihood of preterm birth.”

The study was supported by Mirvie, Tommy’s Charity, and the National Institute for Health Research Biomedical Research Centre. The investigators disclosed financial relationships with Mirvie, including equity interest and/or intellectual property rights. Cohort contributors were remunerated for sample collection and/or shipping. Dr. Jelliffe-Pawlowski holds a patent for a different preterm birth prediction blood test.

*This story was updated on 4/26/2022. 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AMERICAN JOURNAL OF OBSTETRICS AND GYNECOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study: Fasting plus calorie counting offered no weight-loss benefit over calorie counting alone

Article Type
Changed

 

Not so fast! Daily fasting with calorie restriction may not lead to shedding more pounds than just cutting back on calories, according to the authors of a new study.

Over the course of a year, study participants who ate only from 8:00 a.m. to 4:00 p.m. did not lose significantly more weight than individuals who ate whenever they wanted, nor did they achieve significantly greater improvements in other obesity-related health measures like body mass index (BMI) or metabolic risk, reported lead author Deying Liu, MD, of Nanfang Hospital, Southern Medical University, Guangzhou, China, and colleagues.

“[Daily fasting] has gained popularity because it is a weight-loss strategy that is simple to follow, which may enhance adherence,” Dr. Liu and colleagues wrote in the New England Journal of Medicine. However, “the long-term efficacy and safety of time-restricted eating as a weight-loss strategy are still uncertain, and the long-term effects on weight loss of time-restricted eating as compared with daily calorie restriction alone have not been fully explored.”

To learn more, Dr. Liu and colleagues recruited 139 adult patients with BMIs between 28 and 45. Individuals with serious medical conditions, such as malignant tumors, diabetes, chronic kidney disease, and others were excluded. Other exclusion criteria included smoking, ongoing participation in a weight-loss program, GI surgery within the prior year, use of medications that impact energy balance and weight, and planned or current pregnancy.

All participants were advised to eat calorie-restricted diets, with ranges of 1,500-1,800 kcal per day for men and 1,200-1,500 kcal per day for women. To determine the added impact of fasting, participants were randomized in a 1:1 ratio into time-restricted (fasting) or non–time-restricted (nonfasting) groups, in which fasting participants ate only during an 8-hour window from 8:00 a.m. to 4:00 p.m., whereas nonfasting participants ate whenever they wanted.

At 6 months and 12 months, participants were re-evaluated for changes in weight, body fat, BMI, blood pressure, lean body mass, and metabolic risk factors, including glucose level, triglycerides, blood pressure, and others.
 

Caloric intake restriction seems to explain most of beneficial effects

At one-year follow-up, 118 participants (84.9%) remained in the study. Although members of the fasting group lost slightly more weight on average than those in the non-fasting group (mean, 8.0 kg vs. 6.3 kg), the difference between groups was not statistically significant (95% confidence interval, −4.0 to 0.4; P = .11).

Most of the other obesity-related health measures also trended toward favoring the fasting group, but again, none of these improvements was statistically significant. Weight circumference at 1 year, for example, decreased by a mean of 9.4 cm in the fasting group versus 8.8 cm in the nonfasting group, a net difference of 1.8 cm (95% CI, –4.0 to 0.5).

“We found that the two weight-loss regimens that we evaluated had similar success in patients with obesity, regardless of whether they reduced their calorie consumption through time-restricted eating or through calorie restriction alone,” Dr. Liu and colleagues concluded.

Principal investigator Huijie Zhang MD, PhD, professor, chief physician, and deputy director of the department of endocrinology and metabolism at Nafang Hospital, noted that their findings are “consistent with the findings in previous studies.”

“Our data suggest that caloric intake restriction explained most of the beneficial effects of a time-restricted eating regimen,” Dr. Zhang said.

Still, Dr. Zhang called time-restricted eating “a viable and sustainable approach for a person who wants to lose weight.”

More work is needed, Dr. Zhang said, to uncover the impact of fasting in “diverse groups,” including patients with chronic disease like diabetes and cardiovascular disease. Investigators should also conduct studies to compare outcomes between men and women, and evaluate the effects of other fasting durations.
 

 

 

Can trial be applied to a wider population?

According to Blandine Laferrère, MD, PhD, and Satchidananda Panda, PhD, of Columbia University Irving Medical Center, New York, and the Salk Institute for Biological Studies, La Jolla, Calif., respectively, “the results of the trial suggest that calorie restriction combined with time restriction, when delivered with intensive coaching and monitoring, is an approach that is as safe, sustainable, and effective for weight loss as calorie restriction alone.”

Yet Dr. Laferrère and Dr. Panda also expressed skepticism about broader implementation of a similar regime.

“The applicability of this trial to wider populations is debatable,” they wrote in an accompanying editorial. “The short time period for eating at baseline may be specific to the population studied, since investigators outside China have reported longer time windows. The rigorous coaching and monitoring by trial staff also leaves open the question of whether time-restricted eating is easier to adhere to than intentional calorie restriction. Such cost-benefit analyses are important for the assessment of the scalability of a lifestyle intervention.”
 

Duration is trial’s greatest strength

Kristina Varady, PhD, professor of nutrition in the department of kinesiology and nutrition at the University of Illinois at Chicago, said the “key strength” of the trial was its duration, at 12 months, making it the longest time-restricted eating trial to date”; however, she was critical of the design.

Dr. Kristina Varady

“Quite frankly, I’m surprised this study got into such a high-caliber medical journal,” Dr. Varady said in a written comment. “It doesn’t even have a control group! It goes to show how popular these diets are and how much people want to know about them.”

She also noted that “the study was flawed in that it didn’t really look at the effects of true time-restricted eating.” According to Dr. Varady, combining calorie restriction with time-restricted eating “kind of defeats the purpose” of a time-restricted diet.

“The main benefit of time-restricted eating is that you don’t need to count calories in order to lose weight,” Dr. Varady said, citing two of her own studies from 2018 and 2020. “Just by limiting the eating window to 8 hours per day, people naturally cut out 300-500 calories per day. That’s why people like [time-restricted eating] so much.”

Dr. Varady was also “very surprised” at the adherence data. At 1 year, approximately 85% of the patients were still following the protocol, a notably higher rate than most dietary intervention studies, which typically report adherence rates of 50-60%, she said. The high adherence rate was particularly unexpected because of the 8:00 a.m.–4:00 p.m. eating window, Dr. Varady added, since that meant skipping “the family/social meal every evening over 1 whole year!”

The study was funded by the National Key Research and Development Project and others. The study investigators reported no conflicts of interest. Dr. Varady disclosed author fees from the Hachette Book group for her book “The Every Other Day Diet.”

Publications
Topics
Sections

 

Not so fast! Daily fasting with calorie restriction may not lead to shedding more pounds than just cutting back on calories, according to the authors of a new study.

Over the course of a year, study participants who ate only from 8:00 a.m. to 4:00 p.m. did not lose significantly more weight than individuals who ate whenever they wanted, nor did they achieve significantly greater improvements in other obesity-related health measures like body mass index (BMI) or metabolic risk, reported lead author Deying Liu, MD, of Nanfang Hospital, Southern Medical University, Guangzhou, China, and colleagues.

“[Daily fasting] has gained popularity because it is a weight-loss strategy that is simple to follow, which may enhance adherence,” Dr. Liu and colleagues wrote in the New England Journal of Medicine. However, “the long-term efficacy and safety of time-restricted eating as a weight-loss strategy are still uncertain, and the long-term effects on weight loss of time-restricted eating as compared with daily calorie restriction alone have not been fully explored.”

To learn more, Dr. Liu and colleagues recruited 139 adult patients with BMIs between 28 and 45. Individuals with serious medical conditions, such as malignant tumors, diabetes, chronic kidney disease, and others were excluded. Other exclusion criteria included smoking, ongoing participation in a weight-loss program, GI surgery within the prior year, use of medications that impact energy balance and weight, and planned or current pregnancy.

All participants were advised to eat calorie-restricted diets, with ranges of 1,500-1,800 kcal per day for men and 1,200-1,500 kcal per day for women. To determine the added impact of fasting, participants were randomized in a 1:1 ratio into time-restricted (fasting) or non–time-restricted (nonfasting) groups, in which fasting participants ate only during an 8-hour window from 8:00 a.m. to 4:00 p.m., whereas nonfasting participants ate whenever they wanted.

At 6 months and 12 months, participants were re-evaluated for changes in weight, body fat, BMI, blood pressure, lean body mass, and metabolic risk factors, including glucose level, triglycerides, blood pressure, and others.
 

Caloric intake restriction seems to explain most of beneficial effects

At one-year follow-up, 118 participants (84.9%) remained in the study. Although members of the fasting group lost slightly more weight on average than those in the non-fasting group (mean, 8.0 kg vs. 6.3 kg), the difference between groups was not statistically significant (95% confidence interval, −4.0 to 0.4; P = .11).

Most of the other obesity-related health measures also trended toward favoring the fasting group, but again, none of these improvements was statistically significant. Weight circumference at 1 year, for example, decreased by a mean of 9.4 cm in the fasting group versus 8.8 cm in the nonfasting group, a net difference of 1.8 cm (95% CI, –4.0 to 0.5).

“We found that the two weight-loss regimens that we evaluated had similar success in patients with obesity, regardless of whether they reduced their calorie consumption through time-restricted eating or through calorie restriction alone,” Dr. Liu and colleagues concluded.

Principal investigator Huijie Zhang MD, PhD, professor, chief physician, and deputy director of the department of endocrinology and metabolism at Nafang Hospital, noted that their findings are “consistent with the findings in previous studies.”

“Our data suggest that caloric intake restriction explained most of the beneficial effects of a time-restricted eating regimen,” Dr. Zhang said.

Still, Dr. Zhang called time-restricted eating “a viable and sustainable approach for a person who wants to lose weight.”

More work is needed, Dr. Zhang said, to uncover the impact of fasting in “diverse groups,” including patients with chronic disease like diabetes and cardiovascular disease. Investigators should also conduct studies to compare outcomes between men and women, and evaluate the effects of other fasting durations.
 

 

 

Can trial be applied to a wider population?

According to Blandine Laferrère, MD, PhD, and Satchidananda Panda, PhD, of Columbia University Irving Medical Center, New York, and the Salk Institute for Biological Studies, La Jolla, Calif., respectively, “the results of the trial suggest that calorie restriction combined with time restriction, when delivered with intensive coaching and monitoring, is an approach that is as safe, sustainable, and effective for weight loss as calorie restriction alone.”

Yet Dr. Laferrère and Dr. Panda also expressed skepticism about broader implementation of a similar regime.

“The applicability of this trial to wider populations is debatable,” they wrote in an accompanying editorial. “The short time period for eating at baseline may be specific to the population studied, since investigators outside China have reported longer time windows. The rigorous coaching and monitoring by trial staff also leaves open the question of whether time-restricted eating is easier to adhere to than intentional calorie restriction. Such cost-benefit analyses are important for the assessment of the scalability of a lifestyle intervention.”
 

Duration is trial’s greatest strength

Kristina Varady, PhD, professor of nutrition in the department of kinesiology and nutrition at the University of Illinois at Chicago, said the “key strength” of the trial was its duration, at 12 months, making it the longest time-restricted eating trial to date”; however, she was critical of the design.

Dr. Kristina Varady

“Quite frankly, I’m surprised this study got into such a high-caliber medical journal,” Dr. Varady said in a written comment. “It doesn’t even have a control group! It goes to show how popular these diets are and how much people want to know about them.”

She also noted that “the study was flawed in that it didn’t really look at the effects of true time-restricted eating.” According to Dr. Varady, combining calorie restriction with time-restricted eating “kind of defeats the purpose” of a time-restricted diet.

“The main benefit of time-restricted eating is that you don’t need to count calories in order to lose weight,” Dr. Varady said, citing two of her own studies from 2018 and 2020. “Just by limiting the eating window to 8 hours per day, people naturally cut out 300-500 calories per day. That’s why people like [time-restricted eating] so much.”

Dr. Varady was also “very surprised” at the adherence data. At 1 year, approximately 85% of the patients were still following the protocol, a notably higher rate than most dietary intervention studies, which typically report adherence rates of 50-60%, she said. The high adherence rate was particularly unexpected because of the 8:00 a.m.–4:00 p.m. eating window, Dr. Varady added, since that meant skipping “the family/social meal every evening over 1 whole year!”

The study was funded by the National Key Research and Development Project and others. The study investigators reported no conflicts of interest. Dr. Varady disclosed author fees from the Hachette Book group for her book “The Every Other Day Diet.”

 

Not so fast! Daily fasting with calorie restriction may not lead to shedding more pounds than just cutting back on calories, according to the authors of a new study.

Over the course of a year, study participants who ate only from 8:00 a.m. to 4:00 p.m. did not lose significantly more weight than individuals who ate whenever they wanted, nor did they achieve significantly greater improvements in other obesity-related health measures like body mass index (BMI) or metabolic risk, reported lead author Deying Liu, MD, of Nanfang Hospital, Southern Medical University, Guangzhou, China, and colleagues.

“[Daily fasting] has gained popularity because it is a weight-loss strategy that is simple to follow, which may enhance adherence,” Dr. Liu and colleagues wrote in the New England Journal of Medicine. However, “the long-term efficacy and safety of time-restricted eating as a weight-loss strategy are still uncertain, and the long-term effects on weight loss of time-restricted eating as compared with daily calorie restriction alone have not been fully explored.”

To learn more, Dr. Liu and colleagues recruited 139 adult patients with BMIs between 28 and 45. Individuals with serious medical conditions, such as malignant tumors, diabetes, chronic kidney disease, and others were excluded. Other exclusion criteria included smoking, ongoing participation in a weight-loss program, GI surgery within the prior year, use of medications that impact energy balance and weight, and planned or current pregnancy.

All participants were advised to eat calorie-restricted diets, with ranges of 1,500-1,800 kcal per day for men and 1,200-1,500 kcal per day for women. To determine the added impact of fasting, participants were randomized in a 1:1 ratio into time-restricted (fasting) or non–time-restricted (nonfasting) groups, in which fasting participants ate only during an 8-hour window from 8:00 a.m. to 4:00 p.m., whereas nonfasting participants ate whenever they wanted.

At 6 months and 12 months, participants were re-evaluated for changes in weight, body fat, BMI, blood pressure, lean body mass, and metabolic risk factors, including glucose level, triglycerides, blood pressure, and others.
 

Caloric intake restriction seems to explain most of beneficial effects

At one-year follow-up, 118 participants (84.9%) remained in the study. Although members of the fasting group lost slightly more weight on average than those in the non-fasting group (mean, 8.0 kg vs. 6.3 kg), the difference between groups was not statistically significant (95% confidence interval, −4.0 to 0.4; P = .11).

Most of the other obesity-related health measures also trended toward favoring the fasting group, but again, none of these improvements was statistically significant. Weight circumference at 1 year, for example, decreased by a mean of 9.4 cm in the fasting group versus 8.8 cm in the nonfasting group, a net difference of 1.8 cm (95% CI, –4.0 to 0.5).

“We found that the two weight-loss regimens that we evaluated had similar success in patients with obesity, regardless of whether they reduced their calorie consumption through time-restricted eating or through calorie restriction alone,” Dr. Liu and colleagues concluded.

Principal investigator Huijie Zhang MD, PhD, professor, chief physician, and deputy director of the department of endocrinology and metabolism at Nafang Hospital, noted that their findings are “consistent with the findings in previous studies.”

“Our data suggest that caloric intake restriction explained most of the beneficial effects of a time-restricted eating regimen,” Dr. Zhang said.

Still, Dr. Zhang called time-restricted eating “a viable and sustainable approach for a person who wants to lose weight.”

More work is needed, Dr. Zhang said, to uncover the impact of fasting in “diverse groups,” including patients with chronic disease like diabetes and cardiovascular disease. Investigators should also conduct studies to compare outcomes between men and women, and evaluate the effects of other fasting durations.
 

 

 

Can trial be applied to a wider population?

According to Blandine Laferrère, MD, PhD, and Satchidananda Panda, PhD, of Columbia University Irving Medical Center, New York, and the Salk Institute for Biological Studies, La Jolla, Calif., respectively, “the results of the trial suggest that calorie restriction combined with time restriction, when delivered with intensive coaching and monitoring, is an approach that is as safe, sustainable, and effective for weight loss as calorie restriction alone.”

Yet Dr. Laferrère and Dr. Panda also expressed skepticism about broader implementation of a similar regime.

“The applicability of this trial to wider populations is debatable,” they wrote in an accompanying editorial. “The short time period for eating at baseline may be specific to the population studied, since investigators outside China have reported longer time windows. The rigorous coaching and monitoring by trial staff also leaves open the question of whether time-restricted eating is easier to adhere to than intentional calorie restriction. Such cost-benefit analyses are important for the assessment of the scalability of a lifestyle intervention.”
 

Duration is trial’s greatest strength

Kristina Varady, PhD, professor of nutrition in the department of kinesiology and nutrition at the University of Illinois at Chicago, said the “key strength” of the trial was its duration, at 12 months, making it the longest time-restricted eating trial to date”; however, she was critical of the design.

Dr. Kristina Varady

“Quite frankly, I’m surprised this study got into such a high-caliber medical journal,” Dr. Varady said in a written comment. “It doesn’t even have a control group! It goes to show how popular these diets are and how much people want to know about them.”

She also noted that “the study was flawed in that it didn’t really look at the effects of true time-restricted eating.” According to Dr. Varady, combining calorie restriction with time-restricted eating “kind of defeats the purpose” of a time-restricted diet.

“The main benefit of time-restricted eating is that you don’t need to count calories in order to lose weight,” Dr. Varady said, citing two of her own studies from 2018 and 2020. “Just by limiting the eating window to 8 hours per day, people naturally cut out 300-500 calories per day. That’s why people like [time-restricted eating] so much.”

Dr. Varady was also “very surprised” at the adherence data. At 1 year, approximately 85% of the patients were still following the protocol, a notably higher rate than most dietary intervention studies, which typically report adherence rates of 50-60%, she said. The high adherence rate was particularly unexpected because of the 8:00 a.m.–4:00 p.m. eating window, Dr. Varady added, since that meant skipping “the family/social meal every evening over 1 whole year!”

The study was funded by the National Key Research and Development Project and others. The study investigators reported no conflicts of interest. Dr. Varady disclosed author fees from the Hachette Book group for her book “The Every Other Day Diet.”

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Childhood abuse may increase risk of MS in women

Article Type
Changed

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Issue
Neurology Reviews - 30(6)
Publications
Topics
Sections

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Issue
Neurology Reviews - 30(6)
Issue
Neurology Reviews - 30(6)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF NEUROLOGY, NEUROSURGERY, & PSYCHIATRY

Citation Override
Publish date: April 20, 2022
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Real-world data suggest coprescribing PDE5 inhibitors and nitrates may be safe

Article Type
Changed

As coprescribing drugs for erectile dysfunction and oral organic nitrates for ischemic heart disease (IHD) surged, cardiovascular adverse events did not significantly increase, a new study finds.

The authors of the new research specifically examined how frequently phosphodiesterase type 5 (PDE5) inhibitors, such as Viagra, were prescribed. The U.S. Food and Drug Administration and the European Medicines Agency have warned that these drugs for erectile dysfunction are contraindicated for use with nitrates because of concerns about cardiovascular risks.

“Small, randomized, pharmacologic studies have reported an amplified decrease in blood pressure during controlled coexposure with nitrates and [phosphodiesterase type 5 inhibitors], both in healthy participants and in participants with IHD,” wrote lead author Anders Holt, MD, of Copenhagen University Hospital–Herlev and Gentofte and colleagues, in Annals of Internal Medicine. “Potentially, this increases the risk for vascular ischemic events including myocardial infarction and stroke.”

But there is a scarcity of real-world data showing that using both types of drugs together increase these risks, the researchers noted.

To address this knowledge gap, Dr. Holt and colleagues conducted a retrospective study involving 249,541 Danish men with IHD. In this overall population, from 2000 to 2018, prescriptions for PDE5 inhibitors increased 10-fold, from 3.1 to 30.9 prescriptions per 100 persons per year. Within a subgroup of 42,073 patients continuously prescribed oral organic nitrates, PDE5-inhibitor prescriptions jumped twice that magnitude, from 0.9 to 19.7 prescriptions per 100 persons per year.

Despite this surge in coprescribing, the investigators did not observe a significant increase in either of two composite measures of cardiovascular adverse events. The first composite included ischemic stroke, shock, cardiac arrest, myocardial infarction, or acute coronary arteriography (odds ratio, 0.58; 95% confidence interval, 0.28-1.13). The second composite included drug-related adverse events, angina pectoris, or syncope (OR, 0.73; CI, 0.40-1.32).
 

Lead author speculates on reasons for findings

“I propose several explanations [for these findings],” Dr. Holt said in an interview, “but I want to emphasize that our study does not contain any data to back it up. It is just speculation. First, the observed drop in blood pressure may not cause a condition for which patients seek a hospital. A drop in blood pressure has been shown in pharmacologic trials, but it might not translate to a real-life risk for cardiovascular outcomes. Second, patients could be well informed and adherent to guidance that the prescribing physician has provided. For example, patients are aware of the recommended pause in nitrate treatment before PDE5-inhibitor use and follow these recommendations. Third, nitrates are often taken in the morning, and with the careful assumption that most PDE5-inhibitor activities take place in the evening, the nitrates could be metabolized to a degree such that the synergistic interaction is negligible.”

Dr. Holt went on to suggest a novel clinical approach based on the new findings.

“Coadministration should still be contraindicated due to the proven drop in blood pressure,” he said. “However, perhaps physicians can allow for coprescription if patients are adequately informed.”

A qualitative study is needed to determine how patients and physicians discuss coprescription, including avoidance of coadministration, Dr. Holt added.
 

 

 

Findings call for a reassessment of whether the contraindication is warranted

Robert A. Kloner, MD, PhD, chief science officer at the Huntington Medical Research Institutes in Pasadena, Calif., and professor of medicine at University of Southern California, Los Angeles, previously conducted research exploring drug interactions with PDE5 inhibitors, and in 2018, coauthored a literature review that concluded that PDE5 inhibitors and nitrates are contraindicated.

But now, considering these new findings, Dr. Kloner is offering a fresh perspective.

“This study is reassuring,” Dr. Kloner said in an interview. “I think that it’s time to reassess whether there should be an absolute contraindication, or this should be more of like a warning.”

He noted that in controlled studies, like the ones he previously conducted, PDE5 inhibitors and nitrates were administered “very close to each other, on purpose,” yet this probably doesn’t reflect typical practice, in which clinicians can guide usage based on durations of drug metabolism.

“I think that physicians might be more comfortable now prescribing the drugs at the same time, but then telling patients that they shouldn’t take the two drugs simultaneously; they should wait and take the nitrate 24 hours after the last Viagra, or the nitrate 48 hours after the last Cialis,” Dr. Kloner said. “I suspect that that is happening. I suspect also the fact that people would be more likely to take the nitrate in the morning and the PDE5 inhibitor at night probably also contributes to the safety findings.”

Dr. Kloner noted that blood pressures vary throughout the day based on circadian rhythm, and that the body can adapt to some fluctuations without negative effects.

There could still be some people who experience a drop in blood pressure and get sick from it from the two drugs interacting, but that’s probably not that common, he said.

The study was supported by several grants. The investigators disclosed relationships with Merck, BMS, Bayer, and others. Dr. Kloner consults for Sanofi.

Publications
Topics
Sections

As coprescribing drugs for erectile dysfunction and oral organic nitrates for ischemic heart disease (IHD) surged, cardiovascular adverse events did not significantly increase, a new study finds.

The authors of the new research specifically examined how frequently phosphodiesterase type 5 (PDE5) inhibitors, such as Viagra, were prescribed. The U.S. Food and Drug Administration and the European Medicines Agency have warned that these drugs for erectile dysfunction are contraindicated for use with nitrates because of concerns about cardiovascular risks.

“Small, randomized, pharmacologic studies have reported an amplified decrease in blood pressure during controlled coexposure with nitrates and [phosphodiesterase type 5 inhibitors], both in healthy participants and in participants with IHD,” wrote lead author Anders Holt, MD, of Copenhagen University Hospital–Herlev and Gentofte and colleagues, in Annals of Internal Medicine. “Potentially, this increases the risk for vascular ischemic events including myocardial infarction and stroke.”

But there is a scarcity of real-world data showing that using both types of drugs together increase these risks, the researchers noted.

To address this knowledge gap, Dr. Holt and colleagues conducted a retrospective study involving 249,541 Danish men with IHD. In this overall population, from 2000 to 2018, prescriptions for PDE5 inhibitors increased 10-fold, from 3.1 to 30.9 prescriptions per 100 persons per year. Within a subgroup of 42,073 patients continuously prescribed oral organic nitrates, PDE5-inhibitor prescriptions jumped twice that magnitude, from 0.9 to 19.7 prescriptions per 100 persons per year.

Despite this surge in coprescribing, the investigators did not observe a significant increase in either of two composite measures of cardiovascular adverse events. The first composite included ischemic stroke, shock, cardiac arrest, myocardial infarction, or acute coronary arteriography (odds ratio, 0.58; 95% confidence interval, 0.28-1.13). The second composite included drug-related adverse events, angina pectoris, or syncope (OR, 0.73; CI, 0.40-1.32).
 

Lead author speculates on reasons for findings

“I propose several explanations [for these findings],” Dr. Holt said in an interview, “but I want to emphasize that our study does not contain any data to back it up. It is just speculation. First, the observed drop in blood pressure may not cause a condition for which patients seek a hospital. A drop in blood pressure has been shown in pharmacologic trials, but it might not translate to a real-life risk for cardiovascular outcomes. Second, patients could be well informed and adherent to guidance that the prescribing physician has provided. For example, patients are aware of the recommended pause in nitrate treatment before PDE5-inhibitor use and follow these recommendations. Third, nitrates are often taken in the morning, and with the careful assumption that most PDE5-inhibitor activities take place in the evening, the nitrates could be metabolized to a degree such that the synergistic interaction is negligible.”

Dr. Holt went on to suggest a novel clinical approach based on the new findings.

“Coadministration should still be contraindicated due to the proven drop in blood pressure,” he said. “However, perhaps physicians can allow for coprescription if patients are adequately informed.”

A qualitative study is needed to determine how patients and physicians discuss coprescription, including avoidance of coadministration, Dr. Holt added.
 

 

 

Findings call for a reassessment of whether the contraindication is warranted

Robert A. Kloner, MD, PhD, chief science officer at the Huntington Medical Research Institutes in Pasadena, Calif., and professor of medicine at University of Southern California, Los Angeles, previously conducted research exploring drug interactions with PDE5 inhibitors, and in 2018, coauthored a literature review that concluded that PDE5 inhibitors and nitrates are contraindicated.

But now, considering these new findings, Dr. Kloner is offering a fresh perspective.

“This study is reassuring,” Dr. Kloner said in an interview. “I think that it’s time to reassess whether there should be an absolute contraindication, or this should be more of like a warning.”

He noted that in controlled studies, like the ones he previously conducted, PDE5 inhibitors and nitrates were administered “very close to each other, on purpose,” yet this probably doesn’t reflect typical practice, in which clinicians can guide usage based on durations of drug metabolism.

“I think that physicians might be more comfortable now prescribing the drugs at the same time, but then telling patients that they shouldn’t take the two drugs simultaneously; they should wait and take the nitrate 24 hours after the last Viagra, or the nitrate 48 hours after the last Cialis,” Dr. Kloner said. “I suspect that that is happening. I suspect also the fact that people would be more likely to take the nitrate in the morning and the PDE5 inhibitor at night probably also contributes to the safety findings.”

Dr. Kloner noted that blood pressures vary throughout the day based on circadian rhythm, and that the body can adapt to some fluctuations without negative effects.

There could still be some people who experience a drop in blood pressure and get sick from it from the two drugs interacting, but that’s probably not that common, he said.

The study was supported by several grants. The investigators disclosed relationships with Merck, BMS, Bayer, and others. Dr. Kloner consults for Sanofi.

As coprescribing drugs for erectile dysfunction and oral organic nitrates for ischemic heart disease (IHD) surged, cardiovascular adverse events did not significantly increase, a new study finds.

The authors of the new research specifically examined how frequently phosphodiesterase type 5 (PDE5) inhibitors, such as Viagra, were prescribed. The U.S. Food and Drug Administration and the European Medicines Agency have warned that these drugs for erectile dysfunction are contraindicated for use with nitrates because of concerns about cardiovascular risks.

“Small, randomized, pharmacologic studies have reported an amplified decrease in blood pressure during controlled coexposure with nitrates and [phosphodiesterase type 5 inhibitors], both in healthy participants and in participants with IHD,” wrote lead author Anders Holt, MD, of Copenhagen University Hospital–Herlev and Gentofte and colleagues, in Annals of Internal Medicine. “Potentially, this increases the risk for vascular ischemic events including myocardial infarction and stroke.”

But there is a scarcity of real-world data showing that using both types of drugs together increase these risks, the researchers noted.

To address this knowledge gap, Dr. Holt and colleagues conducted a retrospective study involving 249,541 Danish men with IHD. In this overall population, from 2000 to 2018, prescriptions for PDE5 inhibitors increased 10-fold, from 3.1 to 30.9 prescriptions per 100 persons per year. Within a subgroup of 42,073 patients continuously prescribed oral organic nitrates, PDE5-inhibitor prescriptions jumped twice that magnitude, from 0.9 to 19.7 prescriptions per 100 persons per year.

Despite this surge in coprescribing, the investigators did not observe a significant increase in either of two composite measures of cardiovascular adverse events. The first composite included ischemic stroke, shock, cardiac arrest, myocardial infarction, or acute coronary arteriography (odds ratio, 0.58; 95% confidence interval, 0.28-1.13). The second composite included drug-related adverse events, angina pectoris, or syncope (OR, 0.73; CI, 0.40-1.32).
 

Lead author speculates on reasons for findings

“I propose several explanations [for these findings],” Dr. Holt said in an interview, “but I want to emphasize that our study does not contain any data to back it up. It is just speculation. First, the observed drop in blood pressure may not cause a condition for which patients seek a hospital. A drop in blood pressure has been shown in pharmacologic trials, but it might not translate to a real-life risk for cardiovascular outcomes. Second, patients could be well informed and adherent to guidance that the prescribing physician has provided. For example, patients are aware of the recommended pause in nitrate treatment before PDE5-inhibitor use and follow these recommendations. Third, nitrates are often taken in the morning, and with the careful assumption that most PDE5-inhibitor activities take place in the evening, the nitrates could be metabolized to a degree such that the synergistic interaction is negligible.”

Dr. Holt went on to suggest a novel clinical approach based on the new findings.

“Coadministration should still be contraindicated due to the proven drop in blood pressure,” he said. “However, perhaps physicians can allow for coprescription if patients are adequately informed.”

A qualitative study is needed to determine how patients and physicians discuss coprescription, including avoidance of coadministration, Dr. Holt added.
 

 

 

Findings call for a reassessment of whether the contraindication is warranted

Robert A. Kloner, MD, PhD, chief science officer at the Huntington Medical Research Institutes in Pasadena, Calif., and professor of medicine at University of Southern California, Los Angeles, previously conducted research exploring drug interactions with PDE5 inhibitors, and in 2018, coauthored a literature review that concluded that PDE5 inhibitors and nitrates are contraindicated.

But now, considering these new findings, Dr. Kloner is offering a fresh perspective.

“This study is reassuring,” Dr. Kloner said in an interview. “I think that it’s time to reassess whether there should be an absolute contraindication, or this should be more of like a warning.”

He noted that in controlled studies, like the ones he previously conducted, PDE5 inhibitors and nitrates were administered “very close to each other, on purpose,” yet this probably doesn’t reflect typical practice, in which clinicians can guide usage based on durations of drug metabolism.

“I think that physicians might be more comfortable now prescribing the drugs at the same time, but then telling patients that they shouldn’t take the two drugs simultaneously; they should wait and take the nitrate 24 hours after the last Viagra, or the nitrate 48 hours after the last Cialis,” Dr. Kloner said. “I suspect that that is happening. I suspect also the fact that people would be more likely to take the nitrate in the morning and the PDE5 inhibitor at night probably also contributes to the safety findings.”

Dr. Kloner noted that blood pressures vary throughout the day based on circadian rhythm, and that the body can adapt to some fluctuations without negative effects.

There could still be some people who experience a drop in blood pressure and get sick from it from the two drugs interacting, but that’s probably not that common, he said.

The study was supported by several grants. The investigators disclosed relationships with Merck, BMS, Bayer, and others. Dr. Kloner consults for Sanofi.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study: Physical fitness in children linked with concentration, quality of life

Article Type
Changed

Physically fit children have a greater ability to concentrate and better health-related quality of life (HRQOL), according to a new study.

The findings of the German study involving more than 6,500 kids emphasize the importance of cardiorespiratory health in childhood, and support physical fitness initiatives in schools, according to lead author Katharina Köble, MSc, of the Technical University of Munich (Germany), and colleagues.

“Recent studies show that only a few children meet the recommendations of physical activity,” the investigators wrote in Journal of Clinical Medicine.

While the health benefits of physical activity are clearly documented, Ms. Köble and colleagues noted that typical measures of activity, such as accelerometers or self-reported questionnaires, are suboptimal research tools.

“Physical fitness is a more objective parameter to quantify when evaluating health promotion,” the investigators wrote. “Furthermore, cardiorespiratory fitness as part of physical fitness is more strongly related to risk factors of cardiovascular disease than physical activity.”

According to the investigators, physical fitness has also been linked with better concentration and HRQOL, but never in the same population of children.

The new study aimed to address this knowledge gap by assessing 6,533 healthy children aged 6-10 years, approximately half boys and half girls. Associations between physical fitness, concentration, and HRQOL were evaluated using multiple linear regression analysis in participants aged 9-10 years.

Physical fitness was measured using a series of challenges, including curl-ups (pull-ups with palms facing body), push-ups, standing long jump, handgrip strength measurement, and Progressive Aerobic Cardiovascular Endurance Run (PACER). Performing the multistage shuttle run, PACER, “requires participants to maintain the pace set by an audio signal, which progressively increases the intensity every minute.” Results of the PACER test were used to estimate VO2max.

Concentration was measured using the d2-R test, “a paper-pencil cancellation test, where subjects have to cross out all ‘d’ letters with two dashes under a time limit.”

HRQOL was evaluated with the KINDL questionnaire, which covers emotional well-being, physical well-being, everyday functioning (school), friends, family, and self-esteem.

Analysis showed that physical fitness improved with age (P < .001), except for VO2max in girls (P = .129). Concentration also improved with age (P < .001), while HRQOL did not (P = .179).

Among children aged 9-10 years, VO2max scores were strongly associated with both HRQOL (P < .001) and concentration (P < .001).

“VO2max was found to be one of the main factors influencing concentration levels and HRQOL dimensions in primary school children,” the investigators wrote. “Physical fitness, especially cardiorespiratory performance, should therefore be promoted more specifically in school settings to support the promotion of an overall healthy lifestyle in children and adolescents.”
 

Findings are having a real-word impact, according to researcher

In an interview, Ms. Köble noted that the findings are already having a real-world impact.

“We continued data assessment in the long-term and specifically adapted prevention programs in school to the needs of the school children we identified in our study,” she said. “Schools are partially offering specific movement and nutrition classes now.”

In addition, Ms. Köble and colleagues plan on educating teachers about the “urgent need for sufficient physical activity.”

“Academic performance should be considered as an additional health factor in future studies, as well as screen time and eating patterns, as all those variables showed interactions with physical fitness and concentration. In a subanalysis, we showed that children with better physical fitness and concentration values were those who usually went to higher education secondary schools,” they wrote.
 

 

 

VO2max did not correlate with BMI

Gregory Weaver, MD, a pediatrician at Cleveland Clinic Children’s, voiced some concerns about the reliability of the findings. He noted that VO2max did not correlate with body mass index or other measures of physical fitness, and that using the PACER test to estimate VO2max may have skewed the association between physical fitness and concentration.

“It is quite conceivable that children who can maintain the focus to perform maximally on this test will also do well on other tests of attention/concentration,” Dr. Weaver said. “Most children I know would have a very difficult time performing a physical fitness test which requires them to match a recorded pace that slowly increases overtime. I’m not an expert in the area, but it is my understanding that usually VO2max tests involve a treadmill which allows investigators to have complete control over pace.”

Dr. Weaver concluded that more work is needed to determine if physical fitness interventions can have a positive impact on HRQOL and concentration.

“I think the authors of this study attempted to ask an important question about the possible association between physical fitness and concentration among school aged children,” Dr. Weaver said in an interview. “But what is even more vital are studies demonstrating that a change in modifiable health factors like nutrition, physical fitness, or the built environment can improve quality of life. I was hoping the authors would show that an improvement in VO2max over time resulted in an improvement in concentration. Frustratingly, that is not what this article demonstrates.”

The investigators and Dr. Weaver reported no conflicts of interest.

Publications
Topics
Sections

Physically fit children have a greater ability to concentrate and better health-related quality of life (HRQOL), according to a new study.

The findings of the German study involving more than 6,500 kids emphasize the importance of cardiorespiratory health in childhood, and support physical fitness initiatives in schools, according to lead author Katharina Köble, MSc, of the Technical University of Munich (Germany), and colleagues.

“Recent studies show that only a few children meet the recommendations of physical activity,” the investigators wrote in Journal of Clinical Medicine.

While the health benefits of physical activity are clearly documented, Ms. Köble and colleagues noted that typical measures of activity, such as accelerometers or self-reported questionnaires, are suboptimal research tools.

“Physical fitness is a more objective parameter to quantify when evaluating health promotion,” the investigators wrote. “Furthermore, cardiorespiratory fitness as part of physical fitness is more strongly related to risk factors of cardiovascular disease than physical activity.”

According to the investigators, physical fitness has also been linked with better concentration and HRQOL, but never in the same population of children.

The new study aimed to address this knowledge gap by assessing 6,533 healthy children aged 6-10 years, approximately half boys and half girls. Associations between physical fitness, concentration, and HRQOL were evaluated using multiple linear regression analysis in participants aged 9-10 years.

Physical fitness was measured using a series of challenges, including curl-ups (pull-ups with palms facing body), push-ups, standing long jump, handgrip strength measurement, and Progressive Aerobic Cardiovascular Endurance Run (PACER). Performing the multistage shuttle run, PACER, “requires participants to maintain the pace set by an audio signal, which progressively increases the intensity every minute.” Results of the PACER test were used to estimate VO2max.

Concentration was measured using the d2-R test, “a paper-pencil cancellation test, where subjects have to cross out all ‘d’ letters with two dashes under a time limit.”

HRQOL was evaluated with the KINDL questionnaire, which covers emotional well-being, physical well-being, everyday functioning (school), friends, family, and self-esteem.

Analysis showed that physical fitness improved with age (P < .001), except for VO2max in girls (P = .129). Concentration also improved with age (P < .001), while HRQOL did not (P = .179).

Among children aged 9-10 years, VO2max scores were strongly associated with both HRQOL (P < .001) and concentration (P < .001).

“VO2max was found to be one of the main factors influencing concentration levels and HRQOL dimensions in primary school children,” the investigators wrote. “Physical fitness, especially cardiorespiratory performance, should therefore be promoted more specifically in school settings to support the promotion of an overall healthy lifestyle in children and adolescents.”
 

Findings are having a real-word impact, according to researcher

In an interview, Ms. Köble noted that the findings are already having a real-world impact.

“We continued data assessment in the long-term and specifically adapted prevention programs in school to the needs of the school children we identified in our study,” she said. “Schools are partially offering specific movement and nutrition classes now.”

In addition, Ms. Köble and colleagues plan on educating teachers about the “urgent need for sufficient physical activity.”

“Academic performance should be considered as an additional health factor in future studies, as well as screen time and eating patterns, as all those variables showed interactions with physical fitness and concentration. In a subanalysis, we showed that children with better physical fitness and concentration values were those who usually went to higher education secondary schools,” they wrote.
 

 

 

VO2max did not correlate with BMI

Gregory Weaver, MD, a pediatrician at Cleveland Clinic Children’s, voiced some concerns about the reliability of the findings. He noted that VO2max did not correlate with body mass index or other measures of physical fitness, and that using the PACER test to estimate VO2max may have skewed the association between physical fitness and concentration.

“It is quite conceivable that children who can maintain the focus to perform maximally on this test will also do well on other tests of attention/concentration,” Dr. Weaver said. “Most children I know would have a very difficult time performing a physical fitness test which requires them to match a recorded pace that slowly increases overtime. I’m not an expert in the area, but it is my understanding that usually VO2max tests involve a treadmill which allows investigators to have complete control over pace.”

Dr. Weaver concluded that more work is needed to determine if physical fitness interventions can have a positive impact on HRQOL and concentration.

“I think the authors of this study attempted to ask an important question about the possible association between physical fitness and concentration among school aged children,” Dr. Weaver said in an interview. “But what is even more vital are studies demonstrating that a change in modifiable health factors like nutrition, physical fitness, or the built environment can improve quality of life. I was hoping the authors would show that an improvement in VO2max over time resulted in an improvement in concentration. Frustratingly, that is not what this article demonstrates.”

The investigators and Dr. Weaver reported no conflicts of interest.

Physically fit children have a greater ability to concentrate and better health-related quality of life (HRQOL), according to a new study.

The findings of the German study involving more than 6,500 kids emphasize the importance of cardiorespiratory health in childhood, and support physical fitness initiatives in schools, according to lead author Katharina Köble, MSc, of the Technical University of Munich (Germany), and colleagues.

“Recent studies show that only a few children meet the recommendations of physical activity,” the investigators wrote in Journal of Clinical Medicine.

While the health benefits of physical activity are clearly documented, Ms. Köble and colleagues noted that typical measures of activity, such as accelerometers or self-reported questionnaires, are suboptimal research tools.

“Physical fitness is a more objective parameter to quantify when evaluating health promotion,” the investigators wrote. “Furthermore, cardiorespiratory fitness as part of physical fitness is more strongly related to risk factors of cardiovascular disease than physical activity.”

According to the investigators, physical fitness has also been linked with better concentration and HRQOL, but never in the same population of children.

The new study aimed to address this knowledge gap by assessing 6,533 healthy children aged 6-10 years, approximately half boys and half girls. Associations between physical fitness, concentration, and HRQOL were evaluated using multiple linear regression analysis in participants aged 9-10 years.

Physical fitness was measured using a series of challenges, including curl-ups (pull-ups with palms facing body), push-ups, standing long jump, handgrip strength measurement, and Progressive Aerobic Cardiovascular Endurance Run (PACER). Performing the multistage shuttle run, PACER, “requires participants to maintain the pace set by an audio signal, which progressively increases the intensity every minute.” Results of the PACER test were used to estimate VO2max.

Concentration was measured using the d2-R test, “a paper-pencil cancellation test, where subjects have to cross out all ‘d’ letters with two dashes under a time limit.”

HRQOL was evaluated with the KINDL questionnaire, which covers emotional well-being, physical well-being, everyday functioning (school), friends, family, and self-esteem.

Analysis showed that physical fitness improved with age (P < .001), except for VO2max in girls (P = .129). Concentration also improved with age (P < .001), while HRQOL did not (P = .179).

Among children aged 9-10 years, VO2max scores were strongly associated with both HRQOL (P < .001) and concentration (P < .001).

“VO2max was found to be one of the main factors influencing concentration levels and HRQOL dimensions in primary school children,” the investigators wrote. “Physical fitness, especially cardiorespiratory performance, should therefore be promoted more specifically in school settings to support the promotion of an overall healthy lifestyle in children and adolescents.”
 

Findings are having a real-word impact, according to researcher

In an interview, Ms. Köble noted that the findings are already having a real-world impact.

“We continued data assessment in the long-term and specifically adapted prevention programs in school to the needs of the school children we identified in our study,” she said. “Schools are partially offering specific movement and nutrition classes now.”

In addition, Ms. Köble and colleagues plan on educating teachers about the “urgent need for sufficient physical activity.”

“Academic performance should be considered as an additional health factor in future studies, as well as screen time and eating patterns, as all those variables showed interactions with physical fitness and concentration. In a subanalysis, we showed that children with better physical fitness and concentration values were those who usually went to higher education secondary schools,” they wrote.
 

 

 

VO2max did not correlate with BMI

Gregory Weaver, MD, a pediatrician at Cleveland Clinic Children’s, voiced some concerns about the reliability of the findings. He noted that VO2max did not correlate with body mass index or other measures of physical fitness, and that using the PACER test to estimate VO2max may have skewed the association between physical fitness and concentration.

“It is quite conceivable that children who can maintain the focus to perform maximally on this test will also do well on other tests of attention/concentration,” Dr. Weaver said. “Most children I know would have a very difficult time performing a physical fitness test which requires them to match a recorded pace that slowly increases overtime. I’m not an expert in the area, but it is my understanding that usually VO2max tests involve a treadmill which allows investigators to have complete control over pace.”

Dr. Weaver concluded that more work is needed to determine if physical fitness interventions can have a positive impact on HRQOL and concentration.

“I think the authors of this study attempted to ask an important question about the possible association between physical fitness and concentration among school aged children,” Dr. Weaver said in an interview. “But what is even more vital are studies demonstrating that a change in modifiable health factors like nutrition, physical fitness, or the built environment can improve quality of life. I was hoping the authors would show that an improvement in VO2max over time resulted in an improvement in concentration. Frustratingly, that is not what this article demonstrates.”

The investigators and Dr. Weaver reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF CLINICAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New HBV model may open door to more effective antivirals

Long–sought-after breakthrough?
Article Type
Changed

A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.

During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.

“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.

“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.

The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.

Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.

“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.

Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.

“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”

Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.

“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.

The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.

Body

 

On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.

Dr. Raymond T. Chung
Xu et al. have now developed a novel mouse model in which generation of HBsAg is directly dependent on generation of cccDNA. This dependence thus yields a simple marker for assessment of cccDNA status and allows monitoring of the therapeutic effects of novel agents targeting cccDNA by simply following HBsAg titers. More studies are required to explore the mechanisms underlying HBV cccDNA formation and elimination, but this work suggests a new way forward to tractably evaluate agents that specifically interrupt cccDNA metabolism, an important step in our systematic march toward HBV cure.
 

Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.

Publications
Topics
Sections
Body

 

On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.

Dr. Raymond T. Chung
Xu et al. have now developed a novel mouse model in which generation of HBsAg is directly dependent on generation of cccDNA. This dependence thus yields a simple marker for assessment of cccDNA status and allows monitoring of the therapeutic effects of novel agents targeting cccDNA by simply following HBsAg titers. More studies are required to explore the mechanisms underlying HBV cccDNA formation and elimination, but this work suggests a new way forward to tractably evaluate agents that specifically interrupt cccDNA metabolism, an important step in our systematic march toward HBV cure.
 

Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.

Body

 

On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.

Dr. Raymond T. Chung
Xu et al. have now developed a novel mouse model in which generation of HBsAg is directly dependent on generation of cccDNA. This dependence thus yields a simple marker for assessment of cccDNA status and allows monitoring of the therapeutic effects of novel agents targeting cccDNA by simply following HBsAg titers. More studies are required to explore the mechanisms underlying HBV cccDNA formation and elimination, but this work suggests a new way forward to tractably evaluate agents that specifically interrupt cccDNA metabolism, an important step in our systematic march toward HBV cure.
 

Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.

Title
Long–sought-after breakthrough?
Long–sought-after breakthrough?

A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.

During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.

“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.

“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.

The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.

Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.

“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.

Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.

“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”

Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.

“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.

The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.

A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.

During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.

“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.

“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.

The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.

Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.

“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.

Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.

“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”

Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.

“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.

The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Bowel ultrasound may overtake colonoscopy in Crohn’s

A 'significant financial burden' avoided
Article Type
Changed

Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.

After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.

“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”

To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.

Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.

Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.

Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).

Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.

“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”

The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.

Body

Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.

In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).

Dr. Manreet Kaur

While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation. 

Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
 

Publications
Topics
Sections
Body

Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.

In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).

Dr. Manreet Kaur

While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation. 

Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
 

Body

Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.

In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).

Dr. Manreet Kaur

While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation. 

Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
 

Title
A 'significant financial burden' avoided
A 'significant financial burden' avoided

Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.

After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.

“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”

To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.

Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.

Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.

Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).

Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.

“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”

The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.

Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.

After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.

“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”

To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.

Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.

Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.

Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).

Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.

“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”

The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

In-hospital detox or not, anti-CGRPs show efficacy for medication overuse headache

Article Type
Changed

Anti–calcitonin gene-related peptide (anti-CGRP) monoclonal antibodies are effective for patients with chronic migraine and medication overuse headache regardless of detoxification strategy, according to investigators.

Abruptly discontinuing overused analgesics with health care provider oversight – a frequently resource-intensive and challenging process – is no more effective for controlling medication overuse headache than simply advising patients to stop, reported lead author Umberto Pensato, MD, of the University of Bologna, Italy, and colleagues.

“[C]urrently, the abrupt discontinuation of the overused painkiller(s), accompanied by the start of a pharmacological preventive therapy, is the most recommended strategy [for medication overuse headache],” the investigators wrote in Cephalalgia. “While painkiller(s) withdrawal could be accomplished on an outpatient basis in most cases, an in-hospital setting may be required to achieve successful discontinuation in a subgroup of patients with medication overuse headache, further weighing on individual and hospital costs. Additionally hampering this approach, the abrupt discontinuation of the overused painkiller(s) invariably results in disabling withdrawal symptoms for up to 2 weeks, including a transitory worsening of headache, the so-called ‘rebound headache.’ ”
 

Inpatient or outpatient: Does it matter?

According to Dr. Pensato and colleagues, early evidence suggests that previous painkiller withdrawal does not impact the efficacy of anti-CGRPs for medication overuse headache, yet relevant data remain scarce. To address this knowledge gap, they conducted a prospective, real-world study exploring the relationship between detoxification and outcomes after starting anti-CGRP therapy.

Out of 401 patients enrolled based on initiation of erenumab or galcanezumab, 111 satisfied inclusion criteria, including diagnosis of chronic migraine and medication overuse headache, at least 28 days of analgesic usage and headache days per month in the preceding 3 months, and other factors. Of these 111 patients, 83 underwent in-hospital detox, while the remaining 28 patients, who declined detox based on personal reasons or COVID-19–related bed shortage, were advised to discontinue overused medication on an outpatient basis (without oversight).

The primary endpoint was medication overuse headache responder rate after 3 months, as defined by ICHD-3 diagnostic criteria. Secondary endpoints included 6-item headache impact test (HIT-6), monthly headache days (MHD), migraine disability assessment score (MIDAS), mean pain intensity (MPI), monthly pain medication intake (MPMI), baseline predictors of response/refractoriness, and safety.

Three months after starting anti-CGRP therapy, 59% of patients had resolution of medication overuse headache, including 57% in the inpatient detox group and 64% in the outpatient group, a difference that was not statistically significant (P = .4788). Approximately half of the patients (51%) had at least 50% reduction in monthly headache days; although the rate was numerically lower in the inpatient group compared with the outpatient group, the difference was again not significant (51% vs. 54%; P = .8393).

“Our results support the emerging evidence that anti-CGRP drugs may be effective in these patients irrespective of the detoxification program,” the investigators concluded. “Further studies are needed to definitively confirm these results, potentially leading to a paradigm shift in the management of medication overuse headache.”
 

Abrupt or gradual detox?

According to Alan M. Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles, and editor-in-chief of Neurology Reviews, the study was hampered by two major design limitations.

“The biggest problem I see is that the two groups were treated very differently for their detoxification,” Dr. Rapoport said. “One group was detoxified abruptly in the hospital, so the authors were sure that the patients were off acute-care medication before they started their preventives. The other group was advised to stop their medication on an outpatient basis. The issue is that we have no follow-up as to whether the outpatients did or did not abruptly detoxify. A bigger issue was that the two groups were not randomized so there are many other variables that may have come into consideration.”

Still, Dr. Rapoport, a past president of the International Headache Society (IHS), noted that the findings strengthen a growing body of evidence supporting the efficacy of monoclonal antibodies for medication overuse headache regardless of detoxification strategy. He cited a 2020 study by Carlsen and colleagues conducted at the Danish Headache Center in Copenhagen, which reported similar medication overuse headache outcomes across three randomized cohorts whether they received preventive therapy with detoxification, preventive therapy without detoxification, or detoxification followed 2 months later by preventive therapy.

“What I have noticed since we have had monoclonal antibodies in our armamentarium is that these drugs work very well even when the patient has not fully detoxified,” Dr. Rapoport said. “What I do with my patients is not teach them how to detoxify now, but simply educate them to take fewer acute care medications as their headaches get better from the monoclonal antibodies; they should try to take fewer acute care medications for milder, shorter headaches, and just let them go away on their own. Previous research suggests that even when a patient is not educated at all about medication overuse headache and the reason for detoxification, monoclonal antibodies still work in the presence of medication overuse headache, and improve it.”

The investigators disclosed relationships with Allergan, Novartis, Teva, and others. Dr. Rapoport is on the speakers bureau for AbbVie.

Issue
Neurology Reviews - 30(4)
Publications
Topics
Sections

Anti–calcitonin gene-related peptide (anti-CGRP) monoclonal antibodies are effective for patients with chronic migraine and medication overuse headache regardless of detoxification strategy, according to investigators.

Abruptly discontinuing overused analgesics with health care provider oversight – a frequently resource-intensive and challenging process – is no more effective for controlling medication overuse headache than simply advising patients to stop, reported lead author Umberto Pensato, MD, of the University of Bologna, Italy, and colleagues.

“[C]urrently, the abrupt discontinuation of the overused painkiller(s), accompanied by the start of a pharmacological preventive therapy, is the most recommended strategy [for medication overuse headache],” the investigators wrote in Cephalalgia. “While painkiller(s) withdrawal could be accomplished on an outpatient basis in most cases, an in-hospital setting may be required to achieve successful discontinuation in a subgroup of patients with medication overuse headache, further weighing on individual and hospital costs. Additionally hampering this approach, the abrupt discontinuation of the overused painkiller(s) invariably results in disabling withdrawal symptoms for up to 2 weeks, including a transitory worsening of headache, the so-called ‘rebound headache.’ ”
 

Inpatient or outpatient: Does it matter?

According to Dr. Pensato and colleagues, early evidence suggests that previous painkiller withdrawal does not impact the efficacy of anti-CGRPs for medication overuse headache, yet relevant data remain scarce. To address this knowledge gap, they conducted a prospective, real-world study exploring the relationship between detoxification and outcomes after starting anti-CGRP therapy.

Out of 401 patients enrolled based on initiation of erenumab or galcanezumab, 111 satisfied inclusion criteria, including diagnosis of chronic migraine and medication overuse headache, at least 28 days of analgesic usage and headache days per month in the preceding 3 months, and other factors. Of these 111 patients, 83 underwent in-hospital detox, while the remaining 28 patients, who declined detox based on personal reasons or COVID-19–related bed shortage, were advised to discontinue overused medication on an outpatient basis (without oversight).

The primary endpoint was medication overuse headache responder rate after 3 months, as defined by ICHD-3 diagnostic criteria. Secondary endpoints included 6-item headache impact test (HIT-6), monthly headache days (MHD), migraine disability assessment score (MIDAS), mean pain intensity (MPI), monthly pain medication intake (MPMI), baseline predictors of response/refractoriness, and safety.

Three months after starting anti-CGRP therapy, 59% of patients had resolution of medication overuse headache, including 57% in the inpatient detox group and 64% in the outpatient group, a difference that was not statistically significant (P = .4788). Approximately half of the patients (51%) had at least 50% reduction in monthly headache days; although the rate was numerically lower in the inpatient group compared with the outpatient group, the difference was again not significant (51% vs. 54%; P = .8393).

“Our results support the emerging evidence that anti-CGRP drugs may be effective in these patients irrespective of the detoxification program,” the investigators concluded. “Further studies are needed to definitively confirm these results, potentially leading to a paradigm shift in the management of medication overuse headache.”
 

Abrupt or gradual detox?

According to Alan M. Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles, and editor-in-chief of Neurology Reviews, the study was hampered by two major design limitations.

“The biggest problem I see is that the two groups were treated very differently for their detoxification,” Dr. Rapoport said. “One group was detoxified abruptly in the hospital, so the authors were sure that the patients were off acute-care medication before they started their preventives. The other group was advised to stop their medication on an outpatient basis. The issue is that we have no follow-up as to whether the outpatients did or did not abruptly detoxify. A bigger issue was that the two groups were not randomized so there are many other variables that may have come into consideration.”

Still, Dr. Rapoport, a past president of the International Headache Society (IHS), noted that the findings strengthen a growing body of evidence supporting the efficacy of monoclonal antibodies for medication overuse headache regardless of detoxification strategy. He cited a 2020 study by Carlsen and colleagues conducted at the Danish Headache Center in Copenhagen, which reported similar medication overuse headache outcomes across three randomized cohorts whether they received preventive therapy with detoxification, preventive therapy without detoxification, or detoxification followed 2 months later by preventive therapy.

“What I have noticed since we have had monoclonal antibodies in our armamentarium is that these drugs work very well even when the patient has not fully detoxified,” Dr. Rapoport said. “What I do with my patients is not teach them how to detoxify now, but simply educate them to take fewer acute care medications as their headaches get better from the monoclonal antibodies; they should try to take fewer acute care medications for milder, shorter headaches, and just let them go away on their own. Previous research suggests that even when a patient is not educated at all about medication overuse headache and the reason for detoxification, monoclonal antibodies still work in the presence of medication overuse headache, and improve it.”

The investigators disclosed relationships with Allergan, Novartis, Teva, and others. Dr. Rapoport is on the speakers bureau for AbbVie.

Anti–calcitonin gene-related peptide (anti-CGRP) monoclonal antibodies are effective for patients with chronic migraine and medication overuse headache regardless of detoxification strategy, according to investigators.

Abruptly discontinuing overused analgesics with health care provider oversight – a frequently resource-intensive and challenging process – is no more effective for controlling medication overuse headache than simply advising patients to stop, reported lead author Umberto Pensato, MD, of the University of Bologna, Italy, and colleagues.

“[C]urrently, the abrupt discontinuation of the overused painkiller(s), accompanied by the start of a pharmacological preventive therapy, is the most recommended strategy [for medication overuse headache],” the investigators wrote in Cephalalgia. “While painkiller(s) withdrawal could be accomplished on an outpatient basis in most cases, an in-hospital setting may be required to achieve successful discontinuation in a subgroup of patients with medication overuse headache, further weighing on individual and hospital costs. Additionally hampering this approach, the abrupt discontinuation of the overused painkiller(s) invariably results in disabling withdrawal symptoms for up to 2 weeks, including a transitory worsening of headache, the so-called ‘rebound headache.’ ”
 

Inpatient or outpatient: Does it matter?

According to Dr. Pensato and colleagues, early evidence suggests that previous painkiller withdrawal does not impact the efficacy of anti-CGRPs for medication overuse headache, yet relevant data remain scarce. To address this knowledge gap, they conducted a prospective, real-world study exploring the relationship between detoxification and outcomes after starting anti-CGRP therapy.

Out of 401 patients enrolled based on initiation of erenumab or galcanezumab, 111 satisfied inclusion criteria, including diagnosis of chronic migraine and medication overuse headache, at least 28 days of analgesic usage and headache days per month in the preceding 3 months, and other factors. Of these 111 patients, 83 underwent in-hospital detox, while the remaining 28 patients, who declined detox based on personal reasons or COVID-19–related bed shortage, were advised to discontinue overused medication on an outpatient basis (without oversight).

The primary endpoint was medication overuse headache responder rate after 3 months, as defined by ICHD-3 diagnostic criteria. Secondary endpoints included 6-item headache impact test (HIT-6), monthly headache days (MHD), migraine disability assessment score (MIDAS), mean pain intensity (MPI), monthly pain medication intake (MPMI), baseline predictors of response/refractoriness, and safety.

Three months after starting anti-CGRP therapy, 59% of patients had resolution of medication overuse headache, including 57% in the inpatient detox group and 64% in the outpatient group, a difference that was not statistically significant (P = .4788). Approximately half of the patients (51%) had at least 50% reduction in monthly headache days; although the rate was numerically lower in the inpatient group compared with the outpatient group, the difference was again not significant (51% vs. 54%; P = .8393).

“Our results support the emerging evidence that anti-CGRP drugs may be effective in these patients irrespective of the detoxification program,” the investigators concluded. “Further studies are needed to definitively confirm these results, potentially leading to a paradigm shift in the management of medication overuse headache.”
 

Abrupt or gradual detox?

According to Alan M. Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles, and editor-in-chief of Neurology Reviews, the study was hampered by two major design limitations.

“The biggest problem I see is that the two groups were treated very differently for their detoxification,” Dr. Rapoport said. “One group was detoxified abruptly in the hospital, so the authors were sure that the patients were off acute-care medication before they started their preventives. The other group was advised to stop their medication on an outpatient basis. The issue is that we have no follow-up as to whether the outpatients did or did not abruptly detoxify. A bigger issue was that the two groups were not randomized so there are many other variables that may have come into consideration.”

Still, Dr. Rapoport, a past president of the International Headache Society (IHS), noted that the findings strengthen a growing body of evidence supporting the efficacy of monoclonal antibodies for medication overuse headache regardless of detoxification strategy. He cited a 2020 study by Carlsen and colleagues conducted at the Danish Headache Center in Copenhagen, which reported similar medication overuse headache outcomes across three randomized cohorts whether they received preventive therapy with detoxification, preventive therapy without detoxification, or detoxification followed 2 months later by preventive therapy.

“What I have noticed since we have had monoclonal antibodies in our armamentarium is that these drugs work very well even when the patient has not fully detoxified,” Dr. Rapoport said. “What I do with my patients is not teach them how to detoxify now, but simply educate them to take fewer acute care medications as their headaches get better from the monoclonal antibodies; they should try to take fewer acute care medications for milder, shorter headaches, and just let them go away on their own. Previous research suggests that even when a patient is not educated at all about medication overuse headache and the reason for detoxification, monoclonal antibodies still work in the presence of medication overuse headache, and improve it.”

The investigators disclosed relationships with Allergan, Novartis, Teva, and others. Dr. Rapoport is on the speakers bureau for AbbVie.

Issue
Neurology Reviews - 30(4)
Issue
Neurology Reviews - 30(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CEPHALALGIA

Citation Override
Publish date: February 25, 2022
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

ILAE offers first guide to treating depression in epilepsy

Article Type
Changed

The International League Against Epilepsy (ILAE) has issued recommendations for treating depression in patients with epilepsy.

The new guidance highlights the high prevalence of depression among patients with epilepsy while offering the first systematic approach to treatment, reported lead author Marco Mula, MD, PhD, of Atkinson Morley Regional Neuroscience Centre at St George’s University Hospital, London, and colleagues.

“Despite evidence that depression represents a frequently encountered comorbidity [among patients with epilepsy], data on the treatment of depression in epilepsy [are] still limited and recommendations rely mostly on individual clinical experience and expertise,” the investigators wrote in Epilepsia.

Recommendations cover first-line treatment of unipolar depression in epilepsy without other psychiatric disorders.

For patients with mild depression, the guidance supports psychological intervention without pharmacologic therapy; however, if the patient wishes to use medication, has had a positive response to medication in the past, or nonpharmacologic treatments have previously failed or are unavailable, then SSRIs should be considered first-choice therapy. For moderate to severe depression, SSRIs are the first choice, according to Dr. Mula and colleagues.

“It has to be acknowledged that there is considerable debate in the psychiatric literature about the treatment of mild depression in adults,” the investigators noted. “A patient-level meta-analysis pointed out that the magnitude of benefit of antidepressant medications compared with placebo increases with severity of depression symptoms and it may be minimal or nonexistent, on average, in patients with mild or moderate symptoms.”

If a patient does not respond to first-line therapy, then venlafaxine should be considered, according to the guidance. When a patient does respond to therapy, treatment should be continued for at least 6 months, and when residual symptoms persist, treatment should be continued until resolution.

“In people with depression it is established that around two-thirds of patients do not achieve full remission with first-line treatment,” Dr. Mula and colleagues wrote. “In people with epilepsy, current data show that up to 50% of patients do not achieve full remission from depression. For this reason, augmentation strategies are often needed. They should be adopted by psychiatrists, neuropsychiatrists, or mental health professionals familiar with such therapeutic strategies.”

Beyond these key recommendations, the guidance covers a range of additional topics, including other pharmacologic options, medication discontinuation strategies, electroconvulsive therapy, light therapy, exercise training, vagus nerve stimulation, and repetitive transcranial magnetic stimulation.
 

Useful advice that counters common misconceptions

According to Jacqueline A. French, MD, a professor at NYU Langone Medical Center, Dr. Mula and colleagues are “top notch,” and their recommendations “hit every nail on the head.”

Dr. Jacqueline A. French

Dr. French, chief medical officer of The Epilepsy Foundation, emphasized the importance of the publication, which addresses two common misconceptions within the medical community: First, that standard antidepressants are insufficient to treat depression in patients with epilepsy, and second, that antidepressants may trigger seizures.

“The first purpose [of the publication] is to say, yes, these antidepressants do work,” Dr. French said, “and no, they don’t worsen seizures, and you can use them safely, and they are appropriate to use.”

Dr. French explained that managing depression remains a practice gap among epileptologists and neurologists because it is a diagnosis that doesn’t traditionally fall into their purview, yet many patients with epilepsy forgo visiting their primary care providers, who more frequently diagnose and manage depression. Dr. French agreed with the guidance that epilepsy specialists should fill this gap.

“We need to at least be able to take people through their first antidepressant, even though we were not trained to be psychiatrists,” Dr. French said. “That’s part of the best care of our patients.”

Imad Najm, MD, director of the Charles Shor Epilepsy Center, Cleveland Clinic, said the recommendations are a step forward in the field, as they are supported by clinical data, instead of just clinical experience and expertise.

Dr. Imad Najm

Still, Dr. Najm noted that more work is needed to stratify risk of depression in epilepsy and evaluate a possible causal relationship between epilepsy therapies and depression.

He went on to emphasizes the scale of issue at hand, and the stakes involved.

“Depression, anxiety, and psychosis affect a large number of patients with epilepsy,” Dr. Najm said. “Clinical screening and recognition of these comorbidities leads to the institution of treatment options and significant improvement in quality of life. Mental health professionals should be an integral part of any comprehensive epilepsy center.”

The investigators disclosed relationships with Esai, UCB, Elsevier, and others. Dr. French is indirectly involved with multiple pharmaceutical companies developing epilepsy drugs through her role as director of The Epilepsy Study Consortium, a nonprofit organization. Dr. Najm reported no conflicts of interest.

Issue
Neurology Reviews - 30(3)
Publications
Topics
Sections

The International League Against Epilepsy (ILAE) has issued recommendations for treating depression in patients with epilepsy.

The new guidance highlights the high prevalence of depression among patients with epilepsy while offering the first systematic approach to treatment, reported lead author Marco Mula, MD, PhD, of Atkinson Morley Regional Neuroscience Centre at St George’s University Hospital, London, and colleagues.

“Despite evidence that depression represents a frequently encountered comorbidity [among patients with epilepsy], data on the treatment of depression in epilepsy [are] still limited and recommendations rely mostly on individual clinical experience and expertise,” the investigators wrote in Epilepsia.

Recommendations cover first-line treatment of unipolar depression in epilepsy without other psychiatric disorders.

For patients with mild depression, the guidance supports psychological intervention without pharmacologic therapy; however, if the patient wishes to use medication, has had a positive response to medication in the past, or nonpharmacologic treatments have previously failed or are unavailable, then SSRIs should be considered first-choice therapy. For moderate to severe depression, SSRIs are the first choice, according to Dr. Mula and colleagues.

“It has to be acknowledged that there is considerable debate in the psychiatric literature about the treatment of mild depression in adults,” the investigators noted. “A patient-level meta-analysis pointed out that the magnitude of benefit of antidepressant medications compared with placebo increases with severity of depression symptoms and it may be minimal or nonexistent, on average, in patients with mild or moderate symptoms.”

If a patient does not respond to first-line therapy, then venlafaxine should be considered, according to the guidance. When a patient does respond to therapy, treatment should be continued for at least 6 months, and when residual symptoms persist, treatment should be continued until resolution.

“In people with depression it is established that around two-thirds of patients do not achieve full remission with first-line treatment,” Dr. Mula and colleagues wrote. “In people with epilepsy, current data show that up to 50% of patients do not achieve full remission from depression. For this reason, augmentation strategies are often needed. They should be adopted by psychiatrists, neuropsychiatrists, or mental health professionals familiar with such therapeutic strategies.”

Beyond these key recommendations, the guidance covers a range of additional topics, including other pharmacologic options, medication discontinuation strategies, electroconvulsive therapy, light therapy, exercise training, vagus nerve stimulation, and repetitive transcranial magnetic stimulation.
 

Useful advice that counters common misconceptions

According to Jacqueline A. French, MD, a professor at NYU Langone Medical Center, Dr. Mula and colleagues are “top notch,” and their recommendations “hit every nail on the head.”

Dr. Jacqueline A. French

Dr. French, chief medical officer of The Epilepsy Foundation, emphasized the importance of the publication, which addresses two common misconceptions within the medical community: First, that standard antidepressants are insufficient to treat depression in patients with epilepsy, and second, that antidepressants may trigger seizures.

“The first purpose [of the publication] is to say, yes, these antidepressants do work,” Dr. French said, “and no, they don’t worsen seizures, and you can use them safely, and they are appropriate to use.”

Dr. French explained that managing depression remains a practice gap among epileptologists and neurologists because it is a diagnosis that doesn’t traditionally fall into their purview, yet many patients with epilepsy forgo visiting their primary care providers, who more frequently diagnose and manage depression. Dr. French agreed with the guidance that epilepsy specialists should fill this gap.

“We need to at least be able to take people through their first antidepressant, even though we were not trained to be psychiatrists,” Dr. French said. “That’s part of the best care of our patients.”

Imad Najm, MD, director of the Charles Shor Epilepsy Center, Cleveland Clinic, said the recommendations are a step forward in the field, as they are supported by clinical data, instead of just clinical experience and expertise.

Dr. Imad Najm

Still, Dr. Najm noted that more work is needed to stratify risk of depression in epilepsy and evaluate a possible causal relationship between epilepsy therapies and depression.

He went on to emphasizes the scale of issue at hand, and the stakes involved.

“Depression, anxiety, and psychosis affect a large number of patients with epilepsy,” Dr. Najm said. “Clinical screening and recognition of these comorbidities leads to the institution of treatment options and significant improvement in quality of life. Mental health professionals should be an integral part of any comprehensive epilepsy center.”

The investigators disclosed relationships with Esai, UCB, Elsevier, and others. Dr. French is indirectly involved with multiple pharmaceutical companies developing epilepsy drugs through her role as director of The Epilepsy Study Consortium, a nonprofit organization. Dr. Najm reported no conflicts of interest.

The International League Against Epilepsy (ILAE) has issued recommendations for treating depression in patients with epilepsy.

The new guidance highlights the high prevalence of depression among patients with epilepsy while offering the first systematic approach to treatment, reported lead author Marco Mula, MD, PhD, of Atkinson Morley Regional Neuroscience Centre at St George’s University Hospital, London, and colleagues.

“Despite evidence that depression represents a frequently encountered comorbidity [among patients with epilepsy], data on the treatment of depression in epilepsy [are] still limited and recommendations rely mostly on individual clinical experience and expertise,” the investigators wrote in Epilepsia.

Recommendations cover first-line treatment of unipolar depression in epilepsy without other psychiatric disorders.

For patients with mild depression, the guidance supports psychological intervention without pharmacologic therapy; however, if the patient wishes to use medication, has had a positive response to medication in the past, or nonpharmacologic treatments have previously failed or are unavailable, then SSRIs should be considered first-choice therapy. For moderate to severe depression, SSRIs are the first choice, according to Dr. Mula and colleagues.

“It has to be acknowledged that there is considerable debate in the psychiatric literature about the treatment of mild depression in adults,” the investigators noted. “A patient-level meta-analysis pointed out that the magnitude of benefit of antidepressant medications compared with placebo increases with severity of depression symptoms and it may be minimal or nonexistent, on average, in patients with mild or moderate symptoms.”

If a patient does not respond to first-line therapy, then venlafaxine should be considered, according to the guidance. When a patient does respond to therapy, treatment should be continued for at least 6 months, and when residual symptoms persist, treatment should be continued until resolution.

“In people with depression it is established that around two-thirds of patients do not achieve full remission with first-line treatment,” Dr. Mula and colleagues wrote. “In people with epilepsy, current data show that up to 50% of patients do not achieve full remission from depression. For this reason, augmentation strategies are often needed. They should be adopted by psychiatrists, neuropsychiatrists, or mental health professionals familiar with such therapeutic strategies.”

Beyond these key recommendations, the guidance covers a range of additional topics, including other pharmacologic options, medication discontinuation strategies, electroconvulsive therapy, light therapy, exercise training, vagus nerve stimulation, and repetitive transcranial magnetic stimulation.
 

Useful advice that counters common misconceptions

According to Jacqueline A. French, MD, a professor at NYU Langone Medical Center, Dr. Mula and colleagues are “top notch,” and their recommendations “hit every nail on the head.”

Dr. Jacqueline A. French

Dr. French, chief medical officer of The Epilepsy Foundation, emphasized the importance of the publication, which addresses two common misconceptions within the medical community: First, that standard antidepressants are insufficient to treat depression in patients with epilepsy, and second, that antidepressants may trigger seizures.

“The first purpose [of the publication] is to say, yes, these antidepressants do work,” Dr. French said, “and no, they don’t worsen seizures, and you can use them safely, and they are appropriate to use.”

Dr. French explained that managing depression remains a practice gap among epileptologists and neurologists because it is a diagnosis that doesn’t traditionally fall into their purview, yet many patients with epilepsy forgo visiting their primary care providers, who more frequently diagnose and manage depression. Dr. French agreed with the guidance that epilepsy specialists should fill this gap.

“We need to at least be able to take people through their first antidepressant, even though we were not trained to be psychiatrists,” Dr. French said. “That’s part of the best care of our patients.”

Imad Najm, MD, director of the Charles Shor Epilepsy Center, Cleveland Clinic, said the recommendations are a step forward in the field, as they are supported by clinical data, instead of just clinical experience and expertise.

Dr. Imad Najm

Still, Dr. Najm noted that more work is needed to stratify risk of depression in epilepsy and evaluate a possible causal relationship between epilepsy therapies and depression.

He went on to emphasizes the scale of issue at hand, and the stakes involved.

“Depression, anxiety, and psychosis affect a large number of patients with epilepsy,” Dr. Najm said. “Clinical screening and recognition of these comorbidities leads to the institution of treatment options and significant improvement in quality of life. Mental health professionals should be an integral part of any comprehensive epilepsy center.”

The investigators disclosed relationships with Esai, UCB, Elsevier, and others. Dr. French is indirectly involved with multiple pharmaceutical companies developing epilepsy drugs through her role as director of The Epilepsy Study Consortium, a nonprofit organization. Dr. Najm reported no conflicts of interest.

Issue
Neurology Reviews - 30(3)
Issue
Neurology Reviews - 30(3)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM EPILEPSIA

Citation Override
Publish date: February 14, 2022
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Moderate-vigorous stepping seen to lower diabetes risk in older women

Article Type
Changed

More steps per day, particularly at a higher intensity, may reduce the risk of type 2 diabetes in older women, based on a prospective cohort study.

The link between daily stepping and diabetes was not significantly modified by body mass index (BMI) or other common diabetes risk factors, suggesting that the relationship is highly generalizable, lead author Alexis C. Garduno, MPH, a PhD student at the University of California, San Diego, and colleagues reported.

“Physical activity is a key modifiable behavior for diabetes prevention and management,” the investigators wrote in Diabetes Care. “Many prevention studies have demonstrated that regular physical activity, along with improved diet, reduces the risk of diabetes in adults. ... To the best of our knowledge, there are few studies examining the association between objectively measured steps per day and incident diabetes in a community-based setting.”

To this end, the investigators analyzed data from 4,838 older, community-living women in the Objective Physical Activity and Cardiovascular Health Study. Upon enrollment, women were without physician-diagnosed diabetes and had a mean age of 78.9 years. For 1 week, participants wore ActiGraph GT3X+ accelerometers to measure steps per day, as well as step intensity, graded as light or moderate to vigorous.

The relationship between daily activity and diabetes was analyzed using three multivariate models: The first included race/ethnicity and age; the second also included family history of diabetes, education, physical functioning, self-rated health, smoking status, and alcohol consumption; and the third added BMI, “a potential mediator in the causal pathway between steps per day and diabetes,” the investigators wrote.

Participants took an average of 3,729 steps per day, divided roughly evenly between light and moderate to vigorous intensity.

After a median follow-up of 5.7 years, 8.1% of women developed diabetes. The least-adjusted model showed a 14% reduction in diabetes risk per 2,000 steps (hazard ratio, 0.86; 95% confidence interval, 0.80-0.92; P = .007), whereas the second model, adjusting for more confounding variables, showed a 12% reduction in diabetes risk per 2,000 steps (HR, 0.88; 95% CI, 0.78-1.00; P = .045).

The final model, which added BMI, showed a 10% reduction in risk, although it didn’t reach statistical significance (HR, 0.90; 95% CI, 0.80-1.02; P = .11). Furthermore, accelerated failure time models suggested that BMI did not significantly impact the link between steps and diabetes (proportion mediated, 17.7%;95% CI, –55.0 to 142.0; P = .09). Further analyses also found no significant interactions between BMI or other possible confounders.

“The steps per day–diabetes association was not modified by age, race/ethnicity, BMI, physical functioning, or family history of diabetes, which supports the generalizability of these findings to community-living older women,” the investigators wrote.

Increased stepping intensity also appeared to lower risk of diabetes. After adjusting for confounding variables, light stepping was not linked to reduced risk (HR, 0.97; 95% CI, 0.73-1.29; P = .83), whereas moderate to vigorous stepping reduced risk by 14% per 2,000 steps (HR, 0.86; 95% CI, 0.74-1.00; P = .04).

“This study provides evidence supporting an association between steps per day and lower incident diabetes,” the investigators concluded. “While further work is needed to identify whether there is a minimum number of steps per day that results in a clinically significant reduction of diabetes and to evaluate the role that step intensity plays in diabetes etiology for older adults, findings from this study suggest that moderate-vigorous–intensity steps may be more important than lower-intensity steps with respect to incident diabetes. Steps per day–based interventions are needed to advance diabetes prevention science in older adults.”

The study was supported by the National Institute on Aging, the National Institute of Diabetes and Digestive and Kidney Diseases, the Tobacco-Related Disease Research Program, and others. The investigators had no potential conflicts of interest.

Publications
Topics
Sections

More steps per day, particularly at a higher intensity, may reduce the risk of type 2 diabetes in older women, based on a prospective cohort study.

The link between daily stepping and diabetes was not significantly modified by body mass index (BMI) or other common diabetes risk factors, suggesting that the relationship is highly generalizable, lead author Alexis C. Garduno, MPH, a PhD student at the University of California, San Diego, and colleagues reported.

“Physical activity is a key modifiable behavior for diabetes prevention and management,” the investigators wrote in Diabetes Care. “Many prevention studies have demonstrated that regular physical activity, along with improved diet, reduces the risk of diabetes in adults. ... To the best of our knowledge, there are few studies examining the association between objectively measured steps per day and incident diabetes in a community-based setting.”

To this end, the investigators analyzed data from 4,838 older, community-living women in the Objective Physical Activity and Cardiovascular Health Study. Upon enrollment, women were without physician-diagnosed diabetes and had a mean age of 78.9 years. For 1 week, participants wore ActiGraph GT3X+ accelerometers to measure steps per day, as well as step intensity, graded as light or moderate to vigorous.

The relationship between daily activity and diabetes was analyzed using three multivariate models: The first included race/ethnicity and age; the second also included family history of diabetes, education, physical functioning, self-rated health, smoking status, and alcohol consumption; and the third added BMI, “a potential mediator in the causal pathway between steps per day and diabetes,” the investigators wrote.

Participants took an average of 3,729 steps per day, divided roughly evenly between light and moderate to vigorous intensity.

After a median follow-up of 5.7 years, 8.1% of women developed diabetes. The least-adjusted model showed a 14% reduction in diabetes risk per 2,000 steps (hazard ratio, 0.86; 95% confidence interval, 0.80-0.92; P = .007), whereas the second model, adjusting for more confounding variables, showed a 12% reduction in diabetes risk per 2,000 steps (HR, 0.88; 95% CI, 0.78-1.00; P = .045).

The final model, which added BMI, showed a 10% reduction in risk, although it didn’t reach statistical significance (HR, 0.90; 95% CI, 0.80-1.02; P = .11). Furthermore, accelerated failure time models suggested that BMI did not significantly impact the link between steps and diabetes (proportion mediated, 17.7%;95% CI, –55.0 to 142.0; P = .09). Further analyses also found no significant interactions between BMI or other possible confounders.

“The steps per day–diabetes association was not modified by age, race/ethnicity, BMI, physical functioning, or family history of diabetes, which supports the generalizability of these findings to community-living older women,” the investigators wrote.

Increased stepping intensity also appeared to lower risk of diabetes. After adjusting for confounding variables, light stepping was not linked to reduced risk (HR, 0.97; 95% CI, 0.73-1.29; P = .83), whereas moderate to vigorous stepping reduced risk by 14% per 2,000 steps (HR, 0.86; 95% CI, 0.74-1.00; P = .04).

“This study provides evidence supporting an association between steps per day and lower incident diabetes,” the investigators concluded. “While further work is needed to identify whether there is a minimum number of steps per day that results in a clinically significant reduction of diabetes and to evaluate the role that step intensity plays in diabetes etiology for older adults, findings from this study suggest that moderate-vigorous–intensity steps may be more important than lower-intensity steps with respect to incident diabetes. Steps per day–based interventions are needed to advance diabetes prevention science in older adults.”

The study was supported by the National Institute on Aging, the National Institute of Diabetes and Digestive and Kidney Diseases, the Tobacco-Related Disease Research Program, and others. The investigators had no potential conflicts of interest.

More steps per day, particularly at a higher intensity, may reduce the risk of type 2 diabetes in older women, based on a prospective cohort study.

The link between daily stepping and diabetes was not significantly modified by body mass index (BMI) or other common diabetes risk factors, suggesting that the relationship is highly generalizable, lead author Alexis C. Garduno, MPH, a PhD student at the University of California, San Diego, and colleagues reported.

“Physical activity is a key modifiable behavior for diabetes prevention and management,” the investigators wrote in Diabetes Care. “Many prevention studies have demonstrated that regular physical activity, along with improved diet, reduces the risk of diabetes in adults. ... To the best of our knowledge, there are few studies examining the association between objectively measured steps per day and incident diabetes in a community-based setting.”

To this end, the investigators analyzed data from 4,838 older, community-living women in the Objective Physical Activity and Cardiovascular Health Study. Upon enrollment, women were without physician-diagnosed diabetes and had a mean age of 78.9 years. For 1 week, participants wore ActiGraph GT3X+ accelerometers to measure steps per day, as well as step intensity, graded as light or moderate to vigorous.

The relationship between daily activity and diabetes was analyzed using three multivariate models: The first included race/ethnicity and age; the second also included family history of diabetes, education, physical functioning, self-rated health, smoking status, and alcohol consumption; and the third added BMI, “a potential mediator in the causal pathway between steps per day and diabetes,” the investigators wrote.

Participants took an average of 3,729 steps per day, divided roughly evenly between light and moderate to vigorous intensity.

After a median follow-up of 5.7 years, 8.1% of women developed diabetes. The least-adjusted model showed a 14% reduction in diabetes risk per 2,000 steps (hazard ratio, 0.86; 95% confidence interval, 0.80-0.92; P = .007), whereas the second model, adjusting for more confounding variables, showed a 12% reduction in diabetes risk per 2,000 steps (HR, 0.88; 95% CI, 0.78-1.00; P = .045).

The final model, which added BMI, showed a 10% reduction in risk, although it didn’t reach statistical significance (HR, 0.90; 95% CI, 0.80-1.02; P = .11). Furthermore, accelerated failure time models suggested that BMI did not significantly impact the link between steps and diabetes (proportion mediated, 17.7%;95% CI, –55.0 to 142.0; P = .09). Further analyses also found no significant interactions between BMI or other possible confounders.

“The steps per day–diabetes association was not modified by age, race/ethnicity, BMI, physical functioning, or family history of diabetes, which supports the generalizability of these findings to community-living older women,” the investigators wrote.

Increased stepping intensity also appeared to lower risk of diabetes. After adjusting for confounding variables, light stepping was not linked to reduced risk (HR, 0.97; 95% CI, 0.73-1.29; P = .83), whereas moderate to vigorous stepping reduced risk by 14% per 2,000 steps (HR, 0.86; 95% CI, 0.74-1.00; P = .04).

“This study provides evidence supporting an association between steps per day and lower incident diabetes,” the investigators concluded. “While further work is needed to identify whether there is a minimum number of steps per day that results in a clinically significant reduction of diabetes and to evaluate the role that step intensity plays in diabetes etiology for older adults, findings from this study suggest that moderate-vigorous–intensity steps may be more important than lower-intensity steps with respect to incident diabetes. Steps per day–based interventions are needed to advance diabetes prevention science in older adults.”

The study was supported by the National Institute on Aging, the National Institute of Diabetes and Digestive and Kidney Diseases, the Tobacco-Related Disease Research Program, and others. The investigators had no potential conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DIABETES CARE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article