User login
The prospect of a medication to treat OSA is getting closer
For researchers involved with sleep disorders, developing a pharmacologic treatment for obstructive sleep apnea (OSA) is a bit like searching for the holy grail. P K Schweitzer and colleagues have published the results of the randomized MARIPOSA study assessing a combination of two medicinal products known as AD109, one of the products having an antimuscarinic effect (aroxybutynin), and the other a noradrenergic effect (atomoxetine), in treating this condition.
MARIPOSA Methodology
The trial included 209 patients, 176 of whom completed the 4-week protocol. The trial was double-blinded according to four parallel arms: participants in the first and second arms received AD109 containing doses of 2.5 mg/75 mg and 5 mg/75 mg of aroxybutynin and atomoxetine, respectively. The third arm received atomoxetine alone (75 mg), and the fourth arm was given a placebo.
Two polysomnograms (PSGs) were carried out at the start and end of the trial, allowing researchers to calculate the apnea-hypopnea index (AHI) and to quantify nocturnal desaturation. The impact of these variables are now being deemed as the primary marker of the risk for cardiovascular complications secondary to OSA. Finally, questionnaires that evaluated excessive daytime sleepiness, fatigue, and sleep quality were completed.
The median age varied from 5 to 57 years, depending on the arm of the study, and body mass index varied between 31.2 and 34.5. Inclusion criteria comprised an AHI between 10 and 45 events per hour, of which, at least 75% were described as obstructive. Where continuous positive airway pressure (CPAP) was used (21%-30% of cases), it was abandoned during the trial (in a time frame that is perhaps too short to consider these patients as treatment naive).
Combination Brought Improvements
After the 4 weeks of treatment, the AHI measured via follow-up PSG went from a median of 20.5 to 10.8 (in arm one and from 19.4 to 9.5 in arm two (P < .0001 vs placebo in these two arms). For participants in arm three, AHI went from 19.0 to 11.8 (P < .01 vs placebo).
The rate of nocturnal desaturation (in percentage per hour) declined from -12.7 in arm one (P = .03), from -16.6 in arm two (P = .005), and from -5.2 in arm three (P = .003) compared with the placebo. The fatigue score was significantly improved by AD109 2.5 mg/75 mg. The use of atomoxetine alone slightly worsened the sleep disturbance score.
The main side effects were dry mouth sensation (which was markedly more common with AD109 5 mg/75 mg), difficulty passing urine in 7%-22% of cases, tachycardia in all trial arms, and increased diastolic blood pressure at the 2.5-mg/75-mg dose. The authors concluded that AD109, a combination of noradrenergic and antimuscarinic molecules, is effective in correcting mild to severe OSA.
The 2.5-mg/75-mg dose was as effective as the 5-mg/75-mg dose. Atomoxetine alone is less effective, has more side effects, and is associated with lower quality sleep. Finally, it is reported that compliance with oral treatment was not checked, yet the argument of patient noncompliance with CPAP is largely used by the authors in their presentation of their study. A phase 3 trial is underway.
Nevertheless, these results herald important scientific benefits if we consider that Colin Sullivan’s original 1981 research paper, which ushered in the CPAP era, presented the results of just five participants.
This article was translated from JIM, which is part of the Medscape professional network.
A version of this article appeared on Medscape.com.
For researchers involved with sleep disorders, developing a pharmacologic treatment for obstructive sleep apnea (OSA) is a bit like searching for the holy grail. P K Schweitzer and colleagues have published the results of the randomized MARIPOSA study assessing a combination of two medicinal products known as AD109, one of the products having an antimuscarinic effect (aroxybutynin), and the other a noradrenergic effect (atomoxetine), in treating this condition.
MARIPOSA Methodology
The trial included 209 patients, 176 of whom completed the 4-week protocol. The trial was double-blinded according to four parallel arms: participants in the first and second arms received AD109 containing doses of 2.5 mg/75 mg and 5 mg/75 mg of aroxybutynin and atomoxetine, respectively. The third arm received atomoxetine alone (75 mg), and the fourth arm was given a placebo.
Two polysomnograms (PSGs) were carried out at the start and end of the trial, allowing researchers to calculate the apnea-hypopnea index (AHI) and to quantify nocturnal desaturation. The impact of these variables are now being deemed as the primary marker of the risk for cardiovascular complications secondary to OSA. Finally, questionnaires that evaluated excessive daytime sleepiness, fatigue, and sleep quality were completed.
The median age varied from 5 to 57 years, depending on the arm of the study, and body mass index varied between 31.2 and 34.5. Inclusion criteria comprised an AHI between 10 and 45 events per hour, of which, at least 75% were described as obstructive. Where continuous positive airway pressure (CPAP) was used (21%-30% of cases), it was abandoned during the trial (in a time frame that is perhaps too short to consider these patients as treatment naive).
Combination Brought Improvements
After the 4 weeks of treatment, the AHI measured via follow-up PSG went from a median of 20.5 to 10.8 (in arm one and from 19.4 to 9.5 in arm two (P < .0001 vs placebo in these two arms). For participants in arm three, AHI went from 19.0 to 11.8 (P < .01 vs placebo).
The rate of nocturnal desaturation (in percentage per hour) declined from -12.7 in arm one (P = .03), from -16.6 in arm two (P = .005), and from -5.2 in arm three (P = .003) compared with the placebo. The fatigue score was significantly improved by AD109 2.5 mg/75 mg. The use of atomoxetine alone slightly worsened the sleep disturbance score.
The main side effects were dry mouth sensation (which was markedly more common with AD109 5 mg/75 mg), difficulty passing urine in 7%-22% of cases, tachycardia in all trial arms, and increased diastolic blood pressure at the 2.5-mg/75-mg dose. The authors concluded that AD109, a combination of noradrenergic and antimuscarinic molecules, is effective in correcting mild to severe OSA.
The 2.5-mg/75-mg dose was as effective as the 5-mg/75-mg dose. Atomoxetine alone is less effective, has more side effects, and is associated with lower quality sleep. Finally, it is reported that compliance with oral treatment was not checked, yet the argument of patient noncompliance with CPAP is largely used by the authors in their presentation of their study. A phase 3 trial is underway.
Nevertheless, these results herald important scientific benefits if we consider that Colin Sullivan’s original 1981 research paper, which ushered in the CPAP era, presented the results of just five participants.
This article was translated from JIM, which is part of the Medscape professional network.
A version of this article appeared on Medscape.com.
For researchers involved with sleep disorders, developing a pharmacologic treatment for obstructive sleep apnea (OSA) is a bit like searching for the holy grail. P K Schweitzer and colleagues have published the results of the randomized MARIPOSA study assessing a combination of two medicinal products known as AD109, one of the products having an antimuscarinic effect (aroxybutynin), and the other a noradrenergic effect (atomoxetine), in treating this condition.
MARIPOSA Methodology
The trial included 209 patients, 176 of whom completed the 4-week protocol. The trial was double-blinded according to four parallel arms: participants in the first and second arms received AD109 containing doses of 2.5 mg/75 mg and 5 mg/75 mg of aroxybutynin and atomoxetine, respectively. The third arm received atomoxetine alone (75 mg), and the fourth arm was given a placebo.
Two polysomnograms (PSGs) were carried out at the start and end of the trial, allowing researchers to calculate the apnea-hypopnea index (AHI) and to quantify nocturnal desaturation. The impact of these variables are now being deemed as the primary marker of the risk for cardiovascular complications secondary to OSA. Finally, questionnaires that evaluated excessive daytime sleepiness, fatigue, and sleep quality were completed.
The median age varied from 5 to 57 years, depending on the arm of the study, and body mass index varied between 31.2 and 34.5. Inclusion criteria comprised an AHI between 10 and 45 events per hour, of which, at least 75% were described as obstructive. Where continuous positive airway pressure (CPAP) was used (21%-30% of cases), it was abandoned during the trial (in a time frame that is perhaps too short to consider these patients as treatment naive).
Combination Brought Improvements
After the 4 weeks of treatment, the AHI measured via follow-up PSG went from a median of 20.5 to 10.8 (in arm one and from 19.4 to 9.5 in arm two (P < .0001 vs placebo in these two arms). For participants in arm three, AHI went from 19.0 to 11.8 (P < .01 vs placebo).
The rate of nocturnal desaturation (in percentage per hour) declined from -12.7 in arm one (P = .03), from -16.6 in arm two (P = .005), and from -5.2 in arm three (P = .003) compared with the placebo. The fatigue score was significantly improved by AD109 2.5 mg/75 mg. The use of atomoxetine alone slightly worsened the sleep disturbance score.
The main side effects were dry mouth sensation (which was markedly more common with AD109 5 mg/75 mg), difficulty passing urine in 7%-22% of cases, tachycardia in all trial arms, and increased diastolic blood pressure at the 2.5-mg/75-mg dose. The authors concluded that AD109, a combination of noradrenergic and antimuscarinic molecules, is effective in correcting mild to severe OSA.
The 2.5-mg/75-mg dose was as effective as the 5-mg/75-mg dose. Atomoxetine alone is less effective, has more side effects, and is associated with lower quality sleep. Finally, it is reported that compliance with oral treatment was not checked, yet the argument of patient noncompliance with CPAP is largely used by the authors in their presentation of their study. A phase 3 trial is underway.
Nevertheless, these results herald important scientific benefits if we consider that Colin Sullivan’s original 1981 research paper, which ushered in the CPAP era, presented the results of just five participants.
This article was translated from JIM, which is part of the Medscape professional network.
A version of this article appeared on Medscape.com.
Early age at first period raises type 2 diabetes risk
TOPLINE:
, a retrospective study of US women under age 65 found.
METHODOLOGY:
- Researchers analyzed data from 17,377 women who were aged 20-65 years when they participated in a National Health and Nutrition Examination Survey (NHANES) from 1999 to 2018 and reported their age at first menstruation, which was classified as ≤ 10, 11, 12, 13, 14, or ≥ 15 years of age.
- In total, 0.2% of the women (1773) had type 2 diabetes; of these, 11.5% (205) had cardiovascular disease (CVD), defined as coronary heart disease (CHD), myocardial infarction, or stroke.
- Compared with women who had their first menstrual period at age 13 (the mean age in this population), those who had their period at age ≤ 10 had a significantly greater risk of having type 2 diabetes, after adjustment for age, race/ethnicity, education, parity, menopause status, family history of diabetes, smoking status, physical activity, alcohol consumption, and body mass index (odds ratio, 1.32; 95% CI, 1.03-1.69; P trend = .03).
- Among the women with diabetes, compared with those who had their first menstrual period at age 13, those who had it at age ≤ 10 had a significantly greater risk of having stroke (OR, 2.66; 95% CI, 1.07-6.64; P trend = .02), but not CVD or CHD, after adjustment for these multiple variables.
TAKEAWAY:
- In a racially and ethnically diverse national sample of US women younger than 65, “extremely early” age at first menstrual period was associated with significantly increased risk for type 2 diabetes; among the women with type 2 diabetes, it was associated with significantly increased risk for stroke but not CVD or CHD, after adjustment for multiple variables.
- Early age at menarche may be an early indicator of the cardiometabolic disease trajectory in women.
IN PRACTICE:
“Women with early-life exposures such as early age at menarche need to be further examined for diabetes and prevention research and strategies for progression of diabetes complications,” the study authors write.
SOURCE:
The authors, mainly from Tulane University School of Public Health and Tropical Medicine, New Orleans, Louisiana, and also from Harvard Medical School, Boston, Massachusetts, published their findings in BMJ Nutrition, Prevention & Health.
LIMITATIONS:
- The women who participated in NHANES may not be representative of all women in the United States (selection bias).
- The study only included women who reported the age when they had their first menstrual period (selection bias).
- This was a cross-sectional, observational study, so it cannot show causality.
- The women may have reported the wrong age at which they had their first period (recall bias and social desirability bias).
- The women may have inaccurately reported CVD and type 2 diabetes (recall bias and social desirability bias).
DISCLOSURES:
The researchers were supported by grants from the National Heart, Lung, and Blood Institute and from the National Institute of General Medical Sciences of the National Institutes of Health.
A version of this article first appeared on Medscape.com.
TOPLINE:
, a retrospective study of US women under age 65 found.
METHODOLOGY:
- Researchers analyzed data from 17,377 women who were aged 20-65 years when they participated in a National Health and Nutrition Examination Survey (NHANES) from 1999 to 2018 and reported their age at first menstruation, which was classified as ≤ 10, 11, 12, 13, 14, or ≥ 15 years of age.
- In total, 0.2% of the women (1773) had type 2 diabetes; of these, 11.5% (205) had cardiovascular disease (CVD), defined as coronary heart disease (CHD), myocardial infarction, or stroke.
- Compared with women who had their first menstrual period at age 13 (the mean age in this population), those who had their period at age ≤ 10 had a significantly greater risk of having type 2 diabetes, after adjustment for age, race/ethnicity, education, parity, menopause status, family history of diabetes, smoking status, physical activity, alcohol consumption, and body mass index (odds ratio, 1.32; 95% CI, 1.03-1.69; P trend = .03).
- Among the women with diabetes, compared with those who had their first menstrual period at age 13, those who had it at age ≤ 10 had a significantly greater risk of having stroke (OR, 2.66; 95% CI, 1.07-6.64; P trend = .02), but not CVD or CHD, after adjustment for these multiple variables.
TAKEAWAY:
- In a racially and ethnically diverse national sample of US women younger than 65, “extremely early” age at first menstrual period was associated with significantly increased risk for type 2 diabetes; among the women with type 2 diabetes, it was associated with significantly increased risk for stroke but not CVD or CHD, after adjustment for multiple variables.
- Early age at menarche may be an early indicator of the cardiometabolic disease trajectory in women.
IN PRACTICE:
“Women with early-life exposures such as early age at menarche need to be further examined for diabetes and prevention research and strategies for progression of diabetes complications,” the study authors write.
SOURCE:
The authors, mainly from Tulane University School of Public Health and Tropical Medicine, New Orleans, Louisiana, and also from Harvard Medical School, Boston, Massachusetts, published their findings in BMJ Nutrition, Prevention & Health.
LIMITATIONS:
- The women who participated in NHANES may not be representative of all women in the United States (selection bias).
- The study only included women who reported the age when they had their first menstrual period (selection bias).
- This was a cross-sectional, observational study, so it cannot show causality.
- The women may have reported the wrong age at which they had their first period (recall bias and social desirability bias).
- The women may have inaccurately reported CVD and type 2 diabetes (recall bias and social desirability bias).
DISCLOSURES:
The researchers were supported by grants from the National Heart, Lung, and Blood Institute and from the National Institute of General Medical Sciences of the National Institutes of Health.
A version of this article first appeared on Medscape.com.
TOPLINE:
, a retrospective study of US women under age 65 found.
METHODOLOGY:
- Researchers analyzed data from 17,377 women who were aged 20-65 years when they participated in a National Health and Nutrition Examination Survey (NHANES) from 1999 to 2018 and reported their age at first menstruation, which was classified as ≤ 10, 11, 12, 13, 14, or ≥ 15 years of age.
- In total, 0.2% of the women (1773) had type 2 diabetes; of these, 11.5% (205) had cardiovascular disease (CVD), defined as coronary heart disease (CHD), myocardial infarction, or stroke.
- Compared with women who had their first menstrual period at age 13 (the mean age in this population), those who had their period at age ≤ 10 had a significantly greater risk of having type 2 diabetes, after adjustment for age, race/ethnicity, education, parity, menopause status, family history of diabetes, smoking status, physical activity, alcohol consumption, and body mass index (odds ratio, 1.32; 95% CI, 1.03-1.69; P trend = .03).
- Among the women with diabetes, compared with those who had their first menstrual period at age 13, those who had it at age ≤ 10 had a significantly greater risk of having stroke (OR, 2.66; 95% CI, 1.07-6.64; P trend = .02), but not CVD or CHD, after adjustment for these multiple variables.
TAKEAWAY:
- In a racially and ethnically diverse national sample of US women younger than 65, “extremely early” age at first menstrual period was associated with significantly increased risk for type 2 diabetes; among the women with type 2 diabetes, it was associated with significantly increased risk for stroke but not CVD or CHD, after adjustment for multiple variables.
- Early age at menarche may be an early indicator of the cardiometabolic disease trajectory in women.
IN PRACTICE:
“Women with early-life exposures such as early age at menarche need to be further examined for diabetes and prevention research and strategies for progression of diabetes complications,” the study authors write.
SOURCE:
The authors, mainly from Tulane University School of Public Health and Tropical Medicine, New Orleans, Louisiana, and also from Harvard Medical School, Boston, Massachusetts, published their findings in BMJ Nutrition, Prevention & Health.
LIMITATIONS:
- The women who participated in NHANES may not be representative of all women in the United States (selection bias).
- The study only included women who reported the age when they had their first menstrual period (selection bias).
- This was a cross-sectional, observational study, so it cannot show causality.
- The women may have reported the wrong age at which they had their first period (recall bias and social desirability bias).
- The women may have inaccurately reported CVD and type 2 diabetes (recall bias and social desirability bias).
DISCLOSURES:
The researchers were supported by grants from the National Heart, Lung, and Blood Institute and from the National Institute of General Medical Sciences of the National Institutes of Health.
A version of this article first appeared on Medscape.com.
‘Hidden hearing loss’ may cause tinnitus: Study
Scientists know that tinnitus, or ringing in the ears, affects 10% of adults worldwide. But they’re not exactly sure what causes the condition.
The traditional belief is that tinnitus happens in people who had already lost hearing. But some people who have tinnitus are still able to perform well on standard hearing tests, according to researchers at the Massachusetts Eye and Ear Infirmary. That happens because the tests don’t pick up auditory nerve loss, sometimes called “hidden hearing loss.”
Stéphane F. Maison, PhD, the lead author of a new study on tinnitus, said in a news release about the study.
Tinnitus is sometimes compared to phantom limb syndrome, in which people feel pain in limbs they no longer have. While the study published in Scientific Reports doesn’t refer to phantom limb syndrome, it does talk about “phantom sound.”
“In other words, the brain tries to compensate for the loss of hearing by increasing its activity, resulting in the perception of a phantom sound, tinnitus. Until recently though, this idea was disputed as some tinnitus sufferers have normal hearing tests,” the researchers explained in the news release.
Annoyed by the ringing in your ears? What causes tinnitus, and how can you get the sound to buzz off?
The study included 294 adults — 201 who had never reported having tinnitus, 64 who had reported having temporary tinnitus, and 29 who had reported having constant tinnitus for 6 months or more.
All 294 had performed normally on a pure tone test, in which subjects raise their hands when they hear beeps to measure the quietest sounds they can detect.
In a different kind of test, electrodes measured responses to clicking sounds in the inner ear, the auditory nerve, and the brain. The second test found reduced response in the auditory nerves and increased activity in the brainstem activity among those who had tinnitus.
Dr Maison, a principal investigator at Eaton-Peabody Laboratories at Mass Eye and Ear/Harvard Medical School, called the study “a first step toward our ultimate goal of silencing tinnitus.”
“Beyond the nuisance of having persistent ringing or other sounds in the ears, tinnitus symptoms are debilitating in many patients, causing sleep deprivation, social isolation, anxiety and depression, adversely affecting work performance, and reducing significantly their quality of life,” he said in the news release. “We won’t be able to cure tinnitus until we fully understand the mechanisms underlying its genesis.”
A version of this article appeared on WebMD.com.
Scientists know that tinnitus, or ringing in the ears, affects 10% of adults worldwide. But they’re not exactly sure what causes the condition.
The traditional belief is that tinnitus happens in people who had already lost hearing. But some people who have tinnitus are still able to perform well on standard hearing tests, according to researchers at the Massachusetts Eye and Ear Infirmary. That happens because the tests don’t pick up auditory nerve loss, sometimes called “hidden hearing loss.”
Stéphane F. Maison, PhD, the lead author of a new study on tinnitus, said in a news release about the study.
Tinnitus is sometimes compared to phantom limb syndrome, in which people feel pain in limbs they no longer have. While the study published in Scientific Reports doesn’t refer to phantom limb syndrome, it does talk about “phantom sound.”
“In other words, the brain tries to compensate for the loss of hearing by increasing its activity, resulting in the perception of a phantom sound, tinnitus. Until recently though, this idea was disputed as some tinnitus sufferers have normal hearing tests,” the researchers explained in the news release.
Annoyed by the ringing in your ears? What causes tinnitus, and how can you get the sound to buzz off?
The study included 294 adults — 201 who had never reported having tinnitus, 64 who had reported having temporary tinnitus, and 29 who had reported having constant tinnitus for 6 months or more.
All 294 had performed normally on a pure tone test, in which subjects raise their hands when they hear beeps to measure the quietest sounds they can detect.
In a different kind of test, electrodes measured responses to clicking sounds in the inner ear, the auditory nerve, and the brain. The second test found reduced response in the auditory nerves and increased activity in the brainstem activity among those who had tinnitus.
Dr Maison, a principal investigator at Eaton-Peabody Laboratories at Mass Eye and Ear/Harvard Medical School, called the study “a first step toward our ultimate goal of silencing tinnitus.”
“Beyond the nuisance of having persistent ringing or other sounds in the ears, tinnitus symptoms are debilitating in many patients, causing sleep deprivation, social isolation, anxiety and depression, adversely affecting work performance, and reducing significantly their quality of life,” he said in the news release. “We won’t be able to cure tinnitus until we fully understand the mechanisms underlying its genesis.”
A version of this article appeared on WebMD.com.
Scientists know that tinnitus, or ringing in the ears, affects 10% of adults worldwide. But they’re not exactly sure what causes the condition.
The traditional belief is that tinnitus happens in people who had already lost hearing. But some people who have tinnitus are still able to perform well on standard hearing tests, according to researchers at the Massachusetts Eye and Ear Infirmary. That happens because the tests don’t pick up auditory nerve loss, sometimes called “hidden hearing loss.”
Stéphane F. Maison, PhD, the lead author of a new study on tinnitus, said in a news release about the study.
Tinnitus is sometimes compared to phantom limb syndrome, in which people feel pain in limbs they no longer have. While the study published in Scientific Reports doesn’t refer to phantom limb syndrome, it does talk about “phantom sound.”
“In other words, the brain tries to compensate for the loss of hearing by increasing its activity, resulting in the perception of a phantom sound, tinnitus. Until recently though, this idea was disputed as some tinnitus sufferers have normal hearing tests,” the researchers explained in the news release.
Annoyed by the ringing in your ears? What causes tinnitus, and how can you get the sound to buzz off?
The study included 294 adults — 201 who had never reported having tinnitus, 64 who had reported having temporary tinnitus, and 29 who had reported having constant tinnitus for 6 months or more.
All 294 had performed normally on a pure tone test, in which subjects raise their hands when they hear beeps to measure the quietest sounds they can detect.
In a different kind of test, electrodes measured responses to clicking sounds in the inner ear, the auditory nerve, and the brain. The second test found reduced response in the auditory nerves and increased activity in the brainstem activity among those who had tinnitus.
Dr Maison, a principal investigator at Eaton-Peabody Laboratories at Mass Eye and Ear/Harvard Medical School, called the study “a first step toward our ultimate goal of silencing tinnitus.”
“Beyond the nuisance of having persistent ringing or other sounds in the ears, tinnitus symptoms are debilitating in many patients, causing sleep deprivation, social isolation, anxiety and depression, adversely affecting work performance, and reducing significantly their quality of life,” he said in the news release. “We won’t be able to cure tinnitus until we fully understand the mechanisms underlying its genesis.”
A version of this article appeared on WebMD.com.
FROM SCIENTIFIC REPORTS
MDMA therapy for loneliness? Researchers say it could work
Some call the drug “ecstasy” or “molly.” Researchers are calling it a potential tool to help treat loneliness.
As public health experts sound the alarm on a rising loneliness epidemic in the United States and across the globe,
In the latest study, MDMA “led to a robust increase in feelings of connection” among people socializing in a controlled setting. Participants were dosed with either MDMA or a placebo and asked to chat with a stranger. Afterward, those who took MDMA said their companion was more responsive and attentive, and that they had plenty in common. The drug also “increased participants’ ratings of liking their partners, feeling connected and finding the conversation enjoyable and meaningful.”
The study was small — just 18 participants — but its results “have implications for MDMA-assisted therapy,” the authors wrote. “This feeling of connectedness could help patients feel safe and trusting, thereby facilitating deeper emotional exploration.”
MDMA “really does seem to make people want to interact more with other people,” says Harriet de Wit, PhD, a neuropharmacologist at the University of Chicago and one of the study’s authors. The results echo those of earlier research using psychedelics like LSD or psilocybin.
It’s important to note that any intervention involving MDMA or psychedelics would be a drug-assisted therapy — that is, used in conjunction with the appropriate therapy and in a therapeutic setting. MDMA-assisted therapy has already drawn popular and scientific attention, as it recently cleared clinical trials for treating posttraumatic stress disorder (PTSD) and may be nearing approval by the US Food and Drug Administration (FDA).
According to Friederike Holze, PhD, psychopharmacologist at the University of Basel, in Switzerland, “there could be a place” for MDMA and psychedelics in treating chronic loneliness, but only under professional supervision.
There would have to be clear guidelines too, says Joshua Woolley, MD, PhD, a psychiatrist at the University of California, San Francisco.
MDMA and psychedelics “induce this plastic state, a state where people can change. They feel open, they feel like things are possible,” Dr. Woolley says. Then, with therapy, “you can help them change.”
Loneliness Can Impact Our Health
On top of the mental health ramifications, the physiologic effects of loneliness could have grave consequences over time. In observational studies, loneliness has been linked to higher risks for cancer and heart disease, and shorter lifespan. One third of Americans over 45 say they are chronically lonely.
Chronic loneliness changes how we think and behave, research shows. It makes us fear contact with others and see them in a more negative light, as more threatening and less trustworthy. Lonely people prefer to stand farther apart from strangers and avoid touch.
This is where MDMA-assisted therapies could potentially help, by easing these defensive tendencies, according to Dr. Woolley.
MDMA, Psychedelics, and Social Behavior
MDMA, or 3,4-methylenedioxymethamphetamine, is a hybrid between a stimulant and a psychedelic. In Dr. de Wit’s earlier experiments, volunteers given MDMA engaged more in communal activities, chatting, and playing games. They used more positive words during social encounters than those who had received a placebo. And after MDMA, people felt less rejected if they were slighted in Cyberball — a virtual ball-tossing game commonly used to measure the effects of social exclusion.
MDMA has been shown to reduce people’s response to other’s negative emotions, diminishing activation of the amygdala (the brain’s fear center) while looking at pictures of angry faces.
This could be helpful. “If you perceive a person’s natural expression as being a little bit angry, if that disappears, then you might be more inclined to interact,” de Wit says.
However, there may be downsides, too. If a drug makes people more trusting and willing to connect, they could be taken advantage of. This is why, Dr. Woolley says, “psychedelics have been used in cults.”
MDMA may also make the experience of touch more pleasant. In a series of experiments in 2019, researchers gently stroked volunteers ’ arms with a goat-hair brush, mimicking the comforting gestures one may receive from a loved one. At the same time, the scientists monitored the volunteers’ facial muscles. People on MDMA perceived gentle touch as more pleasant than those on placebo, and their smile muscles activated more.
MDMA and psychedelics boost social behaviors in animals, too — suggesting that their effects on relationships have a biological basis. Rats on MDMA are more likely to lie next to each other, and mice become more resilient to social stress. Even octopuses become more outgoing after a dose of MDMA, choosing to spend more time with other octopuses instead of a new toy. Classic psychedelics show similar effects — LSD, for example, makes mice more social.
Psychedelics can induce a sense of a “dissolution of the self-other boundary,” Dr. Woolley says. People who take them often say it’s “helped them feel more connected to themselves and other people.” LSD, first synthesized in 1938, may help increase empathy in some people.
Psilocybin, a compound found in over 200 species of mushrooms and used for centuries in Mesoamerican rituals, also seems to boost empathy, with effects persisting for at least seven days. In Cyberball, the online ball-throwing game, people who took psilocybin felt less socially rejected, an outcome reflected in their brain activation patterns in one study — the areas responsible for social-pain processing appeared to dim after a dose.
Making It Legal and Putting It to Use
In 2020, Oregon became the first state to establish a regulatory framework for psilocybin for therapeutic use, and Colorado followed suit in 2022. Such therapeutic applications of psilocybin could help fight loneliness as well, Dr. Woolley believes, because a “ common symptom of depression is that people feel socially withdrawn and lack motivation, ” he says. As mentioned above, MDMA-assisted therapy is also nearing FDA approval for PTSD.
What remain unclear are the exact mechanisms at play.
“MDMA releases oxytocin, and it does that through serotonin receptors,” Dr. de Wit says. Serotonin activates 5-HT1A receptors in the hypothalamus, releasing oxytocin into the bloodstream. In Dr. de Wit’s recent experiments, the more people felt connected after taking MDMA, the more oxytocin was found circulating in their bodies. (Another drug, methamphetamine, also upped the levels of oxytocin but did not increase feelings of connectedness.)
“It’s likely that both something in the serotonin system independent of oxytocin, and oxytocin itself, contribute,” Dr. de Wit says. Dopamine, a neurotransmitter responsible for motivation, appears to increase as well.
The empathy-boosting effects of LSD also seem to be at least partly driven by oxytocin, experiments published in 2021 revealed. Studies in mice, meanwhile, suggest that glutamate, a chemical messenger in the brain, may be behind some of LSD’s prosocial effects.
Scientists are fairly certain which receptors these drugs bind to and which neurotransmitters they affect. “How that gets translated into these higher-order things like empathy and feeling connected to the world, we don’t totally understand,” Dr. Woolley says.
Challenges and the Future
Although MDMA and psychedelics are largely considered safe when taken in a legal, medically controlled setting, there is reason to be cautious.
“They have relatively low impact on the body, like heart rate increase or blood pressure increase. But they might leave some disturbing psychological effects,” says Dr. Holze. Scientists routinely screen experiment volunteers for their risk for psychiatric disorders.
Although risk for addiction is low with both MDMA and psychedelics, there is always some risk for misuse. MDMA “ can produce feelings of well-being, and then people might use it repeatedly, ” Dr. de Wit says. “ That doesn ’ t seem to be a problem for really a lot of people, but it could easily happen. ”
Still, possibilities remain for MDMA in the fight against loneliness.
“[People] feel open, they feel like things are possible, they feel like they’re unstuck,” Dr. Woolley says. “You can harness that in psychotherapy.”
A version of this article appeared on Medscape.com.
Some call the drug “ecstasy” or “molly.” Researchers are calling it a potential tool to help treat loneliness.
As public health experts sound the alarm on a rising loneliness epidemic in the United States and across the globe,
In the latest study, MDMA “led to a robust increase in feelings of connection” among people socializing in a controlled setting. Participants were dosed with either MDMA or a placebo and asked to chat with a stranger. Afterward, those who took MDMA said their companion was more responsive and attentive, and that they had plenty in common. The drug also “increased participants’ ratings of liking their partners, feeling connected and finding the conversation enjoyable and meaningful.”
The study was small — just 18 participants — but its results “have implications for MDMA-assisted therapy,” the authors wrote. “This feeling of connectedness could help patients feel safe and trusting, thereby facilitating deeper emotional exploration.”
MDMA “really does seem to make people want to interact more with other people,” says Harriet de Wit, PhD, a neuropharmacologist at the University of Chicago and one of the study’s authors. The results echo those of earlier research using psychedelics like LSD or psilocybin.
It’s important to note that any intervention involving MDMA or psychedelics would be a drug-assisted therapy — that is, used in conjunction with the appropriate therapy and in a therapeutic setting. MDMA-assisted therapy has already drawn popular and scientific attention, as it recently cleared clinical trials for treating posttraumatic stress disorder (PTSD) and may be nearing approval by the US Food and Drug Administration (FDA).
According to Friederike Holze, PhD, psychopharmacologist at the University of Basel, in Switzerland, “there could be a place” for MDMA and psychedelics in treating chronic loneliness, but only under professional supervision.
There would have to be clear guidelines too, says Joshua Woolley, MD, PhD, a psychiatrist at the University of California, San Francisco.
MDMA and psychedelics “induce this plastic state, a state where people can change. They feel open, they feel like things are possible,” Dr. Woolley says. Then, with therapy, “you can help them change.”
Loneliness Can Impact Our Health
On top of the mental health ramifications, the physiologic effects of loneliness could have grave consequences over time. In observational studies, loneliness has been linked to higher risks for cancer and heart disease, and shorter lifespan. One third of Americans over 45 say they are chronically lonely.
Chronic loneliness changes how we think and behave, research shows. It makes us fear contact with others and see them in a more negative light, as more threatening and less trustworthy. Lonely people prefer to stand farther apart from strangers and avoid touch.
This is where MDMA-assisted therapies could potentially help, by easing these defensive tendencies, according to Dr. Woolley.
MDMA, Psychedelics, and Social Behavior
MDMA, or 3,4-methylenedioxymethamphetamine, is a hybrid between a stimulant and a psychedelic. In Dr. de Wit’s earlier experiments, volunteers given MDMA engaged more in communal activities, chatting, and playing games. They used more positive words during social encounters than those who had received a placebo. And after MDMA, people felt less rejected if they were slighted in Cyberball — a virtual ball-tossing game commonly used to measure the effects of social exclusion.
MDMA has been shown to reduce people’s response to other’s negative emotions, diminishing activation of the amygdala (the brain’s fear center) while looking at pictures of angry faces.
This could be helpful. “If you perceive a person’s natural expression as being a little bit angry, if that disappears, then you might be more inclined to interact,” de Wit says.
However, there may be downsides, too. If a drug makes people more trusting and willing to connect, they could be taken advantage of. This is why, Dr. Woolley says, “psychedelics have been used in cults.”
MDMA may also make the experience of touch more pleasant. In a series of experiments in 2019, researchers gently stroked volunteers ’ arms with a goat-hair brush, mimicking the comforting gestures one may receive from a loved one. At the same time, the scientists monitored the volunteers’ facial muscles. People on MDMA perceived gentle touch as more pleasant than those on placebo, and their smile muscles activated more.
MDMA and psychedelics boost social behaviors in animals, too — suggesting that their effects on relationships have a biological basis. Rats on MDMA are more likely to lie next to each other, and mice become more resilient to social stress. Even octopuses become more outgoing after a dose of MDMA, choosing to spend more time with other octopuses instead of a new toy. Classic psychedelics show similar effects — LSD, for example, makes mice more social.
Psychedelics can induce a sense of a “dissolution of the self-other boundary,” Dr. Woolley says. People who take them often say it’s “helped them feel more connected to themselves and other people.” LSD, first synthesized in 1938, may help increase empathy in some people.
Psilocybin, a compound found in over 200 species of mushrooms and used for centuries in Mesoamerican rituals, also seems to boost empathy, with effects persisting for at least seven days. In Cyberball, the online ball-throwing game, people who took psilocybin felt less socially rejected, an outcome reflected in their brain activation patterns in one study — the areas responsible for social-pain processing appeared to dim after a dose.
Making It Legal and Putting It to Use
In 2020, Oregon became the first state to establish a regulatory framework for psilocybin for therapeutic use, and Colorado followed suit in 2022. Such therapeutic applications of psilocybin could help fight loneliness as well, Dr. Woolley believes, because a “ common symptom of depression is that people feel socially withdrawn and lack motivation, ” he says. As mentioned above, MDMA-assisted therapy is also nearing FDA approval for PTSD.
What remain unclear are the exact mechanisms at play.
“MDMA releases oxytocin, and it does that through serotonin receptors,” Dr. de Wit says. Serotonin activates 5-HT1A receptors in the hypothalamus, releasing oxytocin into the bloodstream. In Dr. de Wit’s recent experiments, the more people felt connected after taking MDMA, the more oxytocin was found circulating in their bodies. (Another drug, methamphetamine, also upped the levels of oxytocin but did not increase feelings of connectedness.)
“It’s likely that both something in the serotonin system independent of oxytocin, and oxytocin itself, contribute,” Dr. de Wit says. Dopamine, a neurotransmitter responsible for motivation, appears to increase as well.
The empathy-boosting effects of LSD also seem to be at least partly driven by oxytocin, experiments published in 2021 revealed. Studies in mice, meanwhile, suggest that glutamate, a chemical messenger in the brain, may be behind some of LSD’s prosocial effects.
Scientists are fairly certain which receptors these drugs bind to and which neurotransmitters they affect. “How that gets translated into these higher-order things like empathy and feeling connected to the world, we don’t totally understand,” Dr. Woolley says.
Challenges and the Future
Although MDMA and psychedelics are largely considered safe when taken in a legal, medically controlled setting, there is reason to be cautious.
“They have relatively low impact on the body, like heart rate increase or blood pressure increase. But they might leave some disturbing psychological effects,” says Dr. Holze. Scientists routinely screen experiment volunteers for their risk for psychiatric disorders.
Although risk for addiction is low with both MDMA and psychedelics, there is always some risk for misuse. MDMA “ can produce feelings of well-being, and then people might use it repeatedly, ” Dr. de Wit says. “ That doesn ’ t seem to be a problem for really a lot of people, but it could easily happen. ”
Still, possibilities remain for MDMA in the fight against loneliness.
“[People] feel open, they feel like things are possible, they feel like they’re unstuck,” Dr. Woolley says. “You can harness that in psychotherapy.”
A version of this article appeared on Medscape.com.
Some call the drug “ecstasy” or “molly.” Researchers are calling it a potential tool to help treat loneliness.
As public health experts sound the alarm on a rising loneliness epidemic in the United States and across the globe,
In the latest study, MDMA “led to a robust increase in feelings of connection” among people socializing in a controlled setting. Participants were dosed with either MDMA or a placebo and asked to chat with a stranger. Afterward, those who took MDMA said their companion was more responsive and attentive, and that they had plenty in common. The drug also “increased participants’ ratings of liking their partners, feeling connected and finding the conversation enjoyable and meaningful.”
The study was small — just 18 participants — but its results “have implications for MDMA-assisted therapy,” the authors wrote. “This feeling of connectedness could help patients feel safe and trusting, thereby facilitating deeper emotional exploration.”
MDMA “really does seem to make people want to interact more with other people,” says Harriet de Wit, PhD, a neuropharmacologist at the University of Chicago and one of the study’s authors. The results echo those of earlier research using psychedelics like LSD or psilocybin.
It’s important to note that any intervention involving MDMA or psychedelics would be a drug-assisted therapy — that is, used in conjunction with the appropriate therapy and in a therapeutic setting. MDMA-assisted therapy has already drawn popular and scientific attention, as it recently cleared clinical trials for treating posttraumatic stress disorder (PTSD) and may be nearing approval by the US Food and Drug Administration (FDA).
According to Friederike Holze, PhD, psychopharmacologist at the University of Basel, in Switzerland, “there could be a place” for MDMA and psychedelics in treating chronic loneliness, but only under professional supervision.
There would have to be clear guidelines too, says Joshua Woolley, MD, PhD, a psychiatrist at the University of California, San Francisco.
MDMA and psychedelics “induce this plastic state, a state where people can change. They feel open, they feel like things are possible,” Dr. Woolley says. Then, with therapy, “you can help them change.”
Loneliness Can Impact Our Health
On top of the mental health ramifications, the physiologic effects of loneliness could have grave consequences over time. In observational studies, loneliness has been linked to higher risks for cancer and heart disease, and shorter lifespan. One third of Americans over 45 say they are chronically lonely.
Chronic loneliness changes how we think and behave, research shows. It makes us fear contact with others and see them in a more negative light, as more threatening and less trustworthy. Lonely people prefer to stand farther apart from strangers and avoid touch.
This is where MDMA-assisted therapies could potentially help, by easing these defensive tendencies, according to Dr. Woolley.
MDMA, Psychedelics, and Social Behavior
MDMA, or 3,4-methylenedioxymethamphetamine, is a hybrid between a stimulant and a psychedelic. In Dr. de Wit’s earlier experiments, volunteers given MDMA engaged more in communal activities, chatting, and playing games. They used more positive words during social encounters than those who had received a placebo. And after MDMA, people felt less rejected if they were slighted in Cyberball — a virtual ball-tossing game commonly used to measure the effects of social exclusion.
MDMA has been shown to reduce people’s response to other’s negative emotions, diminishing activation of the amygdala (the brain’s fear center) while looking at pictures of angry faces.
This could be helpful. “If you perceive a person’s natural expression as being a little bit angry, if that disappears, then you might be more inclined to interact,” de Wit says.
However, there may be downsides, too. If a drug makes people more trusting and willing to connect, they could be taken advantage of. This is why, Dr. Woolley says, “psychedelics have been used in cults.”
MDMA may also make the experience of touch more pleasant. In a series of experiments in 2019, researchers gently stroked volunteers ’ arms with a goat-hair brush, mimicking the comforting gestures one may receive from a loved one. At the same time, the scientists monitored the volunteers’ facial muscles. People on MDMA perceived gentle touch as more pleasant than those on placebo, and their smile muscles activated more.
MDMA and psychedelics boost social behaviors in animals, too — suggesting that their effects on relationships have a biological basis. Rats on MDMA are more likely to lie next to each other, and mice become more resilient to social stress. Even octopuses become more outgoing after a dose of MDMA, choosing to spend more time with other octopuses instead of a new toy. Classic psychedelics show similar effects — LSD, for example, makes mice more social.
Psychedelics can induce a sense of a “dissolution of the self-other boundary,” Dr. Woolley says. People who take them often say it’s “helped them feel more connected to themselves and other people.” LSD, first synthesized in 1938, may help increase empathy in some people.
Psilocybin, a compound found in over 200 species of mushrooms and used for centuries in Mesoamerican rituals, also seems to boost empathy, with effects persisting for at least seven days. In Cyberball, the online ball-throwing game, people who took psilocybin felt less socially rejected, an outcome reflected in their brain activation patterns in one study — the areas responsible for social-pain processing appeared to dim after a dose.
Making It Legal and Putting It to Use
In 2020, Oregon became the first state to establish a regulatory framework for psilocybin for therapeutic use, and Colorado followed suit in 2022. Such therapeutic applications of psilocybin could help fight loneliness as well, Dr. Woolley believes, because a “ common symptom of depression is that people feel socially withdrawn and lack motivation, ” he says. As mentioned above, MDMA-assisted therapy is also nearing FDA approval for PTSD.
What remain unclear are the exact mechanisms at play.
“MDMA releases oxytocin, and it does that through serotonin receptors,” Dr. de Wit says. Serotonin activates 5-HT1A receptors in the hypothalamus, releasing oxytocin into the bloodstream. In Dr. de Wit’s recent experiments, the more people felt connected after taking MDMA, the more oxytocin was found circulating in their bodies. (Another drug, methamphetamine, also upped the levels of oxytocin but did not increase feelings of connectedness.)
“It’s likely that both something in the serotonin system independent of oxytocin, and oxytocin itself, contribute,” Dr. de Wit says. Dopamine, a neurotransmitter responsible for motivation, appears to increase as well.
The empathy-boosting effects of LSD also seem to be at least partly driven by oxytocin, experiments published in 2021 revealed. Studies in mice, meanwhile, suggest that glutamate, a chemical messenger in the brain, may be behind some of LSD’s prosocial effects.
Scientists are fairly certain which receptors these drugs bind to and which neurotransmitters they affect. “How that gets translated into these higher-order things like empathy and feeling connected to the world, we don’t totally understand,” Dr. Woolley says.
Challenges and the Future
Although MDMA and psychedelics are largely considered safe when taken in a legal, medically controlled setting, there is reason to be cautious.
“They have relatively low impact on the body, like heart rate increase or blood pressure increase. But they might leave some disturbing psychological effects,” says Dr. Holze. Scientists routinely screen experiment volunteers for their risk for psychiatric disorders.
Although risk for addiction is low with both MDMA and psychedelics, there is always some risk for misuse. MDMA “ can produce feelings of well-being, and then people might use it repeatedly, ” Dr. de Wit says. “ That doesn ’ t seem to be a problem for really a lot of people, but it could easily happen. ”
Still, possibilities remain for MDMA in the fight against loneliness.
“[People] feel open, they feel like things are possible, they feel like they’re unstuck,” Dr. Woolley says. “You can harness that in psychotherapy.”
A version of this article appeared on Medscape.com.
2023 AGA Innovation Conference on the Advances in Endosurgery
WASHINGTON, DC —
(formerly Consensus Conference) on the Advances in Endosurgery, November 10 – 11. It was organized and chaired by Amrita Sethi, MD, Columbia University Irving Medical Center—NYP and Sri Komanduri, MD, MS, Feinberg School of Medicine, Northwestern University, Chicago.The conference brought together gastroenterologists (GIs), surgeons, and industry partners to explore what further collaboration and clinical adoption is needed to advance endosurgical applications. Both GIs and surgeons welcomed potential collaboration especially in developing strategies to promote education and training initiatives, including defining what procedures and techniques are to be included in the endosurgery arena. Jeffrey Potkul, Medtronic Endoscopy, noted that this was a “great forum, format, and discussions — it will take novel approaches such as this conference and new collaboration models to ensure technology innovation in the endoluminal space can reach patients and empower improved outcomes in Gastroenterology.”
Topics discussed included third space endoscopy, endobariatric and metabolic endoscopy, and endoscopy related to transluminal access. Exciting new developments in robotic endoscopy were also highlighted with an attempt to understand the value proposition of this innovation in the endoscopy space, as well as successes and failures of past efforts to help guide success going forward. Other issues raised were methods for device development including initiating research studies, how to navigate regulatory processes for Food and Drug Administration approval of new devices, and ongoing issues related to billing and reimbursement. There was consensus around the need for collaboration between all stakeholders to drive innovation and its adoption in the field of endosurgery. This meeting is one of the first of its kind to bring innovators across multiple disciplines together with the intention of moving the entire field of endosurgery forward and encouraging creative solutions.
We would like to thank the members of the AGA Center for GI Innovation and Technology Committee and attendees who made this year’s conference a success. The conference was supported by independent grants from Boston Scientific Corporation, Cook Medical Inc., Endo Tools Therapeutics, Fujifilm Healthcare Americas Corporation, Intuitive Surgical, Olympus Corporation, and Medtronic.
WASHINGTON, DC —
(formerly Consensus Conference) on the Advances in Endosurgery, November 10 – 11. It was organized and chaired by Amrita Sethi, MD, Columbia University Irving Medical Center—NYP and Sri Komanduri, MD, MS, Feinberg School of Medicine, Northwestern University, Chicago.The conference brought together gastroenterologists (GIs), surgeons, and industry partners to explore what further collaboration and clinical adoption is needed to advance endosurgical applications. Both GIs and surgeons welcomed potential collaboration especially in developing strategies to promote education and training initiatives, including defining what procedures and techniques are to be included in the endosurgery arena. Jeffrey Potkul, Medtronic Endoscopy, noted that this was a “great forum, format, and discussions — it will take novel approaches such as this conference and new collaboration models to ensure technology innovation in the endoluminal space can reach patients and empower improved outcomes in Gastroenterology.”
Topics discussed included third space endoscopy, endobariatric and metabolic endoscopy, and endoscopy related to transluminal access. Exciting new developments in robotic endoscopy were also highlighted with an attempt to understand the value proposition of this innovation in the endoscopy space, as well as successes and failures of past efforts to help guide success going forward. Other issues raised were methods for device development including initiating research studies, how to navigate regulatory processes for Food and Drug Administration approval of new devices, and ongoing issues related to billing and reimbursement. There was consensus around the need for collaboration between all stakeholders to drive innovation and its adoption in the field of endosurgery. This meeting is one of the first of its kind to bring innovators across multiple disciplines together with the intention of moving the entire field of endosurgery forward and encouraging creative solutions.
We would like to thank the members of the AGA Center for GI Innovation and Technology Committee and attendees who made this year’s conference a success. The conference was supported by independent grants from Boston Scientific Corporation, Cook Medical Inc., Endo Tools Therapeutics, Fujifilm Healthcare Americas Corporation, Intuitive Surgical, Olympus Corporation, and Medtronic.
WASHINGTON, DC —
(formerly Consensus Conference) on the Advances in Endosurgery, November 10 – 11. It was organized and chaired by Amrita Sethi, MD, Columbia University Irving Medical Center—NYP and Sri Komanduri, MD, MS, Feinberg School of Medicine, Northwestern University, Chicago.The conference brought together gastroenterologists (GIs), surgeons, and industry partners to explore what further collaboration and clinical adoption is needed to advance endosurgical applications. Both GIs and surgeons welcomed potential collaboration especially in developing strategies to promote education and training initiatives, including defining what procedures and techniques are to be included in the endosurgery arena. Jeffrey Potkul, Medtronic Endoscopy, noted that this was a “great forum, format, and discussions — it will take novel approaches such as this conference and new collaboration models to ensure technology innovation in the endoluminal space can reach patients and empower improved outcomes in Gastroenterology.”
Topics discussed included third space endoscopy, endobariatric and metabolic endoscopy, and endoscopy related to transluminal access. Exciting new developments in robotic endoscopy were also highlighted with an attempt to understand the value proposition of this innovation in the endoscopy space, as well as successes and failures of past efforts to help guide success going forward. Other issues raised were methods for device development including initiating research studies, how to navigate regulatory processes for Food and Drug Administration approval of new devices, and ongoing issues related to billing and reimbursement. There was consensus around the need for collaboration between all stakeholders to drive innovation and its adoption in the field of endosurgery. This meeting is one of the first of its kind to bring innovators across multiple disciplines together with the intention of moving the entire field of endosurgery forward and encouraging creative solutions.
We would like to thank the members of the AGA Center for GI Innovation and Technology Committee and attendees who made this year’s conference a success. The conference was supported by independent grants from Boston Scientific Corporation, Cook Medical Inc., Endo Tools Therapeutics, Fujifilm Healthcare Americas Corporation, Intuitive Surgical, Olympus Corporation, and Medtronic.
Statins and the liver: Not harmful and perhaps beneficial
Pearls from the Pros was published in Gastro Hep Advances .
Dr. Friedman is the Anton R. Fried, MD, Chair of the Department of Medicine at Newton-Wellesley Hospital in Newton, Mass., and assistant chief of medicine at Massachusetts General Hospital, and a professor of medicine at Harvard Medical School and Tufts University, Boston. Dr. Martin is chief of the division of digestive health and liver diseases at the University of Miami, where he is the Mandel Chair of Gastroenterology. The authors disclose no conflicts.
Pearls from the Pros was published in Gastro Hep Advances .
Dr. Friedman is the Anton R. Fried, MD, Chair of the Department of Medicine at Newton-Wellesley Hospital in Newton, Mass., and assistant chief of medicine at Massachusetts General Hospital, and a professor of medicine at Harvard Medical School and Tufts University, Boston. Dr. Martin is chief of the division of digestive health and liver diseases at the University of Miami, where he is the Mandel Chair of Gastroenterology. The authors disclose no conflicts.
Pearls from the Pros was published in Gastro Hep Advances .
Dr. Friedman is the Anton R. Fried, MD, Chair of the Department of Medicine at Newton-Wellesley Hospital in Newton, Mass., and assistant chief of medicine at Massachusetts General Hospital, and a professor of medicine at Harvard Medical School and Tufts University, Boston. Dr. Martin is chief of the division of digestive health and liver diseases at the University of Miami, where he is the Mandel Chair of Gastroenterology. The authors disclose no conflicts.
AGA clinical practice guideline affirms role of biomarkers in Crohn’s disease management
, offering the most specific evidence-based recommendations yet for the use of fecal calprotectin (FCP) and serum C-reactive protein (CRP) in assessing disease activity.
Repeated monitoring with endoscopy allows for an objective assessment of inflammation and mucosal healing compared with symptoms alone. However, relying solely on endoscopy to guide management is an approach “limited by cost and resource utilization, invasiveness, and reduced patient acceptability,” wrote guideline authors on behalf of the AGA Clinical Guidelines Committee. The guideline was published online Nov. 17 in Gastroenterology.
“Use of biomarkers is no longer considered experimental and should be an integral part of IBD care and monitoring,” said Ashwin Ananthakrishnan, MBBS, MPH, a gastroenterologist with Massachusetts General Hospital in Boston and first author of the guideline. “We need further studies to define their optimal longitudinal use, but at a given time point, there is now abundant evidence that biomarkers provide significant incremental benefit over symptoms alone in assessing a patient’s status.”
Using evidence from randomized controlled trials and observational studies, and applying it to common clinical scenarios, there are conditional recommendations on the use of biomarkers in patients with established, diagnosed disease who were asymptomatic, symptomatic, or in surgically induced remission. Those recommendations, laid out in a detailed Clinical Decision Support Tool, include the following:
For asymptomatic patients: Check CRP and FCP every 6-12 months. Patients with normal levels, and who have endoscopically confirmed remission within the last 3 years without any subsequent change in symptoms or treatment, need not undergo endoscopy and can be followed with biomarker and clinical checks alone. If CRP or FCP are elevated (defined as CRP ≥ 5 mg/L, FCP ≥ 150 mcg/g), consider repeating biomarkers and/or performing endoscopic assessment of disease activity before adjusting treatment.
For mildly symptomatic patients: Role of biomarker testing may be limited and endoscopic or radiologic assessment may be required to assess active inflammation given the higher rate of false positive and false negative results with biomarkers in this population.
For patients with more severe symptoms: Elevated CRP or FCP can be used to guide treatment adjustment without endoscopic confirmation in certain situations. Normal levels may be false negative and should be confirmed by endoscopic assessment of disease activity.
For patients in surgically induced remission with a low likelihood of recurrence: FCP levels below 50 mcg/g can be used in lieu of routine endoscopic assessment within the first year after surgery. Higher FCP levels should prompt endoscopic assessment.
For patients in surgically induced remission with a high risk of recurrence: Do not rely on biomarkers. Perform endoscopic assessment.
All recommendations were deemed of low to moderate certainty based on results from randomized clinical trials and observational studies that utilized these biomarkers in patients with Crohn’s disease. Citing a dearth of quality evidence, the guideline authors determined they could not make recommendations on the use of a third proprietary biomarker — the endoscopic healing index (EHI).
Recent AGA Clinical Practice Guidelines on the role of biomarkers in ulcerative colitis, published in March, also support a strong role for fecal and blood biomarkers, determining when these can be used to avoid unneeded endoscopic assessments. However, in patients with Crohn’s disease, symptoms correlate less well with endoscopic activity.
As a result, “biomarker performance was acceptable only in asymptomatic individuals who had recently confirmed endoscopic remission; in those without recent endoscopic assessment, test performance was suboptimal.” In addition, the weaker correlation between symptoms and endoscopic activity in Crohn’s “reduced the utility of biomarker measurement to infer disease activity in those with mild symptoms.”
The guidelines were fully funded by the AGA Institute. The authors disclosed a number of potential conflicts of interest, including receiving research grants, as well as consulting and speaking fees, from pharmaceutical companies.
, offering the most specific evidence-based recommendations yet for the use of fecal calprotectin (FCP) and serum C-reactive protein (CRP) in assessing disease activity.
Repeated monitoring with endoscopy allows for an objective assessment of inflammation and mucosal healing compared with symptoms alone. However, relying solely on endoscopy to guide management is an approach “limited by cost and resource utilization, invasiveness, and reduced patient acceptability,” wrote guideline authors on behalf of the AGA Clinical Guidelines Committee. The guideline was published online Nov. 17 in Gastroenterology.
“Use of biomarkers is no longer considered experimental and should be an integral part of IBD care and monitoring,” said Ashwin Ananthakrishnan, MBBS, MPH, a gastroenterologist with Massachusetts General Hospital in Boston and first author of the guideline. “We need further studies to define their optimal longitudinal use, but at a given time point, there is now abundant evidence that biomarkers provide significant incremental benefit over symptoms alone in assessing a patient’s status.”
Using evidence from randomized controlled trials and observational studies, and applying it to common clinical scenarios, there are conditional recommendations on the use of biomarkers in patients with established, diagnosed disease who were asymptomatic, symptomatic, or in surgically induced remission. Those recommendations, laid out in a detailed Clinical Decision Support Tool, include the following:
For asymptomatic patients: Check CRP and FCP every 6-12 months. Patients with normal levels, and who have endoscopically confirmed remission within the last 3 years without any subsequent change in symptoms or treatment, need not undergo endoscopy and can be followed with biomarker and clinical checks alone. If CRP or FCP are elevated (defined as CRP ≥ 5 mg/L, FCP ≥ 150 mcg/g), consider repeating biomarkers and/or performing endoscopic assessment of disease activity before adjusting treatment.
For mildly symptomatic patients: Role of biomarker testing may be limited and endoscopic or radiologic assessment may be required to assess active inflammation given the higher rate of false positive and false negative results with biomarkers in this population.
For patients with more severe symptoms: Elevated CRP or FCP can be used to guide treatment adjustment without endoscopic confirmation in certain situations. Normal levels may be false negative and should be confirmed by endoscopic assessment of disease activity.
For patients in surgically induced remission with a low likelihood of recurrence: FCP levels below 50 mcg/g can be used in lieu of routine endoscopic assessment within the first year after surgery. Higher FCP levels should prompt endoscopic assessment.
For patients in surgically induced remission with a high risk of recurrence: Do not rely on biomarkers. Perform endoscopic assessment.
All recommendations were deemed of low to moderate certainty based on results from randomized clinical trials and observational studies that utilized these biomarkers in patients with Crohn’s disease. Citing a dearth of quality evidence, the guideline authors determined they could not make recommendations on the use of a third proprietary biomarker — the endoscopic healing index (EHI).
Recent AGA Clinical Practice Guidelines on the role of biomarkers in ulcerative colitis, published in March, also support a strong role for fecal and blood biomarkers, determining when these can be used to avoid unneeded endoscopic assessments. However, in patients with Crohn’s disease, symptoms correlate less well with endoscopic activity.
As a result, “biomarker performance was acceptable only in asymptomatic individuals who had recently confirmed endoscopic remission; in those without recent endoscopic assessment, test performance was suboptimal.” In addition, the weaker correlation between symptoms and endoscopic activity in Crohn’s “reduced the utility of biomarker measurement to infer disease activity in those with mild symptoms.”
The guidelines were fully funded by the AGA Institute. The authors disclosed a number of potential conflicts of interest, including receiving research grants, as well as consulting and speaking fees, from pharmaceutical companies.
, offering the most specific evidence-based recommendations yet for the use of fecal calprotectin (FCP) and serum C-reactive protein (CRP) in assessing disease activity.
Repeated monitoring with endoscopy allows for an objective assessment of inflammation and mucosal healing compared with symptoms alone. However, relying solely on endoscopy to guide management is an approach “limited by cost and resource utilization, invasiveness, and reduced patient acceptability,” wrote guideline authors on behalf of the AGA Clinical Guidelines Committee. The guideline was published online Nov. 17 in Gastroenterology.
“Use of biomarkers is no longer considered experimental and should be an integral part of IBD care and monitoring,” said Ashwin Ananthakrishnan, MBBS, MPH, a gastroenterologist with Massachusetts General Hospital in Boston and first author of the guideline. “We need further studies to define their optimal longitudinal use, but at a given time point, there is now abundant evidence that biomarkers provide significant incremental benefit over symptoms alone in assessing a patient’s status.”
Using evidence from randomized controlled trials and observational studies, and applying it to common clinical scenarios, there are conditional recommendations on the use of biomarkers in patients with established, diagnosed disease who were asymptomatic, symptomatic, or in surgically induced remission. Those recommendations, laid out in a detailed Clinical Decision Support Tool, include the following:
For asymptomatic patients: Check CRP and FCP every 6-12 months. Patients with normal levels, and who have endoscopically confirmed remission within the last 3 years without any subsequent change in symptoms or treatment, need not undergo endoscopy and can be followed with biomarker and clinical checks alone. If CRP or FCP are elevated (defined as CRP ≥ 5 mg/L, FCP ≥ 150 mcg/g), consider repeating biomarkers and/or performing endoscopic assessment of disease activity before adjusting treatment.
For mildly symptomatic patients: Role of biomarker testing may be limited and endoscopic or radiologic assessment may be required to assess active inflammation given the higher rate of false positive and false negative results with biomarkers in this population.
For patients with more severe symptoms: Elevated CRP or FCP can be used to guide treatment adjustment without endoscopic confirmation in certain situations. Normal levels may be false negative and should be confirmed by endoscopic assessment of disease activity.
For patients in surgically induced remission with a low likelihood of recurrence: FCP levels below 50 mcg/g can be used in lieu of routine endoscopic assessment within the first year after surgery. Higher FCP levels should prompt endoscopic assessment.
For patients in surgically induced remission with a high risk of recurrence: Do not rely on biomarkers. Perform endoscopic assessment.
All recommendations were deemed of low to moderate certainty based on results from randomized clinical trials and observational studies that utilized these biomarkers in patients with Crohn’s disease. Citing a dearth of quality evidence, the guideline authors determined they could not make recommendations on the use of a third proprietary biomarker — the endoscopic healing index (EHI).
Recent AGA Clinical Practice Guidelines on the role of biomarkers in ulcerative colitis, published in March, also support a strong role for fecal and blood biomarkers, determining when these can be used to avoid unneeded endoscopic assessments. However, in patients with Crohn’s disease, symptoms correlate less well with endoscopic activity.
As a result, “biomarker performance was acceptable only in asymptomatic individuals who had recently confirmed endoscopic remission; in those without recent endoscopic assessment, test performance was suboptimal.” In addition, the weaker correlation between symptoms and endoscopic activity in Crohn’s “reduced the utility of biomarker measurement to infer disease activity in those with mild symptoms.”
The guidelines were fully funded by the AGA Institute. The authors disclosed a number of potential conflicts of interest, including receiving research grants, as well as consulting and speaking fees, from pharmaceutical companies.
FROM GASTROENTEROLOGY
Experimental Therapy Restores Cognitive Function in Chronic TBI
(msTBI) and chronic sequelae.
Participants in this first-in-humans trial experienced brain injuries between 3-18 years before the study that left them with persistent neuropsychological impairment and a range of functional disabilities.
This is the first time a DBS device has been implanted in the central thalamus in humans, an area of the brain measuring only a few millimeters wide that helps regulate consciousness.
Placing the electrodes required a novel surgical technique developed by the investigators that included virtual models of each participant’s brain, microelectrode recording, and neuroimaging to identify neuronal circuits affected by the TBI.
After 3 months of 12-hour daily DBS treatments, participants’ performance on cognitive tests improved by an average of 32% from baseline. Participants were able to read books, watch TV shows, play video games, complete schoolwork, and felt significantly less fatigued during the day.
Although the small trial only included five patients, the work is already being hailed by other experts as significant.“We were looking for partial restoration of executive attention and expected [the treatment] would have an effect, but I wouldn’t have anticipated the effect size we saw,” co-lead investigator Nicholas Schiff, MD, professor of neuroscience at Weill Cornell Medical College, New York City, said in an interview.
The findings were published online Dec. 4 in Nature Medicine.
“No Trivial Feat”
An estimated 5.3 million children and adults are living with a permanent TBI-related disability in the US today. There currently is no effective therapy for impaired attention, executive function, working memory or information-processing speed caused by the initial injury.
Previous research suggests that a loss of activity in key brain circuits in the thalamus may be associated with a loss of cognitive function.
The investigators recruited six adults (four men and two women) between the ages of 22 and 60 years with a history of msTBI and chronic neuropsychological impairment and functional disability. One participant was later withdrawn from the trial for protocol noncompliance.
Participants completed a range of questionnaires and tests to establish baseline cognitive, psychological, and quality-of-life status.
To restore lost executive functioning in the brain, investigators had to target not only the central lateral nucleus, but also the neuronal network connected to the region that reaches other parts of the brain.
“To do both of those things we had to develop a whole toolset in order to model both the target and trajectory, which had to be right to make it work properly,” co-lead investigator Jaimie Henderson, MD, professor of neurosurgery at Stanford University College of Medicine, Stanford, California, said in an interview. “That gave us a pretty narrow window in which to work and getting an electrode accurately to this target is not a trivial feat.”
“A Moving Target”
Each participant’s brain physiology was slightly different, meaning the path that worked for one individual might not work for another. The surgery was further complicated by shifting in the brain that occurred as individual electrodes were placed.
“It was a literal moving target,” Dr. Henderson said.
In the beginning, investigators used microelectrode recording to “listen” to individual neurons to see which ones weren’t firing correctly.
When that method failed to offer the precise information needed for electrode placement, the investigators switched to neuroimaging, which allowed them to complete the surgery more quickly and accurately.
Participants remained in the hospital 1-2 days after surgery. They returned for postoperative imaging 30 days after surgery and were randomly assigned to different schedules for a 14-day titration period to optimize DBS stimulation.
The primary outcome was a 10% improvement on part B of the trail-making test, a neuropsychological test that measures executive functioning.
After 90 days of 12-hour daily DBS treatments, participants’ scores increased 15%–52% (average 32%) from baseline. Participants also reported an average of 33% decline in fatigue, one of the most common side effects of msTBI, and an average 80% improvement in attention.
The main safety risk during the 3- to-4-hour procedure is bleeding, which didn’t affect any of the participants in this study. One participant developed a surgical site infection, but all other side effects were mild.
After the 90-day treatment period, the study plan called for patients to be randomly assigned to a blinded withdrawal of treatment, with the DBS turned off for 21 days. Two of the patients declined to be randomized. DBS was turned off in one participant while the other two continued as normal.
After 3 weeks, the patient whose DBS was turned off showed a 34% decline on cognitive tests. The device was reactivated after the study and that participant has since reported improvements.
The DBS devices continue to function in all participants. Although their performance is not being measured as part of the study, anecdotal reports indicate sustained improvement in executive functioning.
“The brain injury causes this global down-regulation of brain function and what we think that this is doing is turning that back up again,” Dr. Henderson said. “At a very simplistic level, what we’re trying to do is turn the lights back up after the dimmer switch is switched down from the injury.”
New Hope
TBI patients are usually treated aggressively during the first year, when significant improvements are most likely, but there are few therapeutic options beyond that time, said neurologist Javier Cardenas, MD, who commented on the findings for this article.
“Many providers throw their hands up after a year in terms of intervention and then we’re always looking at potential declines over time,” said Dr. Cardenas, director of the Concussion and Brain Injury Center at the Rockefeller Neuroscience Institution, West Virginia University, Morgantown. “Most people plateau and don’t decline but we’re always worried about a secondary decline in traumatic brain injury.”Surgery is usually only employed immediately following the brain injury. The notion of surgery as a therapeutic option years after the initial assault on the brain is novel, said Jimmy Yang, MD, assistant professor of neurologic surgery at Ohio State University College of Medicine, Columbus, who commented on the findings for this article.
“While deep brain stimulation surgery in clinical practice is specifically tailored to each patient we treat, this study goes a step further by integrating research tools that have not yet made it to the clinical realm,” Dr. Yang said. “As a result, while these methods are not commonly used in clinical care, the overall strategy highlights how research advances are linked to clinical advances.”
Investigators are working to secure funding for a larger phase 2 trial.
“With millions of people affected by traumatic brain injury but without effective therapies, this study brings hope that options are on the horizon to help these patients,” Dr. Yang said.
The study was supported by funding from the National Institute of Health BRAIN Initiative and a grant from the Translational Science Center at Weill Cornell Medical College. Surgical implants were provided by Medtronic. Dr. Henderson and Dr. Schiff are listed as inventors on several patent applications for the experimental DBS therapy described in the study. Dr. Cardenas and Dr. Yang report no relevant financial relationships.
A version of this article first appeared on Medscape.com .
(msTBI) and chronic sequelae.
Participants in this first-in-humans trial experienced brain injuries between 3-18 years before the study that left them with persistent neuropsychological impairment and a range of functional disabilities.
This is the first time a DBS device has been implanted in the central thalamus in humans, an area of the brain measuring only a few millimeters wide that helps regulate consciousness.
Placing the electrodes required a novel surgical technique developed by the investigators that included virtual models of each participant’s brain, microelectrode recording, and neuroimaging to identify neuronal circuits affected by the TBI.
After 3 months of 12-hour daily DBS treatments, participants’ performance on cognitive tests improved by an average of 32% from baseline. Participants were able to read books, watch TV shows, play video games, complete schoolwork, and felt significantly less fatigued during the day.
Although the small trial only included five patients, the work is already being hailed by other experts as significant.“We were looking for partial restoration of executive attention and expected [the treatment] would have an effect, but I wouldn’t have anticipated the effect size we saw,” co-lead investigator Nicholas Schiff, MD, professor of neuroscience at Weill Cornell Medical College, New York City, said in an interview.
The findings were published online Dec. 4 in Nature Medicine.
“No Trivial Feat”
An estimated 5.3 million children and adults are living with a permanent TBI-related disability in the US today. There currently is no effective therapy for impaired attention, executive function, working memory or information-processing speed caused by the initial injury.
Previous research suggests that a loss of activity in key brain circuits in the thalamus may be associated with a loss of cognitive function.
The investigators recruited six adults (four men and two women) between the ages of 22 and 60 years with a history of msTBI and chronic neuropsychological impairment and functional disability. One participant was later withdrawn from the trial for protocol noncompliance.
Participants completed a range of questionnaires and tests to establish baseline cognitive, psychological, and quality-of-life status.
To restore lost executive functioning in the brain, investigators had to target not only the central lateral nucleus, but also the neuronal network connected to the region that reaches other parts of the brain.
“To do both of those things we had to develop a whole toolset in order to model both the target and trajectory, which had to be right to make it work properly,” co-lead investigator Jaimie Henderson, MD, professor of neurosurgery at Stanford University College of Medicine, Stanford, California, said in an interview. “That gave us a pretty narrow window in which to work and getting an electrode accurately to this target is not a trivial feat.”
“A Moving Target”
Each participant’s brain physiology was slightly different, meaning the path that worked for one individual might not work for another. The surgery was further complicated by shifting in the brain that occurred as individual electrodes were placed.
“It was a literal moving target,” Dr. Henderson said.
In the beginning, investigators used microelectrode recording to “listen” to individual neurons to see which ones weren’t firing correctly.
When that method failed to offer the precise information needed for electrode placement, the investigators switched to neuroimaging, which allowed them to complete the surgery more quickly and accurately.
Participants remained in the hospital 1-2 days after surgery. They returned for postoperative imaging 30 days after surgery and were randomly assigned to different schedules for a 14-day titration period to optimize DBS stimulation.
The primary outcome was a 10% improvement on part B of the trail-making test, a neuropsychological test that measures executive functioning.
After 90 days of 12-hour daily DBS treatments, participants’ scores increased 15%–52% (average 32%) from baseline. Participants also reported an average of 33% decline in fatigue, one of the most common side effects of msTBI, and an average 80% improvement in attention.
The main safety risk during the 3- to-4-hour procedure is bleeding, which didn’t affect any of the participants in this study. One participant developed a surgical site infection, but all other side effects were mild.
After the 90-day treatment period, the study plan called for patients to be randomly assigned to a blinded withdrawal of treatment, with the DBS turned off for 21 days. Two of the patients declined to be randomized. DBS was turned off in one participant while the other two continued as normal.
After 3 weeks, the patient whose DBS was turned off showed a 34% decline on cognitive tests. The device was reactivated after the study and that participant has since reported improvements.
The DBS devices continue to function in all participants. Although their performance is not being measured as part of the study, anecdotal reports indicate sustained improvement in executive functioning.
“The brain injury causes this global down-regulation of brain function and what we think that this is doing is turning that back up again,” Dr. Henderson said. “At a very simplistic level, what we’re trying to do is turn the lights back up after the dimmer switch is switched down from the injury.”
New Hope
TBI patients are usually treated aggressively during the first year, when significant improvements are most likely, but there are few therapeutic options beyond that time, said neurologist Javier Cardenas, MD, who commented on the findings for this article.
“Many providers throw their hands up after a year in terms of intervention and then we’re always looking at potential declines over time,” said Dr. Cardenas, director of the Concussion and Brain Injury Center at the Rockefeller Neuroscience Institution, West Virginia University, Morgantown. “Most people plateau and don’t decline but we’re always worried about a secondary decline in traumatic brain injury.”Surgery is usually only employed immediately following the brain injury. The notion of surgery as a therapeutic option years after the initial assault on the brain is novel, said Jimmy Yang, MD, assistant professor of neurologic surgery at Ohio State University College of Medicine, Columbus, who commented on the findings for this article.
“While deep brain stimulation surgery in clinical practice is specifically tailored to each patient we treat, this study goes a step further by integrating research tools that have not yet made it to the clinical realm,” Dr. Yang said. “As a result, while these methods are not commonly used in clinical care, the overall strategy highlights how research advances are linked to clinical advances.”
Investigators are working to secure funding for a larger phase 2 trial.
“With millions of people affected by traumatic brain injury but without effective therapies, this study brings hope that options are on the horizon to help these patients,” Dr. Yang said.
The study was supported by funding from the National Institute of Health BRAIN Initiative and a grant from the Translational Science Center at Weill Cornell Medical College. Surgical implants were provided by Medtronic. Dr. Henderson and Dr. Schiff are listed as inventors on several patent applications for the experimental DBS therapy described in the study. Dr. Cardenas and Dr. Yang report no relevant financial relationships.
A version of this article first appeared on Medscape.com .
(msTBI) and chronic sequelae.
Participants in this first-in-humans trial experienced brain injuries between 3-18 years before the study that left them with persistent neuropsychological impairment and a range of functional disabilities.
This is the first time a DBS device has been implanted in the central thalamus in humans, an area of the brain measuring only a few millimeters wide that helps regulate consciousness.
Placing the electrodes required a novel surgical technique developed by the investigators that included virtual models of each participant’s brain, microelectrode recording, and neuroimaging to identify neuronal circuits affected by the TBI.
After 3 months of 12-hour daily DBS treatments, participants’ performance on cognitive tests improved by an average of 32% from baseline. Participants were able to read books, watch TV shows, play video games, complete schoolwork, and felt significantly less fatigued during the day.
Although the small trial only included five patients, the work is already being hailed by other experts as significant.“We were looking for partial restoration of executive attention and expected [the treatment] would have an effect, but I wouldn’t have anticipated the effect size we saw,” co-lead investigator Nicholas Schiff, MD, professor of neuroscience at Weill Cornell Medical College, New York City, said in an interview.
The findings were published online Dec. 4 in Nature Medicine.
“No Trivial Feat”
An estimated 5.3 million children and adults are living with a permanent TBI-related disability in the US today. There currently is no effective therapy for impaired attention, executive function, working memory or information-processing speed caused by the initial injury.
Previous research suggests that a loss of activity in key brain circuits in the thalamus may be associated with a loss of cognitive function.
The investigators recruited six adults (four men and two women) between the ages of 22 and 60 years with a history of msTBI and chronic neuropsychological impairment and functional disability. One participant was later withdrawn from the trial for protocol noncompliance.
Participants completed a range of questionnaires and tests to establish baseline cognitive, psychological, and quality-of-life status.
To restore lost executive functioning in the brain, investigators had to target not only the central lateral nucleus, but also the neuronal network connected to the region that reaches other parts of the brain.
“To do both of those things we had to develop a whole toolset in order to model both the target and trajectory, which had to be right to make it work properly,” co-lead investigator Jaimie Henderson, MD, professor of neurosurgery at Stanford University College of Medicine, Stanford, California, said in an interview. “That gave us a pretty narrow window in which to work and getting an electrode accurately to this target is not a trivial feat.”
“A Moving Target”
Each participant’s brain physiology was slightly different, meaning the path that worked for one individual might not work for another. The surgery was further complicated by shifting in the brain that occurred as individual electrodes were placed.
“It was a literal moving target,” Dr. Henderson said.
In the beginning, investigators used microelectrode recording to “listen” to individual neurons to see which ones weren’t firing correctly.
When that method failed to offer the precise information needed for electrode placement, the investigators switched to neuroimaging, which allowed them to complete the surgery more quickly and accurately.
Participants remained in the hospital 1-2 days after surgery. They returned for postoperative imaging 30 days after surgery and were randomly assigned to different schedules for a 14-day titration period to optimize DBS stimulation.
The primary outcome was a 10% improvement on part B of the trail-making test, a neuropsychological test that measures executive functioning.
After 90 days of 12-hour daily DBS treatments, participants’ scores increased 15%–52% (average 32%) from baseline. Participants also reported an average of 33% decline in fatigue, one of the most common side effects of msTBI, and an average 80% improvement in attention.
The main safety risk during the 3- to-4-hour procedure is bleeding, which didn’t affect any of the participants in this study. One participant developed a surgical site infection, but all other side effects were mild.
After the 90-day treatment period, the study plan called for patients to be randomly assigned to a blinded withdrawal of treatment, with the DBS turned off for 21 days. Two of the patients declined to be randomized. DBS was turned off in one participant while the other two continued as normal.
After 3 weeks, the patient whose DBS was turned off showed a 34% decline on cognitive tests. The device was reactivated after the study and that participant has since reported improvements.
The DBS devices continue to function in all participants. Although their performance is not being measured as part of the study, anecdotal reports indicate sustained improvement in executive functioning.
“The brain injury causes this global down-regulation of brain function and what we think that this is doing is turning that back up again,” Dr. Henderson said. “At a very simplistic level, what we’re trying to do is turn the lights back up after the dimmer switch is switched down from the injury.”
New Hope
TBI patients are usually treated aggressively during the first year, when significant improvements are most likely, but there are few therapeutic options beyond that time, said neurologist Javier Cardenas, MD, who commented on the findings for this article.
“Many providers throw their hands up after a year in terms of intervention and then we’re always looking at potential declines over time,” said Dr. Cardenas, director of the Concussion and Brain Injury Center at the Rockefeller Neuroscience Institution, West Virginia University, Morgantown. “Most people plateau and don’t decline but we’re always worried about a secondary decline in traumatic brain injury.”Surgery is usually only employed immediately following the brain injury. The notion of surgery as a therapeutic option years after the initial assault on the brain is novel, said Jimmy Yang, MD, assistant professor of neurologic surgery at Ohio State University College of Medicine, Columbus, who commented on the findings for this article.
“While deep brain stimulation surgery in clinical practice is specifically tailored to each patient we treat, this study goes a step further by integrating research tools that have not yet made it to the clinical realm,” Dr. Yang said. “As a result, while these methods are not commonly used in clinical care, the overall strategy highlights how research advances are linked to clinical advances.”
Investigators are working to secure funding for a larger phase 2 trial.
“With millions of people affected by traumatic brain injury but without effective therapies, this study brings hope that options are on the horizon to help these patients,” Dr. Yang said.
The study was supported by funding from the National Institute of Health BRAIN Initiative and a grant from the Translational Science Center at Weill Cornell Medical College. Surgical implants were provided by Medtronic. Dr. Henderson and Dr. Schiff are listed as inventors on several patent applications for the experimental DBS therapy described in the study. Dr. Cardenas and Dr. Yang report no relevant financial relationships.
A version of this article first appeared on Medscape.com .
Acyclcarnitines could drive IBD via dysbiosis
These findings improve our understanding of IBD pathogenesis and disease course, and could prove valuable in biomarker research, reported lead author Gary D. Wu, MD, of the University of Pennsylvania, Philadelphia, and colleagues.
In health, carnitine and acylcarnitines aid in fatty acid transport, the investigators wrote in September in Cellular and Molecular Gastroenterology and Hepatology. Acylcarnitines are also involved in metabolic signaling, and in the absence of sufficient short-chain fatty acids may serve as an alternative energy source for the intestinal epithelium.
“Recently, we and others have shown that fecal acylcarnitines are increased in patients with IBD, especially during dysbiosis,” they noted. “However, the mechanism(s) responsible for the increase of fecal acylcarnitines in IBD and their biological function have not been elucidated.”
The present study aimed to address this knowledge gap by characterizing both carnitine and acylcarnitines in pediatric IBD.
First, the investigators confirmed that both carnitine and acylcarnitines were elevated in fecal samples from pediatric patients with IBD.
Next, they analyzed fecal samples from subjects in the Food and Resulting Microbiota and Metabolome (FARMM) study, which compared microbiota recovery after gut purge and antibiotics among participants eating an omnivorous diet, a vegan diet, or an exclusive enteral nutrition (EEN) diet lacking in fiber. After the antibiotics, levels of fecal carnitine and acylcarnitines increased significantly in all groups, suggesting that microbiota were consuming these molecules.
To clarify the relationship between inflammation and levels of carnitine and acylcarnitines in the absence of microbiota, Dr. Wu and colleagues employed a germ-free mouse model with dextran sodium sulfate (DSS)–induced colitis. Levels of both molecule types were significantly increased in bile and plasma of mice with colitis versus those that were not exposed to DSS.
“Because the gut microbiota consumes both carnitine and acylcarnitines, these results are consistent with the notion that the increase of these metabolites in the feces of patients with IBD is driven by increased biliary delivery of acylcarnitines to the lumen combined with the reduced number and function of mitochondria in the colonic epithelium as previously reported,” the investigators wrote.
Further experiments with plated cultures and mice revealed that various bacterial species consumed carnitine and acylcarnitines in distinct patterns. Enterobacteriaceae demonstrated a notable proclivity for consumption in vitro and within the murine gut.
“As a high-dimensional analytic feature, the pattern of fecal acylcarnitines, perhaps together with bacterial taxonomy, may have utility as a biomarker for the presence or prognosis of IBD,” Dr. Wu and colleagues concluded. “In addition, based on currently available information about the impact of carnitine on the biology of Enterobacteriaceae, acylcarnitines also may have an important functional effect on the biology of the gut microbiota that is relevant to the pathogenesis or course of disease in patients with IBD.”
The study was supported by the Crohn’s and Colitis Foundation, the PennCHOP Microbiome Program, the Penn Center for Nutritional Science and Medicine, and others. The investigators disclosed no conflicts of interest.
The description of noninvasive biomarkers for inflammatory bowel disease (IBD) is key to better characterizing the disease pathogenesis. In this new publication, Lemons et al. describe deleterious effects of gut luminal carnitine and acylcarnitine in pediatric IBD patients, showing that these metabolites can serve as energy substrates to the microbiota, especially Enterobacteriaceae, promoting the growth of pathobionts and contributing to the persistence of dysbiosis which, in turn, may foster the course of IBD. In fact, acylcarnitine had been highlighted as a potential new target for IBD during dysbiosis by a previous multi-omics study of the gut microbiome. Moreover, Dr. Gary Wu’s team has shown that the intestinal epithelium can uptake and use acylcarnitine as an alternative source for energy production. However, epithelial mitochondrial dysfunction triggered by inflammation reduces the capacity of colonocytes to consume long-chain fatty acids, thus enhancing the fecal levels of acylcarnitine as described in IBD patients.
The description of noninvasive biomarkers for inflammatory bowel disease (IBD) is key to better characterizing the disease pathogenesis. In this new publication, Lemons et al. describe deleterious effects of gut luminal carnitine and acylcarnitine in pediatric IBD patients, showing that these metabolites can serve as energy substrates to the microbiota, especially Enterobacteriaceae, promoting the growth of pathobionts and contributing to the persistence of dysbiosis which, in turn, may foster the course of IBD. In fact, acylcarnitine had been highlighted as a potential new target for IBD during dysbiosis by a previous multi-omics study of the gut microbiome. Moreover, Dr. Gary Wu’s team has shown that the intestinal epithelium can uptake and use acylcarnitine as an alternative source for energy production. However, epithelial mitochondrial dysfunction triggered by inflammation reduces the capacity of colonocytes to consume long-chain fatty acids, thus enhancing the fecal levels of acylcarnitine as described in IBD patients.
The description of noninvasive biomarkers for inflammatory bowel disease (IBD) is key to better characterizing the disease pathogenesis. In this new publication, Lemons et al. describe deleterious effects of gut luminal carnitine and acylcarnitine in pediatric IBD patients, showing that these metabolites can serve as energy substrates to the microbiota, especially Enterobacteriaceae, promoting the growth of pathobionts and contributing to the persistence of dysbiosis which, in turn, may foster the course of IBD. In fact, acylcarnitine had been highlighted as a potential new target for IBD during dysbiosis by a previous multi-omics study of the gut microbiome. Moreover, Dr. Gary Wu’s team has shown that the intestinal epithelium can uptake and use acylcarnitine as an alternative source for energy production. However, epithelial mitochondrial dysfunction triggered by inflammation reduces the capacity of colonocytes to consume long-chain fatty acids, thus enhancing the fecal levels of acylcarnitine as described in IBD patients.
These findings improve our understanding of IBD pathogenesis and disease course, and could prove valuable in biomarker research, reported lead author Gary D. Wu, MD, of the University of Pennsylvania, Philadelphia, and colleagues.
In health, carnitine and acylcarnitines aid in fatty acid transport, the investigators wrote in September in Cellular and Molecular Gastroenterology and Hepatology. Acylcarnitines are also involved in metabolic signaling, and in the absence of sufficient short-chain fatty acids may serve as an alternative energy source for the intestinal epithelium.
“Recently, we and others have shown that fecal acylcarnitines are increased in patients with IBD, especially during dysbiosis,” they noted. “However, the mechanism(s) responsible for the increase of fecal acylcarnitines in IBD and their biological function have not been elucidated.”
The present study aimed to address this knowledge gap by characterizing both carnitine and acylcarnitines in pediatric IBD.
First, the investigators confirmed that both carnitine and acylcarnitines were elevated in fecal samples from pediatric patients with IBD.
Next, they analyzed fecal samples from subjects in the Food and Resulting Microbiota and Metabolome (FARMM) study, which compared microbiota recovery after gut purge and antibiotics among participants eating an omnivorous diet, a vegan diet, or an exclusive enteral nutrition (EEN) diet lacking in fiber. After the antibiotics, levels of fecal carnitine and acylcarnitines increased significantly in all groups, suggesting that microbiota were consuming these molecules.
To clarify the relationship between inflammation and levels of carnitine and acylcarnitines in the absence of microbiota, Dr. Wu and colleagues employed a germ-free mouse model with dextran sodium sulfate (DSS)–induced colitis. Levels of both molecule types were significantly increased in bile and plasma of mice with colitis versus those that were not exposed to DSS.
“Because the gut microbiota consumes both carnitine and acylcarnitines, these results are consistent with the notion that the increase of these metabolites in the feces of patients with IBD is driven by increased biliary delivery of acylcarnitines to the lumen combined with the reduced number and function of mitochondria in the colonic epithelium as previously reported,” the investigators wrote.
Further experiments with plated cultures and mice revealed that various bacterial species consumed carnitine and acylcarnitines in distinct patterns. Enterobacteriaceae demonstrated a notable proclivity for consumption in vitro and within the murine gut.
“As a high-dimensional analytic feature, the pattern of fecal acylcarnitines, perhaps together with bacterial taxonomy, may have utility as a biomarker for the presence or prognosis of IBD,” Dr. Wu and colleagues concluded. “In addition, based on currently available information about the impact of carnitine on the biology of Enterobacteriaceae, acylcarnitines also may have an important functional effect on the biology of the gut microbiota that is relevant to the pathogenesis or course of disease in patients with IBD.”
The study was supported by the Crohn’s and Colitis Foundation, the PennCHOP Microbiome Program, the Penn Center for Nutritional Science and Medicine, and others. The investigators disclosed no conflicts of interest.
These findings improve our understanding of IBD pathogenesis and disease course, and could prove valuable in biomarker research, reported lead author Gary D. Wu, MD, of the University of Pennsylvania, Philadelphia, and colleagues.
In health, carnitine and acylcarnitines aid in fatty acid transport, the investigators wrote in September in Cellular and Molecular Gastroenterology and Hepatology. Acylcarnitines are also involved in metabolic signaling, and in the absence of sufficient short-chain fatty acids may serve as an alternative energy source for the intestinal epithelium.
“Recently, we and others have shown that fecal acylcarnitines are increased in patients with IBD, especially during dysbiosis,” they noted. “However, the mechanism(s) responsible for the increase of fecal acylcarnitines in IBD and their biological function have not been elucidated.”
The present study aimed to address this knowledge gap by characterizing both carnitine and acylcarnitines in pediatric IBD.
First, the investigators confirmed that both carnitine and acylcarnitines were elevated in fecal samples from pediatric patients with IBD.
Next, they analyzed fecal samples from subjects in the Food and Resulting Microbiota and Metabolome (FARMM) study, which compared microbiota recovery after gut purge and antibiotics among participants eating an omnivorous diet, a vegan diet, or an exclusive enteral nutrition (EEN) diet lacking in fiber. After the antibiotics, levels of fecal carnitine and acylcarnitines increased significantly in all groups, suggesting that microbiota were consuming these molecules.
To clarify the relationship between inflammation and levels of carnitine and acylcarnitines in the absence of microbiota, Dr. Wu and colleagues employed a germ-free mouse model with dextran sodium sulfate (DSS)–induced colitis. Levels of both molecule types were significantly increased in bile and plasma of mice with colitis versus those that were not exposed to DSS.
“Because the gut microbiota consumes both carnitine and acylcarnitines, these results are consistent with the notion that the increase of these metabolites in the feces of patients with IBD is driven by increased biliary delivery of acylcarnitines to the lumen combined with the reduced number and function of mitochondria in the colonic epithelium as previously reported,” the investigators wrote.
Further experiments with plated cultures and mice revealed that various bacterial species consumed carnitine and acylcarnitines in distinct patterns. Enterobacteriaceae demonstrated a notable proclivity for consumption in vitro and within the murine gut.
“As a high-dimensional analytic feature, the pattern of fecal acylcarnitines, perhaps together with bacterial taxonomy, may have utility as a biomarker for the presence or prognosis of IBD,” Dr. Wu and colleagues concluded. “In addition, based on currently available information about the impact of carnitine on the biology of Enterobacteriaceae, acylcarnitines also may have an important functional effect on the biology of the gut microbiota that is relevant to the pathogenesis or course of disease in patients with IBD.”
The study was supported by the Crohn’s and Colitis Foundation, the PennCHOP Microbiome Program, the Penn Center for Nutritional Science and Medicine, and others. The investigators disclosed no conflicts of interest.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
AGA CPU updates usage of vasoactive drugs, IV albumin, for cirrhosis
The publication, authored by Vincent Wai-Sun Wong, MBChB, MD, and colleagues, includes 12 best-practice-advice statements concerning 3 common clinical scenarios: variceal hemorrhage, ascites and spontaneous bacterial peritonitis, and acute kidney injury and hepatorenal syndrome.
These complications of liver decompensation “are manifestations of portal hypertension with a [consequent] vasodilatory–hyperdynamic circulatory state, resulting in progressive decreases in effective arterial blood volume and renal perfusion,” the update authors wrote in November in Gastroenterology. “Because a potent vasoconstrictor, terlipressin, was recently approved by the United States Food and Drug Administration and because recent trials have explored use of intravenous albumin in other settings, it was considered that a best practice update would be relevant regarding the use of vasoactive drugs and intravenous albumin in these 3 specific scenarios.”
Variceal Hemorrhage
Comprising 70% of all upper GI hemorrhage in patients with cirrhosis, and carrying a 6-week mortality rate as high as 43%, Dr. Wong and colleagues advise immediate initiation of vasoactive drugs upon suspision of variceal hemorrhage, ideally before therapeutic and/or diagnostic endoscopy.
“The goals of management of acute variceal hemorrhage include initial hemostasis, preventing early rebleeding, and reducing in-hospital and 6-week mortality,” they wrote, noting that vasoactive drugs are effective at stopping bleeding in up to 8 out of 10 cases.
In patients with acute variceal hemorrhage undergoing endoscopic hemostasis, vasoactive agents should be continued for 2-5 days to prevent early rebleeding, according to the second best-practice-advice statement.
The third statement suggests octreotide as the drug of choice for variceal hemorrhage due to its favorable safety profile.
“Nowadays, vasopressin is no longer advised in patients with acute variceal hemorrhage because of a high risk of cardiovascular adverse events,” the update authors noted.
Ascites and Spontaneous Bacterial Peritonitis
In cases requiring large-volume (greater than 5 L) paracentesis, intravenous albumin should be administered at time of fluid removal, according to the update. In these patients, albumin reduces the risk of post-paracentesis circulatory dysfunction (defined as an increase in plasma renin activity), thereby reducing the risk of acute kidney injury.
Intravenous albumin should also be considered in patients with spontaneous bacterial peritonitis as this can overcome associated vasodilatation and decreased effective arterial blood volume, which may lead to acute kidney injury if untreated. In contrast, because of a demonstrated lack of efficacy, albumin is not advised in infections other than spontaneous bacterial peritonitis, unless associated with acute kidney injury.
Long-term albumin administration should be avoided in patients with cirrhosis and uncomplicated ascites, whether they are hospitalized or not, as evidence is lacking to support a consistent beneficial effect.
The update also advises against vasoconstrictors in patients with uncomplicated ascites, bacterial peritonitis, and after large-volume paracentesis, again due to a lack of supporting evidence.
Acute Kidney Injury and Hepatorenal Syndrome
In hospitalized patients with cirrhosis and ascites presenting with acute kidney injury, Dr. Wong and colleagues called albumin “the volume expander of choice in hospitalized patients with cirrhosis and ascites presenting with acute kidney injury,” however, the authors caution the dose of albumin “should be tailored to the volume status of the patient.”
The update authors suggested that terlipressin and norepinephrine are suitable options for patients with cirrhosis and the hepatorenal syndrome; however, they suggest terlipressin above the others based on available evidence and suggested concomitant albumin administration as it may further improve renal blood flow by filling the central circulation.
Terlipressin also has the advantage (over norepinephrine) of being administrable via a peripheral line without the need for intensive care unit monitoring, the update authors wrote. The agent is contraindicated in patients with hypoxia or with coronary, peripheral, or mesenteric ischemia, and it should be used with caution in patients with ACLF grade 3, according to the publication. Risks of terlipressin may also outweigh benefits in patients with a serum creatine greater than 5 mg/dL and those listed for transplant with a MELD score of 35 or higher.
The Clinical Practice Update was commissioned and supported by AGA. The authors disclosed relationships with Advanz, Boehringer Ingelheim, 89bio, and others.
The publication, authored by Vincent Wai-Sun Wong, MBChB, MD, and colleagues, includes 12 best-practice-advice statements concerning 3 common clinical scenarios: variceal hemorrhage, ascites and spontaneous bacterial peritonitis, and acute kidney injury and hepatorenal syndrome.
These complications of liver decompensation “are manifestations of portal hypertension with a [consequent] vasodilatory–hyperdynamic circulatory state, resulting in progressive decreases in effective arterial blood volume and renal perfusion,” the update authors wrote in November in Gastroenterology. “Because a potent vasoconstrictor, terlipressin, was recently approved by the United States Food and Drug Administration and because recent trials have explored use of intravenous albumin in other settings, it was considered that a best practice update would be relevant regarding the use of vasoactive drugs and intravenous albumin in these 3 specific scenarios.”
Variceal Hemorrhage
Comprising 70% of all upper GI hemorrhage in patients with cirrhosis, and carrying a 6-week mortality rate as high as 43%, Dr. Wong and colleagues advise immediate initiation of vasoactive drugs upon suspision of variceal hemorrhage, ideally before therapeutic and/or diagnostic endoscopy.
“The goals of management of acute variceal hemorrhage include initial hemostasis, preventing early rebleeding, and reducing in-hospital and 6-week mortality,” they wrote, noting that vasoactive drugs are effective at stopping bleeding in up to 8 out of 10 cases.
In patients with acute variceal hemorrhage undergoing endoscopic hemostasis, vasoactive agents should be continued for 2-5 days to prevent early rebleeding, according to the second best-practice-advice statement.
The third statement suggests octreotide as the drug of choice for variceal hemorrhage due to its favorable safety profile.
“Nowadays, vasopressin is no longer advised in patients with acute variceal hemorrhage because of a high risk of cardiovascular adverse events,” the update authors noted.
Ascites and Spontaneous Bacterial Peritonitis
In cases requiring large-volume (greater than 5 L) paracentesis, intravenous albumin should be administered at time of fluid removal, according to the update. In these patients, albumin reduces the risk of post-paracentesis circulatory dysfunction (defined as an increase in plasma renin activity), thereby reducing the risk of acute kidney injury.
Intravenous albumin should also be considered in patients with spontaneous bacterial peritonitis as this can overcome associated vasodilatation and decreased effective arterial blood volume, which may lead to acute kidney injury if untreated. In contrast, because of a demonstrated lack of efficacy, albumin is not advised in infections other than spontaneous bacterial peritonitis, unless associated with acute kidney injury.
Long-term albumin administration should be avoided in patients with cirrhosis and uncomplicated ascites, whether they are hospitalized or not, as evidence is lacking to support a consistent beneficial effect.
The update also advises against vasoconstrictors in patients with uncomplicated ascites, bacterial peritonitis, and after large-volume paracentesis, again due to a lack of supporting evidence.
Acute Kidney Injury and Hepatorenal Syndrome
In hospitalized patients with cirrhosis and ascites presenting with acute kidney injury, Dr. Wong and colleagues called albumin “the volume expander of choice in hospitalized patients with cirrhosis and ascites presenting with acute kidney injury,” however, the authors caution the dose of albumin “should be tailored to the volume status of the patient.”
The update authors suggested that terlipressin and norepinephrine are suitable options for patients with cirrhosis and the hepatorenal syndrome; however, they suggest terlipressin above the others based on available evidence and suggested concomitant albumin administration as it may further improve renal blood flow by filling the central circulation.
Terlipressin also has the advantage (over norepinephrine) of being administrable via a peripheral line without the need for intensive care unit monitoring, the update authors wrote. The agent is contraindicated in patients with hypoxia or with coronary, peripheral, or mesenteric ischemia, and it should be used with caution in patients with ACLF grade 3, according to the publication. Risks of terlipressin may also outweigh benefits in patients with a serum creatine greater than 5 mg/dL and those listed for transplant with a MELD score of 35 or higher.
The Clinical Practice Update was commissioned and supported by AGA. The authors disclosed relationships with Advanz, Boehringer Ingelheim, 89bio, and others.
The publication, authored by Vincent Wai-Sun Wong, MBChB, MD, and colleagues, includes 12 best-practice-advice statements concerning 3 common clinical scenarios: variceal hemorrhage, ascites and spontaneous bacterial peritonitis, and acute kidney injury and hepatorenal syndrome.
These complications of liver decompensation “are manifestations of portal hypertension with a [consequent] vasodilatory–hyperdynamic circulatory state, resulting in progressive decreases in effective arterial blood volume and renal perfusion,” the update authors wrote in November in Gastroenterology. “Because a potent vasoconstrictor, terlipressin, was recently approved by the United States Food and Drug Administration and because recent trials have explored use of intravenous albumin in other settings, it was considered that a best practice update would be relevant regarding the use of vasoactive drugs and intravenous albumin in these 3 specific scenarios.”
Variceal Hemorrhage
Comprising 70% of all upper GI hemorrhage in patients with cirrhosis, and carrying a 6-week mortality rate as high as 43%, Dr. Wong and colleagues advise immediate initiation of vasoactive drugs upon suspision of variceal hemorrhage, ideally before therapeutic and/or diagnostic endoscopy.
“The goals of management of acute variceal hemorrhage include initial hemostasis, preventing early rebleeding, and reducing in-hospital and 6-week mortality,” they wrote, noting that vasoactive drugs are effective at stopping bleeding in up to 8 out of 10 cases.
In patients with acute variceal hemorrhage undergoing endoscopic hemostasis, vasoactive agents should be continued for 2-5 days to prevent early rebleeding, according to the second best-practice-advice statement.
The third statement suggests octreotide as the drug of choice for variceal hemorrhage due to its favorable safety profile.
“Nowadays, vasopressin is no longer advised in patients with acute variceal hemorrhage because of a high risk of cardiovascular adverse events,” the update authors noted.
Ascites and Spontaneous Bacterial Peritonitis
In cases requiring large-volume (greater than 5 L) paracentesis, intravenous albumin should be administered at time of fluid removal, according to the update. In these patients, albumin reduces the risk of post-paracentesis circulatory dysfunction (defined as an increase in plasma renin activity), thereby reducing the risk of acute kidney injury.
Intravenous albumin should also be considered in patients with spontaneous bacterial peritonitis as this can overcome associated vasodilatation and decreased effective arterial blood volume, which may lead to acute kidney injury if untreated. In contrast, because of a demonstrated lack of efficacy, albumin is not advised in infections other than spontaneous bacterial peritonitis, unless associated with acute kidney injury.
Long-term albumin administration should be avoided in patients with cirrhosis and uncomplicated ascites, whether they are hospitalized or not, as evidence is lacking to support a consistent beneficial effect.
The update also advises against vasoconstrictors in patients with uncomplicated ascites, bacterial peritonitis, and after large-volume paracentesis, again due to a lack of supporting evidence.
Acute Kidney Injury and Hepatorenal Syndrome
In hospitalized patients with cirrhosis and ascites presenting with acute kidney injury, Dr. Wong and colleagues called albumin “the volume expander of choice in hospitalized patients with cirrhosis and ascites presenting with acute kidney injury,” however, the authors caution the dose of albumin “should be tailored to the volume status of the patient.”
The update authors suggested that terlipressin and norepinephrine are suitable options for patients with cirrhosis and the hepatorenal syndrome; however, they suggest terlipressin above the others based on available evidence and suggested concomitant albumin administration as it may further improve renal blood flow by filling the central circulation.
Terlipressin also has the advantage (over norepinephrine) of being administrable via a peripheral line without the need for intensive care unit monitoring, the update authors wrote. The agent is contraindicated in patients with hypoxia or with coronary, peripheral, or mesenteric ischemia, and it should be used with caution in patients with ACLF grade 3, according to the publication. Risks of terlipressin may also outweigh benefits in patients with a serum creatine greater than 5 mg/dL and those listed for transplant with a MELD score of 35 or higher.
The Clinical Practice Update was commissioned and supported by AGA. The authors disclosed relationships with Advanz, Boehringer Ingelheim, 89bio, and others.
FROM GASTROENTEROLOGY