User login
Sex-Related Differences Found in IgG4-Related Disease Epidemiology
TOPLINE:
Men with immunoglobulin G4 (IgG4)-related disease exhibit significantly lower serum lipase levels and a greater likelihood of organ involvement than women, highlighting significant sex-dependent differences in disease manifestations.
METHODOLOGY:
- Researchers conducted a retrospective study of 328 patients (69% men) diagnosed with IgG4-related disease at the Massachusetts General Hospital – Rheumatology Clinic, Boston, who met the American College of Rheumatology–European Alliance of Associations for Rheumatology (ACR-EULAR) classification criteria between January 2008 and May 2023.
- Among the 328 patients, 69% were men and 31% were women, with a significant male-to-female ratio of 2.2:1.0. Men were typically older at diagnosis (median age, 63.7 vs 58.2 years).
- Data on serum lipase levels, renal involvement, and other clinical and laboratory parameters were collected.
TAKEAWAY:
- Men had higher baseline ACR-EULAR scores, indicating more severe disease (median score of 35.0 vs 29.5; P = .0010).
- Male patients demonstrated a median baseline serum lipase concentration of 24.5 U/L, significantly lower than the 33.5 U/L observed in women.
- Pancreatic (50% vs 26%) or renal (36% vs 18%) involvement was more common in men.
- Men exhibited higher IgG4 levels (P = .0050) and active B-cell responses in the blood (P = .0095).
IN PRACTICE:
According to the authors, this work confirms “the impression of an important sex disparity among patients with IgG4-related disease, with most patients being male, and male patients demonstrating strong tendencies toward more severe disease than female patients.”
SOURCE:
The study was led by Isha Jha, MD, Massachusetts General Hospital, Boston. It was published online on May 30, 2024, in The Lancet Rheumatology.
LIMITATIONS:
The study’s retrospective design may limit the ability to establish causality between sex differences and IgG4-related disease manifestations. A relatively small percentage of patients were assessed before receiving any immunosuppressive treatment, potentially influencing the observed clinical parameters.
DISCLOSURES:
This work was supported by the National Institutes of Health/National Institute of Allergy and Infectious Diseases, the Rheumatology Research Foundation, and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some authors declared financial ties outside this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article first appeared on Medscape.com.
TOPLINE:
Men with immunoglobulin G4 (IgG4)-related disease exhibit significantly lower serum lipase levels and a greater likelihood of organ involvement than women, highlighting significant sex-dependent differences in disease manifestations.
METHODOLOGY:
- Researchers conducted a retrospective study of 328 patients (69% men) diagnosed with IgG4-related disease at the Massachusetts General Hospital – Rheumatology Clinic, Boston, who met the American College of Rheumatology–European Alliance of Associations for Rheumatology (ACR-EULAR) classification criteria between January 2008 and May 2023.
- Among the 328 patients, 69% were men and 31% were women, with a significant male-to-female ratio of 2.2:1.0. Men were typically older at diagnosis (median age, 63.7 vs 58.2 years).
- Data on serum lipase levels, renal involvement, and other clinical and laboratory parameters were collected.
TAKEAWAY:
- Men had higher baseline ACR-EULAR scores, indicating more severe disease (median score of 35.0 vs 29.5; P = .0010).
- Male patients demonstrated a median baseline serum lipase concentration of 24.5 U/L, significantly lower than the 33.5 U/L observed in women.
- Pancreatic (50% vs 26%) or renal (36% vs 18%) involvement was more common in men.
- Men exhibited higher IgG4 levels (P = .0050) and active B-cell responses in the blood (P = .0095).
IN PRACTICE:
According to the authors, this work confirms “the impression of an important sex disparity among patients with IgG4-related disease, with most patients being male, and male patients demonstrating strong tendencies toward more severe disease than female patients.”
SOURCE:
The study was led by Isha Jha, MD, Massachusetts General Hospital, Boston. It was published online on May 30, 2024, in The Lancet Rheumatology.
LIMITATIONS:
The study’s retrospective design may limit the ability to establish causality between sex differences and IgG4-related disease manifestations. A relatively small percentage of patients were assessed before receiving any immunosuppressive treatment, potentially influencing the observed clinical parameters.
DISCLOSURES:
This work was supported by the National Institutes of Health/National Institute of Allergy and Infectious Diseases, the Rheumatology Research Foundation, and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some authors declared financial ties outside this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article first appeared on Medscape.com.
TOPLINE:
Men with immunoglobulin G4 (IgG4)-related disease exhibit significantly lower serum lipase levels and a greater likelihood of organ involvement than women, highlighting significant sex-dependent differences in disease manifestations.
METHODOLOGY:
- Researchers conducted a retrospective study of 328 patients (69% men) diagnosed with IgG4-related disease at the Massachusetts General Hospital – Rheumatology Clinic, Boston, who met the American College of Rheumatology–European Alliance of Associations for Rheumatology (ACR-EULAR) classification criteria between January 2008 and May 2023.
- Among the 328 patients, 69% were men and 31% were women, with a significant male-to-female ratio of 2.2:1.0. Men were typically older at diagnosis (median age, 63.7 vs 58.2 years).
- Data on serum lipase levels, renal involvement, and other clinical and laboratory parameters were collected.
TAKEAWAY:
- Men had higher baseline ACR-EULAR scores, indicating more severe disease (median score of 35.0 vs 29.5; P = .0010).
- Male patients demonstrated a median baseline serum lipase concentration of 24.5 U/L, significantly lower than the 33.5 U/L observed in women.
- Pancreatic (50% vs 26%) or renal (36% vs 18%) involvement was more common in men.
- Men exhibited higher IgG4 levels (P = .0050) and active B-cell responses in the blood (P = .0095).
IN PRACTICE:
According to the authors, this work confirms “the impression of an important sex disparity among patients with IgG4-related disease, with most patients being male, and male patients demonstrating strong tendencies toward more severe disease than female patients.”
SOURCE:
The study was led by Isha Jha, MD, Massachusetts General Hospital, Boston. It was published online on May 30, 2024, in The Lancet Rheumatology.
LIMITATIONS:
The study’s retrospective design may limit the ability to establish causality between sex differences and IgG4-related disease manifestations. A relatively small percentage of patients were assessed before receiving any immunosuppressive treatment, potentially influencing the observed clinical parameters.
DISCLOSURES:
This work was supported by the National Institutes of Health/National Institute of Allergy and Infectious Diseases, the Rheumatology Research Foundation, and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some authors declared financial ties outside this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article first appeared on Medscape.com.
Why Do Investigational OA Drugs Need Better Trial Endpoints? Lorecivivint Serves as an Example
VIENNA — The hypothesis that pivotal clinical trials for osteoarthritis (OA)-modifying therapies are not using appropriate designs or endpoints appears to be consistent with the recent failure of the phase 3 trial of the investigational agent lorecivivint, according to experts tackling this issue.
For the elusive target of disease-modifying OA drugs (DMOADs), “there have been a lot of developments in the last few years but so far a lot of disappointments,” said Francis Berenbaum, MD, PhD, head of the department of rheumatology, Saint-Antoine Hospital, Paris, France.
Disagreement on the target most likely to favorably alter the natural history of disease might be the key issue. Dr. Berenbaum considers it essential to determine which changes in the joint signify a favorable drug effect and will lead to what regulatory agencies consider a clinically meaningful benefit. These include improved function and long-term preservation of the joint, as well as symptom control.
Of primary targets to modify the course of OA, cartilage is not one of them, according to Dr. Berenbaum, who spoke in a session on DMOADs and regenerative OA therapies at the annual European Congress of Rheumatology.
OA Is Not a Cartilage-Only Disease
“There is now a big consensus that osteoarthritis is not a cartilage-only disease,” he said. Rather, he addressed the inadequate appreciation of the “whole joint” pathology that underlies OA. He called for a fundamental “paradigm change” to work toward a disease-modifying effect that produces benefit on a hard endpoint.
There are multiple steps needed to work toward this goal after a consensus is reached on a meaningful surrogate endpoint, Dr. Berenbaum said. While symptom reduction is a good start, he called for evidence of disease attenuation or a regenerative effect on an important surrogate such as improved integrity of synovial tissue and improved bone health. Such surrogates are necessary to guide DMOAD development but not sufficient. The proof that a therapy is a DMOAD depends on a favorable effect on a hard endpoint. In the case of the knee, freedom from joint replacement is an example, Dr. Berenbaum said.
Philip G. Conaghan, MBBS, PhD, director of rheumatic and musculoskeletal medicine, University of Leeds, England, agreed with this general premise. Speaking just before Dr. Berenbaum in the same session, Dr. Conaghan traced this history of the effort to create DMOADs and updated those in clinical trials.
In his talk, he listed some of the many disappointments, including those which have targeted cartilage thickness, before updating the numerous ongoing development programs. There are many targets that appear viable, but none are in final stages of testing.
In remarks to this news organization, he said he generally agreed with Dr. Berenbaum about the need for greater rigor for developing drugs to meet regulatory criteria for disease-modifying effects.
Of the drugs he reviewed, Dr. Conaghan identified lorecivivint, an intra-articular CLK/DYRK inhibitor that’s thought to modulate Wnt and inflammatory pathways, as the only drug with DMOAD potential to go to a multicenter phase 3 trial so far. The drug’s negative outcome in phase 3 was particularly disappointing after the substantial promise shown in a phase 2b study published in 2021.
In the phase 3 study, lorecivivint, relative to placebo, did not achieve a significant improvement in the primary endpoint of improved medial joint space width (JSW) in the target knee as assessed at the end of a 48-week, double-blind trial.
New Follow-Up Data Support DMOAD Activity
Yet, additional extension data from the phase 3 lorecivivint trial presented in the EULAR DMOAD session challenge the conclusion even if they do not change the results.
The new data presented at EULAR is the second of two sets of extension data. The first, reported earlier, involved an analysis at 96 weeks or 48 weeks after the double-blind trial. At the beginning of this extension, lorecivivint-start patients had received a second intraarticular injection of 0.07 mg, while placebo patients were crossed over to receive their first injection.
Over the course of this first extension, the gradual loss in medial JSW observed from baseline to the end of the initial 48 weeks had plateaued in those treated with lorecivivint, but the decline continued in the placebo group. As a result, the lorecivivint-start patients had a numerical but not a statistically significant advantage for medial JSW over the placebo-switch group, according to Yusuf Yazici, MD, chief medical officer of Biosplice Therapeutics, San Diego, which developed lorecivivint.
In a second open-label extension described by Dr. Yazici at EULAR 2024, a third injection was administered to the lorecivivint-start patients and a second injection to the placebo-start patients. After 52 more weeks of follow-up, there were now 3 years of follow-up in the lorecivivint-start group and 2 years of follow-up in the placebo-start group.
At the end of this second extension, lorecivivint-start patients had a median increase in JSW that was approaching the baseline level at study entry. Although the placebo-start group had experienced a decline in the medial JSW at the end of the first extension when they had received one injection, JSW had also improved in the direction of baseline after a second injection with 2 years of follow-up. The advantage of three injections of lorecivivint over 3 years had now reached statistical significance (P = .031) despite the improvement seen in the placebo-start group following two injections over 2 years.
At 3 Years, Benefit Is Finally Potentially Significant
If placebo-treated patients had not received a second shot of lorecivivint and progressed at the rate seen before their second shot, the hypothetical trajectory would have provided lorecivivint with a highly statistically significant advantage (P < .001), said Dr. Yazici, displaying a hypothetical graph.
Along with improvements in pain and function associated with lorecivivint relative to placebo at 6 months, 12 months, and 24 months, the structural improvements in 3 years now suggest that “long-term treatment with lorecivivint has the potential to be a DMOAD for knee OA,” Dr. Yazici said.
While Dr. Berenbaum did not comment on this speculation, he did note the potential need for long-term studies to prove a disease-modifying effect in OA. This is the rationale for identifying surrogates.
To illustrate this point, Dr. Berenbaum made an analogy between OA and cardiovascular disease. In cardiovascular disease, surrogates of disease-modifying therapies, such as control of hypertension or hyperlipidemia, are accepted by regulatory agencies on the basis of their proven association with hard endpoints, such as myocardial infarction, stroke, or cardiovascular death. Like joint failure, these events can take years or decades to arise.
“For trials in OA, we need to agree on these surrogates,” Dr. Berenbaum said, although he acknowledged that they would then have to be validated. Noting that the US Food and Drug Administration has now identified OA as a serious disease for which accelerated drug approvals will be considered to address an unmet need, Dr. Berenbaum suggested there is an even greater impetus for improving strategies for DMOAD development.
Dr. Berenbaum reported financial relationships with Grünenthal, GlaxoSmithKline, Eli Lilly, Novartis, Pfizer, and Servier. Dr. Conaghan reported financial relationships with AbbVie, AstraZeneca, Eli Lilly, Galapagos, GlaxoSmithKline, Grünenthal, Janssen, Levicept, Merck, Novartis, Pfizer, Regeneron, Stryker, and UCB. Dr. Yazici is an employee of Biosplice Therapeutics, which provided funding for the OAS-07 trial.
A version of this article first appeared on Medscape.com.
VIENNA — The hypothesis that pivotal clinical trials for osteoarthritis (OA)-modifying therapies are not using appropriate designs or endpoints appears to be consistent with the recent failure of the phase 3 trial of the investigational agent lorecivivint, according to experts tackling this issue.
For the elusive target of disease-modifying OA drugs (DMOADs), “there have been a lot of developments in the last few years but so far a lot of disappointments,” said Francis Berenbaum, MD, PhD, head of the department of rheumatology, Saint-Antoine Hospital, Paris, France.
Disagreement on the target most likely to favorably alter the natural history of disease might be the key issue. Dr. Berenbaum considers it essential to determine which changes in the joint signify a favorable drug effect and will lead to what regulatory agencies consider a clinically meaningful benefit. These include improved function and long-term preservation of the joint, as well as symptom control.
Of primary targets to modify the course of OA, cartilage is not one of them, according to Dr. Berenbaum, who spoke in a session on DMOADs and regenerative OA therapies at the annual European Congress of Rheumatology.
OA Is Not a Cartilage-Only Disease
“There is now a big consensus that osteoarthritis is not a cartilage-only disease,” he said. Rather, he addressed the inadequate appreciation of the “whole joint” pathology that underlies OA. He called for a fundamental “paradigm change” to work toward a disease-modifying effect that produces benefit on a hard endpoint.
There are multiple steps needed to work toward this goal after a consensus is reached on a meaningful surrogate endpoint, Dr. Berenbaum said. While symptom reduction is a good start, he called for evidence of disease attenuation or a regenerative effect on an important surrogate such as improved integrity of synovial tissue and improved bone health. Such surrogates are necessary to guide DMOAD development but not sufficient. The proof that a therapy is a DMOAD depends on a favorable effect on a hard endpoint. In the case of the knee, freedom from joint replacement is an example, Dr. Berenbaum said.
Philip G. Conaghan, MBBS, PhD, director of rheumatic and musculoskeletal medicine, University of Leeds, England, agreed with this general premise. Speaking just before Dr. Berenbaum in the same session, Dr. Conaghan traced this history of the effort to create DMOADs and updated those in clinical trials.
In his talk, he listed some of the many disappointments, including those which have targeted cartilage thickness, before updating the numerous ongoing development programs. There are many targets that appear viable, but none are in final stages of testing.
In remarks to this news organization, he said he generally agreed with Dr. Berenbaum about the need for greater rigor for developing drugs to meet regulatory criteria for disease-modifying effects.
Of the drugs he reviewed, Dr. Conaghan identified lorecivivint, an intra-articular CLK/DYRK inhibitor that’s thought to modulate Wnt and inflammatory pathways, as the only drug with DMOAD potential to go to a multicenter phase 3 trial so far. The drug’s negative outcome in phase 3 was particularly disappointing after the substantial promise shown in a phase 2b study published in 2021.
In the phase 3 study, lorecivivint, relative to placebo, did not achieve a significant improvement in the primary endpoint of improved medial joint space width (JSW) in the target knee as assessed at the end of a 48-week, double-blind trial.
New Follow-Up Data Support DMOAD Activity
Yet, additional extension data from the phase 3 lorecivivint trial presented in the EULAR DMOAD session challenge the conclusion even if they do not change the results.
The new data presented at EULAR is the second of two sets of extension data. The first, reported earlier, involved an analysis at 96 weeks or 48 weeks after the double-blind trial. At the beginning of this extension, lorecivivint-start patients had received a second intraarticular injection of 0.07 mg, while placebo patients were crossed over to receive their first injection.
Over the course of this first extension, the gradual loss in medial JSW observed from baseline to the end of the initial 48 weeks had plateaued in those treated with lorecivivint, but the decline continued in the placebo group. As a result, the lorecivivint-start patients had a numerical but not a statistically significant advantage for medial JSW over the placebo-switch group, according to Yusuf Yazici, MD, chief medical officer of Biosplice Therapeutics, San Diego, which developed lorecivivint.
In a second open-label extension described by Dr. Yazici at EULAR 2024, a third injection was administered to the lorecivivint-start patients and a second injection to the placebo-start patients. After 52 more weeks of follow-up, there were now 3 years of follow-up in the lorecivivint-start group and 2 years of follow-up in the placebo-start group.
At the end of this second extension, lorecivivint-start patients had a median increase in JSW that was approaching the baseline level at study entry. Although the placebo-start group had experienced a decline in the medial JSW at the end of the first extension when they had received one injection, JSW had also improved in the direction of baseline after a second injection with 2 years of follow-up. The advantage of three injections of lorecivivint over 3 years had now reached statistical significance (P = .031) despite the improvement seen in the placebo-start group following two injections over 2 years.
At 3 Years, Benefit Is Finally Potentially Significant
If placebo-treated patients had not received a second shot of lorecivivint and progressed at the rate seen before their second shot, the hypothetical trajectory would have provided lorecivivint with a highly statistically significant advantage (P < .001), said Dr. Yazici, displaying a hypothetical graph.
Along with improvements in pain and function associated with lorecivivint relative to placebo at 6 months, 12 months, and 24 months, the structural improvements in 3 years now suggest that “long-term treatment with lorecivivint has the potential to be a DMOAD for knee OA,” Dr. Yazici said.
While Dr. Berenbaum did not comment on this speculation, he did note the potential need for long-term studies to prove a disease-modifying effect in OA. This is the rationale for identifying surrogates.
To illustrate this point, Dr. Berenbaum made an analogy between OA and cardiovascular disease. In cardiovascular disease, surrogates of disease-modifying therapies, such as control of hypertension or hyperlipidemia, are accepted by regulatory agencies on the basis of their proven association with hard endpoints, such as myocardial infarction, stroke, or cardiovascular death. Like joint failure, these events can take years or decades to arise.
“For trials in OA, we need to agree on these surrogates,” Dr. Berenbaum said, although he acknowledged that they would then have to be validated. Noting that the US Food and Drug Administration has now identified OA as a serious disease for which accelerated drug approvals will be considered to address an unmet need, Dr. Berenbaum suggested there is an even greater impetus for improving strategies for DMOAD development.
Dr. Berenbaum reported financial relationships with Grünenthal, GlaxoSmithKline, Eli Lilly, Novartis, Pfizer, and Servier. Dr. Conaghan reported financial relationships with AbbVie, AstraZeneca, Eli Lilly, Galapagos, GlaxoSmithKline, Grünenthal, Janssen, Levicept, Merck, Novartis, Pfizer, Regeneron, Stryker, and UCB. Dr. Yazici is an employee of Biosplice Therapeutics, which provided funding for the OAS-07 trial.
A version of this article first appeared on Medscape.com.
VIENNA — The hypothesis that pivotal clinical trials for osteoarthritis (OA)-modifying therapies are not using appropriate designs or endpoints appears to be consistent with the recent failure of the phase 3 trial of the investigational agent lorecivivint, according to experts tackling this issue.
For the elusive target of disease-modifying OA drugs (DMOADs), “there have been a lot of developments in the last few years but so far a lot of disappointments,” said Francis Berenbaum, MD, PhD, head of the department of rheumatology, Saint-Antoine Hospital, Paris, France.
Disagreement on the target most likely to favorably alter the natural history of disease might be the key issue. Dr. Berenbaum considers it essential to determine which changes in the joint signify a favorable drug effect and will lead to what regulatory agencies consider a clinically meaningful benefit. These include improved function and long-term preservation of the joint, as well as symptom control.
Of primary targets to modify the course of OA, cartilage is not one of them, according to Dr. Berenbaum, who spoke in a session on DMOADs and regenerative OA therapies at the annual European Congress of Rheumatology.
OA Is Not a Cartilage-Only Disease
“There is now a big consensus that osteoarthritis is not a cartilage-only disease,” he said. Rather, he addressed the inadequate appreciation of the “whole joint” pathology that underlies OA. He called for a fundamental “paradigm change” to work toward a disease-modifying effect that produces benefit on a hard endpoint.
There are multiple steps needed to work toward this goal after a consensus is reached on a meaningful surrogate endpoint, Dr. Berenbaum said. While symptom reduction is a good start, he called for evidence of disease attenuation or a regenerative effect on an important surrogate such as improved integrity of synovial tissue and improved bone health. Such surrogates are necessary to guide DMOAD development but not sufficient. The proof that a therapy is a DMOAD depends on a favorable effect on a hard endpoint. In the case of the knee, freedom from joint replacement is an example, Dr. Berenbaum said.
Philip G. Conaghan, MBBS, PhD, director of rheumatic and musculoskeletal medicine, University of Leeds, England, agreed with this general premise. Speaking just before Dr. Berenbaum in the same session, Dr. Conaghan traced this history of the effort to create DMOADs and updated those in clinical trials.
In his talk, he listed some of the many disappointments, including those which have targeted cartilage thickness, before updating the numerous ongoing development programs. There are many targets that appear viable, but none are in final stages of testing.
In remarks to this news organization, he said he generally agreed with Dr. Berenbaum about the need for greater rigor for developing drugs to meet regulatory criteria for disease-modifying effects.
Of the drugs he reviewed, Dr. Conaghan identified lorecivivint, an intra-articular CLK/DYRK inhibitor that’s thought to modulate Wnt and inflammatory pathways, as the only drug with DMOAD potential to go to a multicenter phase 3 trial so far. The drug’s negative outcome in phase 3 was particularly disappointing after the substantial promise shown in a phase 2b study published in 2021.
In the phase 3 study, lorecivivint, relative to placebo, did not achieve a significant improvement in the primary endpoint of improved medial joint space width (JSW) in the target knee as assessed at the end of a 48-week, double-blind trial.
New Follow-Up Data Support DMOAD Activity
Yet, additional extension data from the phase 3 lorecivivint trial presented in the EULAR DMOAD session challenge the conclusion even if they do not change the results.
The new data presented at EULAR is the second of two sets of extension data. The first, reported earlier, involved an analysis at 96 weeks or 48 weeks after the double-blind trial. At the beginning of this extension, lorecivivint-start patients had received a second intraarticular injection of 0.07 mg, while placebo patients were crossed over to receive their first injection.
Over the course of this first extension, the gradual loss in medial JSW observed from baseline to the end of the initial 48 weeks had plateaued in those treated with lorecivivint, but the decline continued in the placebo group. As a result, the lorecivivint-start patients had a numerical but not a statistically significant advantage for medial JSW over the placebo-switch group, according to Yusuf Yazici, MD, chief medical officer of Biosplice Therapeutics, San Diego, which developed lorecivivint.
In a second open-label extension described by Dr. Yazici at EULAR 2024, a third injection was administered to the lorecivivint-start patients and a second injection to the placebo-start patients. After 52 more weeks of follow-up, there were now 3 years of follow-up in the lorecivivint-start group and 2 years of follow-up in the placebo-start group.
At the end of this second extension, lorecivivint-start patients had a median increase in JSW that was approaching the baseline level at study entry. Although the placebo-start group had experienced a decline in the medial JSW at the end of the first extension when they had received one injection, JSW had also improved in the direction of baseline after a second injection with 2 years of follow-up. The advantage of three injections of lorecivivint over 3 years had now reached statistical significance (P = .031) despite the improvement seen in the placebo-start group following two injections over 2 years.
At 3 Years, Benefit Is Finally Potentially Significant
If placebo-treated patients had not received a second shot of lorecivivint and progressed at the rate seen before their second shot, the hypothetical trajectory would have provided lorecivivint with a highly statistically significant advantage (P < .001), said Dr. Yazici, displaying a hypothetical graph.
Along with improvements in pain and function associated with lorecivivint relative to placebo at 6 months, 12 months, and 24 months, the structural improvements in 3 years now suggest that “long-term treatment with lorecivivint has the potential to be a DMOAD for knee OA,” Dr. Yazici said.
While Dr. Berenbaum did not comment on this speculation, he did note the potential need for long-term studies to prove a disease-modifying effect in OA. This is the rationale for identifying surrogates.
To illustrate this point, Dr. Berenbaum made an analogy between OA and cardiovascular disease. In cardiovascular disease, surrogates of disease-modifying therapies, such as control of hypertension or hyperlipidemia, are accepted by regulatory agencies on the basis of their proven association with hard endpoints, such as myocardial infarction, stroke, or cardiovascular death. Like joint failure, these events can take years or decades to arise.
“For trials in OA, we need to agree on these surrogates,” Dr. Berenbaum said, although he acknowledged that they would then have to be validated. Noting that the US Food and Drug Administration has now identified OA as a serious disease for which accelerated drug approvals will be considered to address an unmet need, Dr. Berenbaum suggested there is an even greater impetus for improving strategies for DMOAD development.
Dr. Berenbaum reported financial relationships with Grünenthal, GlaxoSmithKline, Eli Lilly, Novartis, Pfizer, and Servier. Dr. Conaghan reported financial relationships with AbbVie, AstraZeneca, Eli Lilly, Galapagos, GlaxoSmithKline, Grünenthal, Janssen, Levicept, Merck, Novartis, Pfizer, Regeneron, Stryker, and UCB. Dr. Yazici is an employee of Biosplice Therapeutics, which provided funding for the OAS-07 trial.
A version of this article first appeared on Medscape.com.
FROM EULAR 2024
Akira Endo, the Father of Statins, Dies
Akira Endo, PhD, the Japanese microbiologist and biochemist known as the father of statins, died at the age of 90 on June 5. His research led to the discovery and rise of a class of drugs that revolutionized the prevention and treatment of cardiovascular diseases. This scientific journey began over half a century ago.
Inspired by Alexander Fleming
Born into a family of farmers in northern Japan, Dr. Endo was fascinated by natural sciences from a young age and showed a particular interest in fungi and molds. At the age of 10, he already knew he wanted to become a scientist.
He studied in Japan and the United States, conducting research at the Albert Einstein College of Medicine in New York City. He was struck by the high number of elderly and overweight individuals in the United States and realized the importance of developing a drug to combat cholesterol. It was upon his return to Japan, when he joined the Sankyo laboratory, that the development of statins began.
Inspired by Alexander Fleming, who discovered penicillin in the mold Penicillium, he hypothesized that fungi could produce antibiotics inhibiting 3-hydroxy-3-methylglutaryl coenzyme A (HMG-CoA) reductase, the enzyme that produces cholesterol precursors.
After a year of research on nearly 3800 strains, his team found a known substance, citrinin, that strongly inhibited HMG-CoA reductase and lowered serum cholesterol levels in rats. The research was halted because of its toxicity to the rodents’ kidneys. “Nevertheless, the experience with citrinin gave us hope and courage to quickly discover much more effective active substances,” said Dr. Endo in an article dedicated to the discovery of statins.
First Statin Discovered
In the summer of 1972, researchers discovered a second active culture broth, Penicillium citrinum Pen-51, which was isolated from a sample of rice collected in a grain store in Kyoto.
In July 1973, they isolated three active metabolites from this mold, one of which was compactin, which had structural similarities to HMG-CoA, the substrate of the HMG-CoA reductase reaction.
In 1976, they published two articles reporting the discovery and characterization of compactin (mevastatin), the first statin.
Several Setbacks
Unfortunately, when Sankyo biologists assessed the effectiveness of compactin by giving rats a diet supplemented with compactin for 7 days, no reduction in serum cholesterol was observed.
Only later did an unpublished study show that the statin significantly decreased plasma cholesterol after a month of treatment in laying hens. The hypocholesterolemic effects of compactin were then demonstrated in dogs and monkeys.
However, researchers faced a second challenge in April 1977. Microcrystalline structures were detected in the liver cells of rats that had been fed extremely high amounts of compactin (over 500 mg/kg per day for 5 weeks). Initially deemed toxic, the structures were ultimately found to be nontoxic.
A phase 2 trial began in the summer of 1979 with very encouraging preliminary results, but in August 1980, clinical development of compactin was halted, as the drug was suspected of causing lymphomas in dogs given very high doses: 100 or 200 mg/kg per day for 2 years.
This suspicion also led to the termination of trials on another statin, the closely related lovastatin, which was discovered simultaneously from different fungi by the Merck laboratory and Dr. Endo in February 1979.
First Statin Marketed
It was confirmed that the drug significantly reduced cholesterol levels and was well tolerated. No tumors were detected.
Lovastatin received approval from the Food and Drug Administration to become the first marketed statin in September 1987.
Dr. Endo received numerous awards for his work, including the Albert Lasker Award for Clinical Medical Research in 2008 and the Outstanding Achievement Award from the International Atherosclerosis Society in 2009.
This story was translated from the Medscape French edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
Akira Endo, PhD, the Japanese microbiologist and biochemist known as the father of statins, died at the age of 90 on June 5. His research led to the discovery and rise of a class of drugs that revolutionized the prevention and treatment of cardiovascular diseases. This scientific journey began over half a century ago.
Inspired by Alexander Fleming
Born into a family of farmers in northern Japan, Dr. Endo was fascinated by natural sciences from a young age and showed a particular interest in fungi and molds. At the age of 10, he already knew he wanted to become a scientist.
He studied in Japan and the United States, conducting research at the Albert Einstein College of Medicine in New York City. He was struck by the high number of elderly and overweight individuals in the United States and realized the importance of developing a drug to combat cholesterol. It was upon his return to Japan, when he joined the Sankyo laboratory, that the development of statins began.
Inspired by Alexander Fleming, who discovered penicillin in the mold Penicillium, he hypothesized that fungi could produce antibiotics inhibiting 3-hydroxy-3-methylglutaryl coenzyme A (HMG-CoA) reductase, the enzyme that produces cholesterol precursors.
After a year of research on nearly 3800 strains, his team found a known substance, citrinin, that strongly inhibited HMG-CoA reductase and lowered serum cholesterol levels in rats. The research was halted because of its toxicity to the rodents’ kidneys. “Nevertheless, the experience with citrinin gave us hope and courage to quickly discover much more effective active substances,” said Dr. Endo in an article dedicated to the discovery of statins.
First Statin Discovered
In the summer of 1972, researchers discovered a second active culture broth, Penicillium citrinum Pen-51, which was isolated from a sample of rice collected in a grain store in Kyoto.
In July 1973, they isolated three active metabolites from this mold, one of which was compactin, which had structural similarities to HMG-CoA, the substrate of the HMG-CoA reductase reaction.
In 1976, they published two articles reporting the discovery and characterization of compactin (mevastatin), the first statin.
Several Setbacks
Unfortunately, when Sankyo biologists assessed the effectiveness of compactin by giving rats a diet supplemented with compactin for 7 days, no reduction in serum cholesterol was observed.
Only later did an unpublished study show that the statin significantly decreased plasma cholesterol after a month of treatment in laying hens. The hypocholesterolemic effects of compactin were then demonstrated in dogs and monkeys.
However, researchers faced a second challenge in April 1977. Microcrystalline structures were detected in the liver cells of rats that had been fed extremely high amounts of compactin (over 500 mg/kg per day for 5 weeks). Initially deemed toxic, the structures were ultimately found to be nontoxic.
A phase 2 trial began in the summer of 1979 with very encouraging preliminary results, but in August 1980, clinical development of compactin was halted, as the drug was suspected of causing lymphomas in dogs given very high doses: 100 or 200 mg/kg per day for 2 years.
This suspicion also led to the termination of trials on another statin, the closely related lovastatin, which was discovered simultaneously from different fungi by the Merck laboratory and Dr. Endo in February 1979.
First Statin Marketed
It was confirmed that the drug significantly reduced cholesterol levels and was well tolerated. No tumors were detected.
Lovastatin received approval from the Food and Drug Administration to become the first marketed statin in September 1987.
Dr. Endo received numerous awards for his work, including the Albert Lasker Award for Clinical Medical Research in 2008 and the Outstanding Achievement Award from the International Atherosclerosis Society in 2009.
This story was translated from the Medscape French edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
Akira Endo, PhD, the Japanese microbiologist and biochemist known as the father of statins, died at the age of 90 on June 5. His research led to the discovery and rise of a class of drugs that revolutionized the prevention and treatment of cardiovascular diseases. This scientific journey began over half a century ago.
Inspired by Alexander Fleming
Born into a family of farmers in northern Japan, Dr. Endo was fascinated by natural sciences from a young age and showed a particular interest in fungi and molds. At the age of 10, he already knew he wanted to become a scientist.
He studied in Japan and the United States, conducting research at the Albert Einstein College of Medicine in New York City. He was struck by the high number of elderly and overweight individuals in the United States and realized the importance of developing a drug to combat cholesterol. It was upon his return to Japan, when he joined the Sankyo laboratory, that the development of statins began.
Inspired by Alexander Fleming, who discovered penicillin in the mold Penicillium, he hypothesized that fungi could produce antibiotics inhibiting 3-hydroxy-3-methylglutaryl coenzyme A (HMG-CoA) reductase, the enzyme that produces cholesterol precursors.
After a year of research on nearly 3800 strains, his team found a known substance, citrinin, that strongly inhibited HMG-CoA reductase and lowered serum cholesterol levels in rats. The research was halted because of its toxicity to the rodents’ kidneys. “Nevertheless, the experience with citrinin gave us hope and courage to quickly discover much more effective active substances,” said Dr. Endo in an article dedicated to the discovery of statins.
First Statin Discovered
In the summer of 1972, researchers discovered a second active culture broth, Penicillium citrinum Pen-51, which was isolated from a sample of rice collected in a grain store in Kyoto.
In July 1973, they isolated three active metabolites from this mold, one of which was compactin, which had structural similarities to HMG-CoA, the substrate of the HMG-CoA reductase reaction.
In 1976, they published two articles reporting the discovery and characterization of compactin (mevastatin), the first statin.
Several Setbacks
Unfortunately, when Sankyo biologists assessed the effectiveness of compactin by giving rats a diet supplemented with compactin for 7 days, no reduction in serum cholesterol was observed.
Only later did an unpublished study show that the statin significantly decreased plasma cholesterol after a month of treatment in laying hens. The hypocholesterolemic effects of compactin were then demonstrated in dogs and monkeys.
However, researchers faced a second challenge in April 1977. Microcrystalline structures were detected in the liver cells of rats that had been fed extremely high amounts of compactin (over 500 mg/kg per day for 5 weeks). Initially deemed toxic, the structures were ultimately found to be nontoxic.
A phase 2 trial began in the summer of 1979 with very encouraging preliminary results, but in August 1980, clinical development of compactin was halted, as the drug was suspected of causing lymphomas in dogs given very high doses: 100 or 200 mg/kg per day for 2 years.
This suspicion also led to the termination of trials on another statin, the closely related lovastatin, which was discovered simultaneously from different fungi by the Merck laboratory and Dr. Endo in February 1979.
First Statin Marketed
It was confirmed that the drug significantly reduced cholesterol levels and was well tolerated. No tumors were detected.
Lovastatin received approval from the Food and Drug Administration to become the first marketed statin in September 1987.
Dr. Endo received numerous awards for his work, including the Albert Lasker Award for Clinical Medical Research in 2008 and the Outstanding Achievement Award from the International Atherosclerosis Society in 2009.
This story was translated from the Medscape French edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
Ghrelin Paradox: Unlocking New Avenues in Obesity Management
Despite their best efforts, 80% of people who lose weight regain it and many end up heavier within 5 years. Why? Our bodies fight back, revving up hunger while slowing metabolism after weight loss. In ongoing obesity discussions, ghrelin is in the spotlight as the “hunger hormone” playing a crucial role in driving appetite and facilitating weight gain.
Weight loss interventions, such as diet or gastric bypass surgery, may trigger an increase in ghrelin levels, potentially fueling long-term weight gain. Consequently, ghrelin remains a focal point of research into innovative antiobesity treatments.
Ghrelin, a hormone produced in the stomach, is often called the “hunger hormone.” Ghrelin is a circulating orexigenic gut hormone with growth hormone–releasing activity.
Since the discovery of ghrelin, in 1999, research in mice and people has focused on its effect on regulating appetite and implications for long-term weight control. When hunger strikes, ghrelin levels surge, sending signals to the brain that ramp up the appetite. Following a meal, ghrelin decreases, indicating fullness.
Studies have found that people who were injected with subcutaneous ghrelin experienced a 46% increase in hunger and ate 28% more at their next meal than those who didn’t receive a ghrelin injection.
We might expect high levels of ghrelin in individuals with obesity, but this is not the case. In fact, ghrelin levels are typically lower in individuals with obesity than in leaner individuals. This finding might seem to contradict the idea that obesity is due to high levels of the hunger hormone.
Excess weight could increase sensitivity to ghrelin, where more receptors lead to higher hunger stimulation with less ghrelin. Beyond hunger, ghrelin can also lead us to eat for comfort, as when stressed or anxious. Ghrelin and synthetic ghrelin mimetics increase body weight and fat mass by activating receptors in the arcuate nucleus of the hypothalamus (Müller et al.; Bany Bakar et al.). There, it also activates the brain’s reward pathways, making us crave food even when we are not hungry. This connection between ghrelin and emotional eating can contribute to stress-induced obesity.
In my clinical practice, I have seen individuals gain maximum weight when they are under more stress and are sleep-deprived. This is because ghrelin levels increased in these scenarios. This elevation of ghrelin in high-stress, low-sleep situations affects weight gain in women during the postpartum period and menopause.
Evidence also suggests that certain foods affect ghrelin levels. After a person eats carbohydrates, their ghrelin levels initially decrease quickly, but this is followed by a rise in ghrelin, leading them to become hungry again. In contrast, protein intake helps suppress ghrelin levels for longer. Hence, we advise patients to increase protein intake while reducing their carb intake, or to always eat protein along with carbs.
It makes sense that when individuals with obesity lose weight by fasting or caloric restriction and try to maintain that weight loss, their bodies tend to produce more ghrelin. This effect might explain why people who lose weight often find it hard to keep it off: Rising ghrelin levels after weight loss might drive them to eat more and regain weight.
Two prominent weight loss surgeries, sleeve gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB), have opposite effects on ghrelin levels, reflecting their distinct mechanisms for weight loss. SG involves removal of the gastric fundus, where ghrelin is produced, resulting in a significant decrease in ghrelin levels; RYGB operates through malabsorption without directly affecting ghrelin production. Despite these differing approaches, both techniques demonstrate remarkable weight loss efficacy. Research comparing the two procedures reveals that SG leads to decreased fasting plasma ghrelin levels, whereas RYGB prompts an increase, highlighting the additional appetite-reducing mechanism of SG through ghrelin suppression. This contrast underscores the intricate role of ghrelin in appetite regulation and suggests that its manipulation can significantly influence weight loss outcomes.
With the effect of ghrelin in stimulating appetite being established, other studies have explored the relationship between ghrelin and insulin resistance. A meta-analysis by researchers at Qingdao University, Qingdao, China, found that circulating ghrelin levels were negatively correlated with insulin resistance in individuals with obesity and normal fasting glucose levels. The findings suggest that the role of ghrelin in obesity might extend beyond appetite regulation to influence metabolic pathways and that ghrelin may be a marker for predicting obesity.
Researchers are exploring potential therapeutic targets focusing on ghrelin modulation. Although selective neutralization of ghrelin has not yielded consistent results in rodent models, the interplay between ghrelin and LEAP2— a hormone that attaches to the same brain receptors — could be an area of interest for future obesity treatments.
Could ghrelin be the key to tackling obesity? Blocking ghrelin pharmacologically might be a strategy to keep weight off after weight loss, and it could help prevent the typical rebound effect seen with diets and withdrawal of medications. Considering the high rates of weight regain after diet-induced weight loss and withdrawal of weight loss medications, targeting ghrelin might be the missing link in long-term obesity treatment. It could be a valuable approach to improving long-term outcomes for obesity. However, these blockers might have significant side effects, given that ghrelin affects not only hunger but also the brain’s reward and pleasure centers. Therefore, caution will be needed in developing such medications owing to their potential impact on mood and mental health.
With ghrelin playing roles in hunger, reward pathways, and energy regulation, understanding this hormone is crucial in the fight against obesity. Stay tuned for future research that could shed light on the underlying mechanisms at play and hopefully results in clinical action steps.
Dimpi Desai, MD, is a professor in the Department of Medicine, Division of Endocrinology, Gerontology, and Metabolism, Stanford University, Stanford, California, and has disclosed no relevant financial relationships. Ashni Dharia, MD, is a resident in the Department of Internal Medicine, Allegheny General Hospital, Pittsburgh, Pennsylvania.
A version of this article appeared on Medscape.com.
Despite their best efforts, 80% of people who lose weight regain it and many end up heavier within 5 years. Why? Our bodies fight back, revving up hunger while slowing metabolism after weight loss. In ongoing obesity discussions, ghrelin is in the spotlight as the “hunger hormone” playing a crucial role in driving appetite and facilitating weight gain.
Weight loss interventions, such as diet or gastric bypass surgery, may trigger an increase in ghrelin levels, potentially fueling long-term weight gain. Consequently, ghrelin remains a focal point of research into innovative antiobesity treatments.
Ghrelin, a hormone produced in the stomach, is often called the “hunger hormone.” Ghrelin is a circulating orexigenic gut hormone with growth hormone–releasing activity.
Since the discovery of ghrelin, in 1999, research in mice and people has focused on its effect on regulating appetite and implications for long-term weight control. When hunger strikes, ghrelin levels surge, sending signals to the brain that ramp up the appetite. Following a meal, ghrelin decreases, indicating fullness.
Studies have found that people who were injected with subcutaneous ghrelin experienced a 46% increase in hunger and ate 28% more at their next meal than those who didn’t receive a ghrelin injection.
We might expect high levels of ghrelin in individuals with obesity, but this is not the case. In fact, ghrelin levels are typically lower in individuals with obesity than in leaner individuals. This finding might seem to contradict the idea that obesity is due to high levels of the hunger hormone.
Excess weight could increase sensitivity to ghrelin, where more receptors lead to higher hunger stimulation with less ghrelin. Beyond hunger, ghrelin can also lead us to eat for comfort, as when stressed or anxious. Ghrelin and synthetic ghrelin mimetics increase body weight and fat mass by activating receptors in the arcuate nucleus of the hypothalamus (Müller et al.; Bany Bakar et al.). There, it also activates the brain’s reward pathways, making us crave food even when we are not hungry. This connection between ghrelin and emotional eating can contribute to stress-induced obesity.
In my clinical practice, I have seen individuals gain maximum weight when they are under more stress and are sleep-deprived. This is because ghrelin levels increased in these scenarios. This elevation of ghrelin in high-stress, low-sleep situations affects weight gain in women during the postpartum period and menopause.
Evidence also suggests that certain foods affect ghrelin levels. After a person eats carbohydrates, their ghrelin levels initially decrease quickly, but this is followed by a rise in ghrelin, leading them to become hungry again. In contrast, protein intake helps suppress ghrelin levels for longer. Hence, we advise patients to increase protein intake while reducing their carb intake, or to always eat protein along with carbs.
It makes sense that when individuals with obesity lose weight by fasting or caloric restriction and try to maintain that weight loss, their bodies tend to produce more ghrelin. This effect might explain why people who lose weight often find it hard to keep it off: Rising ghrelin levels after weight loss might drive them to eat more and regain weight.
Two prominent weight loss surgeries, sleeve gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB), have opposite effects on ghrelin levels, reflecting their distinct mechanisms for weight loss. SG involves removal of the gastric fundus, where ghrelin is produced, resulting in a significant decrease in ghrelin levels; RYGB operates through malabsorption without directly affecting ghrelin production. Despite these differing approaches, both techniques demonstrate remarkable weight loss efficacy. Research comparing the two procedures reveals that SG leads to decreased fasting plasma ghrelin levels, whereas RYGB prompts an increase, highlighting the additional appetite-reducing mechanism of SG through ghrelin suppression. This contrast underscores the intricate role of ghrelin in appetite regulation and suggests that its manipulation can significantly influence weight loss outcomes.
With the effect of ghrelin in stimulating appetite being established, other studies have explored the relationship between ghrelin and insulin resistance. A meta-analysis by researchers at Qingdao University, Qingdao, China, found that circulating ghrelin levels were negatively correlated with insulin resistance in individuals with obesity and normal fasting glucose levels. The findings suggest that the role of ghrelin in obesity might extend beyond appetite regulation to influence metabolic pathways and that ghrelin may be a marker for predicting obesity.
Researchers are exploring potential therapeutic targets focusing on ghrelin modulation. Although selective neutralization of ghrelin has not yielded consistent results in rodent models, the interplay between ghrelin and LEAP2— a hormone that attaches to the same brain receptors — could be an area of interest for future obesity treatments.
Could ghrelin be the key to tackling obesity? Blocking ghrelin pharmacologically might be a strategy to keep weight off after weight loss, and it could help prevent the typical rebound effect seen with diets and withdrawal of medications. Considering the high rates of weight regain after diet-induced weight loss and withdrawal of weight loss medications, targeting ghrelin might be the missing link in long-term obesity treatment. It could be a valuable approach to improving long-term outcomes for obesity. However, these blockers might have significant side effects, given that ghrelin affects not only hunger but also the brain’s reward and pleasure centers. Therefore, caution will be needed in developing such medications owing to their potential impact on mood and mental health.
With ghrelin playing roles in hunger, reward pathways, and energy regulation, understanding this hormone is crucial in the fight against obesity. Stay tuned for future research that could shed light on the underlying mechanisms at play and hopefully results in clinical action steps.
Dimpi Desai, MD, is a professor in the Department of Medicine, Division of Endocrinology, Gerontology, and Metabolism, Stanford University, Stanford, California, and has disclosed no relevant financial relationships. Ashni Dharia, MD, is a resident in the Department of Internal Medicine, Allegheny General Hospital, Pittsburgh, Pennsylvania.
A version of this article appeared on Medscape.com.
Despite their best efforts, 80% of people who lose weight regain it and many end up heavier within 5 years. Why? Our bodies fight back, revving up hunger while slowing metabolism after weight loss. In ongoing obesity discussions, ghrelin is in the spotlight as the “hunger hormone” playing a crucial role in driving appetite and facilitating weight gain.
Weight loss interventions, such as diet or gastric bypass surgery, may trigger an increase in ghrelin levels, potentially fueling long-term weight gain. Consequently, ghrelin remains a focal point of research into innovative antiobesity treatments.
Ghrelin, a hormone produced in the stomach, is often called the “hunger hormone.” Ghrelin is a circulating orexigenic gut hormone with growth hormone–releasing activity.
Since the discovery of ghrelin, in 1999, research in mice and people has focused on its effect on regulating appetite and implications for long-term weight control. When hunger strikes, ghrelin levels surge, sending signals to the brain that ramp up the appetite. Following a meal, ghrelin decreases, indicating fullness.
Studies have found that people who were injected with subcutaneous ghrelin experienced a 46% increase in hunger and ate 28% more at their next meal than those who didn’t receive a ghrelin injection.
We might expect high levels of ghrelin in individuals with obesity, but this is not the case. In fact, ghrelin levels are typically lower in individuals with obesity than in leaner individuals. This finding might seem to contradict the idea that obesity is due to high levels of the hunger hormone.
Excess weight could increase sensitivity to ghrelin, where more receptors lead to higher hunger stimulation with less ghrelin. Beyond hunger, ghrelin can also lead us to eat for comfort, as when stressed or anxious. Ghrelin and synthetic ghrelin mimetics increase body weight and fat mass by activating receptors in the arcuate nucleus of the hypothalamus (Müller et al.; Bany Bakar et al.). There, it also activates the brain’s reward pathways, making us crave food even when we are not hungry. This connection between ghrelin and emotional eating can contribute to stress-induced obesity.
In my clinical practice, I have seen individuals gain maximum weight when they are under more stress and are sleep-deprived. This is because ghrelin levels increased in these scenarios. This elevation of ghrelin in high-stress, low-sleep situations affects weight gain in women during the postpartum period and menopause.
Evidence also suggests that certain foods affect ghrelin levels. After a person eats carbohydrates, their ghrelin levels initially decrease quickly, but this is followed by a rise in ghrelin, leading them to become hungry again. In contrast, protein intake helps suppress ghrelin levels for longer. Hence, we advise patients to increase protein intake while reducing their carb intake, or to always eat protein along with carbs.
It makes sense that when individuals with obesity lose weight by fasting or caloric restriction and try to maintain that weight loss, their bodies tend to produce more ghrelin. This effect might explain why people who lose weight often find it hard to keep it off: Rising ghrelin levels after weight loss might drive them to eat more and regain weight.
Two prominent weight loss surgeries, sleeve gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB), have opposite effects on ghrelin levels, reflecting their distinct mechanisms for weight loss. SG involves removal of the gastric fundus, where ghrelin is produced, resulting in a significant decrease in ghrelin levels; RYGB operates through malabsorption without directly affecting ghrelin production. Despite these differing approaches, both techniques demonstrate remarkable weight loss efficacy. Research comparing the two procedures reveals that SG leads to decreased fasting plasma ghrelin levels, whereas RYGB prompts an increase, highlighting the additional appetite-reducing mechanism of SG through ghrelin suppression. This contrast underscores the intricate role of ghrelin in appetite regulation and suggests that its manipulation can significantly influence weight loss outcomes.
With the effect of ghrelin in stimulating appetite being established, other studies have explored the relationship between ghrelin and insulin resistance. A meta-analysis by researchers at Qingdao University, Qingdao, China, found that circulating ghrelin levels were negatively correlated with insulin resistance in individuals with obesity and normal fasting glucose levels. The findings suggest that the role of ghrelin in obesity might extend beyond appetite regulation to influence metabolic pathways and that ghrelin may be a marker for predicting obesity.
Researchers are exploring potential therapeutic targets focusing on ghrelin modulation. Although selective neutralization of ghrelin has not yielded consistent results in rodent models, the interplay between ghrelin and LEAP2— a hormone that attaches to the same brain receptors — could be an area of interest for future obesity treatments.
Could ghrelin be the key to tackling obesity? Blocking ghrelin pharmacologically might be a strategy to keep weight off after weight loss, and it could help prevent the typical rebound effect seen with diets and withdrawal of medications. Considering the high rates of weight regain after diet-induced weight loss and withdrawal of weight loss medications, targeting ghrelin might be the missing link in long-term obesity treatment. It could be a valuable approach to improving long-term outcomes for obesity. However, these blockers might have significant side effects, given that ghrelin affects not only hunger but also the brain’s reward and pleasure centers. Therefore, caution will be needed in developing such medications owing to their potential impact on mood and mental health.
With ghrelin playing roles in hunger, reward pathways, and energy regulation, understanding this hormone is crucial in the fight against obesity. Stay tuned for future research that could shed light on the underlying mechanisms at play and hopefully results in clinical action steps.
Dimpi Desai, MD, is a professor in the Department of Medicine, Division of Endocrinology, Gerontology, and Metabolism, Stanford University, Stanford, California, and has disclosed no relevant financial relationships. Ashni Dharia, MD, is a resident in the Department of Internal Medicine, Allegheny General Hospital, Pittsburgh, Pennsylvania.
A version of this article appeared on Medscape.com.
New Clues on How Blast Exposure May Lead to Alzheimer’s Disease
In October 2023, Robert Card — a grenade instructor in the Army Reserve — shot and killed 18 people in Maine, before turning the gun on himself. As reported by The New York Times, his family said that he had become increasingly erratic and violent during the months before the rampage.
A postmortem conducted by the Chronic Traumatic Encephalopathy (CTE) Center at Boston University found “significant evidence of traumatic brain injuries” [TBIs] and “significant degeneration, axonal and myelin loss, inflammation, and small blood vessel injury” in the white matter, the center’s director, Ann McKee, MD, said in a press release. “These findings align with our previous studies on the effects of blast injury in humans and experimental models.”
Members of the military, such as Mr. Card, are exposed to blasts from repeated firing of heavy weapons not only during combat but also during training.
A higher index of suspicion for dementia or Alzheimer’s disease may be warranted in patients with a history of blast exposure or subconcussive brain injury who present with cognitive issues, according to experts interviewed.
In 2022, the US Department of Defense (DOD) launched its Warfighter Brain Health Initiative with the aim of “optimizing service member brain health and countering traumatic brain injuries.”
In April 2024, the Blast Overpressure Safety Act was introduced in the Senate to require the DOD to enact better blast screening, tracking, prevention, and treatment. The DOD initiated 26 blast overpressure studies.
Heather Snyder, PhD, Alzheimer’s Association vice president of Medical and Scientific Relations, said that an important component of that research involves “the need to study the difference between TBI-caused dementia and dementia caused independently” and “the need to study biomarkers to better understand the long-term consequences of TBI.”
What Is the Underlying Biology?
Dr. Snyder was the lead author of a white paper produced by the Alzheimer’s Association in 2018 on military-related risk factors for Alzheimer’s disease and related dementias. “There is a lot of work trying to understand the effect of pure blast waves on the brain, as opposed to the actual impact of the injury,” she said.
The white paper speculated that blast exposure may be analogous to subconcussive brain injury in athletes where there are no obvious immediate clinical symptoms or neurological dysfunction but which can cause cumulative injury and functional impairment over time.
“We are also trying to understand the underlying biology around brain changes, such as accumulation of tau and amyloid and other specific markers related to brain changes in Alzheimer’s disease,” said Dr. Snyder, chair of the Peer Reviewed Alzheimer’s Research Program Programmatic Panel for Alzheimer’s Disease/Alzheimer’s Disease and Related Dementias and TBI.
Common Biomarker Signatures
A recent study in Neurology comparing 51 veterans with mild TBI (mTBI) with 85 veterans and civilians with no lifetime history of TBI is among the first to explore these biomarker changes in human beings.
“Our findings suggest that chronic neuropathologic processes associated with blast mTBI share properties in common with pathogenic processes that are precursors to Alzheimer’s disease onset,” said coauthor Elaine R. Peskind, MD, professor of psychiatry and behavioral sciences, University of Washington, Seattle.
The largely male participants were a mean age of 34 years and underwent standardized clinical and neuropsychological testing as well as lumbar puncture to collect cerebrospinal fluid (CSF). The mTBI group had experienced at least one war zone blast or combined blast/impact that met criteria for mTBI, but 91% had more than one blast mTBI, and the study took place over 13 years.
The researchers found that the mTBI group “had biomarker signatures in common with the earliest stages of Alzheimer’s disease,” said Dr. Peskind.
For example, at age 50, they had lower mean levels of CSF amyloid beta 42 (Abeta42), the earliest marker of brain parenchymal Abeta deposition, compared with the control group (154 pg/mL and 1864 pg/mL lower, respectively).
High CSF phosphorylated tau181 (p-tau181) and total tau are established biomarkers for Alzheimer’s disease. However, levels of these biomarkers remained “relatively constant with age” in participants with mTBI but were higher in older ages for the non-TBI group.
The mTBI group also showed worse cognitive performance at older ages (P < .08). Poorer verbal memory and verbal fluency performance were associated with lower CSF Abeta42 in older participants (P ≤ .05).
In Alzheimer’s disease, a reduction in CSF Abeta42 may occur up to 20 years before the onset of clinical symptoms, according to Dr. Peskind. “But what we don’t know from this study is what this means, as total tau protein and p-tau181 in the CSF were also low, which isn’t entirely typical in the picture of preclinical Alzheimer’s disease,” she said. However, changes in total tau and p-tau181 lag behind changes in Abeta42.
Is Impaired Clearance the Culprit?
Coauthor Jeffrey Iliff, PhD, professor, University of Washington Department of Psychiatry and Behavioral Sciences and University of Washington Department of Neurology, Seattle, elaborated.
“In the setting of Alzheimer’s disease, a signature of the disease is reduced CSF Abeta42, which is thought to reflect that much of the amyloid gets ‘stuck’ in the brain in the form of amyloid plaques,” he said. “There are usually higher levels of phosphorylated tau and total tau, which are thought to reflect the presence of tau tangles and degeneration of neurons in the brain. But in this study, all of those were lowered, which is not exactly an Alzheimer’s disease profile.”
Dr. Iliff, associate director for research, VA Northwest Mental Illness Research, Education, and Clinical Center at VA Puget Sound Health Care System, Seattle, suggested that the culprit may be impairment in the brain’s glymphatic system. “Recently described biological research supports [the concept of] clearance of waste out of the brain during sleep via the glymphatic system, with amyloid and tau being cleared from the brain interstitium during sleep.”
A recent hypothesis is that blast TBI impairs that process. “This is why we see less of those proteins in the CSF. They’re not being cleared, which might contribute downstream to the clumping up of protein in the brain,” he suggested.
The evidence base corroborating that hypothesis is in its infancy; however, new research conducted by Dr. Iliff and his colleagues sheds light on this potential mechanism.
In blast TBI, energy from the explosion and resulting overpressure wave are “transmitted through the brain, which causes tissues of different densities — such as gray and white matter — to accelerate at different rates,” according to Dr. Iliff. This results in the shearing and stretching of brain tissue, leading to a “diffuse pattern of tissue damage.”
It is known that blast TBI has clinical overlap and associations with posttraumatic stress disorder (PTSD), depression, and persistent neurobehavioral symptoms; that veterans with a history of TBI are more than twice as likely to die by suicide than veterans with no TBI history; and that TBI may increase the risk for Alzheimer’s disease and related dementing disorders, as well as CTE.
The missing link may be the glymphatic system — a “brain-wide network of perivascular pathways, along which CSF and interstitial fluid (ISF) exchange, supporting the clearance of interstitial solutes, including amyloid-beta.”
Dr. Iliff and his group previously found that glymphatic function is “markedly and chronically impaired” following impact TBI in mice and that this impairment is associated with the mislocalization of astroglial aquaporin 4 (AQP4), a water channel that lines perivascular spaces and plays a role in healthy glymphatic exchange.
In their new study, the researchers examined both the expression and the localization of AQP4 in the human postmortem frontal cortex and found “distinct laminar differences” in AQP4 expression following blast exposure. They observed similar changes as well as impairment of glymphatic function, which emerged 28 days following blast injury in a mouse model of repetitive blast mTBI.
And in a cohort of veterans with blast mTBI, blast exposure was found to be associated with an increased burden of frontal cortical MRI-visible perivascular spaces — a “putative neuroimaging marker” of glymphatic perivascular dysfunction.
The earlier Neurology study “showed impairment of biomarkers in the CSF, but the new study showed ‘why’ or ‘how’ these biomarkers are impaired, which is via impairment of the glymphatic clearance process,” Dr. Iliff explained.
Veterans Especially Vulnerable
Dr. Peskind, co-director of the VA Northwest Mental Illness Research, Education and Clinical Center, VA Puget Sound Health Care System, noted that while the veterans in the earlier study had at least one TBI, the average number was 20, and it was more common to have more than 50 mTBIs than to have a single one.
“These were highly exposed combat vets,” she said. “And that number doesn’t even account for subconcussive exposure to blasts, which now appear to cause detectable brain damage, even in the absence of a diagnosable TBI.”
The Maine shooter, Mr. Card, had not seen combat and was not assessed for TBI during a psychiatric hospitalization, according to The New York Times.
Dr. Peskind added that this type of blast damage is likely specific to individuals in the military. “It isn’t the sound that causes the damage,” she explained. “It’s the blast wave, the pressure wave, and there aren’t a lot of other occupations that have those types of occupational exposures.”
Dr. Snyder added that the majority of blast TBIs have been studied in military personnel, and she is not aware of studies that have looked at blast injuries in other industries, such as demolition or mining, to see if they have the same type of biologic consequences.
Dr. Snyder hopes that the researchers will follow the participants in the Neurology study and continue looking at specific markers related to Alzheimer’s disease brain changes. What the research so far shows “is that, at an earlier age, we’re starting to see those markers changing, suggesting that the underlying biology in people with mild blast TBI is similar to the underlying biology in Alzheimer’s disease as well.”
Michael Alosco, PhD, associate professor and vice chair of research, department of neurology, Boston University Chobanian & Avedisian School of Medicine, called the issue of blast exposure and TBI “a very complex and nuanced topic,” especially because TBI is “considered a risk factor of Alzheimer’s disease” and “different types of TBIs could trigger distinct pathophysiologic processes; however, the long-term impact of repetitive blast TBIs on neurodegenerative disease changes remains unknown.”
He coauthored an editorial on the earlier Neurology study that noted its limitations, such as a small sample size and lack of consideration of lifestyle and health factors but acknowledged that the “findings provide preliminary evidence that repetitive blast exposures might influence beta-amyloid accumulation.”
Clinical Implications
For Dr. Peskind, the “inflection point” was seeing lower CSF Abeta42, about 20 years earlier than ages 60 and 70, which is more typical in cognitively normal community volunteers.
But she described herself as “loath to say that veterans or service members have a 20-year acceleration of risk of Alzheimer’s disease,” adding, “I don’t want to scare the heck out of our service members of veterans.” Although “this is what we fear, we’re not ready to say it for sure yet because we need to do more work. Nevertheless, it does increase the index of suspicion.”
The clinical take-home messages are not unique to service members or veterans or people with a history of head injuries or a genetic predisposition to Alzheimer’s disease, she emphasized. “If anyone of any age or occupation comes in with cognitive issues, such as [impaired] memory or executive function, they deserve a workup for dementing disorders.” Frontotemporal dementia, for example, can present earlier than Alzheimer’s disease typically does.
Common comorbidities with TBI are PTSD and obstructive sleep apnea (OSA), which can also cause cognitive issues and are also risk factors for dementia.
Dr. Iliff agreed. “If you see a veteran with a history of PTSD, a history of blast TBI, and a history of OSA or some combination of those three, I recommend having a higher index of suspicion [for potential dementia] than for an average person without any of these, even at a younger age than one would ordinarily expect.”
Of all of these factors, the only truly directly modifiable one is sleep disruption, including that caused by OSA or sleep disorders related to PTSD, he added. “Epidemiologic data suggest a connection particularly between midlife sleep disruption and the risk of dementia and Alzheimer’s disease, and so it’s worth thinking about sleep as a modifiable risk factor even as early as the 40s and 50s, whether the patient is or isn’t a veteran.”
Dr. Peskind recommended asking patients, “Do they snore? Do they thrash about during sleep? Do they have trauma nightmares? This will inform the type of intervention required.”
Dr. Alosco added that there is no known “safe” threshold of exposure to blasts, and that thresholds are “unclear, particularly at the individual level.” In American football, there is a dose-response relationship between years of play and risk for later-life neurological disorder. “The best way to mitigate risk is to limit cumulative exposure,” he said.
The study by Li and colleagues was funded by grant funding from the Department of Veterans Affairs Rehabilitation Research and Development Service and the University of Washington Friends of Alzheimer’s Research. Other sources of funding to individual researchers are listed in the original paper. The study by Braun and colleagues was supported by the National Heart, Lung and Blood Institute; the Department of Veterans Affairs Rehabilitation Research and Development Service; and the National Institute on Aging. The white paper included studies that received funding from numerous sources, including the National Institutes of Health and the DOD. Dr. Iliff serves as the chair of the Scientific Advisory Board for Applied Cognition Inc., from which he receives compensation and in which he holds an equity stake. In the last year, he served as a paid consultant to Gryphon Biosciences. Dr. Peskind has served as a paid consultant to the companies Genentech, Roche, and Alpha Cognition. Dr. Alosco was supported by grant funding from the NIH; he received research support from Rainwater Charitable Foundation Inc., and Life Molecular Imaging Inc.; he has received a single honorarium from the Michael J. Fox Foundation for services unrelated to this editorial; and he received royalties from Oxford University Press Inc. The other authors’ disclosures are listed in the original papers.
A version of this article appeared on Medscape.com.
In October 2023, Robert Card — a grenade instructor in the Army Reserve — shot and killed 18 people in Maine, before turning the gun on himself. As reported by The New York Times, his family said that he had become increasingly erratic and violent during the months before the rampage.
A postmortem conducted by the Chronic Traumatic Encephalopathy (CTE) Center at Boston University found “significant evidence of traumatic brain injuries” [TBIs] and “significant degeneration, axonal and myelin loss, inflammation, and small blood vessel injury” in the white matter, the center’s director, Ann McKee, MD, said in a press release. “These findings align with our previous studies on the effects of blast injury in humans and experimental models.”
Members of the military, such as Mr. Card, are exposed to blasts from repeated firing of heavy weapons not only during combat but also during training.
A higher index of suspicion for dementia or Alzheimer’s disease may be warranted in patients with a history of blast exposure or subconcussive brain injury who present with cognitive issues, according to experts interviewed.
In 2022, the US Department of Defense (DOD) launched its Warfighter Brain Health Initiative with the aim of “optimizing service member brain health and countering traumatic brain injuries.”
In April 2024, the Blast Overpressure Safety Act was introduced in the Senate to require the DOD to enact better blast screening, tracking, prevention, and treatment. The DOD initiated 26 blast overpressure studies.
Heather Snyder, PhD, Alzheimer’s Association vice president of Medical and Scientific Relations, said that an important component of that research involves “the need to study the difference between TBI-caused dementia and dementia caused independently” and “the need to study biomarkers to better understand the long-term consequences of TBI.”
What Is the Underlying Biology?
Dr. Snyder was the lead author of a white paper produced by the Alzheimer’s Association in 2018 on military-related risk factors for Alzheimer’s disease and related dementias. “There is a lot of work trying to understand the effect of pure blast waves on the brain, as opposed to the actual impact of the injury,” she said.
The white paper speculated that blast exposure may be analogous to subconcussive brain injury in athletes where there are no obvious immediate clinical symptoms or neurological dysfunction but which can cause cumulative injury and functional impairment over time.
“We are also trying to understand the underlying biology around brain changes, such as accumulation of tau and amyloid and other specific markers related to brain changes in Alzheimer’s disease,” said Dr. Snyder, chair of the Peer Reviewed Alzheimer’s Research Program Programmatic Panel for Alzheimer’s Disease/Alzheimer’s Disease and Related Dementias and TBI.
Common Biomarker Signatures
A recent study in Neurology comparing 51 veterans with mild TBI (mTBI) with 85 veterans and civilians with no lifetime history of TBI is among the first to explore these biomarker changes in human beings.
“Our findings suggest that chronic neuropathologic processes associated with blast mTBI share properties in common with pathogenic processes that are precursors to Alzheimer’s disease onset,” said coauthor Elaine R. Peskind, MD, professor of psychiatry and behavioral sciences, University of Washington, Seattle.
The largely male participants were a mean age of 34 years and underwent standardized clinical and neuropsychological testing as well as lumbar puncture to collect cerebrospinal fluid (CSF). The mTBI group had experienced at least one war zone blast or combined blast/impact that met criteria for mTBI, but 91% had more than one blast mTBI, and the study took place over 13 years.
The researchers found that the mTBI group “had biomarker signatures in common with the earliest stages of Alzheimer’s disease,” said Dr. Peskind.
For example, at age 50, they had lower mean levels of CSF amyloid beta 42 (Abeta42), the earliest marker of brain parenchymal Abeta deposition, compared with the control group (154 pg/mL and 1864 pg/mL lower, respectively).
High CSF phosphorylated tau181 (p-tau181) and total tau are established biomarkers for Alzheimer’s disease. However, levels of these biomarkers remained “relatively constant with age” in participants with mTBI but were higher in older ages for the non-TBI group.
The mTBI group also showed worse cognitive performance at older ages (P < .08). Poorer verbal memory and verbal fluency performance were associated with lower CSF Abeta42 in older participants (P ≤ .05).
In Alzheimer’s disease, a reduction in CSF Abeta42 may occur up to 20 years before the onset of clinical symptoms, according to Dr. Peskind. “But what we don’t know from this study is what this means, as total tau protein and p-tau181 in the CSF were also low, which isn’t entirely typical in the picture of preclinical Alzheimer’s disease,” she said. However, changes in total tau and p-tau181 lag behind changes in Abeta42.
Is Impaired Clearance the Culprit?
Coauthor Jeffrey Iliff, PhD, professor, University of Washington Department of Psychiatry and Behavioral Sciences and University of Washington Department of Neurology, Seattle, elaborated.
“In the setting of Alzheimer’s disease, a signature of the disease is reduced CSF Abeta42, which is thought to reflect that much of the amyloid gets ‘stuck’ in the brain in the form of amyloid plaques,” he said. “There are usually higher levels of phosphorylated tau and total tau, which are thought to reflect the presence of tau tangles and degeneration of neurons in the brain. But in this study, all of those were lowered, which is not exactly an Alzheimer’s disease profile.”
Dr. Iliff, associate director for research, VA Northwest Mental Illness Research, Education, and Clinical Center at VA Puget Sound Health Care System, Seattle, suggested that the culprit may be impairment in the brain’s glymphatic system. “Recently described biological research supports [the concept of] clearance of waste out of the brain during sleep via the glymphatic system, with amyloid and tau being cleared from the brain interstitium during sleep.”
A recent hypothesis is that blast TBI impairs that process. “This is why we see less of those proteins in the CSF. They’re not being cleared, which might contribute downstream to the clumping up of protein in the brain,” he suggested.
The evidence base corroborating that hypothesis is in its infancy; however, new research conducted by Dr. Iliff and his colleagues sheds light on this potential mechanism.
In blast TBI, energy from the explosion and resulting overpressure wave are “transmitted through the brain, which causes tissues of different densities — such as gray and white matter — to accelerate at different rates,” according to Dr. Iliff. This results in the shearing and stretching of brain tissue, leading to a “diffuse pattern of tissue damage.”
It is known that blast TBI has clinical overlap and associations with posttraumatic stress disorder (PTSD), depression, and persistent neurobehavioral symptoms; that veterans with a history of TBI are more than twice as likely to die by suicide than veterans with no TBI history; and that TBI may increase the risk for Alzheimer’s disease and related dementing disorders, as well as CTE.
The missing link may be the glymphatic system — a “brain-wide network of perivascular pathways, along which CSF and interstitial fluid (ISF) exchange, supporting the clearance of interstitial solutes, including amyloid-beta.”
Dr. Iliff and his group previously found that glymphatic function is “markedly and chronically impaired” following impact TBI in mice and that this impairment is associated with the mislocalization of astroglial aquaporin 4 (AQP4), a water channel that lines perivascular spaces and plays a role in healthy glymphatic exchange.
In their new study, the researchers examined both the expression and the localization of AQP4 in the human postmortem frontal cortex and found “distinct laminar differences” in AQP4 expression following blast exposure. They observed similar changes as well as impairment of glymphatic function, which emerged 28 days following blast injury in a mouse model of repetitive blast mTBI.
And in a cohort of veterans with blast mTBI, blast exposure was found to be associated with an increased burden of frontal cortical MRI-visible perivascular spaces — a “putative neuroimaging marker” of glymphatic perivascular dysfunction.
The earlier Neurology study “showed impairment of biomarkers in the CSF, but the new study showed ‘why’ or ‘how’ these biomarkers are impaired, which is via impairment of the glymphatic clearance process,” Dr. Iliff explained.
Veterans Especially Vulnerable
Dr. Peskind, co-director of the VA Northwest Mental Illness Research, Education and Clinical Center, VA Puget Sound Health Care System, noted that while the veterans in the earlier study had at least one TBI, the average number was 20, and it was more common to have more than 50 mTBIs than to have a single one.
“These were highly exposed combat vets,” she said. “And that number doesn’t even account for subconcussive exposure to blasts, which now appear to cause detectable brain damage, even in the absence of a diagnosable TBI.”
The Maine shooter, Mr. Card, had not seen combat and was not assessed for TBI during a psychiatric hospitalization, according to The New York Times.
Dr. Peskind added that this type of blast damage is likely specific to individuals in the military. “It isn’t the sound that causes the damage,” she explained. “It’s the blast wave, the pressure wave, and there aren’t a lot of other occupations that have those types of occupational exposures.”
Dr. Snyder added that the majority of blast TBIs have been studied in military personnel, and she is not aware of studies that have looked at blast injuries in other industries, such as demolition or mining, to see if they have the same type of biologic consequences.
Dr. Snyder hopes that the researchers will follow the participants in the Neurology study and continue looking at specific markers related to Alzheimer’s disease brain changes. What the research so far shows “is that, at an earlier age, we’re starting to see those markers changing, suggesting that the underlying biology in people with mild blast TBI is similar to the underlying biology in Alzheimer’s disease as well.”
Michael Alosco, PhD, associate professor and vice chair of research, department of neurology, Boston University Chobanian & Avedisian School of Medicine, called the issue of blast exposure and TBI “a very complex and nuanced topic,” especially because TBI is “considered a risk factor of Alzheimer’s disease” and “different types of TBIs could trigger distinct pathophysiologic processes; however, the long-term impact of repetitive blast TBIs on neurodegenerative disease changes remains unknown.”
He coauthored an editorial on the earlier Neurology study that noted its limitations, such as a small sample size and lack of consideration of lifestyle and health factors but acknowledged that the “findings provide preliminary evidence that repetitive blast exposures might influence beta-amyloid accumulation.”
Clinical Implications
For Dr. Peskind, the “inflection point” was seeing lower CSF Abeta42, about 20 years earlier than ages 60 and 70, which is more typical in cognitively normal community volunteers.
But she described herself as “loath to say that veterans or service members have a 20-year acceleration of risk of Alzheimer’s disease,” adding, “I don’t want to scare the heck out of our service members of veterans.” Although “this is what we fear, we’re not ready to say it for sure yet because we need to do more work. Nevertheless, it does increase the index of suspicion.”
The clinical take-home messages are not unique to service members or veterans or people with a history of head injuries or a genetic predisposition to Alzheimer’s disease, she emphasized. “If anyone of any age or occupation comes in with cognitive issues, such as [impaired] memory or executive function, they deserve a workup for dementing disorders.” Frontotemporal dementia, for example, can present earlier than Alzheimer’s disease typically does.
Common comorbidities with TBI are PTSD and obstructive sleep apnea (OSA), which can also cause cognitive issues and are also risk factors for dementia.
Dr. Iliff agreed. “If you see a veteran with a history of PTSD, a history of blast TBI, and a history of OSA or some combination of those three, I recommend having a higher index of suspicion [for potential dementia] than for an average person without any of these, even at a younger age than one would ordinarily expect.”
Of all of these factors, the only truly directly modifiable one is sleep disruption, including that caused by OSA or sleep disorders related to PTSD, he added. “Epidemiologic data suggest a connection particularly between midlife sleep disruption and the risk of dementia and Alzheimer’s disease, and so it’s worth thinking about sleep as a modifiable risk factor even as early as the 40s and 50s, whether the patient is or isn’t a veteran.”
Dr. Peskind recommended asking patients, “Do they snore? Do they thrash about during sleep? Do they have trauma nightmares? This will inform the type of intervention required.”
Dr. Alosco added that there is no known “safe” threshold of exposure to blasts, and that thresholds are “unclear, particularly at the individual level.” In American football, there is a dose-response relationship between years of play and risk for later-life neurological disorder. “The best way to mitigate risk is to limit cumulative exposure,” he said.
The study by Li and colleagues was funded by grant funding from the Department of Veterans Affairs Rehabilitation Research and Development Service and the University of Washington Friends of Alzheimer’s Research. Other sources of funding to individual researchers are listed in the original paper. The study by Braun and colleagues was supported by the National Heart, Lung and Blood Institute; the Department of Veterans Affairs Rehabilitation Research and Development Service; and the National Institute on Aging. The white paper included studies that received funding from numerous sources, including the National Institutes of Health and the DOD. Dr. Iliff serves as the chair of the Scientific Advisory Board for Applied Cognition Inc., from which he receives compensation and in which he holds an equity stake. In the last year, he served as a paid consultant to Gryphon Biosciences. Dr. Peskind has served as a paid consultant to the companies Genentech, Roche, and Alpha Cognition. Dr. Alosco was supported by grant funding from the NIH; he received research support from Rainwater Charitable Foundation Inc., and Life Molecular Imaging Inc.; he has received a single honorarium from the Michael J. Fox Foundation for services unrelated to this editorial; and he received royalties from Oxford University Press Inc. The other authors’ disclosures are listed in the original papers.
A version of this article appeared on Medscape.com.
In October 2023, Robert Card — a grenade instructor in the Army Reserve — shot and killed 18 people in Maine, before turning the gun on himself. As reported by The New York Times, his family said that he had become increasingly erratic and violent during the months before the rampage.
A postmortem conducted by the Chronic Traumatic Encephalopathy (CTE) Center at Boston University found “significant evidence of traumatic brain injuries” [TBIs] and “significant degeneration, axonal and myelin loss, inflammation, and small blood vessel injury” in the white matter, the center’s director, Ann McKee, MD, said in a press release. “These findings align with our previous studies on the effects of blast injury in humans and experimental models.”
Members of the military, such as Mr. Card, are exposed to blasts from repeated firing of heavy weapons not only during combat but also during training.
A higher index of suspicion for dementia or Alzheimer’s disease may be warranted in patients with a history of blast exposure or subconcussive brain injury who present with cognitive issues, according to experts interviewed.
In 2022, the US Department of Defense (DOD) launched its Warfighter Brain Health Initiative with the aim of “optimizing service member brain health and countering traumatic brain injuries.”
In April 2024, the Blast Overpressure Safety Act was introduced in the Senate to require the DOD to enact better blast screening, tracking, prevention, and treatment. The DOD initiated 26 blast overpressure studies.
Heather Snyder, PhD, Alzheimer’s Association vice president of Medical and Scientific Relations, said that an important component of that research involves “the need to study the difference between TBI-caused dementia and dementia caused independently” and “the need to study biomarkers to better understand the long-term consequences of TBI.”
What Is the Underlying Biology?
Dr. Snyder was the lead author of a white paper produced by the Alzheimer’s Association in 2018 on military-related risk factors for Alzheimer’s disease and related dementias. “There is a lot of work trying to understand the effect of pure blast waves on the brain, as opposed to the actual impact of the injury,” she said.
The white paper speculated that blast exposure may be analogous to subconcussive brain injury in athletes where there are no obvious immediate clinical symptoms or neurological dysfunction but which can cause cumulative injury and functional impairment over time.
“We are also trying to understand the underlying biology around brain changes, such as accumulation of tau and amyloid and other specific markers related to brain changes in Alzheimer’s disease,” said Dr. Snyder, chair of the Peer Reviewed Alzheimer’s Research Program Programmatic Panel for Alzheimer’s Disease/Alzheimer’s Disease and Related Dementias and TBI.
Common Biomarker Signatures
A recent study in Neurology comparing 51 veterans with mild TBI (mTBI) with 85 veterans and civilians with no lifetime history of TBI is among the first to explore these biomarker changes in human beings.
“Our findings suggest that chronic neuropathologic processes associated with blast mTBI share properties in common with pathogenic processes that are precursors to Alzheimer’s disease onset,” said coauthor Elaine R. Peskind, MD, professor of psychiatry and behavioral sciences, University of Washington, Seattle.
The largely male participants were a mean age of 34 years and underwent standardized clinical and neuropsychological testing as well as lumbar puncture to collect cerebrospinal fluid (CSF). The mTBI group had experienced at least one war zone blast or combined blast/impact that met criteria for mTBI, but 91% had more than one blast mTBI, and the study took place over 13 years.
The researchers found that the mTBI group “had biomarker signatures in common with the earliest stages of Alzheimer’s disease,” said Dr. Peskind.
For example, at age 50, they had lower mean levels of CSF amyloid beta 42 (Abeta42), the earliest marker of brain parenchymal Abeta deposition, compared with the control group (154 pg/mL and 1864 pg/mL lower, respectively).
High CSF phosphorylated tau181 (p-tau181) and total tau are established biomarkers for Alzheimer’s disease. However, levels of these biomarkers remained “relatively constant with age” in participants with mTBI but were higher in older ages for the non-TBI group.
The mTBI group also showed worse cognitive performance at older ages (P < .08). Poorer verbal memory and verbal fluency performance were associated with lower CSF Abeta42 in older participants (P ≤ .05).
In Alzheimer’s disease, a reduction in CSF Abeta42 may occur up to 20 years before the onset of clinical symptoms, according to Dr. Peskind. “But what we don’t know from this study is what this means, as total tau protein and p-tau181 in the CSF were also low, which isn’t entirely typical in the picture of preclinical Alzheimer’s disease,” she said. However, changes in total tau and p-tau181 lag behind changes in Abeta42.
Is Impaired Clearance the Culprit?
Coauthor Jeffrey Iliff, PhD, professor, University of Washington Department of Psychiatry and Behavioral Sciences and University of Washington Department of Neurology, Seattle, elaborated.
“In the setting of Alzheimer’s disease, a signature of the disease is reduced CSF Abeta42, which is thought to reflect that much of the amyloid gets ‘stuck’ in the brain in the form of amyloid plaques,” he said. “There are usually higher levels of phosphorylated tau and total tau, which are thought to reflect the presence of tau tangles and degeneration of neurons in the brain. But in this study, all of those were lowered, which is not exactly an Alzheimer’s disease profile.”
Dr. Iliff, associate director for research, VA Northwest Mental Illness Research, Education, and Clinical Center at VA Puget Sound Health Care System, Seattle, suggested that the culprit may be impairment in the brain’s glymphatic system. “Recently described biological research supports [the concept of] clearance of waste out of the brain during sleep via the glymphatic system, with amyloid and tau being cleared from the brain interstitium during sleep.”
A recent hypothesis is that blast TBI impairs that process. “This is why we see less of those proteins in the CSF. They’re not being cleared, which might contribute downstream to the clumping up of protein in the brain,” he suggested.
The evidence base corroborating that hypothesis is in its infancy; however, new research conducted by Dr. Iliff and his colleagues sheds light on this potential mechanism.
In blast TBI, energy from the explosion and resulting overpressure wave are “transmitted through the brain, which causes tissues of different densities — such as gray and white matter — to accelerate at different rates,” according to Dr. Iliff. This results in the shearing and stretching of brain tissue, leading to a “diffuse pattern of tissue damage.”
It is known that blast TBI has clinical overlap and associations with posttraumatic stress disorder (PTSD), depression, and persistent neurobehavioral symptoms; that veterans with a history of TBI are more than twice as likely to die by suicide than veterans with no TBI history; and that TBI may increase the risk for Alzheimer’s disease and related dementing disorders, as well as CTE.
The missing link may be the glymphatic system — a “brain-wide network of perivascular pathways, along which CSF and interstitial fluid (ISF) exchange, supporting the clearance of interstitial solutes, including amyloid-beta.”
Dr. Iliff and his group previously found that glymphatic function is “markedly and chronically impaired” following impact TBI in mice and that this impairment is associated with the mislocalization of astroglial aquaporin 4 (AQP4), a water channel that lines perivascular spaces and plays a role in healthy glymphatic exchange.
In their new study, the researchers examined both the expression and the localization of AQP4 in the human postmortem frontal cortex and found “distinct laminar differences” in AQP4 expression following blast exposure. They observed similar changes as well as impairment of glymphatic function, which emerged 28 days following blast injury in a mouse model of repetitive blast mTBI.
And in a cohort of veterans with blast mTBI, blast exposure was found to be associated with an increased burden of frontal cortical MRI-visible perivascular spaces — a “putative neuroimaging marker” of glymphatic perivascular dysfunction.
The earlier Neurology study “showed impairment of biomarkers in the CSF, but the new study showed ‘why’ or ‘how’ these biomarkers are impaired, which is via impairment of the glymphatic clearance process,” Dr. Iliff explained.
Veterans Especially Vulnerable
Dr. Peskind, co-director of the VA Northwest Mental Illness Research, Education and Clinical Center, VA Puget Sound Health Care System, noted that while the veterans in the earlier study had at least one TBI, the average number was 20, and it was more common to have more than 50 mTBIs than to have a single one.
“These were highly exposed combat vets,” she said. “And that number doesn’t even account for subconcussive exposure to blasts, which now appear to cause detectable brain damage, even in the absence of a diagnosable TBI.”
The Maine shooter, Mr. Card, had not seen combat and was not assessed for TBI during a psychiatric hospitalization, according to The New York Times.
Dr. Peskind added that this type of blast damage is likely specific to individuals in the military. “It isn’t the sound that causes the damage,” she explained. “It’s the blast wave, the pressure wave, and there aren’t a lot of other occupations that have those types of occupational exposures.”
Dr. Snyder added that the majority of blast TBIs have been studied in military personnel, and she is not aware of studies that have looked at blast injuries in other industries, such as demolition or mining, to see if they have the same type of biologic consequences.
Dr. Snyder hopes that the researchers will follow the participants in the Neurology study and continue looking at specific markers related to Alzheimer’s disease brain changes. What the research so far shows “is that, at an earlier age, we’re starting to see those markers changing, suggesting that the underlying biology in people with mild blast TBI is similar to the underlying biology in Alzheimer’s disease as well.”
Michael Alosco, PhD, associate professor and vice chair of research, department of neurology, Boston University Chobanian & Avedisian School of Medicine, called the issue of blast exposure and TBI “a very complex and nuanced topic,” especially because TBI is “considered a risk factor of Alzheimer’s disease” and “different types of TBIs could trigger distinct pathophysiologic processes; however, the long-term impact of repetitive blast TBIs on neurodegenerative disease changes remains unknown.”
He coauthored an editorial on the earlier Neurology study that noted its limitations, such as a small sample size and lack of consideration of lifestyle and health factors but acknowledged that the “findings provide preliminary evidence that repetitive blast exposures might influence beta-amyloid accumulation.”
Clinical Implications
For Dr. Peskind, the “inflection point” was seeing lower CSF Abeta42, about 20 years earlier than ages 60 and 70, which is more typical in cognitively normal community volunteers.
But she described herself as “loath to say that veterans or service members have a 20-year acceleration of risk of Alzheimer’s disease,” adding, “I don’t want to scare the heck out of our service members of veterans.” Although “this is what we fear, we’re not ready to say it for sure yet because we need to do more work. Nevertheless, it does increase the index of suspicion.”
The clinical take-home messages are not unique to service members or veterans or people with a history of head injuries or a genetic predisposition to Alzheimer’s disease, she emphasized. “If anyone of any age or occupation comes in with cognitive issues, such as [impaired] memory or executive function, they deserve a workup for dementing disorders.” Frontotemporal dementia, for example, can present earlier than Alzheimer’s disease typically does.
Common comorbidities with TBI are PTSD and obstructive sleep apnea (OSA), which can also cause cognitive issues and are also risk factors for dementia.
Dr. Iliff agreed. “If you see a veteran with a history of PTSD, a history of blast TBI, and a history of OSA or some combination of those three, I recommend having a higher index of suspicion [for potential dementia] than for an average person without any of these, even at a younger age than one would ordinarily expect.”
Of all of these factors, the only truly directly modifiable one is sleep disruption, including that caused by OSA or sleep disorders related to PTSD, he added. “Epidemiologic data suggest a connection particularly between midlife sleep disruption and the risk of dementia and Alzheimer’s disease, and so it’s worth thinking about sleep as a modifiable risk factor even as early as the 40s and 50s, whether the patient is or isn’t a veteran.”
Dr. Peskind recommended asking patients, “Do they snore? Do they thrash about during sleep? Do they have trauma nightmares? This will inform the type of intervention required.”
Dr. Alosco added that there is no known “safe” threshold of exposure to blasts, and that thresholds are “unclear, particularly at the individual level.” In American football, there is a dose-response relationship between years of play and risk for later-life neurological disorder. “The best way to mitigate risk is to limit cumulative exposure,” he said.
The study by Li and colleagues was funded by grant funding from the Department of Veterans Affairs Rehabilitation Research and Development Service and the University of Washington Friends of Alzheimer’s Research. Other sources of funding to individual researchers are listed in the original paper. The study by Braun and colleagues was supported by the National Heart, Lung and Blood Institute; the Department of Veterans Affairs Rehabilitation Research and Development Service; and the National Institute on Aging. The white paper included studies that received funding from numerous sources, including the National Institutes of Health and the DOD. Dr. Iliff serves as the chair of the Scientific Advisory Board for Applied Cognition Inc., from which he receives compensation and in which he holds an equity stake. In the last year, he served as a paid consultant to Gryphon Biosciences. Dr. Peskind has served as a paid consultant to the companies Genentech, Roche, and Alpha Cognition. Dr. Alosco was supported by grant funding from the NIH; he received research support from Rainwater Charitable Foundation Inc., and Life Molecular Imaging Inc.; he has received a single honorarium from the Michael J. Fox Foundation for services unrelated to this editorial; and he received royalties from Oxford University Press Inc. The other authors’ disclosures are listed in the original papers.
A version of this article appeared on Medscape.com.
Ixekizumab Met Phase 3 Trial Endpoint in Juvenile PsA, Enthesitis-Related Arthritis
VIENNA — Ixekizumab (Taltz), an interleukin-17A inhibitor that’s already approved for the treatment of psoriatic arthritis and axial spondyloarthritis in adults appears likely to be granted the same corresponding indications for children, based on initial results from an open-label, phase 3 trial that employed adalimumab as a reference.
With a safety profile comparable with that seen in adult patients, ixekizumab “met the prespecified criterion for success at 16 weeks,” reported Athimalaipet V. Ramanan, MD, PhD, of Bristol Royal Hospital for Children and Translational Health Sciences, Bristol, England.
In this multicenter, randomized, open-label trial called COSPIRIT-JIA, which is still ongoing, investigators enrolled 101 children with active juvenile PsA (JPsA) or enthesitis-related arthritis (ERA), which is akin to spondyloarthritis in adults.
The efficacy and safety data at 16 weeks were presented as a late-breaking abstract at the annual European Congress of Rheumatology. Dr. Ramanan said that the open-label extension to 104 weeks is underway and further follow-up out to 264 weeks is planned.
Nearly 90% Achieve ACR30
The trial had an adaptive design in which the first 40 patients without biologics experience were randomized to ixekizumab or adalimumab, stratified by JPsA or ERA diagnosis, and the following 61 patients with either no biologic experience or an inadequate response or intolerance to biologics all received ixekizumab. The drugs were dosed according to weight. Dr. Ramanan explained that a placebo-controlled trial was considered unethical because of the strong evidence of benefit from biologics for JPsA and ERA.
The trial easily met its predefined threshold for success, which required ≥ 80% probability, based on Bayesian analysis, that ≥ 50% of patients would have 30% improvement in American College of Rheumatology response criteria (ACR30) at week 16. ACR30 was achieved in 88.9% of those treated with ixekizumab overall vs 95.0% of those treated with adalimumab, but the trial was not designed as a head-to-head comparison. Rather, adalimumab served as a reference.
When compared for the distinct diseases, the ACR30 rates were also numerically lower for ixekizumab relative to adalimumab for both ERA (88.9% vs 93.8%) and JPsA (88.9% vs 100%), but all of the adalimumab patients were naive to biologics. In comparison, about 75% of patients receiving ixekizumab were biologic-therapy naive.
Response rates to ixekizumab overall were numerically higher for patients without previous biologic experience than for those with experience (90.0% vs 85.7%), and this was also the case for patients with ERA (92.5% vs 78.6%). However, in the JPsA group, biologic-experienced patients had higher numerical response rates to ixekizumab (100% vs 85.0%).
An ACR30 is not a clinical goal that satisfies most patients and clinicians, Dr. Ramanan conceded, but he noted that ACR50 was reached with ixekizumab by 81.5% with ERA and 74.1% with JPsA, and ACR70 was reached by 68.5% and 55.6%, respectively. The highest responses of ACR90 (27.8% and 33.3%) and ACR100 (14.8% and 25.9%) were lower but still substantial in the ERA and JPsA groups, respectively.
Through week 16, 58.0% of those treated with ixekizumab had an adverse event considered treatment-related. Nearly half were of mild severity, and the remainder were moderate. Only 3.7% were considered serious. No patient discontinued study treatment because of an adverse event.
In this study, the presence of at least three active peripheral joints was an inclusion criterion. The median age was about 13 years in the biologic-naive adalimumab and ixekizumab groups and 14 years in the ixekizumab biologic-experienced group. The youngest patient in the study was aged 5 years, and the oldest was aged 18 years. Although about 40% of patients were women in the two biologic-naive subgroups, it was 60% in the biologic-experienced group.
On average, patients in the biologic-naive group were entered about 1 year after diagnosis. In the experienced patients, the average duration of disease at entry was nearly 4 years. About 45% of patients remained on conventional synthetic disease-modifying antirheumatic drugs while receiving ixekizumab. The proportion was 35% in the adalimumab reference arm.
Ixekizumab Might Fulfill Need for More Options
There are several biologics that have received regulatory approval or are already widely used for the treatment of JPsA or ERA, but more options are needed, according to Dr. Ramanan and the chair of the abstract session in which these data were reported. According to Caroline Ospelt, MD, PhD, a researcher at the Center for Experimental Rheumatology, University Hospital Zurich, Switzerland, regulatory approval of ixekizumab will depend on sustained efficacy and safety in longer follow-up from the COSPIRIT-JIA trial, but this trial supports continued development.
Despite a novel mechanism of action, “the data so far suggest a level of efficacy similar to that of anti-TNF [anti-tumor necrosis factor] biologics,” said Dr. Ospelt, who, in addition to moderating the late-breaking session, served as Scientific Program Chair of EULAR 2024.
While Dr. Ospelt emphasized that she is a researcher involved in translational rheumatology studies and not a clinician, she said there was consensus within the program committee to select this abstract from other high-quality latebreaker submissions on the basis of its potential clinical significance.
Dr. Ramanan reported financial relationships with AbbVie, AstraZeneca, Novartis, Pfizer, Roche, SOBI, UCB, and Eli Lilly, which provided funding for this study. Dr. Ospelt reported no potential conflicts of interest.
A version of this article first appeared on Medscape.com.
VIENNA — Ixekizumab (Taltz), an interleukin-17A inhibitor that’s already approved for the treatment of psoriatic arthritis and axial spondyloarthritis in adults appears likely to be granted the same corresponding indications for children, based on initial results from an open-label, phase 3 trial that employed adalimumab as a reference.
With a safety profile comparable with that seen in adult patients, ixekizumab “met the prespecified criterion for success at 16 weeks,” reported Athimalaipet V. Ramanan, MD, PhD, of Bristol Royal Hospital for Children and Translational Health Sciences, Bristol, England.
In this multicenter, randomized, open-label trial called COSPIRIT-JIA, which is still ongoing, investigators enrolled 101 children with active juvenile PsA (JPsA) or enthesitis-related arthritis (ERA), which is akin to spondyloarthritis in adults.
The efficacy and safety data at 16 weeks were presented as a late-breaking abstract at the annual European Congress of Rheumatology. Dr. Ramanan said that the open-label extension to 104 weeks is underway and further follow-up out to 264 weeks is planned.
Nearly 90% Achieve ACR30
The trial had an adaptive design in which the first 40 patients without biologics experience were randomized to ixekizumab or adalimumab, stratified by JPsA or ERA diagnosis, and the following 61 patients with either no biologic experience or an inadequate response or intolerance to biologics all received ixekizumab. The drugs were dosed according to weight. Dr. Ramanan explained that a placebo-controlled trial was considered unethical because of the strong evidence of benefit from biologics for JPsA and ERA.
The trial easily met its predefined threshold for success, which required ≥ 80% probability, based on Bayesian analysis, that ≥ 50% of patients would have 30% improvement in American College of Rheumatology response criteria (ACR30) at week 16. ACR30 was achieved in 88.9% of those treated with ixekizumab overall vs 95.0% of those treated with adalimumab, but the trial was not designed as a head-to-head comparison. Rather, adalimumab served as a reference.
When compared for the distinct diseases, the ACR30 rates were also numerically lower for ixekizumab relative to adalimumab for both ERA (88.9% vs 93.8%) and JPsA (88.9% vs 100%), but all of the adalimumab patients were naive to biologics. In comparison, about 75% of patients receiving ixekizumab were biologic-therapy naive.
Response rates to ixekizumab overall were numerically higher for patients without previous biologic experience than for those with experience (90.0% vs 85.7%), and this was also the case for patients with ERA (92.5% vs 78.6%). However, in the JPsA group, biologic-experienced patients had higher numerical response rates to ixekizumab (100% vs 85.0%).
An ACR30 is not a clinical goal that satisfies most patients and clinicians, Dr. Ramanan conceded, but he noted that ACR50 was reached with ixekizumab by 81.5% with ERA and 74.1% with JPsA, and ACR70 was reached by 68.5% and 55.6%, respectively. The highest responses of ACR90 (27.8% and 33.3%) and ACR100 (14.8% and 25.9%) were lower but still substantial in the ERA and JPsA groups, respectively.
Through week 16, 58.0% of those treated with ixekizumab had an adverse event considered treatment-related. Nearly half were of mild severity, and the remainder were moderate. Only 3.7% were considered serious. No patient discontinued study treatment because of an adverse event.
In this study, the presence of at least three active peripheral joints was an inclusion criterion. The median age was about 13 years in the biologic-naive adalimumab and ixekizumab groups and 14 years in the ixekizumab biologic-experienced group. The youngest patient in the study was aged 5 years, and the oldest was aged 18 years. Although about 40% of patients were women in the two biologic-naive subgroups, it was 60% in the biologic-experienced group.
On average, patients in the biologic-naive group were entered about 1 year after diagnosis. In the experienced patients, the average duration of disease at entry was nearly 4 years. About 45% of patients remained on conventional synthetic disease-modifying antirheumatic drugs while receiving ixekizumab. The proportion was 35% in the adalimumab reference arm.
Ixekizumab Might Fulfill Need for More Options
There are several biologics that have received regulatory approval or are already widely used for the treatment of JPsA or ERA, but more options are needed, according to Dr. Ramanan and the chair of the abstract session in which these data were reported. According to Caroline Ospelt, MD, PhD, a researcher at the Center for Experimental Rheumatology, University Hospital Zurich, Switzerland, regulatory approval of ixekizumab will depend on sustained efficacy and safety in longer follow-up from the COSPIRIT-JIA trial, but this trial supports continued development.
Despite a novel mechanism of action, “the data so far suggest a level of efficacy similar to that of anti-TNF [anti-tumor necrosis factor] biologics,” said Dr. Ospelt, who, in addition to moderating the late-breaking session, served as Scientific Program Chair of EULAR 2024.
While Dr. Ospelt emphasized that she is a researcher involved in translational rheumatology studies and not a clinician, she said there was consensus within the program committee to select this abstract from other high-quality latebreaker submissions on the basis of its potential clinical significance.
Dr. Ramanan reported financial relationships with AbbVie, AstraZeneca, Novartis, Pfizer, Roche, SOBI, UCB, and Eli Lilly, which provided funding for this study. Dr. Ospelt reported no potential conflicts of interest.
A version of this article first appeared on Medscape.com.
VIENNA — Ixekizumab (Taltz), an interleukin-17A inhibitor that’s already approved for the treatment of psoriatic arthritis and axial spondyloarthritis in adults appears likely to be granted the same corresponding indications for children, based on initial results from an open-label, phase 3 trial that employed adalimumab as a reference.
With a safety profile comparable with that seen in adult patients, ixekizumab “met the prespecified criterion for success at 16 weeks,” reported Athimalaipet V. Ramanan, MD, PhD, of Bristol Royal Hospital for Children and Translational Health Sciences, Bristol, England.
In this multicenter, randomized, open-label trial called COSPIRIT-JIA, which is still ongoing, investigators enrolled 101 children with active juvenile PsA (JPsA) or enthesitis-related arthritis (ERA), which is akin to spondyloarthritis in adults.
The efficacy and safety data at 16 weeks were presented as a late-breaking abstract at the annual European Congress of Rheumatology. Dr. Ramanan said that the open-label extension to 104 weeks is underway and further follow-up out to 264 weeks is planned.
Nearly 90% Achieve ACR30
The trial had an adaptive design in which the first 40 patients without biologics experience were randomized to ixekizumab or adalimumab, stratified by JPsA or ERA diagnosis, and the following 61 patients with either no biologic experience or an inadequate response or intolerance to biologics all received ixekizumab. The drugs were dosed according to weight. Dr. Ramanan explained that a placebo-controlled trial was considered unethical because of the strong evidence of benefit from biologics for JPsA and ERA.
The trial easily met its predefined threshold for success, which required ≥ 80% probability, based on Bayesian analysis, that ≥ 50% of patients would have 30% improvement in American College of Rheumatology response criteria (ACR30) at week 16. ACR30 was achieved in 88.9% of those treated with ixekizumab overall vs 95.0% of those treated with adalimumab, but the trial was not designed as a head-to-head comparison. Rather, adalimumab served as a reference.
When compared for the distinct diseases, the ACR30 rates were also numerically lower for ixekizumab relative to adalimumab for both ERA (88.9% vs 93.8%) and JPsA (88.9% vs 100%), but all of the adalimumab patients were naive to biologics. In comparison, about 75% of patients receiving ixekizumab were biologic-therapy naive.
Response rates to ixekizumab overall were numerically higher for patients without previous biologic experience than for those with experience (90.0% vs 85.7%), and this was also the case for patients with ERA (92.5% vs 78.6%). However, in the JPsA group, biologic-experienced patients had higher numerical response rates to ixekizumab (100% vs 85.0%).
An ACR30 is not a clinical goal that satisfies most patients and clinicians, Dr. Ramanan conceded, but he noted that ACR50 was reached with ixekizumab by 81.5% with ERA and 74.1% with JPsA, and ACR70 was reached by 68.5% and 55.6%, respectively. The highest responses of ACR90 (27.8% and 33.3%) and ACR100 (14.8% and 25.9%) were lower but still substantial in the ERA and JPsA groups, respectively.
Through week 16, 58.0% of those treated with ixekizumab had an adverse event considered treatment-related. Nearly half were of mild severity, and the remainder were moderate. Only 3.7% were considered serious. No patient discontinued study treatment because of an adverse event.
In this study, the presence of at least three active peripheral joints was an inclusion criterion. The median age was about 13 years in the biologic-naive adalimumab and ixekizumab groups and 14 years in the ixekizumab biologic-experienced group. The youngest patient in the study was aged 5 years, and the oldest was aged 18 years. Although about 40% of patients were women in the two biologic-naive subgroups, it was 60% in the biologic-experienced group.
On average, patients in the biologic-naive group were entered about 1 year after diagnosis. In the experienced patients, the average duration of disease at entry was nearly 4 years. About 45% of patients remained on conventional synthetic disease-modifying antirheumatic drugs while receiving ixekizumab. The proportion was 35% in the adalimumab reference arm.
Ixekizumab Might Fulfill Need for More Options
There are several biologics that have received regulatory approval or are already widely used for the treatment of JPsA or ERA, but more options are needed, according to Dr. Ramanan and the chair of the abstract session in which these data were reported. According to Caroline Ospelt, MD, PhD, a researcher at the Center for Experimental Rheumatology, University Hospital Zurich, Switzerland, regulatory approval of ixekizumab will depend on sustained efficacy and safety in longer follow-up from the COSPIRIT-JIA trial, but this trial supports continued development.
Despite a novel mechanism of action, “the data so far suggest a level of efficacy similar to that of anti-TNF [anti-tumor necrosis factor] biologics,” said Dr. Ospelt, who, in addition to moderating the late-breaking session, served as Scientific Program Chair of EULAR 2024.
While Dr. Ospelt emphasized that she is a researcher involved in translational rheumatology studies and not a clinician, she said there was consensus within the program committee to select this abstract from other high-quality latebreaker submissions on the basis of its potential clinical significance.
Dr. Ramanan reported financial relationships with AbbVie, AstraZeneca, Novartis, Pfizer, Roche, SOBI, UCB, and Eli Lilly, which provided funding for this study. Dr. Ospelt reported no potential conflicts of interest.
A version of this article first appeared on Medscape.com.
FROM EULAR 2024
Tirzepatide Reduces Sleep Interruptions, Halting Almost Half of CPAP Use
ORLANDO, FLA. — The diabetes and weight loss drug tirzepatide (Mounjaro for type 2 diabetes; Zepbound for obesity) was so effective at reducing sleep disruptions in patients with obesity and obstructive sleep apnea (OSA) that 40%-50% no longer needed to use a continuous positive airway pressure (CPAP) device, according to two new studies.
Tirzepatide, a long-acting glucose-dependent insulinotropic polypeptide (GIP) receptor agonist and glucagon-like peptide 1 (GLP-1) receptor agonist, also lowered C-reactive protein levels and systolic blood pressure. And patients taking the medication lost 18%-20% of their body weight.
said lead author Atul Malhotra, MD, professor of medicine at the University of California, San Diego, and director of sleep medicine at UC San Diego Health.
The two double-blind, randomized, controlled trials in patients with obesity and moderate to severe OSA were conducted at 60 sites in nine countries. The results were presented at the American Diabetes Association (ADA) 84th Scientific Sessions and simultaneously published online in the New England Journal of Medicine.
OSA affects 1 billion people worldwide and 30 million American adults, many of whom are undiagnosed. Obesity is a common risk factor. According to the ADA, 40% of those with obesity have OSA and 70% of those with OSA have obesity.
CPAP is an effective and the most-used intervention for OSA, but many patients refuse to use the device, stop using it, or cannot use it. Should tirzepatide eventually gain Food and Drug Administration approval for OSA, it would be the first drug approved for the condition.
“This new drug treatment offers a more accessible alternative for individuals who cannot tolerate or adhere to existing therapies,” said Dr. Malhotra.
Huge Reduction in Episodes, Severity
For the two studies, patients were enrolled who had moderate to severe OSA, defined as more than 15 events per hour (using the apnea-hypopnea index [AHI]) and a body mass index of 30 kg/m2 or greater. Those not using a CPAP device were enrolled in study 1, and those using a CPAP device were enrolled in study 2.
Participants received either the maximum tolerated dose of tirzepatide (10 or 15 mg by once-weekly injection) or placebo for 1 year. In study 1, 114 individuals received tirzepatide and 120 received placebo. For study 2, 119 patients received tirzepatide and 114 received placebo. All participants received regular lifestyle counseling sessions about nutrition and were instructed to reduce food intake by 500 kcal/day and to engage in at least 150 min/week of physical activity.
Enrollment was limited to 70% men to ensure adequate representation of women.
At baseline, 65%-70% of participants had severe OSA, with more than 30 events/hour on the AHI scale and a mean of 51.5 events/hour.
By 1 year, patients taking tirzepatide had 27-30 fewer events/hour, compared with 4-6 fewer events/hour for those taking placebo.
Up to half of those who received tirzepatide in both trials had less than 5 events/hour or 5-14 AHI events/hour and an Epworth Sleepiness Scale score of 10 or less. Those thresholds “represent a level at which CPAP therapy may not be recommended,” wrote the authors.
Patients in the tirzepatide group also had a decrease in systolic blood pressure from baseline of 9.7 mm Hg in study 1 and 7.6 mm Hg in study 2 at week 48.
The most common adverse events were diarrhea, nausea, and vomiting, which occurred in approximately a quarter of patients taking tirzepatide. There were two adjudicated-confirmed cases of acute pancreatitis in those taking tirzepatide in study 2.
Patients who received tirzepatide also reported fewer daytime and nighttime disturbances, as measured using the Patient-Reported Outcomes Measurement Information System Short Form scale for Sleep-Related Impairment and Sleep Disturbance.
Tirzepatide Plus CPAP Are Best
Writing in an accompanying editorial, Sanjay R. Patel, MD, noted that, although clinical guidelines have recommended that weight loss strategies be incorporated as part of OSA treatment, “the integration of obesity management into the approaches to care for obstructive sleep apnea has lagged.”
As many as half of patients abandon CPAP therapy within 3 years, wrote Dr. Patel, who is professor of medicine and epidemiology at the University of Pittsburgh, Pittsburgh, Pennsylvania, and medical director of the UPMC Comprehensive Sleep Disorders program. “An effective medication to treat obesity is thus an obvious avenue to pursue.”
Dr. Patel noted the large reductions in the number of events on the AHI scale. He wrote that the improvement in systolic blood pressure “was substantially larger than effects seen with CPAP therapy alone and indicate that tirzepatide may be an attractive option for those patients who seek to reduce their cardiovascular risk.”
Dr. Patel raised concerns about whether patients outside of a trial would stick with therapy, noting studies have shown high rates of discontinuation of GLP-1 receptor agonists.
And, he wrote, “racial disparities in the use of GLP-1 receptor agonists among patients with diabetes arouse concern that the addition of tirzepatide as a treatment option for obstructive sleep apnea without directly addressing policies relative to coverage of care will only further exacerbate already pervasive disparities in clinical care for obstructive sleep apnea.”
Commenting on the study during the presentation of the results, Louis Aronne, MD, said he believes the trials demonstrate “the treatment of obesity with tirzepatide plus CPAP is really the optimal treatment for obstructive sleep apnea and obesity-related cardiometabolic risks.” Dr. Aronne is the Sanford I. Weill professor of metabolic research at Weill Cornell Medical College, New York City.
Dr. Aronne added there is still much to learn. It is still not clear whether tirzepatide had an independent effect in the OSA trial — as has been seen in other studies where the drug clearly reduced cardiovascular risk — or whether the positive results were primarily caused by weight loss.
“I believe that over time we’ll see that this particular effect in sleep apnea is related to weight,” he said.
The study was supported by Eli Lilly. Dr. Malhotra has reported being a paid consultant for Lilly and ZOLL Medical and a cofounder of Healcisio.
A version of this article appeared on Medscape.com.
ORLANDO, FLA. — The diabetes and weight loss drug tirzepatide (Mounjaro for type 2 diabetes; Zepbound for obesity) was so effective at reducing sleep disruptions in patients with obesity and obstructive sleep apnea (OSA) that 40%-50% no longer needed to use a continuous positive airway pressure (CPAP) device, according to two new studies.
Tirzepatide, a long-acting glucose-dependent insulinotropic polypeptide (GIP) receptor agonist and glucagon-like peptide 1 (GLP-1) receptor agonist, also lowered C-reactive protein levels and systolic blood pressure. And patients taking the medication lost 18%-20% of their body weight.
said lead author Atul Malhotra, MD, professor of medicine at the University of California, San Diego, and director of sleep medicine at UC San Diego Health.
The two double-blind, randomized, controlled trials in patients with obesity and moderate to severe OSA were conducted at 60 sites in nine countries. The results were presented at the American Diabetes Association (ADA) 84th Scientific Sessions and simultaneously published online in the New England Journal of Medicine.
OSA affects 1 billion people worldwide and 30 million American adults, many of whom are undiagnosed. Obesity is a common risk factor. According to the ADA, 40% of those with obesity have OSA and 70% of those with OSA have obesity.
CPAP is an effective and the most-used intervention for OSA, but many patients refuse to use the device, stop using it, or cannot use it. Should tirzepatide eventually gain Food and Drug Administration approval for OSA, it would be the first drug approved for the condition.
“This new drug treatment offers a more accessible alternative for individuals who cannot tolerate or adhere to existing therapies,” said Dr. Malhotra.
Huge Reduction in Episodes, Severity
For the two studies, patients were enrolled who had moderate to severe OSA, defined as more than 15 events per hour (using the apnea-hypopnea index [AHI]) and a body mass index of 30 kg/m2 or greater. Those not using a CPAP device were enrolled in study 1, and those using a CPAP device were enrolled in study 2.
Participants received either the maximum tolerated dose of tirzepatide (10 or 15 mg by once-weekly injection) or placebo for 1 year. In study 1, 114 individuals received tirzepatide and 120 received placebo. For study 2, 119 patients received tirzepatide and 114 received placebo. All participants received regular lifestyle counseling sessions about nutrition and were instructed to reduce food intake by 500 kcal/day and to engage in at least 150 min/week of physical activity.
Enrollment was limited to 70% men to ensure adequate representation of women.
At baseline, 65%-70% of participants had severe OSA, with more than 30 events/hour on the AHI scale and a mean of 51.5 events/hour.
By 1 year, patients taking tirzepatide had 27-30 fewer events/hour, compared with 4-6 fewer events/hour for those taking placebo.
Up to half of those who received tirzepatide in both trials had less than 5 events/hour or 5-14 AHI events/hour and an Epworth Sleepiness Scale score of 10 or less. Those thresholds “represent a level at which CPAP therapy may not be recommended,” wrote the authors.
Patients in the tirzepatide group also had a decrease in systolic blood pressure from baseline of 9.7 mm Hg in study 1 and 7.6 mm Hg in study 2 at week 48.
The most common adverse events were diarrhea, nausea, and vomiting, which occurred in approximately a quarter of patients taking tirzepatide. There were two adjudicated-confirmed cases of acute pancreatitis in those taking tirzepatide in study 2.
Patients who received tirzepatide also reported fewer daytime and nighttime disturbances, as measured using the Patient-Reported Outcomes Measurement Information System Short Form scale for Sleep-Related Impairment and Sleep Disturbance.
Tirzepatide Plus CPAP Are Best
Writing in an accompanying editorial, Sanjay R. Patel, MD, noted that, although clinical guidelines have recommended that weight loss strategies be incorporated as part of OSA treatment, “the integration of obesity management into the approaches to care for obstructive sleep apnea has lagged.”
As many as half of patients abandon CPAP therapy within 3 years, wrote Dr. Patel, who is professor of medicine and epidemiology at the University of Pittsburgh, Pittsburgh, Pennsylvania, and medical director of the UPMC Comprehensive Sleep Disorders program. “An effective medication to treat obesity is thus an obvious avenue to pursue.”
Dr. Patel noted the large reductions in the number of events on the AHI scale. He wrote that the improvement in systolic blood pressure “was substantially larger than effects seen with CPAP therapy alone and indicate that tirzepatide may be an attractive option for those patients who seek to reduce their cardiovascular risk.”
Dr. Patel raised concerns about whether patients outside of a trial would stick with therapy, noting studies have shown high rates of discontinuation of GLP-1 receptor agonists.
And, he wrote, “racial disparities in the use of GLP-1 receptor agonists among patients with diabetes arouse concern that the addition of tirzepatide as a treatment option for obstructive sleep apnea without directly addressing policies relative to coverage of care will only further exacerbate already pervasive disparities in clinical care for obstructive sleep apnea.”
Commenting on the study during the presentation of the results, Louis Aronne, MD, said he believes the trials demonstrate “the treatment of obesity with tirzepatide plus CPAP is really the optimal treatment for obstructive sleep apnea and obesity-related cardiometabolic risks.” Dr. Aronne is the Sanford I. Weill professor of metabolic research at Weill Cornell Medical College, New York City.
Dr. Aronne added there is still much to learn. It is still not clear whether tirzepatide had an independent effect in the OSA trial — as has been seen in other studies where the drug clearly reduced cardiovascular risk — or whether the positive results were primarily caused by weight loss.
“I believe that over time we’ll see that this particular effect in sleep apnea is related to weight,” he said.
The study was supported by Eli Lilly. Dr. Malhotra has reported being a paid consultant for Lilly and ZOLL Medical and a cofounder of Healcisio.
A version of this article appeared on Medscape.com.
ORLANDO, FLA. — The diabetes and weight loss drug tirzepatide (Mounjaro for type 2 diabetes; Zepbound for obesity) was so effective at reducing sleep disruptions in patients with obesity and obstructive sleep apnea (OSA) that 40%-50% no longer needed to use a continuous positive airway pressure (CPAP) device, according to two new studies.
Tirzepatide, a long-acting glucose-dependent insulinotropic polypeptide (GIP) receptor agonist and glucagon-like peptide 1 (GLP-1) receptor agonist, also lowered C-reactive protein levels and systolic blood pressure. And patients taking the medication lost 18%-20% of their body weight.
said lead author Atul Malhotra, MD, professor of medicine at the University of California, San Diego, and director of sleep medicine at UC San Diego Health.
The two double-blind, randomized, controlled trials in patients with obesity and moderate to severe OSA were conducted at 60 sites in nine countries. The results were presented at the American Diabetes Association (ADA) 84th Scientific Sessions and simultaneously published online in the New England Journal of Medicine.
OSA affects 1 billion people worldwide and 30 million American adults, many of whom are undiagnosed. Obesity is a common risk factor. According to the ADA, 40% of those with obesity have OSA and 70% of those with OSA have obesity.
CPAP is an effective and the most-used intervention for OSA, but many patients refuse to use the device, stop using it, or cannot use it. Should tirzepatide eventually gain Food and Drug Administration approval for OSA, it would be the first drug approved for the condition.
“This new drug treatment offers a more accessible alternative for individuals who cannot tolerate or adhere to existing therapies,” said Dr. Malhotra.
Huge Reduction in Episodes, Severity
For the two studies, patients were enrolled who had moderate to severe OSA, defined as more than 15 events per hour (using the apnea-hypopnea index [AHI]) and a body mass index of 30 kg/m2 or greater. Those not using a CPAP device were enrolled in study 1, and those using a CPAP device were enrolled in study 2.
Participants received either the maximum tolerated dose of tirzepatide (10 or 15 mg by once-weekly injection) or placebo for 1 year. In study 1, 114 individuals received tirzepatide and 120 received placebo. For study 2, 119 patients received tirzepatide and 114 received placebo. All participants received regular lifestyle counseling sessions about nutrition and were instructed to reduce food intake by 500 kcal/day and to engage in at least 150 min/week of physical activity.
Enrollment was limited to 70% men to ensure adequate representation of women.
At baseline, 65%-70% of participants had severe OSA, with more than 30 events/hour on the AHI scale and a mean of 51.5 events/hour.
By 1 year, patients taking tirzepatide had 27-30 fewer events/hour, compared with 4-6 fewer events/hour for those taking placebo.
Up to half of those who received tirzepatide in both trials had less than 5 events/hour or 5-14 AHI events/hour and an Epworth Sleepiness Scale score of 10 or less. Those thresholds “represent a level at which CPAP therapy may not be recommended,” wrote the authors.
Patients in the tirzepatide group also had a decrease in systolic blood pressure from baseline of 9.7 mm Hg in study 1 and 7.6 mm Hg in study 2 at week 48.
The most common adverse events were diarrhea, nausea, and vomiting, which occurred in approximately a quarter of patients taking tirzepatide. There were two adjudicated-confirmed cases of acute pancreatitis in those taking tirzepatide in study 2.
Patients who received tirzepatide also reported fewer daytime and nighttime disturbances, as measured using the Patient-Reported Outcomes Measurement Information System Short Form scale for Sleep-Related Impairment and Sleep Disturbance.
Tirzepatide Plus CPAP Are Best
Writing in an accompanying editorial, Sanjay R. Patel, MD, noted that, although clinical guidelines have recommended that weight loss strategies be incorporated as part of OSA treatment, “the integration of obesity management into the approaches to care for obstructive sleep apnea has lagged.”
As many as half of patients abandon CPAP therapy within 3 years, wrote Dr. Patel, who is professor of medicine and epidemiology at the University of Pittsburgh, Pittsburgh, Pennsylvania, and medical director of the UPMC Comprehensive Sleep Disorders program. “An effective medication to treat obesity is thus an obvious avenue to pursue.”
Dr. Patel noted the large reductions in the number of events on the AHI scale. He wrote that the improvement in systolic blood pressure “was substantially larger than effects seen with CPAP therapy alone and indicate that tirzepatide may be an attractive option for those patients who seek to reduce their cardiovascular risk.”
Dr. Patel raised concerns about whether patients outside of a trial would stick with therapy, noting studies have shown high rates of discontinuation of GLP-1 receptor agonists.
And, he wrote, “racial disparities in the use of GLP-1 receptor agonists among patients with diabetes arouse concern that the addition of tirzepatide as a treatment option for obstructive sleep apnea without directly addressing policies relative to coverage of care will only further exacerbate already pervasive disparities in clinical care for obstructive sleep apnea.”
Commenting on the study during the presentation of the results, Louis Aronne, MD, said he believes the trials demonstrate “the treatment of obesity with tirzepatide plus CPAP is really the optimal treatment for obstructive sleep apnea and obesity-related cardiometabolic risks.” Dr. Aronne is the Sanford I. Weill professor of metabolic research at Weill Cornell Medical College, New York City.
Dr. Aronne added there is still much to learn. It is still not clear whether tirzepatide had an independent effect in the OSA trial — as has been seen in other studies where the drug clearly reduced cardiovascular risk — or whether the positive results were primarily caused by weight loss.
“I believe that over time we’ll see that this particular effect in sleep apnea is related to weight,” he said.
The study was supported by Eli Lilly. Dr. Malhotra has reported being a paid consultant for Lilly and ZOLL Medical and a cofounder of Healcisio.
A version of this article appeared on Medscape.com.
FROM ADA 2024
Six Distinct Subtypes of Depression, Anxiety Identified via Brain Imaging
This research has “immediate clinical implications,” study investigator Leanne Williams, PhD, director of the Stanford Medicine Center for Precision Mental Health and Wellness, told this news organization.
“At Stanford, we have started translating the imaging technology into use in a new precision mental health clinic. The technology is being actively developed for wider use in clinical settings, and we hope to make it accessible to more clinicians and patients,” Dr. Williams said.
The study was published online in Nature Medicine.
No More Trial and Error?
Depression is a highly heterogeneous disease, with individual patients having different symptoms and treatment responses. About 30% of patients with major depression are resistant to treatment, and about half of patients with generalized anxiety disorder do not respond to first-line treatment.
“The dominant ‘one-size-fits-all’ diagnostic approach in psychiatry leads to cycling through treatment options by trial and error, which is lengthy, expensive, and frustrating, with 30%-40% of patients not achieving remission after trying one treatment,” the authors noted.
“The goal of our work is figuring out how we can get it right the first time,” Dr. Williams said in a news release, and that requires a better understanding of the neurobiology of depression.
To that end, 801 adults diagnosed with depression and anxiety underwent functional MRI to measure brain activity at rest and when engaged in tasks designed to test cognitive and emotional functioning.
Researchers probed six brain circuits previously associated with depression: the default mode circuit, salience circuit, attention circuit, negative affect circuit, positive affect circuit, and the cognitive control circuit.
Using a machine learning technique known as cluster analysis to group the patients’ brain images, they identified six clinically distinct biotypes of depression and anxiety defined by specific profiles of dysfunction within both task-free and task-evoked brain circuits.
“Importantly for clinical translation, these biotypes predict response to different pharmacological and behavioral interventions,” investigators wrote.
For example, patients with a biotype characterized by overactivity in cognitive regions of the brain experienced the best response to the antidepressant venlafaxine, compared with patients with other biotypes.
Patients with a different biotype, characterized by higher at-rest levels of activity in three regions associated with depression and problem-solving, responded better to behavioral therapy.
In addition, those with a third biotype, who had lower levels of activity at rest in the brain circuit that controls attention, were less apt to see improvement of their symptoms with behavioral therapy than those with other biotypes. The various biotypes also correlated with differences in symptoms and task performance.
For example, individuals with overactive cognitive regions of the brain had higher levels of anhedonia than those with other biotypes, and they also performed worse on tasks measuring executive function. Those with the biotype that responded best to behavioral therapy also made errors on executive function tasks but performed well on cognitive tasks.
A Work in Progress
The findings provide a deeper understanding of the neurobiological underpinnings of depression and anxiety and could lead to improved diagnostic accuracy and more tailored treatment approaches, the researchers noted.
Naming the biotypes is a work in progress, Dr. Williams said.
“We have thought a lot about the naming. In the Nature Medicine paper, we use a technical convention to name the biotypes based on which brain circuit problems define each of them,” she explained.
“For example, the first biotype is called DC+SC+AC+ because it is defined by connectivity increases [C+] on three resting circuits — default mode [D], salience [S], and frontoparietal attention [A]. We are working with collaborators to generate biotype names that could be convergent across findings and labs. In the near future, we anticipate generating more descriptive medical names that clinicians could refer to alongside the technical names,” Dr. Williams said.
Commenting on the research for this news organization, James Murrough, MD, PhD, director of the Depression and Anxiety Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai, New York, called it “super exciting.”
“The work from this research group is an excellent example of where precision psychiatry research is right now, particularly with regard to the use of brain imaging to personalize treatment, and this paper gives us a glimpse of where we could be in the not-too-distant future,” Dr. Murrough said.
However, he cautioned that at this point, “we’re far from realizing the dream of precision psychiatry. We just don’t have robust evidence that brain imaging markers can really guide clinical decision-making currently.”
Funding for the study was provided by the National Institutes of Health and by Brain Resource Ltd. Dr. Williams declared US patent applications numbered 10/034,645 and 15/820,338: “Systems and methods for detecting complex networks in MRI data.” Dr. Murrough had no relevant disclosures.
A version of this article appeared on Medscape.com.
This research has “immediate clinical implications,” study investigator Leanne Williams, PhD, director of the Stanford Medicine Center for Precision Mental Health and Wellness, told this news organization.
“At Stanford, we have started translating the imaging technology into use in a new precision mental health clinic. The technology is being actively developed for wider use in clinical settings, and we hope to make it accessible to more clinicians and patients,” Dr. Williams said.
The study was published online in Nature Medicine.
No More Trial and Error?
Depression is a highly heterogeneous disease, with individual patients having different symptoms and treatment responses. About 30% of patients with major depression are resistant to treatment, and about half of patients with generalized anxiety disorder do not respond to first-line treatment.
“The dominant ‘one-size-fits-all’ diagnostic approach in psychiatry leads to cycling through treatment options by trial and error, which is lengthy, expensive, and frustrating, with 30%-40% of patients not achieving remission after trying one treatment,” the authors noted.
“The goal of our work is figuring out how we can get it right the first time,” Dr. Williams said in a news release, and that requires a better understanding of the neurobiology of depression.
To that end, 801 adults diagnosed with depression and anxiety underwent functional MRI to measure brain activity at rest and when engaged in tasks designed to test cognitive and emotional functioning.
Researchers probed six brain circuits previously associated with depression: the default mode circuit, salience circuit, attention circuit, negative affect circuit, positive affect circuit, and the cognitive control circuit.
Using a machine learning technique known as cluster analysis to group the patients’ brain images, they identified six clinically distinct biotypes of depression and anxiety defined by specific profiles of dysfunction within both task-free and task-evoked brain circuits.
“Importantly for clinical translation, these biotypes predict response to different pharmacological and behavioral interventions,” investigators wrote.
For example, patients with a biotype characterized by overactivity in cognitive regions of the brain experienced the best response to the antidepressant venlafaxine, compared with patients with other biotypes.
Patients with a different biotype, characterized by higher at-rest levels of activity in three regions associated with depression and problem-solving, responded better to behavioral therapy.
In addition, those with a third biotype, who had lower levels of activity at rest in the brain circuit that controls attention, were less apt to see improvement of their symptoms with behavioral therapy than those with other biotypes. The various biotypes also correlated with differences in symptoms and task performance.
For example, individuals with overactive cognitive regions of the brain had higher levels of anhedonia than those with other biotypes, and they also performed worse on tasks measuring executive function. Those with the biotype that responded best to behavioral therapy also made errors on executive function tasks but performed well on cognitive tasks.
A Work in Progress
The findings provide a deeper understanding of the neurobiological underpinnings of depression and anxiety and could lead to improved diagnostic accuracy and more tailored treatment approaches, the researchers noted.
Naming the biotypes is a work in progress, Dr. Williams said.
“We have thought a lot about the naming. In the Nature Medicine paper, we use a technical convention to name the biotypes based on which brain circuit problems define each of them,” she explained.
“For example, the first biotype is called DC+SC+AC+ because it is defined by connectivity increases [C+] on three resting circuits — default mode [D], salience [S], and frontoparietal attention [A]. We are working with collaborators to generate biotype names that could be convergent across findings and labs. In the near future, we anticipate generating more descriptive medical names that clinicians could refer to alongside the technical names,” Dr. Williams said.
Commenting on the research for this news organization, James Murrough, MD, PhD, director of the Depression and Anxiety Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai, New York, called it “super exciting.”
“The work from this research group is an excellent example of where precision psychiatry research is right now, particularly with regard to the use of brain imaging to personalize treatment, and this paper gives us a glimpse of where we could be in the not-too-distant future,” Dr. Murrough said.
However, he cautioned that at this point, “we’re far from realizing the dream of precision psychiatry. We just don’t have robust evidence that brain imaging markers can really guide clinical decision-making currently.”
Funding for the study was provided by the National Institutes of Health and by Brain Resource Ltd. Dr. Williams declared US patent applications numbered 10/034,645 and 15/820,338: “Systems and methods for detecting complex networks in MRI data.” Dr. Murrough had no relevant disclosures.
A version of this article appeared on Medscape.com.
This research has “immediate clinical implications,” study investigator Leanne Williams, PhD, director of the Stanford Medicine Center for Precision Mental Health and Wellness, told this news organization.
“At Stanford, we have started translating the imaging technology into use in a new precision mental health clinic. The technology is being actively developed for wider use in clinical settings, and we hope to make it accessible to more clinicians and patients,” Dr. Williams said.
The study was published online in Nature Medicine.
No More Trial and Error?
Depression is a highly heterogeneous disease, with individual patients having different symptoms and treatment responses. About 30% of patients with major depression are resistant to treatment, and about half of patients with generalized anxiety disorder do not respond to first-line treatment.
“The dominant ‘one-size-fits-all’ diagnostic approach in psychiatry leads to cycling through treatment options by trial and error, which is lengthy, expensive, and frustrating, with 30%-40% of patients not achieving remission after trying one treatment,” the authors noted.
“The goal of our work is figuring out how we can get it right the first time,” Dr. Williams said in a news release, and that requires a better understanding of the neurobiology of depression.
To that end, 801 adults diagnosed with depression and anxiety underwent functional MRI to measure brain activity at rest and when engaged in tasks designed to test cognitive and emotional functioning.
Researchers probed six brain circuits previously associated with depression: the default mode circuit, salience circuit, attention circuit, negative affect circuit, positive affect circuit, and the cognitive control circuit.
Using a machine learning technique known as cluster analysis to group the patients’ brain images, they identified six clinically distinct biotypes of depression and anxiety defined by specific profiles of dysfunction within both task-free and task-evoked brain circuits.
“Importantly for clinical translation, these biotypes predict response to different pharmacological and behavioral interventions,” investigators wrote.
For example, patients with a biotype characterized by overactivity in cognitive regions of the brain experienced the best response to the antidepressant venlafaxine, compared with patients with other biotypes.
Patients with a different biotype, characterized by higher at-rest levels of activity in three regions associated with depression and problem-solving, responded better to behavioral therapy.
In addition, those with a third biotype, who had lower levels of activity at rest in the brain circuit that controls attention, were less apt to see improvement of their symptoms with behavioral therapy than those with other biotypes. The various biotypes also correlated with differences in symptoms and task performance.
For example, individuals with overactive cognitive regions of the brain had higher levels of anhedonia than those with other biotypes, and they also performed worse on tasks measuring executive function. Those with the biotype that responded best to behavioral therapy also made errors on executive function tasks but performed well on cognitive tasks.
A Work in Progress
The findings provide a deeper understanding of the neurobiological underpinnings of depression and anxiety and could lead to improved diagnostic accuracy and more tailored treatment approaches, the researchers noted.
Naming the biotypes is a work in progress, Dr. Williams said.
“We have thought a lot about the naming. In the Nature Medicine paper, we use a technical convention to name the biotypes based on which brain circuit problems define each of them,” she explained.
“For example, the first biotype is called DC+SC+AC+ because it is defined by connectivity increases [C+] on three resting circuits — default mode [D], salience [S], and frontoparietal attention [A]. We are working with collaborators to generate biotype names that could be convergent across findings and labs. In the near future, we anticipate generating more descriptive medical names that clinicians could refer to alongside the technical names,” Dr. Williams said.
Commenting on the research for this news organization, James Murrough, MD, PhD, director of the Depression and Anxiety Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai, New York, called it “super exciting.”
“The work from this research group is an excellent example of where precision psychiatry research is right now, particularly with regard to the use of brain imaging to personalize treatment, and this paper gives us a glimpse of where we could be in the not-too-distant future,” Dr. Murrough said.
However, he cautioned that at this point, “we’re far from realizing the dream of precision psychiatry. We just don’t have robust evidence that brain imaging markers can really guide clinical decision-making currently.”
Funding for the study was provided by the National Institutes of Health and by Brain Resource Ltd. Dr. Williams declared US patent applications numbered 10/034,645 and 15/820,338: “Systems and methods for detecting complex networks in MRI data.” Dr. Murrough had no relevant disclosures.
A version of this article appeared on Medscape.com.
Medicare Advantage Plans Not Always Advantageous
While Medicare Advantage (MA) plans are marketed as providing more generous benefits than traditional Medicare (TM), differences in the financial burden between beneficiaries switching to MA and staying with TM, are minimal, a longitudinal cohort analysis found.
In fact, according to a study by Sungchul Park, PhD, a health economist at Korea University in Seoul, and colleagues, the estimated annual out-of-pocket spending when switching to MA was $168 higher than staying in TM. That amounted to a 10.5% relative increase based on baseline out-of-pocket spending of $1597 annually among switchers, ranging widely, however, from a $133 decrease to a $469 increase. And for some, MA enrollment was associated with a higher likelihood of catastrophic financial burden.
“Our findings contrast with the notion that MA’s apparently more generous health insurance benefits lead to financial savings for enrollees,” Dr. Park and associates wrote in Annals of Internal Medicine.
The study
The analysis looked at costs for 7054 TM stayers and 1544 TM-to-MA switchers from the 2014-2020 Medical Expenditure Panel Survey, focusing on a cohort in which 18% of TM-covered individuals in year 1 switched to MA in year 2.
Comparative financial outcome measures included individual healthcare costs (out-of-pocket spending/cost sharing), financial burden (high/catastrophic), and subjective financial hardship (difficulty paying medical bills).
Although the overall out-of-pocket differences for MA were minimal and amounted to less than 1% of total healthcare expenses, MA was associated with a greater financial burden in vulnerable, especially in low-income populations. For every 100 beneficiaries with family incomes below 200% of the federal poverty level, one to six more switchers faced a catastrophic financial burden, with their out-of-pocket costs consuming more than 40% of household income in the year after switching.
The gap between the perception of lower costs and reality may be caused by a substantially heavier cost-sharing burden for certain services in MA plans, Dr. Park and associates pointed out. While MA enrollees generally paid less in some studies than the Part A hospital deductible for TM for inpatient stays of 3 days, they were more likely to face higher cost sharing for stays exceeding 7 days
Furthermore, whereas TM covers home health services without cost sharing, some MA plans have copayments. In addition, out-of-network health services can cost more. MA enrollees paid an average of $9 more for mental health services than for other in-network services and often encountered limited access to in-network providers. According to a 2021 study, only 18.2% of mental health professionals, 34.4% of cardiologists, 50.0% of psychiatrists, and 57.9% of primary care providers were included in MA networks,
An accompanying editorial noted that private MA plans will reap $83 billion in overpayments from U.S. taxpayers this year, according to Congress’s Medicare Payment Advisory Commission.
And as the data from Dr. Park and colleagues reveal, switchers don’t get much financial protection, according to primary care physician and healthcare researcher Steffi J. Woolhandler, MD, MPH, and internist David U. Himmelstein, MD, both of City University of New York at Hunter College in New York City.
“Medicare Advantage looks good when you’re healthy and don’t need much care. But when you need coverage, it often fails, leaving you with big bills and narrow choices for care,” Dr. Woolhandler said in an interview.
So how do these findings square with insurers’ hard-sell claims and enrollees’ perceptions that MA cuts out-of-pocket costs? “The likeliest explanation is that MA insurers have structured their benefits to advantage low-cost (that is, profitable) enrollees and disadvantage those requiring expensive care,” the editorial commentators wrote. For beneficiaries on inexpensive medications, MA plans would be a financial win. “But for patients requiring expensive chemotherapies, the 20% coinsurance that most MA plans charge could be financially ruinous.”
Commenting on the study but not involved in it, David A. Lipschutz, JD, LLB, associate director of the Center for Medicare Advocacy in Washington, DC, called the study an important one that provides more evidence that significant overpayments to MA plans don’t translate to better financial protections for plan enrollees, particularly lower-income individuals. “While there has been some recent movement to hold plans more accountable for providing necessary care, much more impactful action by policymakers is required to mitigate the harms of the growing privatization of the Medicare program,” he said. “MA overpayments could be redistributed to traditional Medicare in order to enrich all Medicare beneficiaries instead of just insurance companies.”
This study was supported by the National Research Foundation of Korea. Dr. Park disclosed no competing interests. One study coauthor reported support from government and not-for-profit research-funding bodies. Editorialists Dr. Woolhandler and Dr. Himmelstein had no competing interests to declare. Dr. Lipschutz disclosed Medicare advocacy work.
While Medicare Advantage (MA) plans are marketed as providing more generous benefits than traditional Medicare (TM), differences in the financial burden between beneficiaries switching to MA and staying with TM, are minimal, a longitudinal cohort analysis found.
In fact, according to a study by Sungchul Park, PhD, a health economist at Korea University in Seoul, and colleagues, the estimated annual out-of-pocket spending when switching to MA was $168 higher than staying in TM. That amounted to a 10.5% relative increase based on baseline out-of-pocket spending of $1597 annually among switchers, ranging widely, however, from a $133 decrease to a $469 increase. And for some, MA enrollment was associated with a higher likelihood of catastrophic financial burden.
“Our findings contrast with the notion that MA’s apparently more generous health insurance benefits lead to financial savings for enrollees,” Dr. Park and associates wrote in Annals of Internal Medicine.
The study
The analysis looked at costs for 7054 TM stayers and 1544 TM-to-MA switchers from the 2014-2020 Medical Expenditure Panel Survey, focusing on a cohort in which 18% of TM-covered individuals in year 1 switched to MA in year 2.
Comparative financial outcome measures included individual healthcare costs (out-of-pocket spending/cost sharing), financial burden (high/catastrophic), and subjective financial hardship (difficulty paying medical bills).
Although the overall out-of-pocket differences for MA were minimal and amounted to less than 1% of total healthcare expenses, MA was associated with a greater financial burden in vulnerable, especially in low-income populations. For every 100 beneficiaries with family incomes below 200% of the federal poverty level, one to six more switchers faced a catastrophic financial burden, with their out-of-pocket costs consuming more than 40% of household income in the year after switching.
The gap between the perception of lower costs and reality may be caused by a substantially heavier cost-sharing burden for certain services in MA plans, Dr. Park and associates pointed out. While MA enrollees generally paid less in some studies than the Part A hospital deductible for TM for inpatient stays of 3 days, they were more likely to face higher cost sharing for stays exceeding 7 days
Furthermore, whereas TM covers home health services without cost sharing, some MA plans have copayments. In addition, out-of-network health services can cost more. MA enrollees paid an average of $9 more for mental health services than for other in-network services and often encountered limited access to in-network providers. According to a 2021 study, only 18.2% of mental health professionals, 34.4% of cardiologists, 50.0% of psychiatrists, and 57.9% of primary care providers were included in MA networks,
An accompanying editorial noted that private MA plans will reap $83 billion in overpayments from U.S. taxpayers this year, according to Congress’s Medicare Payment Advisory Commission.
And as the data from Dr. Park and colleagues reveal, switchers don’t get much financial protection, according to primary care physician and healthcare researcher Steffi J. Woolhandler, MD, MPH, and internist David U. Himmelstein, MD, both of City University of New York at Hunter College in New York City.
“Medicare Advantage looks good when you’re healthy and don’t need much care. But when you need coverage, it often fails, leaving you with big bills and narrow choices for care,” Dr. Woolhandler said in an interview.
So how do these findings square with insurers’ hard-sell claims and enrollees’ perceptions that MA cuts out-of-pocket costs? “The likeliest explanation is that MA insurers have structured their benefits to advantage low-cost (that is, profitable) enrollees and disadvantage those requiring expensive care,” the editorial commentators wrote. For beneficiaries on inexpensive medications, MA plans would be a financial win. “But for patients requiring expensive chemotherapies, the 20% coinsurance that most MA plans charge could be financially ruinous.”
Commenting on the study but not involved in it, David A. Lipschutz, JD, LLB, associate director of the Center for Medicare Advocacy in Washington, DC, called the study an important one that provides more evidence that significant overpayments to MA plans don’t translate to better financial protections for plan enrollees, particularly lower-income individuals. “While there has been some recent movement to hold plans more accountable for providing necessary care, much more impactful action by policymakers is required to mitigate the harms of the growing privatization of the Medicare program,” he said. “MA overpayments could be redistributed to traditional Medicare in order to enrich all Medicare beneficiaries instead of just insurance companies.”
This study was supported by the National Research Foundation of Korea. Dr. Park disclosed no competing interests. One study coauthor reported support from government and not-for-profit research-funding bodies. Editorialists Dr. Woolhandler and Dr. Himmelstein had no competing interests to declare. Dr. Lipschutz disclosed Medicare advocacy work.
While Medicare Advantage (MA) plans are marketed as providing more generous benefits than traditional Medicare (TM), differences in the financial burden between beneficiaries switching to MA and staying with TM, are minimal, a longitudinal cohort analysis found.
In fact, according to a study by Sungchul Park, PhD, a health economist at Korea University in Seoul, and colleagues, the estimated annual out-of-pocket spending when switching to MA was $168 higher than staying in TM. That amounted to a 10.5% relative increase based on baseline out-of-pocket spending of $1597 annually among switchers, ranging widely, however, from a $133 decrease to a $469 increase. And for some, MA enrollment was associated with a higher likelihood of catastrophic financial burden.
“Our findings contrast with the notion that MA’s apparently more generous health insurance benefits lead to financial savings for enrollees,” Dr. Park and associates wrote in Annals of Internal Medicine.
The study
The analysis looked at costs for 7054 TM stayers and 1544 TM-to-MA switchers from the 2014-2020 Medical Expenditure Panel Survey, focusing on a cohort in which 18% of TM-covered individuals in year 1 switched to MA in year 2.
Comparative financial outcome measures included individual healthcare costs (out-of-pocket spending/cost sharing), financial burden (high/catastrophic), and subjective financial hardship (difficulty paying medical bills).
Although the overall out-of-pocket differences for MA were minimal and amounted to less than 1% of total healthcare expenses, MA was associated with a greater financial burden in vulnerable, especially in low-income populations. For every 100 beneficiaries with family incomes below 200% of the federal poverty level, one to six more switchers faced a catastrophic financial burden, with their out-of-pocket costs consuming more than 40% of household income in the year after switching.
The gap between the perception of lower costs and reality may be caused by a substantially heavier cost-sharing burden for certain services in MA plans, Dr. Park and associates pointed out. While MA enrollees generally paid less in some studies than the Part A hospital deductible for TM for inpatient stays of 3 days, they were more likely to face higher cost sharing for stays exceeding 7 days
Furthermore, whereas TM covers home health services without cost sharing, some MA plans have copayments. In addition, out-of-network health services can cost more. MA enrollees paid an average of $9 more for mental health services than for other in-network services and often encountered limited access to in-network providers. According to a 2021 study, only 18.2% of mental health professionals, 34.4% of cardiologists, 50.0% of psychiatrists, and 57.9% of primary care providers were included in MA networks,
An accompanying editorial noted that private MA plans will reap $83 billion in overpayments from U.S. taxpayers this year, according to Congress’s Medicare Payment Advisory Commission.
And as the data from Dr. Park and colleagues reveal, switchers don’t get much financial protection, according to primary care physician and healthcare researcher Steffi J. Woolhandler, MD, MPH, and internist David U. Himmelstein, MD, both of City University of New York at Hunter College in New York City.
“Medicare Advantage looks good when you’re healthy and don’t need much care. But when you need coverage, it often fails, leaving you with big bills and narrow choices for care,” Dr. Woolhandler said in an interview.
So how do these findings square with insurers’ hard-sell claims and enrollees’ perceptions that MA cuts out-of-pocket costs? “The likeliest explanation is that MA insurers have structured their benefits to advantage low-cost (that is, profitable) enrollees and disadvantage those requiring expensive care,” the editorial commentators wrote. For beneficiaries on inexpensive medications, MA plans would be a financial win. “But for patients requiring expensive chemotherapies, the 20% coinsurance that most MA plans charge could be financially ruinous.”
Commenting on the study but not involved in it, David A. Lipschutz, JD, LLB, associate director of the Center for Medicare Advocacy in Washington, DC, called the study an important one that provides more evidence that significant overpayments to MA plans don’t translate to better financial protections for plan enrollees, particularly lower-income individuals. “While there has been some recent movement to hold plans more accountable for providing necessary care, much more impactful action by policymakers is required to mitigate the harms of the growing privatization of the Medicare program,” he said. “MA overpayments could be redistributed to traditional Medicare in order to enrich all Medicare beneficiaries instead of just insurance companies.”
This study was supported by the National Research Foundation of Korea. Dr. Park disclosed no competing interests. One study coauthor reported support from government and not-for-profit research-funding bodies. Editorialists Dr. Woolhandler and Dr. Himmelstein had no competing interests to declare. Dr. Lipschutz disclosed Medicare advocacy work.
FROM ANNALS OF INTERNAL MEDICINE
Are Primary Care Physicians the Answer to the US Headache Neurologist Shortage?
SAN DIEGO —
It is estimated that about 4 million PCP office visits annually are headache related, and that 52.8% of all migraine encounters occur in primary care settings.
However, PCPs aren’t always adequately trained in headache management and referral times to specialist care can be lengthy.
Data published in Headache show only 564 accredited headache specialists practice in the United States, but at least 3700 headache specialists are needed to treat those affected by migraine, with even more needed to address other disabling headache types such as tension-type headache and cluster headache. To keep up with population growth, it is estimated that the United States will require 4500 headache specialists by 2040.
First Contact
To tackle this specialist shortfall, the AHS developed the First Contact program with the aim of improving headache education in primary care and help alleviate at least some of the demand for specialist care.
The national program was rolled out in 2020 and 2021. The educational symposia were delivered to PCPs at multiple locations across the country. The initiative also included a comprehensive website with numerous support resources.
After participating in the initiative, attendees were surveyed about the value of the program, and the results were subsequently analyzed and presented at the annual meeting of the American Headache Society.
The analysis included 636 survey respondents, a 38% response rate. Almost all participants (96%) were MDs and DOs. The remainder included nurse practitioners, physician assistants, and dentists.
About 85.6% of respondents reported being completely or very confident in their ability to recognize and accurately diagnose headache disorders, and 81.3% said they were completely or very confident in their ability to create tailored treatment plans.
Just over 90% of participants reported they would implement practice changes as a result of the program. The most commonly cited change was the use of diagnostic tools such as the three-question Migraine ID screener, followed closely by consideration of prescribing triptans and reducing the use of unnecessary neuroimaging.
“Overall, there was a positive response to this type of educational programming and interest in ongoing education in addressing headache disorders with both pharmaceutical and non-pharmaceutical treatment options,” said Nisha Malhotra, MD, a resident at New York University (NYU) Langone Health, New York City, who presented the findings at the conference.
The fact that so many general practitioners were keen to use this easy-to-use screen [Migraine ID screener], which can pick up about 90% of people with migraine, is “great,” said study investigator Mia Minen, MD, associate professor and chief of headache research at NYU Langone Health. “I’m pleased primary care providers said they were considering implementing this simple tool.”
However, respondents also cited barriers to change. These included cost constraints (48.9%), insurance reimbursement issues (48.6%), and lack of time (45.3%). Dr. Malhotra noted these concerns are primarily related to workflow rather than knowledge gaps or lack of training.
“This is exciting in that there doesn’t seem to be an issue with education primarily but rather with the logistical issues that exist in the workflow in a primary care setting,” said Dr. Malhotra.
Participants also noted the need for other improvements. For example, they expressed interest in differentiating migraine from other headache types and having a better understanding of how and when to refer to specialists, said Dr. Malhotra.
These practitioners also want to know more about treatment options beyond first-line medications. “They were interested in understanding more advanced medication treatment options beyond just the typical triptan,” said Dr. Malhotra.
In addition, they want to become more skilled in non-pharmaceutical options such as occipital nerve blocks and in massage, acupuncture, and other complementary forms of migraine management, she said.
The study may be vulnerable to sampling bias as survey participants had just attended an educational symposium on headaches. “They were already, to some degree, interested in improving their knowledge on headache,” said Dr. Malhotra.
Another study limitation was that researchers didn’t conduct a pre-survey analysis to determine changes as a result of the symposia. And as the survey was conducted so close to the symposium, “it’s difficult to draw conclusions on the long-term effects,” she added.
“That being said, First Contact is one of the first national initiatives for primary care education, and thus far, it has been very well received.”
The next step is to continue expanding the program and to create a First Contact for women and First Contact for pediatrics, said Dr. Minen.
Improved Diagnosis, Better Care
Commenting on the initiative, Juliana VanderPluym, MD, a headache specialist at the Mayo Clinic, Phoenix, who co-chaired the session where the survey results were presented, said it helps address the supply-demand imbalance in headache healthcare.
“Many, many people have headache disorders, and very few people are technically headache specialists, so we have to rely on our colleagues in primary care to help address the great need that’s out there for patients with headache disorders.”
Too many patients don’t get a proper diagnosis or appropriate treatment, said Dr. VanderPluym, so as time passes, “diseases can become more chronic and more refractory, and it affects people’s quality of life and productivity.”
The First Contact program, she said, helps increase providers’ comfort and confidence that they are providing the best patient care possible and lead to a reduction in the need for specialist referrals.
Dr. Minen serves on the First Contact advisory board.
A version of this article appeared on Medscape.com.
SAN DIEGO —
It is estimated that about 4 million PCP office visits annually are headache related, and that 52.8% of all migraine encounters occur in primary care settings.
However, PCPs aren’t always adequately trained in headache management and referral times to specialist care can be lengthy.
Data published in Headache show only 564 accredited headache specialists practice in the United States, but at least 3700 headache specialists are needed to treat those affected by migraine, with even more needed to address other disabling headache types such as tension-type headache and cluster headache. To keep up with population growth, it is estimated that the United States will require 4500 headache specialists by 2040.
First Contact
To tackle this specialist shortfall, the AHS developed the First Contact program with the aim of improving headache education in primary care and help alleviate at least some of the demand for specialist care.
The national program was rolled out in 2020 and 2021. The educational symposia were delivered to PCPs at multiple locations across the country. The initiative also included a comprehensive website with numerous support resources.
After participating in the initiative, attendees were surveyed about the value of the program, and the results were subsequently analyzed and presented at the annual meeting of the American Headache Society.
The analysis included 636 survey respondents, a 38% response rate. Almost all participants (96%) were MDs and DOs. The remainder included nurse practitioners, physician assistants, and dentists.
About 85.6% of respondents reported being completely or very confident in their ability to recognize and accurately diagnose headache disorders, and 81.3% said they were completely or very confident in their ability to create tailored treatment plans.
Just over 90% of participants reported they would implement practice changes as a result of the program. The most commonly cited change was the use of diagnostic tools such as the three-question Migraine ID screener, followed closely by consideration of prescribing triptans and reducing the use of unnecessary neuroimaging.
“Overall, there was a positive response to this type of educational programming and interest in ongoing education in addressing headache disorders with both pharmaceutical and non-pharmaceutical treatment options,” said Nisha Malhotra, MD, a resident at New York University (NYU) Langone Health, New York City, who presented the findings at the conference.
The fact that so many general practitioners were keen to use this easy-to-use screen [Migraine ID screener], which can pick up about 90% of people with migraine, is “great,” said study investigator Mia Minen, MD, associate professor and chief of headache research at NYU Langone Health. “I’m pleased primary care providers said they were considering implementing this simple tool.”
However, respondents also cited barriers to change. These included cost constraints (48.9%), insurance reimbursement issues (48.6%), and lack of time (45.3%). Dr. Malhotra noted these concerns are primarily related to workflow rather than knowledge gaps or lack of training.
“This is exciting in that there doesn’t seem to be an issue with education primarily but rather with the logistical issues that exist in the workflow in a primary care setting,” said Dr. Malhotra.
Participants also noted the need for other improvements. For example, they expressed interest in differentiating migraine from other headache types and having a better understanding of how and when to refer to specialists, said Dr. Malhotra.
These practitioners also want to know more about treatment options beyond first-line medications. “They were interested in understanding more advanced medication treatment options beyond just the typical triptan,” said Dr. Malhotra.
In addition, they want to become more skilled in non-pharmaceutical options such as occipital nerve blocks and in massage, acupuncture, and other complementary forms of migraine management, she said.
The study may be vulnerable to sampling bias as survey participants had just attended an educational symposium on headaches. “They were already, to some degree, interested in improving their knowledge on headache,” said Dr. Malhotra.
Another study limitation was that researchers didn’t conduct a pre-survey analysis to determine changes as a result of the symposia. And as the survey was conducted so close to the symposium, “it’s difficult to draw conclusions on the long-term effects,” she added.
“That being said, First Contact is one of the first national initiatives for primary care education, and thus far, it has been very well received.”
The next step is to continue expanding the program and to create a First Contact for women and First Contact for pediatrics, said Dr. Minen.
Improved Diagnosis, Better Care
Commenting on the initiative, Juliana VanderPluym, MD, a headache specialist at the Mayo Clinic, Phoenix, who co-chaired the session where the survey results were presented, said it helps address the supply-demand imbalance in headache healthcare.
“Many, many people have headache disorders, and very few people are technically headache specialists, so we have to rely on our colleagues in primary care to help address the great need that’s out there for patients with headache disorders.”
Too many patients don’t get a proper diagnosis or appropriate treatment, said Dr. VanderPluym, so as time passes, “diseases can become more chronic and more refractory, and it affects people’s quality of life and productivity.”
The First Contact program, she said, helps increase providers’ comfort and confidence that they are providing the best patient care possible and lead to a reduction in the need for specialist referrals.
Dr. Minen serves on the First Contact advisory board.
A version of this article appeared on Medscape.com.
SAN DIEGO —
It is estimated that about 4 million PCP office visits annually are headache related, and that 52.8% of all migraine encounters occur in primary care settings.
However, PCPs aren’t always adequately trained in headache management and referral times to specialist care can be lengthy.
Data published in Headache show only 564 accredited headache specialists practice in the United States, but at least 3700 headache specialists are needed to treat those affected by migraine, with even more needed to address other disabling headache types such as tension-type headache and cluster headache. To keep up with population growth, it is estimated that the United States will require 4500 headache specialists by 2040.
First Contact
To tackle this specialist shortfall, the AHS developed the First Contact program with the aim of improving headache education in primary care and help alleviate at least some of the demand for specialist care.
The national program was rolled out in 2020 and 2021. The educational symposia were delivered to PCPs at multiple locations across the country. The initiative also included a comprehensive website with numerous support resources.
After participating in the initiative, attendees were surveyed about the value of the program, and the results were subsequently analyzed and presented at the annual meeting of the American Headache Society.
The analysis included 636 survey respondents, a 38% response rate. Almost all participants (96%) were MDs and DOs. The remainder included nurse practitioners, physician assistants, and dentists.
About 85.6% of respondents reported being completely or very confident in their ability to recognize and accurately diagnose headache disorders, and 81.3% said they were completely or very confident in their ability to create tailored treatment plans.
Just over 90% of participants reported they would implement practice changes as a result of the program. The most commonly cited change was the use of diagnostic tools such as the three-question Migraine ID screener, followed closely by consideration of prescribing triptans and reducing the use of unnecessary neuroimaging.
“Overall, there was a positive response to this type of educational programming and interest in ongoing education in addressing headache disorders with both pharmaceutical and non-pharmaceutical treatment options,” said Nisha Malhotra, MD, a resident at New York University (NYU) Langone Health, New York City, who presented the findings at the conference.
The fact that so many general practitioners were keen to use this easy-to-use screen [Migraine ID screener], which can pick up about 90% of people with migraine, is “great,” said study investigator Mia Minen, MD, associate professor and chief of headache research at NYU Langone Health. “I’m pleased primary care providers said they were considering implementing this simple tool.”
However, respondents also cited barriers to change. These included cost constraints (48.9%), insurance reimbursement issues (48.6%), and lack of time (45.3%). Dr. Malhotra noted these concerns are primarily related to workflow rather than knowledge gaps or lack of training.
“This is exciting in that there doesn’t seem to be an issue with education primarily but rather with the logistical issues that exist in the workflow in a primary care setting,” said Dr. Malhotra.
Participants also noted the need for other improvements. For example, they expressed interest in differentiating migraine from other headache types and having a better understanding of how and when to refer to specialists, said Dr. Malhotra.
These practitioners also want to know more about treatment options beyond first-line medications. “They were interested in understanding more advanced medication treatment options beyond just the typical triptan,” said Dr. Malhotra.
In addition, they want to become more skilled in non-pharmaceutical options such as occipital nerve blocks and in massage, acupuncture, and other complementary forms of migraine management, she said.
The study may be vulnerable to sampling bias as survey participants had just attended an educational symposium on headaches. “They were already, to some degree, interested in improving their knowledge on headache,” said Dr. Malhotra.
Another study limitation was that researchers didn’t conduct a pre-survey analysis to determine changes as a result of the symposia. And as the survey was conducted so close to the symposium, “it’s difficult to draw conclusions on the long-term effects,” she added.
“That being said, First Contact is one of the first national initiatives for primary care education, and thus far, it has been very well received.”
The next step is to continue expanding the program and to create a First Contact for women and First Contact for pediatrics, said Dr. Minen.
Improved Diagnosis, Better Care
Commenting on the initiative, Juliana VanderPluym, MD, a headache specialist at the Mayo Clinic, Phoenix, who co-chaired the session where the survey results were presented, said it helps address the supply-demand imbalance in headache healthcare.
“Many, many people have headache disorders, and very few people are technically headache specialists, so we have to rely on our colleagues in primary care to help address the great need that’s out there for patients with headache disorders.”
Too many patients don’t get a proper diagnosis or appropriate treatment, said Dr. VanderPluym, so as time passes, “diseases can become more chronic and more refractory, and it affects people’s quality of life and productivity.”
The First Contact program, she said, helps increase providers’ comfort and confidence that they are providing the best patient care possible and lead to a reduction in the need for specialist referrals.
Dr. Minen serves on the First Contact advisory board.
A version of this article appeared on Medscape.com.
FROM AHS 2024