User login
Molecular mechanisms may predict major depressive disorder
“Given the multifaceted nature of MDD, the multiple small but dynamic genetic alterations in biomolecular pathways, which are modulated by epigenetic modifications, could contribute to a better understanding of the underlying aetiology and pathophysiology of this disorder,” wrote Cyrus Su Hui Ho, MD, of National University Health System, Singapore, and colleagues. However, studies of biomarkers in psychiatry are limited, and the predictive potential of microribonucleic acids (miRNAs) has not been examined, they said.
In a study published in Comprehensive Psychiatry, the researchers identified 60 adults with depression and 60 healthy controls. Depression severity was assessed with the Hamilton Depression Rating Scale. Other demographic and clinical characteristics were similar between the patients and controls; 10 patients were unmedicated.
The researchers used QUIAGEN Ingenuity Pathway Analysis to identify the specific depression-related biological pathways affected by various miRNAs.
A total of six miRNAs (miR-542-3p, miR-181b-3p, miR-190a-5p, miR-33a-3p, miR-3690, and miR-6895-3p) were down-regulated in unmedicated depressed patients, compared with healthy controls.
In a receiver operating characteristic (ROC) analysis, a combination panel with three miRNAs (miR-542-3p, miR-181b-3p, and miR-3690) in whole blood yielded an area under the curve (AUC) of 0.67. This combination correctly classified 66.7% of MDD patients and 63.3% of healthy controls.
The ability of individual miRNAs to differentiate between MDD patients and controls in the current study was limited, the researchers wrote in their discussion. “However, when three miRNAs (miR-542b-3p, miR-181b-3p, and miR-3690) were combined as a panel, the AUC was enhanced to an almost acceptable degree (AUC of 0.67, approaching 0.7) and might have value in complementing clinical diagnoses,” they said.
The study findings were limited by several factors including the small sample size and the use of medications by most MDD patients, which resulted in an especially small number of unmedicated patients, the researchers noted. Other limitations included the use of study population from a single center, and the inability to explain the link between blood and brain miRNA expression, they said.
However, the study is the first clinical trial in Singapore to examine the role of miRNA in depression and to identify miRNAs as potential biomarkers for MDD, they said.
Additional studies are needed to explore miRNA biomarkers for diagnosis, disease prognosis, and treatment response in MDD, they concluded.
The study was supported by the National University Health System Seed Fund. The researchers had no financial conflicts to disclose.
“Given the multifaceted nature of MDD, the multiple small but dynamic genetic alterations in biomolecular pathways, which are modulated by epigenetic modifications, could contribute to a better understanding of the underlying aetiology and pathophysiology of this disorder,” wrote Cyrus Su Hui Ho, MD, of National University Health System, Singapore, and colleagues. However, studies of biomarkers in psychiatry are limited, and the predictive potential of microribonucleic acids (miRNAs) has not been examined, they said.
In a study published in Comprehensive Psychiatry, the researchers identified 60 adults with depression and 60 healthy controls. Depression severity was assessed with the Hamilton Depression Rating Scale. Other demographic and clinical characteristics were similar between the patients and controls; 10 patients were unmedicated.
The researchers used QUIAGEN Ingenuity Pathway Analysis to identify the specific depression-related biological pathways affected by various miRNAs.
A total of six miRNAs (miR-542-3p, miR-181b-3p, miR-190a-5p, miR-33a-3p, miR-3690, and miR-6895-3p) were down-regulated in unmedicated depressed patients, compared with healthy controls.
In a receiver operating characteristic (ROC) analysis, a combination panel with three miRNAs (miR-542-3p, miR-181b-3p, and miR-3690) in whole blood yielded an area under the curve (AUC) of 0.67. This combination correctly classified 66.7% of MDD patients and 63.3% of healthy controls.
The ability of individual miRNAs to differentiate between MDD patients and controls in the current study was limited, the researchers wrote in their discussion. “However, when three miRNAs (miR-542b-3p, miR-181b-3p, and miR-3690) were combined as a panel, the AUC was enhanced to an almost acceptable degree (AUC of 0.67, approaching 0.7) and might have value in complementing clinical diagnoses,” they said.
The study findings were limited by several factors including the small sample size and the use of medications by most MDD patients, which resulted in an especially small number of unmedicated patients, the researchers noted. Other limitations included the use of study population from a single center, and the inability to explain the link between blood and brain miRNA expression, they said.
However, the study is the first clinical trial in Singapore to examine the role of miRNA in depression and to identify miRNAs as potential biomarkers for MDD, they said.
Additional studies are needed to explore miRNA biomarkers for diagnosis, disease prognosis, and treatment response in MDD, they concluded.
The study was supported by the National University Health System Seed Fund. The researchers had no financial conflicts to disclose.
“Given the multifaceted nature of MDD, the multiple small but dynamic genetic alterations in biomolecular pathways, which are modulated by epigenetic modifications, could contribute to a better understanding of the underlying aetiology and pathophysiology of this disorder,” wrote Cyrus Su Hui Ho, MD, of National University Health System, Singapore, and colleagues. However, studies of biomarkers in psychiatry are limited, and the predictive potential of microribonucleic acids (miRNAs) has not been examined, they said.
In a study published in Comprehensive Psychiatry, the researchers identified 60 adults with depression and 60 healthy controls. Depression severity was assessed with the Hamilton Depression Rating Scale. Other demographic and clinical characteristics were similar between the patients and controls; 10 patients were unmedicated.
The researchers used QUIAGEN Ingenuity Pathway Analysis to identify the specific depression-related biological pathways affected by various miRNAs.
A total of six miRNAs (miR-542-3p, miR-181b-3p, miR-190a-5p, miR-33a-3p, miR-3690, and miR-6895-3p) were down-regulated in unmedicated depressed patients, compared with healthy controls.
In a receiver operating characteristic (ROC) analysis, a combination panel with three miRNAs (miR-542-3p, miR-181b-3p, and miR-3690) in whole blood yielded an area under the curve (AUC) of 0.67. This combination correctly classified 66.7% of MDD patients and 63.3% of healthy controls.
The ability of individual miRNAs to differentiate between MDD patients and controls in the current study was limited, the researchers wrote in their discussion. “However, when three miRNAs (miR-542b-3p, miR-181b-3p, and miR-3690) were combined as a panel, the AUC was enhanced to an almost acceptable degree (AUC of 0.67, approaching 0.7) and might have value in complementing clinical diagnoses,” they said.
The study findings were limited by several factors including the small sample size and the use of medications by most MDD patients, which resulted in an especially small number of unmedicated patients, the researchers noted. Other limitations included the use of study population from a single center, and the inability to explain the link between blood and brain miRNA expression, they said.
However, the study is the first clinical trial in Singapore to examine the role of miRNA in depression and to identify miRNAs as potential biomarkers for MDD, they said.
Additional studies are needed to explore miRNA biomarkers for diagnosis, disease prognosis, and treatment response in MDD, they concluded.
The study was supported by the National University Health System Seed Fund. The researchers had no financial conflicts to disclose.
FROM COMPREHENSIVE PSYCHIATRY
First-line or BiV backup? Conduction system pacing for CRT in heart failure
Pacing as a device therapy for heart failure (HF) is headed for what is probably its next big advance.
After decades of biventricular (BiV) pacemaker success in resynchronizing the ventricles and improving clinical outcomes, relatively new conduction-system pacing (CSP) techniques that avoid the pitfalls of right-ventricular (RV) pacing using BiV lead systems have been supplanting traditional cardiac resynchronization therapy (CRT) in selected patients at some major centers. In fact, they are solidly ensconced in a new guideline document addressing indications for CSP and BiV pacing in HF.
But , an alternative when BiV pacing isn’t appropriate or can’t be engaged.
That’s mainly because the limited, mostly observational evidence supporting CSP in the document can’t measure up to the clinical experience and plethora of large, randomized trials behind BiV-CRT.
But that shortfall is headed for change. Several new comparative studies, including a small, randomized trial, have added significantly to evidence suggesting that CSP is at least as effective as traditional CRT for procedural, functional safety, and clinical outcomes.
The new studies “are inherently prone to bias, but their results are really good,” observed Juan C. Diaz, MD. They show improvements in left ventricular ejection fraction (LVEF) and symptoms with CSP that are “outstanding compared to what we have been doing for the last 20 years,” he said in an interview.
Dr. Diaz, Clínica Las Vegas, Medellin, Colombia, is an investigator with the observational SYNCHRONY, which is among the new CSP studies formally presented at the annual scientific sessions of the Heart Rhythm Society. He is also lead author on its same-day publication in JACC: Clinical Electrophysiology.
Dr. Diaz said that CSP, which sustains pacing via the native conduction system, makes more “physiologic sense” than BiV pacing and represents “a step forward” for HF device therapy.
SYNCHRONY compared LBB-area with BiV pacing as the initial strategy for achieving cardiac resynchronization in patients with ischemic or nonischemic cardiomyopathy.
CSP is “a long way” from replacing conventional CRT, he said. But the new studies at the HRS sessions should help extend His-bundle and LBB-area pacing to more patients, he added, given the significant long-term “drawbacks” of BiV pacing. These include inevitable RV pacing, multiple leads, and the risks associated with chronic transvenous leads.
Zachary Goldberger, MD, University of Wisconsin–Madison, went a bit further in support of CSP as invited discussant for the SYNCHRONY presentation.
Given that it improved LVEF, heart failure class, HF hospitalizations (HFH), and mortality in that study and others, Dr. Goldberger said, CSP could potentially “become the dominant mode of resynchronization going forward.”
Other experts at the meeting saw CSP’s potential more as one of several pacing techniques that could be brought to bear for patients with CRT indications.
“Conduction system pacing is going to be a huge complement to biventricular pacing,” to which about 30% of patients have a “less than optimal response,” said Pugazhendhi Vijayaraman, MD, chief of clinical electrophysiology, Geisinger Heart Institute, Danville, Pa.
“I don’t think it needs to replace biventricular pacing, because biventricular pacing is a well-established, incredibly powerful therapy,” he told this news organization. But CSP is likely to provide “a good alternative option” in patients with poor responses to BiV-CRT.
It may, however, render some current BiV-pacing alternatives “obsolete,” Dr. Vijayaraman observed. “At our center, at least for the last 5 years, no patient has needed epicardial surgical left ventricular lead placement” because CSP was a better backup option.
Dr. Vijayaraman presented two of the meeting’s CSP vs. BiV pacing comparisons. In one, the 100-patient randomized HOT-CRT trial, contractile function improved significantly on CSP, which could be either His-bundle or LBB-area pacing.
He also presented an observational study of LBB-area pacing at 15 centers in Asia, Europe, and North America and led the authors of its simultaneous publication in the Journal of the American College of Cardiology.
“I think left-bundle conduction system pacing is the future, for sure,” Jagmeet P. Singh, MD, DPhil, told this news organization. Still, it doesn’t always work and when it does, it “doesn’t work equally in all patients,” he said.
“Conduction system pacing certainly makes a lot of sense,” especially in patients with left-bundle-branch block (LBBB), and “maybe not as a primary approach but certainly as a secondary approach,” said Dr. Singh, Massachusetts General Hospital, Boston, who is not a coauthor on any of the three studies.
He acknowledged that CSP may work well as a first-line option in patients with LBBB at some experienced centers. For those without LBBB or who have an intraventricular conduction delay, who represent 45%-50% of current CRT cases, Dr. Singh observed, “there’s still more evidence” that BiV-CRT is a more appropriate initial approach.
Standard CRT may fail, however, even in some patients who otherwise meet guideline-based indications. “We don’t really understand all the mechanisms for nonresponse in conventional biventricular pacing,” observed Niraj Varma, MD, PhD, Cleveland Clinic, also not involved with any of the three studies.
In some groups, including “patients with larger ventricles,” for example, BiV-CRT doesn’t always narrow the electrocardiographic QRS complex or preexcite delayed left ventricular (LV) activation, hallmarks of successful CRT, he said in an interview.
“I think we need to understand why this occurs in both situations,” but in such cases, CSP alone or as an adjunct to direct LV pacing may be successful. “Sometimes we need both an LV lead and the conduction-system pacing lead.”
Narrower, more efficient use of CSP as a BiV-CRT alternative may also boost its chances for success, Dr. Varma added. “I think we need to refine patient selection.”
HOT-CRT: Randomized CSP vs. BiV pacing trial
Conducted at three centers in a single health system, the His-optimized cardiac resynchronization therapy study (HOT-CRT) randomly assigned 100 patients with primary or secondary CRT indications to either to CSP – by either His-bundle or LBB-area pacing – or to standard BiV-CRT as the first-line resynchronization method.
Treatment crossovers, allowed for either pacing modality in the event of implantation failure, occurred in two patients and nine patients initially assigned to CSP and BiV pacing, respectively (4% vs. 18%), Dr. Vijayaraman reported.
Historically in trials, BiV pacing has elevated LVEF by about 7%, he said. The mean 12-point increase observed with CSP “is huge, in that sense.” HOT-CRT enrolled a predominantly male and White population at centers highly experienced in both CSP and BiV pacing, limiting its broad relevance to practice, as pointed out by both Dr. Vijayaraman and his presentation’s invited discussant, Yong-Mei Cha, MD, Mayo Clinic, Rochester, Minn. Dr. Cha, who is director of cardiac device services at her center, also highlighted the greater rate of crossover from BiV pacing to CSP, 18% vs. 4% in the other direction. “This is a very encouraging result,” because the implant-failure rate for LBB-area pacing may drop once more operators become “familiar and skilled with conduction-system pacing.” Overall, the study supports CSP as “a very good alternative for heart failure patients when BiV pacing fails.”
International comparison of CSP and BiV pacing
In Dr. Vijayaraman’s other study, the observational comparison of LBB-area pacing and BiV-CRT, the CSP technique emerged as a “reasonable alternative to biventricular pacing, not only for improvement in LV function but also to reduce adverse clinical outcomes.”
Indeed, in the international study of 1,778 mostly male patients with primary or secondary CRT indications who received LBB-area or BiV pacing (797 and 981 patients, respectively), those on CSP saw a significant drop in risk for the primary endpoint, death or HFH.
Mean LVEF improved from 27% to 41% in the LBB-area pacing group and 27% to 37% with BiV pacing (P < .001 for both changes) over a follow-up averaging 33 months. The difference in improvement between CSP and BiV pacing was significant at P < .001.
In adjusted analysis, the risk for death or HFH was greater for BiV-pacing patients, a difference driven by HFH events.
- Death or HF: hazard ratio, 1.49 (95% confidence interval, 1.21-1.84; P < .001).
- Death: HR, 1.14 (95% CI, 0.88-1.48; P = .313).
- HFH: HR, 1.49 (95% CI, 1.16-1.92; P = .002)
The analysis has all the “inherent biases” of an observational study. The risk for patient-selection bias, however, was somewhat mitigated by consistent practice patterns at participating centers, Dr. Vijayaraman told this news organization.
For example, he said, operators at six of the institutions were most likely to use CSP as the first-line approach, and the same number of centers usually went with BiV pacing.
SYNCHRONY: First-line LBB-area pacing vs. BiV-CRT
Outcomes using the two approaches were similar in the prospective, international, observational study of 371 patients with ischemic or nonischemic cardiomyopathy and standard CRT indications. Allocation of 128 patients to LBB-area pacing and 243 to BiV-CRT was based on patient and operator preferences, reported Jorge Romero Jr, MD, Brigham and Women’s Hospital, Boston, at the HRS sessions.
Risk for the death-HFH primary endpoint dropped 38% for those initially treated with LBB-area pacing, compared with BiV pacing, primarily because of a lower HFH risk:
- Death or HFH: HR, 0.62 (95% CI, 0.41-0.93; P = .02).
- Death: HR, 0.57 (95% CI, 0.25-1.32; P = .19).
- HFH: HR, 0.61 (95% CI, 0.34-0.93; P = .02)
Patients in the CSP group were also more likely to improve by at least one NYHA (New York Heart Association) class (80.4% vs. 67.9%; P < .001), consistent with their greater absolute change in LVEF (8.0 vs. 3.9 points; P < .01).
The findings “suggest that LBBAP [left-bundle branch area pacing] is an excellent alternative to BiV pacing,” with a comparable safety profile, write Jayanthi N. Koneru, MBBS, and Kenneth A. Ellenbogen, MD, in an editorial accompanying the published SYNCHRONY report.
“The differences in improvement of LVEF are encouraging for both groups,” but were superior for LBB-area pacing, continue Dr. Koneru and Dr. Ellenbogen, both with Virginia Commonwealth University Medical Center, Richmond. “Whether these results would have regressed to the mean over a longer period of follow-up or diverge further with LBB-area pacing continuing to be superior is unknown.”
Years for an answer?
A large randomized comparison of CSP and BiV-CRT, called Left vs. Left, is currently in early stages, Sana M. Al-Khatib, MD, MHS, Duke University Medical Center, Durham, N.C., said in a media presentation on two of the presented studies. It has a planned enrollment of more than 2,100 patients on optimal meds with an LVEF of 50% or lower and either a QRS duration of at least 130 ms or an anticipated burden of RV pacing exceeding 40%.
The trial, she said, “will take years to give an answer, but it is actually designed to address the question of whether a composite endpoint of time to death or heart failure hospitalization can be improved with conduction system pacing vs. biventricular pacing.”
Dr. Al-Khatib is a coauthor on the new guideline covering both CSP and BiV-CRT in HF, as are Dr. Cha, Dr. Varma, Dr. Singh, Dr. Vijayaraman, and Dr. Goldberger; Dr. Ellenbogen is one of the reviewers.
Dr. Diaz discloses receiving honoraria or fees for speaking or teaching from Bayer Healthcare, Pfizer, AstraZeneca, Boston Scientific, and Medtronic. Dr. Vijayaraman discloses receiving honoraria or fees for speaking, teaching, or consulting for Abbott, Medtronic, Biotronik, and Boston Scientific; and receiving research grants from Medtronic. Dr. Varma discloses receiving honoraria or fees for speaking or consulting as an independent contractor for Medtronic, Boston Scientific, Biotronik, Impulse Dynamics USA, Cardiologs, Abbott, Pacemate, Implicity, and EP Solutions. Dr. Singh discloses receiving fees for consulting from EBR Systems, Merit Medical Systems, New Century Health, Biotronik, Abbott, Medtronic, MicroPort Scientific, Cardiologs, Sanofi, CVRx, Impulse Dynamics USA, Octagos, Implicity, Orchestra Biomed, Rhythm Management Group, and Biosense Webster; and receiving honoraria or fees for speaking and teaching from Medscape. Dr. Cha had no relevant financial relationships. Dr. Romero discloses receiving research grants from Biosense Webster; and speaking or receiving honoraria or fees for consulting, speaking, or teaching, or serving on a board for Sanofi, Boston Scientific, and AtriCure. Dr. Koneru discloses consulting for Medtronic and receiving honoraria from Abbott. Dr. Ellenbogen discloses consulting or lecturing for or receiving honoraria from Medtronic, Boston Scientific, and Abbott. Dr. Goldberger discloses receiving royalty income from and serving as an independent contractor for Elsevier. Dr. Al-Khatib discloses receiving research grants from Medtronic and Boston Scientific.
A version of this article first appeared on Medscape.com.
Pacing as a device therapy for heart failure (HF) is headed for what is probably its next big advance.
After decades of biventricular (BiV) pacemaker success in resynchronizing the ventricles and improving clinical outcomes, relatively new conduction-system pacing (CSP) techniques that avoid the pitfalls of right-ventricular (RV) pacing using BiV lead systems have been supplanting traditional cardiac resynchronization therapy (CRT) in selected patients at some major centers. In fact, they are solidly ensconced in a new guideline document addressing indications for CSP and BiV pacing in HF.
But , an alternative when BiV pacing isn’t appropriate or can’t be engaged.
That’s mainly because the limited, mostly observational evidence supporting CSP in the document can’t measure up to the clinical experience and plethora of large, randomized trials behind BiV-CRT.
But that shortfall is headed for change. Several new comparative studies, including a small, randomized trial, have added significantly to evidence suggesting that CSP is at least as effective as traditional CRT for procedural, functional safety, and clinical outcomes.
The new studies “are inherently prone to bias, but their results are really good,” observed Juan C. Diaz, MD. They show improvements in left ventricular ejection fraction (LVEF) and symptoms with CSP that are “outstanding compared to what we have been doing for the last 20 years,” he said in an interview.
Dr. Diaz, Clínica Las Vegas, Medellin, Colombia, is an investigator with the observational SYNCHRONY, which is among the new CSP studies formally presented at the annual scientific sessions of the Heart Rhythm Society. He is also lead author on its same-day publication in JACC: Clinical Electrophysiology.
Dr. Diaz said that CSP, which sustains pacing via the native conduction system, makes more “physiologic sense” than BiV pacing and represents “a step forward” for HF device therapy.
SYNCHRONY compared LBB-area with BiV pacing as the initial strategy for achieving cardiac resynchronization in patients with ischemic or nonischemic cardiomyopathy.
CSP is “a long way” from replacing conventional CRT, he said. But the new studies at the HRS sessions should help extend His-bundle and LBB-area pacing to more patients, he added, given the significant long-term “drawbacks” of BiV pacing. These include inevitable RV pacing, multiple leads, and the risks associated with chronic transvenous leads.
Zachary Goldberger, MD, University of Wisconsin–Madison, went a bit further in support of CSP as invited discussant for the SYNCHRONY presentation.
Given that it improved LVEF, heart failure class, HF hospitalizations (HFH), and mortality in that study and others, Dr. Goldberger said, CSP could potentially “become the dominant mode of resynchronization going forward.”
Other experts at the meeting saw CSP’s potential more as one of several pacing techniques that could be brought to bear for patients with CRT indications.
“Conduction system pacing is going to be a huge complement to biventricular pacing,” to which about 30% of patients have a “less than optimal response,” said Pugazhendhi Vijayaraman, MD, chief of clinical electrophysiology, Geisinger Heart Institute, Danville, Pa.
“I don’t think it needs to replace biventricular pacing, because biventricular pacing is a well-established, incredibly powerful therapy,” he told this news organization. But CSP is likely to provide “a good alternative option” in patients with poor responses to BiV-CRT.
It may, however, render some current BiV-pacing alternatives “obsolete,” Dr. Vijayaraman observed. “At our center, at least for the last 5 years, no patient has needed epicardial surgical left ventricular lead placement” because CSP was a better backup option.
Dr. Vijayaraman presented two of the meeting’s CSP vs. BiV pacing comparisons. In one, the 100-patient randomized HOT-CRT trial, contractile function improved significantly on CSP, which could be either His-bundle or LBB-area pacing.
He also presented an observational study of LBB-area pacing at 15 centers in Asia, Europe, and North America and led the authors of its simultaneous publication in the Journal of the American College of Cardiology.
“I think left-bundle conduction system pacing is the future, for sure,” Jagmeet P. Singh, MD, DPhil, told this news organization. Still, it doesn’t always work and when it does, it “doesn’t work equally in all patients,” he said.
“Conduction system pacing certainly makes a lot of sense,” especially in patients with left-bundle-branch block (LBBB), and “maybe not as a primary approach but certainly as a secondary approach,” said Dr. Singh, Massachusetts General Hospital, Boston, who is not a coauthor on any of the three studies.
He acknowledged that CSP may work well as a first-line option in patients with LBBB at some experienced centers. For those without LBBB or who have an intraventricular conduction delay, who represent 45%-50% of current CRT cases, Dr. Singh observed, “there’s still more evidence” that BiV-CRT is a more appropriate initial approach.
Standard CRT may fail, however, even in some patients who otherwise meet guideline-based indications. “We don’t really understand all the mechanisms for nonresponse in conventional biventricular pacing,” observed Niraj Varma, MD, PhD, Cleveland Clinic, also not involved with any of the three studies.
In some groups, including “patients with larger ventricles,” for example, BiV-CRT doesn’t always narrow the electrocardiographic QRS complex or preexcite delayed left ventricular (LV) activation, hallmarks of successful CRT, he said in an interview.
“I think we need to understand why this occurs in both situations,” but in such cases, CSP alone or as an adjunct to direct LV pacing may be successful. “Sometimes we need both an LV lead and the conduction-system pacing lead.”
Narrower, more efficient use of CSP as a BiV-CRT alternative may also boost its chances for success, Dr. Varma added. “I think we need to refine patient selection.”
HOT-CRT: Randomized CSP vs. BiV pacing trial
Conducted at three centers in a single health system, the His-optimized cardiac resynchronization therapy study (HOT-CRT) randomly assigned 100 patients with primary or secondary CRT indications to either to CSP – by either His-bundle or LBB-area pacing – or to standard BiV-CRT as the first-line resynchronization method.
Treatment crossovers, allowed for either pacing modality in the event of implantation failure, occurred in two patients and nine patients initially assigned to CSP and BiV pacing, respectively (4% vs. 18%), Dr. Vijayaraman reported.
Historically in trials, BiV pacing has elevated LVEF by about 7%, he said. The mean 12-point increase observed with CSP “is huge, in that sense.” HOT-CRT enrolled a predominantly male and White population at centers highly experienced in both CSP and BiV pacing, limiting its broad relevance to practice, as pointed out by both Dr. Vijayaraman and his presentation’s invited discussant, Yong-Mei Cha, MD, Mayo Clinic, Rochester, Minn. Dr. Cha, who is director of cardiac device services at her center, also highlighted the greater rate of crossover from BiV pacing to CSP, 18% vs. 4% in the other direction. “This is a very encouraging result,” because the implant-failure rate for LBB-area pacing may drop once more operators become “familiar and skilled with conduction-system pacing.” Overall, the study supports CSP as “a very good alternative for heart failure patients when BiV pacing fails.”
International comparison of CSP and BiV pacing
In Dr. Vijayaraman’s other study, the observational comparison of LBB-area pacing and BiV-CRT, the CSP technique emerged as a “reasonable alternative to biventricular pacing, not only for improvement in LV function but also to reduce adverse clinical outcomes.”
Indeed, in the international study of 1,778 mostly male patients with primary or secondary CRT indications who received LBB-area or BiV pacing (797 and 981 patients, respectively), those on CSP saw a significant drop in risk for the primary endpoint, death or HFH.
Mean LVEF improved from 27% to 41% in the LBB-area pacing group and 27% to 37% with BiV pacing (P < .001 for both changes) over a follow-up averaging 33 months. The difference in improvement between CSP and BiV pacing was significant at P < .001.
In adjusted analysis, the risk for death or HFH was greater for BiV-pacing patients, a difference driven by HFH events.
- Death or HF: hazard ratio, 1.49 (95% confidence interval, 1.21-1.84; P < .001).
- Death: HR, 1.14 (95% CI, 0.88-1.48; P = .313).
- HFH: HR, 1.49 (95% CI, 1.16-1.92; P = .002)
The analysis has all the “inherent biases” of an observational study. The risk for patient-selection bias, however, was somewhat mitigated by consistent practice patterns at participating centers, Dr. Vijayaraman told this news organization.
For example, he said, operators at six of the institutions were most likely to use CSP as the first-line approach, and the same number of centers usually went with BiV pacing.
SYNCHRONY: First-line LBB-area pacing vs. BiV-CRT
Outcomes using the two approaches were similar in the prospective, international, observational study of 371 patients with ischemic or nonischemic cardiomyopathy and standard CRT indications. Allocation of 128 patients to LBB-area pacing and 243 to BiV-CRT was based on patient and operator preferences, reported Jorge Romero Jr, MD, Brigham and Women’s Hospital, Boston, at the HRS sessions.
Risk for the death-HFH primary endpoint dropped 38% for those initially treated with LBB-area pacing, compared with BiV pacing, primarily because of a lower HFH risk:
- Death or HFH: HR, 0.62 (95% CI, 0.41-0.93; P = .02).
- Death: HR, 0.57 (95% CI, 0.25-1.32; P = .19).
- HFH: HR, 0.61 (95% CI, 0.34-0.93; P = .02)
Patients in the CSP group were also more likely to improve by at least one NYHA (New York Heart Association) class (80.4% vs. 67.9%; P < .001), consistent with their greater absolute change in LVEF (8.0 vs. 3.9 points; P < .01).
The findings “suggest that LBBAP [left-bundle branch area pacing] is an excellent alternative to BiV pacing,” with a comparable safety profile, write Jayanthi N. Koneru, MBBS, and Kenneth A. Ellenbogen, MD, in an editorial accompanying the published SYNCHRONY report.
“The differences in improvement of LVEF are encouraging for both groups,” but were superior for LBB-area pacing, continue Dr. Koneru and Dr. Ellenbogen, both with Virginia Commonwealth University Medical Center, Richmond. “Whether these results would have regressed to the mean over a longer period of follow-up or diverge further with LBB-area pacing continuing to be superior is unknown.”
Years for an answer?
A large randomized comparison of CSP and BiV-CRT, called Left vs. Left, is currently in early stages, Sana M. Al-Khatib, MD, MHS, Duke University Medical Center, Durham, N.C., said in a media presentation on two of the presented studies. It has a planned enrollment of more than 2,100 patients on optimal meds with an LVEF of 50% or lower and either a QRS duration of at least 130 ms or an anticipated burden of RV pacing exceeding 40%.
The trial, she said, “will take years to give an answer, but it is actually designed to address the question of whether a composite endpoint of time to death or heart failure hospitalization can be improved with conduction system pacing vs. biventricular pacing.”
Dr. Al-Khatib is a coauthor on the new guideline covering both CSP and BiV-CRT in HF, as are Dr. Cha, Dr. Varma, Dr. Singh, Dr. Vijayaraman, and Dr. Goldberger; Dr. Ellenbogen is one of the reviewers.
Dr. Diaz discloses receiving honoraria or fees for speaking or teaching from Bayer Healthcare, Pfizer, AstraZeneca, Boston Scientific, and Medtronic. Dr. Vijayaraman discloses receiving honoraria or fees for speaking, teaching, or consulting for Abbott, Medtronic, Biotronik, and Boston Scientific; and receiving research grants from Medtronic. Dr. Varma discloses receiving honoraria or fees for speaking or consulting as an independent contractor for Medtronic, Boston Scientific, Biotronik, Impulse Dynamics USA, Cardiologs, Abbott, Pacemate, Implicity, and EP Solutions. Dr. Singh discloses receiving fees for consulting from EBR Systems, Merit Medical Systems, New Century Health, Biotronik, Abbott, Medtronic, MicroPort Scientific, Cardiologs, Sanofi, CVRx, Impulse Dynamics USA, Octagos, Implicity, Orchestra Biomed, Rhythm Management Group, and Biosense Webster; and receiving honoraria or fees for speaking and teaching from Medscape. Dr. Cha had no relevant financial relationships. Dr. Romero discloses receiving research grants from Biosense Webster; and speaking or receiving honoraria or fees for consulting, speaking, or teaching, or serving on a board for Sanofi, Boston Scientific, and AtriCure. Dr. Koneru discloses consulting for Medtronic and receiving honoraria from Abbott. Dr. Ellenbogen discloses consulting or lecturing for or receiving honoraria from Medtronic, Boston Scientific, and Abbott. Dr. Goldberger discloses receiving royalty income from and serving as an independent contractor for Elsevier. Dr. Al-Khatib discloses receiving research grants from Medtronic and Boston Scientific.
A version of this article first appeared on Medscape.com.
Pacing as a device therapy for heart failure (HF) is headed for what is probably its next big advance.
After decades of biventricular (BiV) pacemaker success in resynchronizing the ventricles and improving clinical outcomes, relatively new conduction-system pacing (CSP) techniques that avoid the pitfalls of right-ventricular (RV) pacing using BiV lead systems have been supplanting traditional cardiac resynchronization therapy (CRT) in selected patients at some major centers. In fact, they are solidly ensconced in a new guideline document addressing indications for CSP and BiV pacing in HF.
But , an alternative when BiV pacing isn’t appropriate or can’t be engaged.
That’s mainly because the limited, mostly observational evidence supporting CSP in the document can’t measure up to the clinical experience and plethora of large, randomized trials behind BiV-CRT.
But that shortfall is headed for change. Several new comparative studies, including a small, randomized trial, have added significantly to evidence suggesting that CSP is at least as effective as traditional CRT for procedural, functional safety, and clinical outcomes.
The new studies “are inherently prone to bias, but their results are really good,” observed Juan C. Diaz, MD. They show improvements in left ventricular ejection fraction (LVEF) and symptoms with CSP that are “outstanding compared to what we have been doing for the last 20 years,” he said in an interview.
Dr. Diaz, Clínica Las Vegas, Medellin, Colombia, is an investigator with the observational SYNCHRONY, which is among the new CSP studies formally presented at the annual scientific sessions of the Heart Rhythm Society. He is also lead author on its same-day publication in JACC: Clinical Electrophysiology.
Dr. Diaz said that CSP, which sustains pacing via the native conduction system, makes more “physiologic sense” than BiV pacing and represents “a step forward” for HF device therapy.
SYNCHRONY compared LBB-area with BiV pacing as the initial strategy for achieving cardiac resynchronization in patients with ischemic or nonischemic cardiomyopathy.
CSP is “a long way” from replacing conventional CRT, he said. But the new studies at the HRS sessions should help extend His-bundle and LBB-area pacing to more patients, he added, given the significant long-term “drawbacks” of BiV pacing. These include inevitable RV pacing, multiple leads, and the risks associated with chronic transvenous leads.
Zachary Goldberger, MD, University of Wisconsin–Madison, went a bit further in support of CSP as invited discussant for the SYNCHRONY presentation.
Given that it improved LVEF, heart failure class, HF hospitalizations (HFH), and mortality in that study and others, Dr. Goldberger said, CSP could potentially “become the dominant mode of resynchronization going forward.”
Other experts at the meeting saw CSP’s potential more as one of several pacing techniques that could be brought to bear for patients with CRT indications.
“Conduction system pacing is going to be a huge complement to biventricular pacing,” to which about 30% of patients have a “less than optimal response,” said Pugazhendhi Vijayaraman, MD, chief of clinical electrophysiology, Geisinger Heart Institute, Danville, Pa.
“I don’t think it needs to replace biventricular pacing, because biventricular pacing is a well-established, incredibly powerful therapy,” he told this news organization. But CSP is likely to provide “a good alternative option” in patients with poor responses to BiV-CRT.
It may, however, render some current BiV-pacing alternatives “obsolete,” Dr. Vijayaraman observed. “At our center, at least for the last 5 years, no patient has needed epicardial surgical left ventricular lead placement” because CSP was a better backup option.
Dr. Vijayaraman presented two of the meeting’s CSP vs. BiV pacing comparisons. In one, the 100-patient randomized HOT-CRT trial, contractile function improved significantly on CSP, which could be either His-bundle or LBB-area pacing.
He also presented an observational study of LBB-area pacing at 15 centers in Asia, Europe, and North America and led the authors of its simultaneous publication in the Journal of the American College of Cardiology.
“I think left-bundle conduction system pacing is the future, for sure,” Jagmeet P. Singh, MD, DPhil, told this news organization. Still, it doesn’t always work and when it does, it “doesn’t work equally in all patients,” he said.
“Conduction system pacing certainly makes a lot of sense,” especially in patients with left-bundle-branch block (LBBB), and “maybe not as a primary approach but certainly as a secondary approach,” said Dr. Singh, Massachusetts General Hospital, Boston, who is not a coauthor on any of the three studies.
He acknowledged that CSP may work well as a first-line option in patients with LBBB at some experienced centers. For those without LBBB or who have an intraventricular conduction delay, who represent 45%-50% of current CRT cases, Dr. Singh observed, “there’s still more evidence” that BiV-CRT is a more appropriate initial approach.
Standard CRT may fail, however, even in some patients who otherwise meet guideline-based indications. “We don’t really understand all the mechanisms for nonresponse in conventional biventricular pacing,” observed Niraj Varma, MD, PhD, Cleveland Clinic, also not involved with any of the three studies.
In some groups, including “patients with larger ventricles,” for example, BiV-CRT doesn’t always narrow the electrocardiographic QRS complex or preexcite delayed left ventricular (LV) activation, hallmarks of successful CRT, he said in an interview.
“I think we need to understand why this occurs in both situations,” but in such cases, CSP alone or as an adjunct to direct LV pacing may be successful. “Sometimes we need both an LV lead and the conduction-system pacing lead.”
Narrower, more efficient use of CSP as a BiV-CRT alternative may also boost its chances for success, Dr. Varma added. “I think we need to refine patient selection.”
HOT-CRT: Randomized CSP vs. BiV pacing trial
Conducted at three centers in a single health system, the His-optimized cardiac resynchronization therapy study (HOT-CRT) randomly assigned 100 patients with primary or secondary CRT indications to either to CSP – by either His-bundle or LBB-area pacing – or to standard BiV-CRT as the first-line resynchronization method.
Treatment crossovers, allowed for either pacing modality in the event of implantation failure, occurred in two patients and nine patients initially assigned to CSP and BiV pacing, respectively (4% vs. 18%), Dr. Vijayaraman reported.
Historically in trials, BiV pacing has elevated LVEF by about 7%, he said. The mean 12-point increase observed with CSP “is huge, in that sense.” HOT-CRT enrolled a predominantly male and White population at centers highly experienced in both CSP and BiV pacing, limiting its broad relevance to practice, as pointed out by both Dr. Vijayaraman and his presentation’s invited discussant, Yong-Mei Cha, MD, Mayo Clinic, Rochester, Minn. Dr. Cha, who is director of cardiac device services at her center, also highlighted the greater rate of crossover from BiV pacing to CSP, 18% vs. 4% in the other direction. “This is a very encouraging result,” because the implant-failure rate for LBB-area pacing may drop once more operators become “familiar and skilled with conduction-system pacing.” Overall, the study supports CSP as “a very good alternative for heart failure patients when BiV pacing fails.”
International comparison of CSP and BiV pacing
In Dr. Vijayaraman’s other study, the observational comparison of LBB-area pacing and BiV-CRT, the CSP technique emerged as a “reasonable alternative to biventricular pacing, not only for improvement in LV function but also to reduce adverse clinical outcomes.”
Indeed, in the international study of 1,778 mostly male patients with primary or secondary CRT indications who received LBB-area or BiV pacing (797 and 981 patients, respectively), those on CSP saw a significant drop in risk for the primary endpoint, death or HFH.
Mean LVEF improved from 27% to 41% in the LBB-area pacing group and 27% to 37% with BiV pacing (P < .001 for both changes) over a follow-up averaging 33 months. The difference in improvement between CSP and BiV pacing was significant at P < .001.
In adjusted analysis, the risk for death or HFH was greater for BiV-pacing patients, a difference driven by HFH events.
- Death or HF: hazard ratio, 1.49 (95% confidence interval, 1.21-1.84; P < .001).
- Death: HR, 1.14 (95% CI, 0.88-1.48; P = .313).
- HFH: HR, 1.49 (95% CI, 1.16-1.92; P = .002)
The analysis has all the “inherent biases” of an observational study. The risk for patient-selection bias, however, was somewhat mitigated by consistent practice patterns at participating centers, Dr. Vijayaraman told this news organization.
For example, he said, operators at six of the institutions were most likely to use CSP as the first-line approach, and the same number of centers usually went with BiV pacing.
SYNCHRONY: First-line LBB-area pacing vs. BiV-CRT
Outcomes using the two approaches were similar in the prospective, international, observational study of 371 patients with ischemic or nonischemic cardiomyopathy and standard CRT indications. Allocation of 128 patients to LBB-area pacing and 243 to BiV-CRT was based on patient and operator preferences, reported Jorge Romero Jr, MD, Brigham and Women’s Hospital, Boston, at the HRS sessions.
Risk for the death-HFH primary endpoint dropped 38% for those initially treated with LBB-area pacing, compared with BiV pacing, primarily because of a lower HFH risk:
- Death or HFH: HR, 0.62 (95% CI, 0.41-0.93; P = .02).
- Death: HR, 0.57 (95% CI, 0.25-1.32; P = .19).
- HFH: HR, 0.61 (95% CI, 0.34-0.93; P = .02)
Patients in the CSP group were also more likely to improve by at least one NYHA (New York Heart Association) class (80.4% vs. 67.9%; P < .001), consistent with their greater absolute change in LVEF (8.0 vs. 3.9 points; P < .01).
The findings “suggest that LBBAP [left-bundle branch area pacing] is an excellent alternative to BiV pacing,” with a comparable safety profile, write Jayanthi N. Koneru, MBBS, and Kenneth A. Ellenbogen, MD, in an editorial accompanying the published SYNCHRONY report.
“The differences in improvement of LVEF are encouraging for both groups,” but were superior for LBB-area pacing, continue Dr. Koneru and Dr. Ellenbogen, both with Virginia Commonwealth University Medical Center, Richmond. “Whether these results would have regressed to the mean over a longer period of follow-up or diverge further with LBB-area pacing continuing to be superior is unknown.”
Years for an answer?
A large randomized comparison of CSP and BiV-CRT, called Left vs. Left, is currently in early stages, Sana M. Al-Khatib, MD, MHS, Duke University Medical Center, Durham, N.C., said in a media presentation on two of the presented studies. It has a planned enrollment of more than 2,100 patients on optimal meds with an LVEF of 50% or lower and either a QRS duration of at least 130 ms or an anticipated burden of RV pacing exceeding 40%.
The trial, she said, “will take years to give an answer, but it is actually designed to address the question of whether a composite endpoint of time to death or heart failure hospitalization can be improved with conduction system pacing vs. biventricular pacing.”
Dr. Al-Khatib is a coauthor on the new guideline covering both CSP and BiV-CRT in HF, as are Dr. Cha, Dr. Varma, Dr. Singh, Dr. Vijayaraman, and Dr. Goldberger; Dr. Ellenbogen is one of the reviewers.
Dr. Diaz discloses receiving honoraria or fees for speaking or teaching from Bayer Healthcare, Pfizer, AstraZeneca, Boston Scientific, and Medtronic. Dr. Vijayaraman discloses receiving honoraria or fees for speaking, teaching, or consulting for Abbott, Medtronic, Biotronik, and Boston Scientific; and receiving research grants from Medtronic. Dr. Varma discloses receiving honoraria or fees for speaking or consulting as an independent contractor for Medtronic, Boston Scientific, Biotronik, Impulse Dynamics USA, Cardiologs, Abbott, Pacemate, Implicity, and EP Solutions. Dr. Singh discloses receiving fees for consulting from EBR Systems, Merit Medical Systems, New Century Health, Biotronik, Abbott, Medtronic, MicroPort Scientific, Cardiologs, Sanofi, CVRx, Impulse Dynamics USA, Octagos, Implicity, Orchestra Biomed, Rhythm Management Group, and Biosense Webster; and receiving honoraria or fees for speaking and teaching from Medscape. Dr. Cha had no relevant financial relationships. Dr. Romero discloses receiving research grants from Biosense Webster; and speaking or receiving honoraria or fees for consulting, speaking, or teaching, or serving on a board for Sanofi, Boston Scientific, and AtriCure. Dr. Koneru discloses consulting for Medtronic and receiving honoraria from Abbott. Dr. Ellenbogen discloses consulting or lecturing for or receiving honoraria from Medtronic, Boston Scientific, and Abbott. Dr. Goldberger discloses receiving royalty income from and serving as an independent contractor for Elsevier. Dr. Al-Khatib discloses receiving research grants from Medtronic and Boston Scientific.
A version of this article first appeared on Medscape.com.
FROM HEART RHYTHM 2023
Dapagliflozin matches non–loop diuretic for congestion in AHF: DAPA-RESIST
suggests a new randomized trial. The drugs were given to the study’s loop diuretic–resistant patients on top of furosemide.
Changes in volume status and measures of pulmonary congestion and risk for serious adverse events were similar for those assigned to take dapagliflozin, an SGLT2 inhibitor, or metolazone, a quinazoline diuretic. Those on dapagliflozin zone ultimately received a larger cumulative furosemide dose in the 61-patient trial, called DAPA-RESIST.
“The next steps are to assess whether a strategy of using SGLT2 inhibitors up front in patients with HF reduces the incidence of diuretic resistance, and to test further combinations of diuretics such as thiazide or thiazide-like diuretics, compared with acetazolamide, when used in addition to an IV loop diuretic and SGLT2 inhibitors together,” Ross T. Campbell, MBChB, PhD, University of Glasgow and Queen Elizabeth University Hospital, also in Glasgow, said in an interview.
Dr. Campbell presented the findings at the annual meeting of the Heart Failure Association of the European Society of Cardiology and is senior author on its simultaneous publication in the European Heart Journal.
The multicenter trial randomly assigned 61 patients with AHF to receive dapagliflozin at a fixed dose of 10 mg once daily or metolazone 5 mg or 10 mg (starting dosage at physician discretion) once daily for 3 days of treatment on an open-label basis.
Patients had entered the trial on furosemide at a mean daily dosage of 260 mg in the dapagliflozin group and 229 mg for those assigned metolazone; dosages for the loop diuretic in the trial weren’t prespecified.
Their median age was 79 and 54% were women; 44% had HF with reduced ejection fraction. Their mean glomerular filtration rate was below 30 mL/min per 1.73 m2 in 26%, 90% had chronic kidney disease, 98% had peripheral edema, and 46% had diabetes.
The mean cumulative furosemide dose was significantly higher among the dapagliflozin group’s 31 patients, 976 mg versus 704 mg for the 30 on acetazolamide (P < .05), 96 hours after the start of randomized therapy. However, patients on dapagliflozin experienced a lesser increase in creatinine (P < .05) and in blood urea (P < .01), a greater change in serum sodium (P < .05), and a smaller reduction in serum potassium (P < .01).
Although the trial wasn’t powered for those outcomes, Dr. Campbell said, “less biochemical upset could be associated with better outcomes in terms of less medium- to long-term renal impairment, and in the short-term length of stay.”
The mean decrease in weight at 96 hours, the primary endpoint, reached 3 kg on dapagliflozin, compared with 3.6 kg with metolazone (P = .082), a difference that fell short of significance.
Loop diuretic efficiency, that is weight change in kg per 40 mg furosemide, “was smaller with dapagliflozin than with metolazone at each time point after randomization, although the difference was only significant at 24 hours,” the published report states.
Changes in pulmonary congestion (by lung ultrasound) and fluid volume were similar between the groups.
“This trial further adds to the evidence base and safety profile for using SGLT2 inhibitors in patients with acute heart failure,” and “gives further confidence to clinicians that this class can be started in ‘sicker’ patients with HF who also have diuretic resistance,” Dr. Campbell said.
Asked during his presentation’s question and answer whether dapagliflozin might have shown a greater effect had the dosage been higher, Dr. Campbell explained that the drug was investigational when the trial started. Adding a higher-dose dapagliflozin arm, he said, would have made for an excessively complex study. But “that’s a great research question for another trial.”
DAPA-RESIST was funded by AstraZeneca. Dr. Campbell disclosed receiving honoraria from AstraZeneca for speaking and from Bayer for serving on an advisory board.
A version of this article first appeared on Medscape.com.
suggests a new randomized trial. The drugs were given to the study’s loop diuretic–resistant patients on top of furosemide.
Changes in volume status and measures of pulmonary congestion and risk for serious adverse events were similar for those assigned to take dapagliflozin, an SGLT2 inhibitor, or metolazone, a quinazoline diuretic. Those on dapagliflozin zone ultimately received a larger cumulative furosemide dose in the 61-patient trial, called DAPA-RESIST.
“The next steps are to assess whether a strategy of using SGLT2 inhibitors up front in patients with HF reduces the incidence of diuretic resistance, and to test further combinations of diuretics such as thiazide or thiazide-like diuretics, compared with acetazolamide, when used in addition to an IV loop diuretic and SGLT2 inhibitors together,” Ross T. Campbell, MBChB, PhD, University of Glasgow and Queen Elizabeth University Hospital, also in Glasgow, said in an interview.
Dr. Campbell presented the findings at the annual meeting of the Heart Failure Association of the European Society of Cardiology and is senior author on its simultaneous publication in the European Heart Journal.
The multicenter trial randomly assigned 61 patients with AHF to receive dapagliflozin at a fixed dose of 10 mg once daily or metolazone 5 mg or 10 mg (starting dosage at physician discretion) once daily for 3 days of treatment on an open-label basis.
Patients had entered the trial on furosemide at a mean daily dosage of 260 mg in the dapagliflozin group and 229 mg for those assigned metolazone; dosages for the loop diuretic in the trial weren’t prespecified.
Their median age was 79 and 54% were women; 44% had HF with reduced ejection fraction. Their mean glomerular filtration rate was below 30 mL/min per 1.73 m2 in 26%, 90% had chronic kidney disease, 98% had peripheral edema, and 46% had diabetes.
The mean cumulative furosemide dose was significantly higher among the dapagliflozin group’s 31 patients, 976 mg versus 704 mg for the 30 on acetazolamide (P < .05), 96 hours after the start of randomized therapy. However, patients on dapagliflozin experienced a lesser increase in creatinine (P < .05) and in blood urea (P < .01), a greater change in serum sodium (P < .05), and a smaller reduction in serum potassium (P < .01).
Although the trial wasn’t powered for those outcomes, Dr. Campbell said, “less biochemical upset could be associated with better outcomes in terms of less medium- to long-term renal impairment, and in the short-term length of stay.”
The mean decrease in weight at 96 hours, the primary endpoint, reached 3 kg on dapagliflozin, compared with 3.6 kg with metolazone (P = .082), a difference that fell short of significance.
Loop diuretic efficiency, that is weight change in kg per 40 mg furosemide, “was smaller with dapagliflozin than with metolazone at each time point after randomization, although the difference was only significant at 24 hours,” the published report states.
Changes in pulmonary congestion (by lung ultrasound) and fluid volume were similar between the groups.
“This trial further adds to the evidence base and safety profile for using SGLT2 inhibitors in patients with acute heart failure,” and “gives further confidence to clinicians that this class can be started in ‘sicker’ patients with HF who also have diuretic resistance,” Dr. Campbell said.
Asked during his presentation’s question and answer whether dapagliflozin might have shown a greater effect had the dosage been higher, Dr. Campbell explained that the drug was investigational when the trial started. Adding a higher-dose dapagliflozin arm, he said, would have made for an excessively complex study. But “that’s a great research question for another trial.”
DAPA-RESIST was funded by AstraZeneca. Dr. Campbell disclosed receiving honoraria from AstraZeneca for speaking and from Bayer for serving on an advisory board.
A version of this article first appeared on Medscape.com.
suggests a new randomized trial. The drugs were given to the study’s loop diuretic–resistant patients on top of furosemide.
Changes in volume status and measures of pulmonary congestion and risk for serious adverse events were similar for those assigned to take dapagliflozin, an SGLT2 inhibitor, or metolazone, a quinazoline diuretic. Those on dapagliflozin zone ultimately received a larger cumulative furosemide dose in the 61-patient trial, called DAPA-RESIST.
“The next steps are to assess whether a strategy of using SGLT2 inhibitors up front in patients with HF reduces the incidence of diuretic resistance, and to test further combinations of diuretics such as thiazide or thiazide-like diuretics, compared with acetazolamide, when used in addition to an IV loop diuretic and SGLT2 inhibitors together,” Ross T. Campbell, MBChB, PhD, University of Glasgow and Queen Elizabeth University Hospital, also in Glasgow, said in an interview.
Dr. Campbell presented the findings at the annual meeting of the Heart Failure Association of the European Society of Cardiology and is senior author on its simultaneous publication in the European Heart Journal.
The multicenter trial randomly assigned 61 patients with AHF to receive dapagliflozin at a fixed dose of 10 mg once daily or metolazone 5 mg or 10 mg (starting dosage at physician discretion) once daily for 3 days of treatment on an open-label basis.
Patients had entered the trial on furosemide at a mean daily dosage of 260 mg in the dapagliflozin group and 229 mg for those assigned metolazone; dosages for the loop diuretic in the trial weren’t prespecified.
Their median age was 79 and 54% were women; 44% had HF with reduced ejection fraction. Their mean glomerular filtration rate was below 30 mL/min per 1.73 m2 in 26%, 90% had chronic kidney disease, 98% had peripheral edema, and 46% had diabetes.
The mean cumulative furosemide dose was significantly higher among the dapagliflozin group’s 31 patients, 976 mg versus 704 mg for the 30 on acetazolamide (P < .05), 96 hours after the start of randomized therapy. However, patients on dapagliflozin experienced a lesser increase in creatinine (P < .05) and in blood urea (P < .01), a greater change in serum sodium (P < .05), and a smaller reduction in serum potassium (P < .01).
Although the trial wasn’t powered for those outcomes, Dr. Campbell said, “less biochemical upset could be associated with better outcomes in terms of less medium- to long-term renal impairment, and in the short-term length of stay.”
The mean decrease in weight at 96 hours, the primary endpoint, reached 3 kg on dapagliflozin, compared with 3.6 kg with metolazone (P = .082), a difference that fell short of significance.
Loop diuretic efficiency, that is weight change in kg per 40 mg furosemide, “was smaller with dapagliflozin than with metolazone at each time point after randomization, although the difference was only significant at 24 hours,” the published report states.
Changes in pulmonary congestion (by lung ultrasound) and fluid volume were similar between the groups.
“This trial further adds to the evidence base and safety profile for using SGLT2 inhibitors in patients with acute heart failure,” and “gives further confidence to clinicians that this class can be started in ‘sicker’ patients with HF who also have diuretic resistance,” Dr. Campbell said.
Asked during his presentation’s question and answer whether dapagliflozin might have shown a greater effect had the dosage been higher, Dr. Campbell explained that the drug was investigational when the trial started. Adding a higher-dose dapagliflozin arm, he said, would have made for an excessively complex study. But “that’s a great research question for another trial.”
DAPA-RESIST was funded by AstraZeneca. Dr. Campbell disclosed receiving honoraria from AstraZeneca for speaking and from Bayer for serving on an advisory board.
A version of this article first appeared on Medscape.com.
FROM HFA-ESC 2023
Gout linked to smaller brain volume, higher likelihood of neurodegenerative diseases
Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.
“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.
“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.
“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.
Links between gout and neurodegenerative diseases debated in earlier studies
Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.
Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
A novel approach that analyzes brain structure and genetics
In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.
Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
MRI shows brain changes in patients with gout
In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.
They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.
Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.
Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).
In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.
Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
Genetic analyses reinforce MRI results
Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.
They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.
In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.
Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
A novel approach that suggests further related research
Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.
Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.
“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.
“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”
“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”
Early diagnosis benefits patients
Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.
“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”
Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.
The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.
Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.
“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.
“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.
“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.
Links between gout and neurodegenerative diseases debated in earlier studies
Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.
Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
A novel approach that analyzes brain structure and genetics
In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.
Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
MRI shows brain changes in patients with gout
In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.
They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.
Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.
Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).
In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.
Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
Genetic analyses reinforce MRI results
Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.
They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.
In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.
Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
A novel approach that suggests further related research
Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.
Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.
“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.
“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”
“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”
Early diagnosis benefits patients
Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.
“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”
Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.
The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.
Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.
“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.
“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.
“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.
Links between gout and neurodegenerative diseases debated in earlier studies
Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.
Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
A novel approach that analyzes brain structure and genetics
In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.
Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
MRI shows brain changes in patients with gout
In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.
They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.
Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.
Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).
In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.
Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
Genetic analyses reinforce MRI results
Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.
They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.
In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.
Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
A novel approach that suggests further related research
Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.
Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.
“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.
“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”
“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”
Early diagnosis benefits patients
Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.
“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”
Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.
The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.
FROM NATURE COMMUNICATIONS
ER+/HER2– breast cancer: Is first or second line CDK4/6 inhibitor therapy better?
That was the conclusion of the phase 3 SONIA study, which was presented at the annual meeting of the American Society of Clinical Oncology.
The benefit from first line therapy is not maintained and almost completely disappears when patients in the control arm cross over to receive CDK4/6 inhibition in second line,” said Gabe Sonke, MD, PhD, during his presentation at the meeting.
CDK4/6 inhibitors have shown benefit in both the first-and second-line setting, according to Dr. Sonke, who is a medical oncologist at the Netherlands Cancer Institute, Amsterdam. He added that most guidelines suggest use of CDK4/6 inhibitors in the first line, but there hasn’t been a direct comparison between use in the first and second line.
“Many patients do very well on endocrine therapy alone [in the first line]. Combination treatment leads to a higher risk of the emergence of resistant patterns such as ESR1 mutations, and CDK4/6 inhibitors also come with added costs and toxicities. Given the absence of comparative data between first line and second line, we designed the SONIA trial,” said Dr. Sonke.
Study methods and results
The researchers recruited 1,050 pre- and postmenopausal women who were randomized to a nonsteroidal AI in the first line followed by second-line CDK4/6i plus the estrogen receptor antagonist fulvestrant, or a nonsteroidal AI plus a CDK4/6i in the first line and fulvestrant in the second line. The most commonly used CDK4/6i was palbociclib at 91%, followed by ribociclib at 8%, and abemaciclib at 1%.
After a median follow-up of 37.3 months, the median duration of CDK4/6i exposure was 24.6 months in the first-line CDK4/6i group and 8.1 months in the second-line CDK4/6i group.
The median PFS during first-line therapy was 24.7 months in the first-line CDK4/6i group and 16.1 months in the second-line CDK4/6i group (hazard ratio, 0.59; P < .0001), which was consistent with the results seen in CDK4/6i pivotal trials in the first-line setting, according to Dr. Sonke. However, PFS after two lines of therapy was not significantly different between the groups (31.0 months vs. 26.8 months, respectively; HR, 0.87; P =.10).
The safety profile was similar to what had been seen in previous trials with respect to adverse events like bone marrow and liver function abnormalities and fatigue, but there were 42% more grade 3 or higher adverse events in the first-line CDK4/6i group than in the second-line CDK4/6i group. Dr. Sonke estimated that the increase in costs related to adverse events amounted to about $200,000 per patient receiving CDK4/6i as first line.
There were no significant differences between the two groups in quality of life measurement.
Subgroup analyses of patient categories including prior adjuvant or neoadjuvant chemotherapy or endocrine therapy, de novo metastatic disease, visceral disease, bone-only disease, and treatment with palbociclib or ribociclib showed no difference in outcome for first- versus second-line CDK4/6i treatment.
Are CDK4/6i costs and side effects worth it?
The findings challenge the need for using CDK4/6 inhibitors as first-line treatment in this population, according to Dr. Sonke, who also raised the following related questions.
“If you were a patient, would you consider a treatment that offers no improvement in quality of life and does not improve overall survival? As a doctor or nurse, would you recommend such a treatment to your patient that nearly doubles the incidence of side effects? And if you were responsible for covering the costs of this treatment, whether as an individual or health care insurance, would you consider it worth $200,000?”
For many patients, particularly in the first line setting where resistance mechanisms are less prevalent, endocrine therapy alone remains an excellent option,” said Dr. Sonke during his presentation.
During the discussion portion of the session, Daniel Stover, MD, who is an associate professor of translational therapeutics at Ohio State University Comprehensive Cancer Center, Columbus, pointed out that the lack of differences in the subanalyses leaves little guidance for physicians.
“We really have a limited signal on who can delay CDK4/6 inhibitors. I think one of the most important outcomes of this study is the focus on the patient, as there were substantially fewer adverse events and of course we need to think about financial toxicity as well,” he said. “I think one of the things that is perhaps most exciting to think about is who are the very good risk patients who can delay CDK4/6 inhibitor [therapy]? I think for the majority of patients, endocrine therapy plus CDK4/6 inhibitor is still the appropriate treatment, but I would argue we need additional biomarkers, be it RNA-based biomarkers, novel PET imaging, or perhaps [circulating tumor] DNA dynamics.”
Do cost savings and reduced side effects outweigh first-line PFS benefit?
During the question-and-answer session, William Sikov, MD, spoke up from the audience in support of Dr. Sonke’s conclusions.
“Clearly there are still patients who benefit from that approach, but I think that we have reached an inflection point: I posit that the question has now changed. [We should not ask] why a certain patient should not receive a CDK4/6 inhibitor, but why a certain patient should receive a CDK4/6 inhibitor in the first-line setting,” said Dr. Sikov, who is professor of medicine at Brown University, Providence, R.I.
Dr. Sonke agreed that first-line CDK4/6i is appropriate for some patients, and later echoed the need for biomarkers, but he said that researchers have so far had little luck in identifying any.
“Of course, it’s a shared decision-making between the patient and a doctor, but I think the baseline would be for all of us to consider first line single-agent endocrine therapy,” he said.
Session comoderator Michael Danso, MD, praised the trial but questioned whether the strategy would be adopted in places like the United States, where cost savings is not a major emphasis.
“Progression-free survival is so significant in the first line setting that I can’t imagine that many oncologists in the U.S. will adopt this approach. The other thing is that this was [almost] all palbociclib, so the question remains, would having a different cyclin dependent kinase inhibitor result in the same results? I think the jury’s still out,” said Dr. Danso, who is the research director at Virginia Oncology Associates, Norfolk.
The study was funded by the Dutch government and Dutch Health Insurers. Dr. Sonke has consulted for or advised Biovica, Novartis, and Seagen. He has received research support through his institution from Agendia, AstraZeneca/Merck, Merck Sharp & Dohme, Novartis, Roche, and Seagen. Dr. Sikov has been a speaker for Lilly. Dr. Danso has received honoraria from Amgen and has consulted or advised Immunomedics, Novartis, Pfizer, and Seagen.
That was the conclusion of the phase 3 SONIA study, which was presented at the annual meeting of the American Society of Clinical Oncology.
The benefit from first line therapy is not maintained and almost completely disappears when patients in the control arm cross over to receive CDK4/6 inhibition in second line,” said Gabe Sonke, MD, PhD, during his presentation at the meeting.
CDK4/6 inhibitors have shown benefit in both the first-and second-line setting, according to Dr. Sonke, who is a medical oncologist at the Netherlands Cancer Institute, Amsterdam. He added that most guidelines suggest use of CDK4/6 inhibitors in the first line, but there hasn’t been a direct comparison between use in the first and second line.
“Many patients do very well on endocrine therapy alone [in the first line]. Combination treatment leads to a higher risk of the emergence of resistant patterns such as ESR1 mutations, and CDK4/6 inhibitors also come with added costs and toxicities. Given the absence of comparative data between first line and second line, we designed the SONIA trial,” said Dr. Sonke.
Study methods and results
The researchers recruited 1,050 pre- and postmenopausal women who were randomized to a nonsteroidal AI in the first line followed by second-line CDK4/6i plus the estrogen receptor antagonist fulvestrant, or a nonsteroidal AI plus a CDK4/6i in the first line and fulvestrant in the second line. The most commonly used CDK4/6i was palbociclib at 91%, followed by ribociclib at 8%, and abemaciclib at 1%.
After a median follow-up of 37.3 months, the median duration of CDK4/6i exposure was 24.6 months in the first-line CDK4/6i group and 8.1 months in the second-line CDK4/6i group.
The median PFS during first-line therapy was 24.7 months in the first-line CDK4/6i group and 16.1 months in the second-line CDK4/6i group (hazard ratio, 0.59; P < .0001), which was consistent with the results seen in CDK4/6i pivotal trials in the first-line setting, according to Dr. Sonke. However, PFS after two lines of therapy was not significantly different between the groups (31.0 months vs. 26.8 months, respectively; HR, 0.87; P =.10).
The safety profile was similar to what had been seen in previous trials with respect to adverse events like bone marrow and liver function abnormalities and fatigue, but there were 42% more grade 3 or higher adverse events in the first-line CDK4/6i group than in the second-line CDK4/6i group. Dr. Sonke estimated that the increase in costs related to adverse events amounted to about $200,000 per patient receiving CDK4/6i as first line.
There were no significant differences between the two groups in quality of life measurement.
Subgroup analyses of patient categories including prior adjuvant or neoadjuvant chemotherapy or endocrine therapy, de novo metastatic disease, visceral disease, bone-only disease, and treatment with palbociclib or ribociclib showed no difference in outcome for first- versus second-line CDK4/6i treatment.
Are CDK4/6i costs and side effects worth it?
The findings challenge the need for using CDK4/6 inhibitors as first-line treatment in this population, according to Dr. Sonke, who also raised the following related questions.
“If you were a patient, would you consider a treatment that offers no improvement in quality of life and does not improve overall survival? As a doctor or nurse, would you recommend such a treatment to your patient that nearly doubles the incidence of side effects? And if you were responsible for covering the costs of this treatment, whether as an individual or health care insurance, would you consider it worth $200,000?”
For many patients, particularly in the first line setting where resistance mechanisms are less prevalent, endocrine therapy alone remains an excellent option,” said Dr. Sonke during his presentation.
During the discussion portion of the session, Daniel Stover, MD, who is an associate professor of translational therapeutics at Ohio State University Comprehensive Cancer Center, Columbus, pointed out that the lack of differences in the subanalyses leaves little guidance for physicians.
“We really have a limited signal on who can delay CDK4/6 inhibitors. I think one of the most important outcomes of this study is the focus on the patient, as there were substantially fewer adverse events and of course we need to think about financial toxicity as well,” he said. “I think one of the things that is perhaps most exciting to think about is who are the very good risk patients who can delay CDK4/6 inhibitor [therapy]? I think for the majority of patients, endocrine therapy plus CDK4/6 inhibitor is still the appropriate treatment, but I would argue we need additional biomarkers, be it RNA-based biomarkers, novel PET imaging, or perhaps [circulating tumor] DNA dynamics.”
Do cost savings and reduced side effects outweigh first-line PFS benefit?
During the question-and-answer session, William Sikov, MD, spoke up from the audience in support of Dr. Sonke’s conclusions.
“Clearly there are still patients who benefit from that approach, but I think that we have reached an inflection point: I posit that the question has now changed. [We should not ask] why a certain patient should not receive a CDK4/6 inhibitor, but why a certain patient should receive a CDK4/6 inhibitor in the first-line setting,” said Dr. Sikov, who is professor of medicine at Brown University, Providence, R.I.
Dr. Sonke agreed that first-line CDK4/6i is appropriate for some patients, and later echoed the need for biomarkers, but he said that researchers have so far had little luck in identifying any.
“Of course, it’s a shared decision-making between the patient and a doctor, but I think the baseline would be for all of us to consider first line single-agent endocrine therapy,” he said.
Session comoderator Michael Danso, MD, praised the trial but questioned whether the strategy would be adopted in places like the United States, where cost savings is not a major emphasis.
“Progression-free survival is so significant in the first line setting that I can’t imagine that many oncologists in the U.S. will adopt this approach. The other thing is that this was [almost] all palbociclib, so the question remains, would having a different cyclin dependent kinase inhibitor result in the same results? I think the jury’s still out,” said Dr. Danso, who is the research director at Virginia Oncology Associates, Norfolk.
The study was funded by the Dutch government and Dutch Health Insurers. Dr. Sonke has consulted for or advised Biovica, Novartis, and Seagen. He has received research support through his institution from Agendia, AstraZeneca/Merck, Merck Sharp & Dohme, Novartis, Roche, and Seagen. Dr. Sikov has been a speaker for Lilly. Dr. Danso has received honoraria from Amgen and has consulted or advised Immunomedics, Novartis, Pfizer, and Seagen.
That was the conclusion of the phase 3 SONIA study, which was presented at the annual meeting of the American Society of Clinical Oncology.
The benefit from first line therapy is not maintained and almost completely disappears when patients in the control arm cross over to receive CDK4/6 inhibition in second line,” said Gabe Sonke, MD, PhD, during his presentation at the meeting.
CDK4/6 inhibitors have shown benefit in both the first-and second-line setting, according to Dr. Sonke, who is a medical oncologist at the Netherlands Cancer Institute, Amsterdam. He added that most guidelines suggest use of CDK4/6 inhibitors in the first line, but there hasn’t been a direct comparison between use in the first and second line.
“Many patients do very well on endocrine therapy alone [in the first line]. Combination treatment leads to a higher risk of the emergence of resistant patterns such as ESR1 mutations, and CDK4/6 inhibitors also come with added costs and toxicities. Given the absence of comparative data between first line and second line, we designed the SONIA trial,” said Dr. Sonke.
Study methods and results
The researchers recruited 1,050 pre- and postmenopausal women who were randomized to a nonsteroidal AI in the first line followed by second-line CDK4/6i plus the estrogen receptor antagonist fulvestrant, or a nonsteroidal AI plus a CDK4/6i in the first line and fulvestrant in the second line. The most commonly used CDK4/6i was palbociclib at 91%, followed by ribociclib at 8%, and abemaciclib at 1%.
After a median follow-up of 37.3 months, the median duration of CDK4/6i exposure was 24.6 months in the first-line CDK4/6i group and 8.1 months in the second-line CDK4/6i group.
The median PFS during first-line therapy was 24.7 months in the first-line CDK4/6i group and 16.1 months in the second-line CDK4/6i group (hazard ratio, 0.59; P < .0001), which was consistent with the results seen in CDK4/6i pivotal trials in the first-line setting, according to Dr. Sonke. However, PFS after two lines of therapy was not significantly different between the groups (31.0 months vs. 26.8 months, respectively; HR, 0.87; P =.10).
The safety profile was similar to what had been seen in previous trials with respect to adverse events like bone marrow and liver function abnormalities and fatigue, but there were 42% more grade 3 or higher adverse events in the first-line CDK4/6i group than in the second-line CDK4/6i group. Dr. Sonke estimated that the increase in costs related to adverse events amounted to about $200,000 per patient receiving CDK4/6i as first line.
There were no significant differences between the two groups in quality of life measurement.
Subgroup analyses of patient categories including prior adjuvant or neoadjuvant chemotherapy or endocrine therapy, de novo metastatic disease, visceral disease, bone-only disease, and treatment with palbociclib or ribociclib showed no difference in outcome for first- versus second-line CDK4/6i treatment.
Are CDK4/6i costs and side effects worth it?
The findings challenge the need for using CDK4/6 inhibitors as first-line treatment in this population, according to Dr. Sonke, who also raised the following related questions.
“If you were a patient, would you consider a treatment that offers no improvement in quality of life and does not improve overall survival? As a doctor or nurse, would you recommend such a treatment to your patient that nearly doubles the incidence of side effects? And if you were responsible for covering the costs of this treatment, whether as an individual or health care insurance, would you consider it worth $200,000?”
For many patients, particularly in the first line setting where resistance mechanisms are less prevalent, endocrine therapy alone remains an excellent option,” said Dr. Sonke during his presentation.
During the discussion portion of the session, Daniel Stover, MD, who is an associate professor of translational therapeutics at Ohio State University Comprehensive Cancer Center, Columbus, pointed out that the lack of differences in the subanalyses leaves little guidance for physicians.
“We really have a limited signal on who can delay CDK4/6 inhibitors. I think one of the most important outcomes of this study is the focus on the patient, as there were substantially fewer adverse events and of course we need to think about financial toxicity as well,” he said. “I think one of the things that is perhaps most exciting to think about is who are the very good risk patients who can delay CDK4/6 inhibitor [therapy]? I think for the majority of patients, endocrine therapy plus CDK4/6 inhibitor is still the appropriate treatment, but I would argue we need additional biomarkers, be it RNA-based biomarkers, novel PET imaging, or perhaps [circulating tumor] DNA dynamics.”
Do cost savings and reduced side effects outweigh first-line PFS benefit?
During the question-and-answer session, William Sikov, MD, spoke up from the audience in support of Dr. Sonke’s conclusions.
“Clearly there are still patients who benefit from that approach, but I think that we have reached an inflection point: I posit that the question has now changed. [We should not ask] why a certain patient should not receive a CDK4/6 inhibitor, but why a certain patient should receive a CDK4/6 inhibitor in the first-line setting,” said Dr. Sikov, who is professor of medicine at Brown University, Providence, R.I.
Dr. Sonke agreed that first-line CDK4/6i is appropriate for some patients, and later echoed the need for biomarkers, but he said that researchers have so far had little luck in identifying any.
“Of course, it’s a shared decision-making between the patient and a doctor, but I think the baseline would be for all of us to consider first line single-agent endocrine therapy,” he said.
Session comoderator Michael Danso, MD, praised the trial but questioned whether the strategy would be adopted in places like the United States, where cost savings is not a major emphasis.
“Progression-free survival is so significant in the first line setting that I can’t imagine that many oncologists in the U.S. will adopt this approach. The other thing is that this was [almost] all palbociclib, so the question remains, would having a different cyclin dependent kinase inhibitor result in the same results? I think the jury’s still out,” said Dr. Danso, who is the research director at Virginia Oncology Associates, Norfolk.
The study was funded by the Dutch government and Dutch Health Insurers. Dr. Sonke has consulted for or advised Biovica, Novartis, and Seagen. He has received research support through his institution from Agendia, AstraZeneca/Merck, Merck Sharp & Dohme, Novartis, Roche, and Seagen. Dr. Sikov has been a speaker for Lilly. Dr. Danso has received honoraria from Amgen and has consulted or advised Immunomedics, Novartis, Pfizer, and Seagen.
AT ASCO 2023
How has cannabis legalization affected pregnant mothers?
A population-based study shows that the rate of cannabis-related acute care use during pregnancy increased from 11 per 100,000 pregnancies before legalization to 20 per 100,000 pregnancies afterward: an increase of 82%. Absolute increases were small, however.
“Our findings are consistent with studies highlighting that cannabis use during pregnancy has been increasing in North America, and this study suggests that cannabis legalization might contribute to and accelerate such trends,” study author Daniel Myran, MD, MPH, a public health and preventive medicine physician at the University of Ottawa in Ontario, said in an interview.
The study was published online in the Canadian Medical Association Journal.
Risks for newborns
In a 2019 study, 7% of U.S. women reported using cannabis during pregnancy during 2016-2017, which was double the rate of 3.4% for 2002-2003.
Dr. Myran and colleagues hypothesized that legalizing nonmedical cannabis has affected the drug’s use during pregnancy in Ontario. “We also hypothesized that hospital care for cannabis use would be associated with adverse neonatal outcomes, even after adjusting for other important risk factors that may differ between people with and without cannabis use,” he said.
The researchers’ repeated cross-sectional analysis evaluated changes in the number of pregnant people who received acute care from January 2015 to July 2021 among all patients who were eligible for Ontario’s public health coverage. The final study cohort included 691,242 pregnant patients, of whom 533 had at least one pregnancy with cannabis-related acute care visits. These mothers had a mean age of 24 years vs. 30 for their counterparts with no such visits.
Using segmented regression, the researchers compared changes in the quarterly rate of pregnant people with acute care related to cannabis use (the primary outcome) with those of acute care for mental health conditions or for noncannabis substance use (the control conditions).
“Severe morning sickness was a major risk factor for care in the emergency department or hospital for cannabis use,” said Dr. Myran. “Prior work has found that people who use cannabis during pregnancy often state that it was used to manage challenging symptoms of pregnancy such as morning sickness.”
Most acute care events (72.2%) were emergency department visits. The most common reasons for acute care were harmful cannabis use (57.6%), followed by cannabis dependence or withdrawal (21.5%), and acute cannabis intoxication (12.8%).
Compared with pregnancies without acute care, those with acute care related to cannabis had higher rates of adverse neonatal outcomes such as birth before 37 weeks’ gestational age (16.9% vs. 7.2%), birth weight at or below the bottom fifth percentile after adjustment for gestational age (12.1% vs. 4.4%), and neonatal intensive care unit admission in the first 28 days of life (31.5% vs. 13%).
An adjusted analysis found that patients younger than 35 years and those living in rural settings or the lowest-income neighborhoods had higher odds of acute cannabis-related care during pregnancy. Patients who received acute care for any substance use or schizophrenia before pregnancy or who accessed outpatient mental health services before pregnancy had higher risk for cannabis-related acute care during pregnancy. Mothers receiving acute care for cannabis also had higher risk for acute care for hyperemesis gravidarum during pregnancy (30.9%).
The rate of acute care for other types of substance use such as alcohol and opioids did not change after cannabis legalization, and acute care for mental health conditions such as anxiety and depression during pregnancy declined by 14%, Dr. Myran noted.
“Physicians who care for pregnant people should consider increasing screening for cannabis use during pregnancy,” said Dr. Myran. “In addition, repeated nonstigmatizing screening and counseling may be indicated for higher-risk groups identified in the study, including pregnancies with severe morning sickness.”
The U.S. perspective
Commenting on the study, M. Camille Hoffman, MD, MSc, a maternal-fetal medicine specialist at the University of Colorado in Aurora, said that the findings likely indicate that legalization has made cannabis users less reluctant to come forward for urgent care. “They cannot really claim that this is equivalent to more use, just that more people are willing to present,” she said. Dr. Hoffman was not involved in the study.
The Canadian results do not align perfectly with what is seen in the United States. “It does suggest that there may be more cannabinoid hyperemesis being coded as hyperemesis gravidarum, which is a pregnancy-specific condition vs. a cannabis-dependence-related one,” said Dr. Hoffman.
Literature in the United States often includes tobacco use as a covariate, she added. “This study does not appear to do that,” she said. “Rather, it uses any substance use. Because of this, it is difficult to really know the contribution of cannabis to the adverse pregnancy outcomes vs. the combination of tobacco and cannabis.”
Finally, she pointed out, the proportion of those presenting for acute care for substance use in the 2 years before conception was 22% for acute care visits for cannabis vs 1% for no acute care visits. “This suggests to me that this was a highly vulnerable group before the legalization of cannabis as well. The overall absolute difference is nine in total per 100,000 – hardly enough to draw any real conclusions. Again, maybe those nine were simply more willing to come forth with concerns with cannabis being legal.”
There is no known safe level of cannabis consumption, and its use by pregnant women has been linked to later neurodevelopmental issues in their offspring. A 2022 U.S. study suggested that cannabis exposure in the womb may leave children later in life at risk for autism, psychiatric disorders, and problematic substance abuse, particularly as they enter peak periods of vulnerability in late adolescence.
As to the impact of legalization in certain U.S. states, a 2022 study found that women perceived legalization to mean greater access to cannabis, increased acceptance of use, and greater trust in cannabis retailers. In line with Dr. Hoffman’s view, this study suggested that legalization made pregnant women more willing to discuss cannabis use during pregnancy honestly with their care providers.
In the United States, prenatal cannabis use is still included in definitions of child abuse or neglect and can lead to termination of parental rights, even in states with full legalization.
“These findings highlight the need for ongoing monitoring of markers of cannabis use during pregnancy after legalization,” said Dr. Myran. He also called for effective policies in regions with legal cannabis, such as increased warning labels on cannabis products.
This study was supported by the Canadian Institutes of Health Research and the University of Ottawa site of ICES, which is funded by an annual grant from the Ontario Ministry of Health and Ministry of Long-Term Care. Dr. Myran reports a speaker fee from McMaster University. Dr. Hoffman reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A population-based study shows that the rate of cannabis-related acute care use during pregnancy increased from 11 per 100,000 pregnancies before legalization to 20 per 100,000 pregnancies afterward: an increase of 82%. Absolute increases were small, however.
“Our findings are consistent with studies highlighting that cannabis use during pregnancy has been increasing in North America, and this study suggests that cannabis legalization might contribute to and accelerate such trends,” study author Daniel Myran, MD, MPH, a public health and preventive medicine physician at the University of Ottawa in Ontario, said in an interview.
The study was published online in the Canadian Medical Association Journal.
Risks for newborns
In a 2019 study, 7% of U.S. women reported using cannabis during pregnancy during 2016-2017, which was double the rate of 3.4% for 2002-2003.
Dr. Myran and colleagues hypothesized that legalizing nonmedical cannabis has affected the drug’s use during pregnancy in Ontario. “We also hypothesized that hospital care for cannabis use would be associated with adverse neonatal outcomes, even after adjusting for other important risk factors that may differ between people with and without cannabis use,” he said.
The researchers’ repeated cross-sectional analysis evaluated changes in the number of pregnant people who received acute care from January 2015 to July 2021 among all patients who were eligible for Ontario’s public health coverage. The final study cohort included 691,242 pregnant patients, of whom 533 had at least one pregnancy with cannabis-related acute care visits. These mothers had a mean age of 24 years vs. 30 for their counterparts with no such visits.
Using segmented regression, the researchers compared changes in the quarterly rate of pregnant people with acute care related to cannabis use (the primary outcome) with those of acute care for mental health conditions or for noncannabis substance use (the control conditions).
“Severe morning sickness was a major risk factor for care in the emergency department or hospital for cannabis use,” said Dr. Myran. “Prior work has found that people who use cannabis during pregnancy often state that it was used to manage challenging symptoms of pregnancy such as morning sickness.”
Most acute care events (72.2%) were emergency department visits. The most common reasons for acute care were harmful cannabis use (57.6%), followed by cannabis dependence or withdrawal (21.5%), and acute cannabis intoxication (12.8%).
Compared with pregnancies without acute care, those with acute care related to cannabis had higher rates of adverse neonatal outcomes such as birth before 37 weeks’ gestational age (16.9% vs. 7.2%), birth weight at or below the bottom fifth percentile after adjustment for gestational age (12.1% vs. 4.4%), and neonatal intensive care unit admission in the first 28 days of life (31.5% vs. 13%).
An adjusted analysis found that patients younger than 35 years and those living in rural settings or the lowest-income neighborhoods had higher odds of acute cannabis-related care during pregnancy. Patients who received acute care for any substance use or schizophrenia before pregnancy or who accessed outpatient mental health services before pregnancy had higher risk for cannabis-related acute care during pregnancy. Mothers receiving acute care for cannabis also had higher risk for acute care for hyperemesis gravidarum during pregnancy (30.9%).
The rate of acute care for other types of substance use such as alcohol and opioids did not change after cannabis legalization, and acute care for mental health conditions such as anxiety and depression during pregnancy declined by 14%, Dr. Myran noted.
“Physicians who care for pregnant people should consider increasing screening for cannabis use during pregnancy,” said Dr. Myran. “In addition, repeated nonstigmatizing screening and counseling may be indicated for higher-risk groups identified in the study, including pregnancies with severe morning sickness.”
The U.S. perspective
Commenting on the study, M. Camille Hoffman, MD, MSc, a maternal-fetal medicine specialist at the University of Colorado in Aurora, said that the findings likely indicate that legalization has made cannabis users less reluctant to come forward for urgent care. “They cannot really claim that this is equivalent to more use, just that more people are willing to present,” she said. Dr. Hoffman was not involved in the study.
The Canadian results do not align perfectly with what is seen in the United States. “It does suggest that there may be more cannabinoid hyperemesis being coded as hyperemesis gravidarum, which is a pregnancy-specific condition vs. a cannabis-dependence-related one,” said Dr. Hoffman.
Literature in the United States often includes tobacco use as a covariate, she added. “This study does not appear to do that,” she said. “Rather, it uses any substance use. Because of this, it is difficult to really know the contribution of cannabis to the adverse pregnancy outcomes vs. the combination of tobacco and cannabis.”
Finally, she pointed out, the proportion of those presenting for acute care for substance use in the 2 years before conception was 22% for acute care visits for cannabis vs 1% for no acute care visits. “This suggests to me that this was a highly vulnerable group before the legalization of cannabis as well. The overall absolute difference is nine in total per 100,000 – hardly enough to draw any real conclusions. Again, maybe those nine were simply more willing to come forth with concerns with cannabis being legal.”
There is no known safe level of cannabis consumption, and its use by pregnant women has been linked to later neurodevelopmental issues in their offspring. A 2022 U.S. study suggested that cannabis exposure in the womb may leave children later in life at risk for autism, psychiatric disorders, and problematic substance abuse, particularly as they enter peak periods of vulnerability in late adolescence.
As to the impact of legalization in certain U.S. states, a 2022 study found that women perceived legalization to mean greater access to cannabis, increased acceptance of use, and greater trust in cannabis retailers. In line with Dr. Hoffman’s view, this study suggested that legalization made pregnant women more willing to discuss cannabis use during pregnancy honestly with their care providers.
In the United States, prenatal cannabis use is still included in definitions of child abuse or neglect and can lead to termination of parental rights, even in states with full legalization.
“These findings highlight the need for ongoing monitoring of markers of cannabis use during pregnancy after legalization,” said Dr. Myran. He also called for effective policies in regions with legal cannabis, such as increased warning labels on cannabis products.
This study was supported by the Canadian Institutes of Health Research and the University of Ottawa site of ICES, which is funded by an annual grant from the Ontario Ministry of Health and Ministry of Long-Term Care. Dr. Myran reports a speaker fee from McMaster University. Dr. Hoffman reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A population-based study shows that the rate of cannabis-related acute care use during pregnancy increased from 11 per 100,000 pregnancies before legalization to 20 per 100,000 pregnancies afterward: an increase of 82%. Absolute increases were small, however.
“Our findings are consistent with studies highlighting that cannabis use during pregnancy has been increasing in North America, and this study suggests that cannabis legalization might contribute to and accelerate such trends,” study author Daniel Myran, MD, MPH, a public health and preventive medicine physician at the University of Ottawa in Ontario, said in an interview.
The study was published online in the Canadian Medical Association Journal.
Risks for newborns
In a 2019 study, 7% of U.S. women reported using cannabis during pregnancy during 2016-2017, which was double the rate of 3.4% for 2002-2003.
Dr. Myran and colleagues hypothesized that legalizing nonmedical cannabis has affected the drug’s use during pregnancy in Ontario. “We also hypothesized that hospital care for cannabis use would be associated with adverse neonatal outcomes, even after adjusting for other important risk factors that may differ between people with and without cannabis use,” he said.
The researchers’ repeated cross-sectional analysis evaluated changes in the number of pregnant people who received acute care from January 2015 to July 2021 among all patients who were eligible for Ontario’s public health coverage. The final study cohort included 691,242 pregnant patients, of whom 533 had at least one pregnancy with cannabis-related acute care visits. These mothers had a mean age of 24 years vs. 30 for their counterparts with no such visits.
Using segmented regression, the researchers compared changes in the quarterly rate of pregnant people with acute care related to cannabis use (the primary outcome) with those of acute care for mental health conditions or for noncannabis substance use (the control conditions).
“Severe morning sickness was a major risk factor for care in the emergency department or hospital for cannabis use,” said Dr. Myran. “Prior work has found that people who use cannabis during pregnancy often state that it was used to manage challenging symptoms of pregnancy such as morning sickness.”
Most acute care events (72.2%) were emergency department visits. The most common reasons for acute care were harmful cannabis use (57.6%), followed by cannabis dependence or withdrawal (21.5%), and acute cannabis intoxication (12.8%).
Compared with pregnancies without acute care, those with acute care related to cannabis had higher rates of adverse neonatal outcomes such as birth before 37 weeks’ gestational age (16.9% vs. 7.2%), birth weight at or below the bottom fifth percentile after adjustment for gestational age (12.1% vs. 4.4%), and neonatal intensive care unit admission in the first 28 days of life (31.5% vs. 13%).
An adjusted analysis found that patients younger than 35 years and those living in rural settings or the lowest-income neighborhoods had higher odds of acute cannabis-related care during pregnancy. Patients who received acute care for any substance use or schizophrenia before pregnancy or who accessed outpatient mental health services before pregnancy had higher risk for cannabis-related acute care during pregnancy. Mothers receiving acute care for cannabis also had higher risk for acute care for hyperemesis gravidarum during pregnancy (30.9%).
The rate of acute care for other types of substance use such as alcohol and opioids did not change after cannabis legalization, and acute care for mental health conditions such as anxiety and depression during pregnancy declined by 14%, Dr. Myran noted.
“Physicians who care for pregnant people should consider increasing screening for cannabis use during pregnancy,” said Dr. Myran. “In addition, repeated nonstigmatizing screening and counseling may be indicated for higher-risk groups identified in the study, including pregnancies with severe morning sickness.”
The U.S. perspective
Commenting on the study, M. Camille Hoffman, MD, MSc, a maternal-fetal medicine specialist at the University of Colorado in Aurora, said that the findings likely indicate that legalization has made cannabis users less reluctant to come forward for urgent care. “They cannot really claim that this is equivalent to more use, just that more people are willing to present,” she said. Dr. Hoffman was not involved in the study.
The Canadian results do not align perfectly with what is seen in the United States. “It does suggest that there may be more cannabinoid hyperemesis being coded as hyperemesis gravidarum, which is a pregnancy-specific condition vs. a cannabis-dependence-related one,” said Dr. Hoffman.
Literature in the United States often includes tobacco use as a covariate, she added. “This study does not appear to do that,” she said. “Rather, it uses any substance use. Because of this, it is difficult to really know the contribution of cannabis to the adverse pregnancy outcomes vs. the combination of tobacco and cannabis.”
Finally, she pointed out, the proportion of those presenting for acute care for substance use in the 2 years before conception was 22% for acute care visits for cannabis vs 1% for no acute care visits. “This suggests to me that this was a highly vulnerable group before the legalization of cannabis as well. The overall absolute difference is nine in total per 100,000 – hardly enough to draw any real conclusions. Again, maybe those nine were simply more willing to come forth with concerns with cannabis being legal.”
There is no known safe level of cannabis consumption, and its use by pregnant women has been linked to later neurodevelopmental issues in their offspring. A 2022 U.S. study suggested that cannabis exposure in the womb may leave children later in life at risk for autism, psychiatric disorders, and problematic substance abuse, particularly as they enter peak periods of vulnerability in late adolescence.
As to the impact of legalization in certain U.S. states, a 2022 study found that women perceived legalization to mean greater access to cannabis, increased acceptance of use, and greater trust in cannabis retailers. In line with Dr. Hoffman’s view, this study suggested that legalization made pregnant women more willing to discuss cannabis use during pregnancy honestly with their care providers.
In the United States, prenatal cannabis use is still included in definitions of child abuse or neglect and can lead to termination of parental rights, even in states with full legalization.
“These findings highlight the need for ongoing monitoring of markers of cannabis use during pregnancy after legalization,” said Dr. Myran. He also called for effective policies in regions with legal cannabis, such as increased warning labels on cannabis products.
This study was supported by the Canadian Institutes of Health Research and the University of Ottawa site of ICES, which is funded by an annual grant from the Ontario Ministry of Health and Ministry of Long-Term Care. Dr. Myran reports a speaker fee from McMaster University. Dr. Hoffman reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM CMAJ
Cell activity in psoriasis may predict disease severity and provide clues to comorbidities
The activity and clustering of certain cell types may distinguish mild and severe forms of psoriasis, with severe disease altering the cellular and metabolic composition of distal unaffected skin sites, according to a new analysis using single-cell transcriptomic technology.
On the surface, psoriasis severity is identified based on the visible lesions, Rochelle L. Castillo, MD, of the division of rheumatology and the NYU Psoriatic Arthritis Center, NYU Langone Health, New York, and colleagues wrote in their study, published in Science Immunology. Although cellular and molecular features of inflammatory skin diseases such as psoriasis have been characterized, activity at the tissue level and its systemic impact has not been explored.
“Our initial goal was to find measurable molecular signals that could tell us who is more likely to develop severe psoriasis, as well as who is at higher risk of developing related disorders that often accompany psoriasis, such as arthritis and cardiovascular disease,” study co–senior investigator Jose Scher, MD, director of the Psoriatic Arthritis Center and the Judith and Stewart Colton Center for Autoimmunity at NYU Langone Health, said in a press release accompanying the publication of the findings. “Having found signals with potential systemic consequences, we are now working to understand how skin inflammation can lead to widespread disease affecting other organs,”
In the study, the researchers used spatial transcriptomics, a technique that positions tissue sections onto genetic arrays to determine gene expression by cell type and histological location, helping to create a broad image-based map of where certain cell types are located in tissues and with what other cells they are communicating. They characterized the cell activity of skin samples from 11 men and women with mild to severe psoriasis/psoriatic arthritis, and three healthy adults who did not have psoriasis. They defined the cellular composition of 25 healthy skin biopsies and matched skin biopsies from psoriatic lesional and nonlesional skin, and identified 17 distinct clusters of cells, which they grouped into epidermal, dermis, pilosebaceous, and adipose categories.
The researchers found that cell activity associated with inflammation, as shown by clusters of fibroblasts and dermal macrophages, was more common in the upper layers of the skin in samples from patients with more severe psoriasis, compared with healthy control samples.
They also examined patterns of immune activity at the cellular level and found significant patterns around the upper follicle, around the perifollicular dermis, and within the hair follicle, where immune cells were enriched in healthy skin. Other cells enriched in these upper layer areas in healthy skin included dendritic cells, innate lymphoid cells, T helper cells, T cytotoxic cells, and myeloid cells.
Clusters of fibroblasts and macrophages, which are associated with inflammation, were clustered in psoriatic lesional skin, which also showed more inflammation at the dermal and suprabasal epidermal levels. B lymphocytes also were more prevalent in lesional skin.
The researchers then analyzed the skin samples according to disease severity; mild psoriasis was defined as a Psoriasis Area and Severity Index score less than 12; moderate to severe disease was defined as a PASI score of 12 or higher. The macrophage, fibroblast, and lymphatic endothelium–associated clusters distinguished mild and moderate to severe endotypes.
The pathology of moderate to severe psoriasis in lesional and nonlesional skin showed the extensive effects of psoriasis-related inflammation. Although nonlesional mild disease was clustered with healthy skin, in cases of moderate to severe disease, nonlesional and lesional groups were clustered together. This effect was segregated according to disease severity, independent of the presence of joint disease, and “was particularly evident in distal, nonlesional samples,” the researchers wrote.
The researchers also found evidence of increased gene activity in more than three dozen molecular pathways associated with metabolism and lipid levels in areas of lesional and nonlesional skin, Dr. Scher said.
The findings were limited by several factors including the small sample size and the limits of spatial transcriptomics technology resolution, the researchers wrote. “As this technology evolves, platforms with higher density, and by extension, resolution, of spatially barcoded beads will provide more granularity about cellular microenvironments in healthy and diseased states.”
The study was supported by the National Institutes of Health, the National Psoriasis Foundation, the NYU Colton Center for Autoimmunity, the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis, the Beatrice Snyder Foundation, The Riley Family Foundation, the Rheumatology Research Foundation, and the NY Stem Cell Foundation. Dr. Castillo had no financial conflicts to disclose. Dr. Scher has served as a consultant for Janssen, Abbvie, Novartis, Pfizer, Sanofi, UCB, and Bristol-Myers Squibb, and has received research funding from Janssen and Pfizer.
The activity and clustering of certain cell types may distinguish mild and severe forms of psoriasis, with severe disease altering the cellular and metabolic composition of distal unaffected skin sites, according to a new analysis using single-cell transcriptomic technology.
On the surface, psoriasis severity is identified based on the visible lesions, Rochelle L. Castillo, MD, of the division of rheumatology and the NYU Psoriatic Arthritis Center, NYU Langone Health, New York, and colleagues wrote in their study, published in Science Immunology. Although cellular and molecular features of inflammatory skin diseases such as psoriasis have been characterized, activity at the tissue level and its systemic impact has not been explored.
“Our initial goal was to find measurable molecular signals that could tell us who is more likely to develop severe psoriasis, as well as who is at higher risk of developing related disorders that often accompany psoriasis, such as arthritis and cardiovascular disease,” study co–senior investigator Jose Scher, MD, director of the Psoriatic Arthritis Center and the Judith and Stewart Colton Center for Autoimmunity at NYU Langone Health, said in a press release accompanying the publication of the findings. “Having found signals with potential systemic consequences, we are now working to understand how skin inflammation can lead to widespread disease affecting other organs,”
In the study, the researchers used spatial transcriptomics, a technique that positions tissue sections onto genetic arrays to determine gene expression by cell type and histological location, helping to create a broad image-based map of where certain cell types are located in tissues and with what other cells they are communicating. They characterized the cell activity of skin samples from 11 men and women with mild to severe psoriasis/psoriatic arthritis, and three healthy adults who did not have psoriasis. They defined the cellular composition of 25 healthy skin biopsies and matched skin biopsies from psoriatic lesional and nonlesional skin, and identified 17 distinct clusters of cells, which they grouped into epidermal, dermis, pilosebaceous, and adipose categories.
The researchers found that cell activity associated with inflammation, as shown by clusters of fibroblasts and dermal macrophages, was more common in the upper layers of the skin in samples from patients with more severe psoriasis, compared with healthy control samples.
They also examined patterns of immune activity at the cellular level and found significant patterns around the upper follicle, around the perifollicular dermis, and within the hair follicle, where immune cells were enriched in healthy skin. Other cells enriched in these upper layer areas in healthy skin included dendritic cells, innate lymphoid cells, T helper cells, T cytotoxic cells, and myeloid cells.
Clusters of fibroblasts and macrophages, which are associated with inflammation, were clustered in psoriatic lesional skin, which also showed more inflammation at the dermal and suprabasal epidermal levels. B lymphocytes also were more prevalent in lesional skin.
The researchers then analyzed the skin samples according to disease severity; mild psoriasis was defined as a Psoriasis Area and Severity Index score less than 12; moderate to severe disease was defined as a PASI score of 12 or higher. The macrophage, fibroblast, and lymphatic endothelium–associated clusters distinguished mild and moderate to severe endotypes.
The pathology of moderate to severe psoriasis in lesional and nonlesional skin showed the extensive effects of psoriasis-related inflammation. Although nonlesional mild disease was clustered with healthy skin, in cases of moderate to severe disease, nonlesional and lesional groups were clustered together. This effect was segregated according to disease severity, independent of the presence of joint disease, and “was particularly evident in distal, nonlesional samples,” the researchers wrote.
The researchers also found evidence of increased gene activity in more than three dozen molecular pathways associated with metabolism and lipid levels in areas of lesional and nonlesional skin, Dr. Scher said.
The findings were limited by several factors including the small sample size and the limits of spatial transcriptomics technology resolution, the researchers wrote. “As this technology evolves, platforms with higher density, and by extension, resolution, of spatially barcoded beads will provide more granularity about cellular microenvironments in healthy and diseased states.”
The study was supported by the National Institutes of Health, the National Psoriasis Foundation, the NYU Colton Center for Autoimmunity, the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis, the Beatrice Snyder Foundation, The Riley Family Foundation, the Rheumatology Research Foundation, and the NY Stem Cell Foundation. Dr. Castillo had no financial conflicts to disclose. Dr. Scher has served as a consultant for Janssen, Abbvie, Novartis, Pfizer, Sanofi, UCB, and Bristol-Myers Squibb, and has received research funding from Janssen and Pfizer.
The activity and clustering of certain cell types may distinguish mild and severe forms of psoriasis, with severe disease altering the cellular and metabolic composition of distal unaffected skin sites, according to a new analysis using single-cell transcriptomic technology.
On the surface, psoriasis severity is identified based on the visible lesions, Rochelle L. Castillo, MD, of the division of rheumatology and the NYU Psoriatic Arthritis Center, NYU Langone Health, New York, and colleagues wrote in their study, published in Science Immunology. Although cellular and molecular features of inflammatory skin diseases such as psoriasis have been characterized, activity at the tissue level and its systemic impact has not been explored.
“Our initial goal was to find measurable molecular signals that could tell us who is more likely to develop severe psoriasis, as well as who is at higher risk of developing related disorders that often accompany psoriasis, such as arthritis and cardiovascular disease,” study co–senior investigator Jose Scher, MD, director of the Psoriatic Arthritis Center and the Judith and Stewart Colton Center for Autoimmunity at NYU Langone Health, said in a press release accompanying the publication of the findings. “Having found signals with potential systemic consequences, we are now working to understand how skin inflammation can lead to widespread disease affecting other organs,”
In the study, the researchers used spatial transcriptomics, a technique that positions tissue sections onto genetic arrays to determine gene expression by cell type and histological location, helping to create a broad image-based map of where certain cell types are located in tissues and with what other cells they are communicating. They characterized the cell activity of skin samples from 11 men and women with mild to severe psoriasis/psoriatic arthritis, and three healthy adults who did not have psoriasis. They defined the cellular composition of 25 healthy skin biopsies and matched skin biopsies from psoriatic lesional and nonlesional skin, and identified 17 distinct clusters of cells, which they grouped into epidermal, dermis, pilosebaceous, and adipose categories.
The researchers found that cell activity associated with inflammation, as shown by clusters of fibroblasts and dermal macrophages, was more common in the upper layers of the skin in samples from patients with more severe psoriasis, compared with healthy control samples.
They also examined patterns of immune activity at the cellular level and found significant patterns around the upper follicle, around the perifollicular dermis, and within the hair follicle, where immune cells were enriched in healthy skin. Other cells enriched in these upper layer areas in healthy skin included dendritic cells, innate lymphoid cells, T helper cells, T cytotoxic cells, and myeloid cells.
Clusters of fibroblasts and macrophages, which are associated with inflammation, were clustered in psoriatic lesional skin, which also showed more inflammation at the dermal and suprabasal epidermal levels. B lymphocytes also were more prevalent in lesional skin.
The researchers then analyzed the skin samples according to disease severity; mild psoriasis was defined as a Psoriasis Area and Severity Index score less than 12; moderate to severe disease was defined as a PASI score of 12 or higher. The macrophage, fibroblast, and lymphatic endothelium–associated clusters distinguished mild and moderate to severe endotypes.
The pathology of moderate to severe psoriasis in lesional and nonlesional skin showed the extensive effects of psoriasis-related inflammation. Although nonlesional mild disease was clustered with healthy skin, in cases of moderate to severe disease, nonlesional and lesional groups were clustered together. This effect was segregated according to disease severity, independent of the presence of joint disease, and “was particularly evident in distal, nonlesional samples,” the researchers wrote.
The researchers also found evidence of increased gene activity in more than three dozen molecular pathways associated with metabolism and lipid levels in areas of lesional and nonlesional skin, Dr. Scher said.
The findings were limited by several factors including the small sample size and the limits of spatial transcriptomics technology resolution, the researchers wrote. “As this technology evolves, platforms with higher density, and by extension, resolution, of spatially barcoded beads will provide more granularity about cellular microenvironments in healthy and diseased states.”
The study was supported by the National Institutes of Health, the National Psoriasis Foundation, the NYU Colton Center for Autoimmunity, the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis, the Beatrice Snyder Foundation, The Riley Family Foundation, the Rheumatology Research Foundation, and the NY Stem Cell Foundation. Dr. Castillo had no financial conflicts to disclose. Dr. Scher has served as a consultant for Janssen, Abbvie, Novartis, Pfizer, Sanofi, UCB, and Bristol-Myers Squibb, and has received research funding from Janssen and Pfizer.
FROM SCIENCE IMMUNOLOGY
Menopause and long COVID: What women should know
British researchers have noted that women at midlife who have long COVID seem to get specific, and severe, symptoms, including brain fog, fatigue, new-onset dizziness, and difficulty sleeping through the night.
Doctors also think it’s possible that long COVID worsens the symptoms of perimenopause and menopause. Lower levels of estrogen and testosterone appear to be the reason.
“A long COVID theory is that there is a temporary disruption to physiological ovarian steroid hormone production, which could [worsen] symptoms of perimenopause and menopause,” said JoAnn V. Pinkerton, MD, professor of obstetrics at the University of Virginia, Charlottesville, and executive director of the North American Menopause Society.
Long COVID symptoms and menopause symptoms can also be very hard to tell apart.
Another U.K. study cautions that because of this kind of symptom overlap, women at midlife may be misdiagnosed. Research from the North American Menopause Society shows that many women may have trouble recovering from long COVID unless their hormone deficiency is treated.
What are the symptoms of long COVID?
There are over 200 symptoms that have been associated with long COVID, according to the American Medical Association. Some common symptoms are currently defined as the following: feeling extremely tired, feeling depleted after exertion, cognitive issues such as brain fog, heart beating over 100 times a minute, and a loss of sense of smell and taste.
Long COVID symptoms begin a few weeks to a few months after a COVID infection. They can last an indefinite amount of time, but “the hope is that long COVID will not be lifelong,” said Clare Flannery, MD, an endocrinologist and associate professor in the departments of obstetrics, gynecology and reproductive sciences and internal medicine at Yale University, New Haven, Conn.
What are the symptoms of menopause?
Some symptoms of menopause include vaginal infections, irregular bleeding, urinary problems, and sexual problems.
Women in their middle years have other symptoms that can be the same as perimenopause/menopause symptoms.
“Common symptoms of perimenopause and menopause which may also be symptoms ascribed to long COVID include hot flashes, night sweats, disrupted sleep, low mood, depression or anxiety, decreased concentration, memory problems, joint and muscle pains, and headaches,” Dr. Pinkerton said.
Can long COVID actually bring on menopause?
In short: Possibly.
A new study from the Massachusetts Institute of Technology/Patient-Led Research Collaborative/University of California, San Francisco, found that long COVID can cause disruptions to a woman’s menstrual cycle, ovaries, fertility, and menopause itself.
This could be caused by chronic inflammation caused by long COVID on hormones as well. This kind of inflammatory response could explain irregularities in a woman’s menstrual cycle, according to the Newson Health Research and Education study. For instance, “when the body has inflammation, ovulation can happen,” Dr. Flannery said.
The mechanism for how long COVID could spur menopause can also involve a woman’s ovaries.
“Since the theory is that COVID affects the ovary with declines in ovarian reserve and ovarian function, it makes sense that long COVID could bring on symptoms of perimenopause or menopause more acutely or more severely and lengthen the symptoms of the perimenopause and menopausal transition,” Dr. Pinkerton said.
How can hormone replacement therapy benefit women dealing with long COVID during menopause?
Estradiol, the strongest estrogen hormone in a woman’s body, has already been shown to have a positive effect against COVID.
“Estradiol therapy treats symptoms more aggressively in the setting of long COVID,” said Dr. Flannery.
Estradiol is also a form of hormone therapy for menopause symptoms.
“Estradiol has been shown to help hot flashes, night sweats, and sleep and improve mood during perimenopause,” said Dr. Pinkerton. “So it’s likely that perimenopausal or menopausal women with long COVID would see improvements both due to the action of estradiol on the ovary seen during COVID and the improvements in symptoms.”
Estrogen-based hormone therapy has been linked to an increased risk for endometrial, breast, and ovarian cancer, according to the American Cancer Society. This means you should carefully consider how comfortable you are with those additional risks before starting this kind of therapy.
“Which of your symptoms are the most difficult to manage? You may see if you can navigate one to three of them. What are you willing to do for your symptoms? If a woman is willing to favor her sleep for the next 6 months to a year, she may be willing to change how she perceives her risk for cancer,” Dr. Flannery said. “What risk is a woman willing to take? I think if someone has a very low concern about a risk of cancer, and she’s suffering a disrupted life, then taking estradiol in a 1- to 2-year trial period could be critical to help.”
What else can help ease long COVID during menopause?
Getting the COVID vaccine, as well as getting a booster, could help. Not only will this help prevent people from being reinfected with COVID, which can worsen symptoms, but a new Swedish study says there is no evidence that it will cause postmenopausal problems like irregular bleeding.
“Weak and inconsistent associations were observed between SARS-CoV-2 vaccination and healthcare contacts for bleeding in women who are postmenopausal, and even less evidence was recorded of an association for menstrual disturbance or bleeding in women who were premenopausal,” said study coauthor Rickard Ljung, MD, PhD, MPH, professor and acting head of the pharmacoepidemiology and analysis department in the division of use and information of the Swedish Medical Products Agency in Uppsala.
A version of this article first appeared on WebMD.com.
British researchers have noted that women at midlife who have long COVID seem to get specific, and severe, symptoms, including brain fog, fatigue, new-onset dizziness, and difficulty sleeping through the night.
Doctors also think it’s possible that long COVID worsens the symptoms of perimenopause and menopause. Lower levels of estrogen and testosterone appear to be the reason.
“A long COVID theory is that there is a temporary disruption to physiological ovarian steroid hormone production, which could [worsen] symptoms of perimenopause and menopause,” said JoAnn V. Pinkerton, MD, professor of obstetrics at the University of Virginia, Charlottesville, and executive director of the North American Menopause Society.
Long COVID symptoms and menopause symptoms can also be very hard to tell apart.
Another U.K. study cautions that because of this kind of symptom overlap, women at midlife may be misdiagnosed. Research from the North American Menopause Society shows that many women may have trouble recovering from long COVID unless their hormone deficiency is treated.
What are the symptoms of long COVID?
There are over 200 symptoms that have been associated with long COVID, according to the American Medical Association. Some common symptoms are currently defined as the following: feeling extremely tired, feeling depleted after exertion, cognitive issues such as brain fog, heart beating over 100 times a minute, and a loss of sense of smell and taste.
Long COVID symptoms begin a few weeks to a few months after a COVID infection. They can last an indefinite amount of time, but “the hope is that long COVID will not be lifelong,” said Clare Flannery, MD, an endocrinologist and associate professor in the departments of obstetrics, gynecology and reproductive sciences and internal medicine at Yale University, New Haven, Conn.
What are the symptoms of menopause?
Some symptoms of menopause include vaginal infections, irregular bleeding, urinary problems, and sexual problems.
Women in their middle years have other symptoms that can be the same as perimenopause/menopause symptoms.
“Common symptoms of perimenopause and menopause which may also be symptoms ascribed to long COVID include hot flashes, night sweats, disrupted sleep, low mood, depression or anxiety, decreased concentration, memory problems, joint and muscle pains, and headaches,” Dr. Pinkerton said.
Can long COVID actually bring on menopause?
In short: Possibly.
A new study from the Massachusetts Institute of Technology/Patient-Led Research Collaborative/University of California, San Francisco, found that long COVID can cause disruptions to a woman’s menstrual cycle, ovaries, fertility, and menopause itself.
This could be caused by chronic inflammation caused by long COVID on hormones as well. This kind of inflammatory response could explain irregularities in a woman’s menstrual cycle, according to the Newson Health Research and Education study. For instance, “when the body has inflammation, ovulation can happen,” Dr. Flannery said.
The mechanism for how long COVID could spur menopause can also involve a woman’s ovaries.
“Since the theory is that COVID affects the ovary with declines in ovarian reserve and ovarian function, it makes sense that long COVID could bring on symptoms of perimenopause or menopause more acutely or more severely and lengthen the symptoms of the perimenopause and menopausal transition,” Dr. Pinkerton said.
How can hormone replacement therapy benefit women dealing with long COVID during menopause?
Estradiol, the strongest estrogen hormone in a woman’s body, has already been shown to have a positive effect against COVID.
“Estradiol therapy treats symptoms more aggressively in the setting of long COVID,” said Dr. Flannery.
Estradiol is also a form of hormone therapy for menopause symptoms.
“Estradiol has been shown to help hot flashes, night sweats, and sleep and improve mood during perimenopause,” said Dr. Pinkerton. “So it’s likely that perimenopausal or menopausal women with long COVID would see improvements both due to the action of estradiol on the ovary seen during COVID and the improvements in symptoms.”
Estrogen-based hormone therapy has been linked to an increased risk for endometrial, breast, and ovarian cancer, according to the American Cancer Society. This means you should carefully consider how comfortable you are with those additional risks before starting this kind of therapy.
“Which of your symptoms are the most difficult to manage? You may see if you can navigate one to three of them. What are you willing to do for your symptoms? If a woman is willing to favor her sleep for the next 6 months to a year, she may be willing to change how she perceives her risk for cancer,” Dr. Flannery said. “What risk is a woman willing to take? I think if someone has a very low concern about a risk of cancer, and she’s suffering a disrupted life, then taking estradiol in a 1- to 2-year trial period could be critical to help.”
What else can help ease long COVID during menopause?
Getting the COVID vaccine, as well as getting a booster, could help. Not only will this help prevent people from being reinfected with COVID, which can worsen symptoms, but a new Swedish study says there is no evidence that it will cause postmenopausal problems like irregular bleeding.
“Weak and inconsistent associations were observed between SARS-CoV-2 vaccination and healthcare contacts for bleeding in women who are postmenopausal, and even less evidence was recorded of an association for menstrual disturbance or bleeding in women who were premenopausal,” said study coauthor Rickard Ljung, MD, PhD, MPH, professor and acting head of the pharmacoepidemiology and analysis department in the division of use and information of the Swedish Medical Products Agency in Uppsala.
A version of this article first appeared on WebMD.com.
British researchers have noted that women at midlife who have long COVID seem to get specific, and severe, symptoms, including brain fog, fatigue, new-onset dizziness, and difficulty sleeping through the night.
Doctors also think it’s possible that long COVID worsens the symptoms of perimenopause and menopause. Lower levels of estrogen and testosterone appear to be the reason.
“A long COVID theory is that there is a temporary disruption to physiological ovarian steroid hormone production, which could [worsen] symptoms of perimenopause and menopause,” said JoAnn V. Pinkerton, MD, professor of obstetrics at the University of Virginia, Charlottesville, and executive director of the North American Menopause Society.
Long COVID symptoms and menopause symptoms can also be very hard to tell apart.
Another U.K. study cautions that because of this kind of symptom overlap, women at midlife may be misdiagnosed. Research from the North American Menopause Society shows that many women may have trouble recovering from long COVID unless their hormone deficiency is treated.
What are the symptoms of long COVID?
There are over 200 symptoms that have been associated with long COVID, according to the American Medical Association. Some common symptoms are currently defined as the following: feeling extremely tired, feeling depleted after exertion, cognitive issues such as brain fog, heart beating over 100 times a minute, and a loss of sense of smell and taste.
Long COVID symptoms begin a few weeks to a few months after a COVID infection. They can last an indefinite amount of time, but “the hope is that long COVID will not be lifelong,” said Clare Flannery, MD, an endocrinologist and associate professor in the departments of obstetrics, gynecology and reproductive sciences and internal medicine at Yale University, New Haven, Conn.
What are the symptoms of menopause?
Some symptoms of menopause include vaginal infections, irregular bleeding, urinary problems, and sexual problems.
Women in their middle years have other symptoms that can be the same as perimenopause/menopause symptoms.
“Common symptoms of perimenopause and menopause which may also be symptoms ascribed to long COVID include hot flashes, night sweats, disrupted sleep, low mood, depression or anxiety, decreased concentration, memory problems, joint and muscle pains, and headaches,” Dr. Pinkerton said.
Can long COVID actually bring on menopause?
In short: Possibly.
A new study from the Massachusetts Institute of Technology/Patient-Led Research Collaborative/University of California, San Francisco, found that long COVID can cause disruptions to a woman’s menstrual cycle, ovaries, fertility, and menopause itself.
This could be caused by chronic inflammation caused by long COVID on hormones as well. This kind of inflammatory response could explain irregularities in a woman’s menstrual cycle, according to the Newson Health Research and Education study. For instance, “when the body has inflammation, ovulation can happen,” Dr. Flannery said.
The mechanism for how long COVID could spur menopause can also involve a woman’s ovaries.
“Since the theory is that COVID affects the ovary with declines in ovarian reserve and ovarian function, it makes sense that long COVID could bring on symptoms of perimenopause or menopause more acutely or more severely and lengthen the symptoms of the perimenopause and menopausal transition,” Dr. Pinkerton said.
How can hormone replacement therapy benefit women dealing with long COVID during menopause?
Estradiol, the strongest estrogen hormone in a woman’s body, has already been shown to have a positive effect against COVID.
“Estradiol therapy treats symptoms more aggressively in the setting of long COVID,” said Dr. Flannery.
Estradiol is also a form of hormone therapy for menopause symptoms.
“Estradiol has been shown to help hot flashes, night sweats, and sleep and improve mood during perimenopause,” said Dr. Pinkerton. “So it’s likely that perimenopausal or menopausal women with long COVID would see improvements both due to the action of estradiol on the ovary seen during COVID and the improvements in symptoms.”
Estrogen-based hormone therapy has been linked to an increased risk for endometrial, breast, and ovarian cancer, according to the American Cancer Society. This means you should carefully consider how comfortable you are with those additional risks before starting this kind of therapy.
“Which of your symptoms are the most difficult to manage? You may see if you can navigate one to three of them. What are you willing to do for your symptoms? If a woman is willing to favor her sleep for the next 6 months to a year, she may be willing to change how she perceives her risk for cancer,” Dr. Flannery said. “What risk is a woman willing to take? I think if someone has a very low concern about a risk of cancer, and she’s suffering a disrupted life, then taking estradiol in a 1- to 2-year trial period could be critical to help.”
What else can help ease long COVID during menopause?
Getting the COVID vaccine, as well as getting a booster, could help. Not only will this help prevent people from being reinfected with COVID, which can worsen symptoms, but a new Swedish study says there is no evidence that it will cause postmenopausal problems like irregular bleeding.
“Weak and inconsistent associations were observed between SARS-CoV-2 vaccination and healthcare contacts for bleeding in women who are postmenopausal, and even less evidence was recorded of an association for menstrual disturbance or bleeding in women who were premenopausal,” said study coauthor Rickard Ljung, MD, PhD, MPH, professor and acting head of the pharmacoepidemiology and analysis department in the division of use and information of the Swedish Medical Products Agency in Uppsala.
A version of this article first appeared on WebMD.com.
COVID vaccines safe for young children, study finds
TOPLINE:
COVID-19 vaccines from Moderna and Pfizer-BioNTech are safe for children under age 5 years, according to findings from a study funded by the Centers for Disease Control and Prevention.
METHODOLOGY:
- Data came from the Vaccine Safety Datalink, which gathers information from eight health systems in the United States.
- Analyzed data from 135,005 doses given to children age 4 and younger who received the Pfizer-BioNTech , and 112,006 doses given to children aged 5 and younger who received the Moderna version.
- Assessed for 23 safety outcomes, including myocarditis, pericarditis, and seizures.
TAKEAWAY:
- One case of hemorrhagic stroke and one case of pulmonary embolism occurred after vaccination but these were linked to preexisting congenital abnormalities.
IN PRACTICE:
“These results can provide reassurance to clinicians, parents, and policymakers alike.”
STUDY DETAILS:
The study was led by Kristin Goddard, MPH, a researcher at the Kaiser Permanente Vaccine Study Center in Oakland, Calif., and was funded by the Centers for Disease Control and Prevention.
LIMITATIONS:
The researchers reported low statistical power for early analysis, especially for rare outcomes. In addition, fewer than 25% of children in the database had received a vaccine at the time of analysis.
DISCLOSURES:
A coauthor reported receiving funding from Janssen Vaccines and Prevention for a study unrelated to COVID-19 vaccines. Another coauthor reported receiving grants from Pfizer in 2019 for clinical trials for coronavirus vaccines, and from Merck, GSK, and Sanofi Pasteur for unrelated research.
A version of this article first appeared on Medscape.com.
TOPLINE:
COVID-19 vaccines from Moderna and Pfizer-BioNTech are safe for children under age 5 years, according to findings from a study funded by the Centers for Disease Control and Prevention.
METHODOLOGY:
- Data came from the Vaccine Safety Datalink, which gathers information from eight health systems in the United States.
- Analyzed data from 135,005 doses given to children age 4 and younger who received the Pfizer-BioNTech , and 112,006 doses given to children aged 5 and younger who received the Moderna version.
- Assessed for 23 safety outcomes, including myocarditis, pericarditis, and seizures.
TAKEAWAY:
- One case of hemorrhagic stroke and one case of pulmonary embolism occurred after vaccination but these were linked to preexisting congenital abnormalities.
IN PRACTICE:
“These results can provide reassurance to clinicians, parents, and policymakers alike.”
STUDY DETAILS:
The study was led by Kristin Goddard, MPH, a researcher at the Kaiser Permanente Vaccine Study Center in Oakland, Calif., and was funded by the Centers for Disease Control and Prevention.
LIMITATIONS:
The researchers reported low statistical power for early analysis, especially for rare outcomes. In addition, fewer than 25% of children in the database had received a vaccine at the time of analysis.
DISCLOSURES:
A coauthor reported receiving funding from Janssen Vaccines and Prevention for a study unrelated to COVID-19 vaccines. Another coauthor reported receiving grants from Pfizer in 2019 for clinical trials for coronavirus vaccines, and from Merck, GSK, and Sanofi Pasteur for unrelated research.
A version of this article first appeared on Medscape.com.
TOPLINE:
COVID-19 vaccines from Moderna and Pfizer-BioNTech are safe for children under age 5 years, according to findings from a study funded by the Centers for Disease Control and Prevention.
METHODOLOGY:
- Data came from the Vaccine Safety Datalink, which gathers information from eight health systems in the United States.
- Analyzed data from 135,005 doses given to children age 4 and younger who received the Pfizer-BioNTech , and 112,006 doses given to children aged 5 and younger who received the Moderna version.
- Assessed for 23 safety outcomes, including myocarditis, pericarditis, and seizures.
TAKEAWAY:
- One case of hemorrhagic stroke and one case of pulmonary embolism occurred after vaccination but these were linked to preexisting congenital abnormalities.
IN PRACTICE:
“These results can provide reassurance to clinicians, parents, and policymakers alike.”
STUDY DETAILS:
The study was led by Kristin Goddard, MPH, a researcher at the Kaiser Permanente Vaccine Study Center in Oakland, Calif., and was funded by the Centers for Disease Control and Prevention.
LIMITATIONS:
The researchers reported low statistical power for early analysis, especially for rare outcomes. In addition, fewer than 25% of children in the database had received a vaccine at the time of analysis.
DISCLOSURES:
A coauthor reported receiving funding from Janssen Vaccines and Prevention for a study unrelated to COVID-19 vaccines. Another coauthor reported receiving grants from Pfizer in 2019 for clinical trials for coronavirus vaccines, and from Merck, GSK, and Sanofi Pasteur for unrelated research.
A version of this article first appeared on Medscape.com.
FROM PEDIATRICS
Review may help clinicians treat adolescents with depression
Depression is common among Canadian adolescents and often goes unnoticed. Many family physicians report feeling unprepared to identify and manage depression in these patients.
“Depression is an increasingly common but treatable condition among adolescents,” the authors wrote. “Primary care physicians and pediatricians are well positioned to support the assessment and first-line management of depression in this group, helping patients to regain their health and function.”
The article was published in CMAJ.
Distinct presentation
More than 40% of cases of depression begin during childhood. Onset at this life stage is associated with worse severity of depression in adulthood and worse social, occupational, and physical health outcomes.
Depression is influenced by genetic and environmental factors. Family history of depression is associated with a three- to fivefold increased risk of depression among older children. Genetic loci are known to be associated with depression, but exposure to parental depression, adverse childhood experiences, and family conflict are also linked to greater risk. Bullying and stigma are associated with greater risk among lesbian, gay, bisexual, and transgender youth.
Compared with adults, adolescents with depression are more likely to be irritable and to have a labile mood, rather than a low mood. Social withdrawal is also more common among adolescents than among adults. Unusual features, such as hypersomnia and increased appetite, may also be present. Anxiety, somatic symptoms, psychomotor agitation, and hallucinations are more common in adolescents than in younger persons with depression. It is vital to assess risk of suicidality and self-injury as well as support systems, and validated scales such as the Columbia Suicide Severity Rating Scale can be useful.
There is no consensus as to whether universal screening for depression is beneficial among adolescents. “Screening in this age group may be a reasonable approach, however, when implemented together with adequate systems that ensure accurate diagnosis and appropriate follow-up,” wrote the authors.
Management of depression in adolescents should begin with psychoeducation and may include lifestyle modification, psychotherapy, and medication. “Importantly, a suicide risk assessment must be done to ensure appropriateness of an outpatient management plan and facilitate safety planning,” the authors wrote.
Lifestyle interventions may target physical activity, diet, and sleep, since unhealthy patterns in all three are associated with heightened symptoms of depression in this population. Regular moderate to vigorous physical activity, and perhaps physical activity of short duration, can improve mood in adolescents. Reduced consumption of sugar-sweetened drinks, processed foods, and meats, along with greater consumption of fruits and legumes, has been shown to reduce depressive symptoms in randomized, controlled trials with adults.
Among psychotherapeutic approaches, cognitive-behavioral therapy has shown the most evidence of efficacy among adolescents with depression, though it is less effective for those with more severe symptoms, poor coping skills, and nonsuicidal self-injury. Some evidence supports interpersonal therapy, which focuses on relationships and social functioning. The involvement of caregivers may improve the response, compared with psychotherapy that only includes the adolescent.
The authors recommend antidepressant medications in more severe cases or when psychotherapy is ineffective or impossible. Guidelines generally support trials with at least two SSRIs before switching to another drug class, since efficacy data for them are sparser, and other drugs have worse side effect profiles.
About 2% of adolescents with depression experience an increase in suicidal ideation and behavior after exposure to antidepressants, usually within the first weeks of initiation, so this potential risk should be discussed with patients and caregivers.
Clinicians feel unprepared
Commenting on the review, Pierre-Paul Tellier, MD, an associate professor of family medicine at McGill University, Montreal, said that clinicians frequently report that they do not feel confident in their ability to manage and diagnose adolescent depression. “We did two systematic reviews to look at the continuing professional development of family physicians in adolescent health, and it turned out that there’s really a very large lack. When we looked at residents and the training that they were getting in adolescent medicine, it was very similar, so they felt unprepared to deal with issues around mental health.”
Medication can be effective, but it can be seen as “an easy way out,” Dr. Tellier added. “It’s not necessarily an ideal plan. What we need to do is to change the person’s way of thinking, the person’s way of responding to a variety of things which will occur throughout their lives. People will have other transition periods in their lives. It’s best if they learn a variety of techniques to deal with depression.”
These techniques include exercise, relaxation methods [which reduce anxiety], and wellness training. Through such techniques, patients “learn a healthier way of living with themselves and who they are, and then this is a lifelong way of learning,” said Dr. Tellier. “If I give you a pill, what I’m teaching is, yes, you can feel better. But you’re not dealing with the problem, you’re just dealing with the symptoms.”
He frequently refers his patients to YouTube videos that outline and explain various strategies. A favorite is a deep breathing exercise presented by Jeremy Howick.
The authors and Dr. Tellier disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Depression is common among Canadian adolescents and often goes unnoticed. Many family physicians report feeling unprepared to identify and manage depression in these patients.
“Depression is an increasingly common but treatable condition among adolescents,” the authors wrote. “Primary care physicians and pediatricians are well positioned to support the assessment and first-line management of depression in this group, helping patients to regain their health and function.”
The article was published in CMAJ.
Distinct presentation
More than 40% of cases of depression begin during childhood. Onset at this life stage is associated with worse severity of depression in adulthood and worse social, occupational, and physical health outcomes.
Depression is influenced by genetic and environmental factors. Family history of depression is associated with a three- to fivefold increased risk of depression among older children. Genetic loci are known to be associated with depression, but exposure to parental depression, adverse childhood experiences, and family conflict are also linked to greater risk. Bullying and stigma are associated with greater risk among lesbian, gay, bisexual, and transgender youth.
Compared with adults, adolescents with depression are more likely to be irritable and to have a labile mood, rather than a low mood. Social withdrawal is also more common among adolescents than among adults. Unusual features, such as hypersomnia and increased appetite, may also be present. Anxiety, somatic symptoms, psychomotor agitation, and hallucinations are more common in adolescents than in younger persons with depression. It is vital to assess risk of suicidality and self-injury as well as support systems, and validated scales such as the Columbia Suicide Severity Rating Scale can be useful.
There is no consensus as to whether universal screening for depression is beneficial among adolescents. “Screening in this age group may be a reasonable approach, however, when implemented together with adequate systems that ensure accurate diagnosis and appropriate follow-up,” wrote the authors.
Management of depression in adolescents should begin with psychoeducation and may include lifestyle modification, psychotherapy, and medication. “Importantly, a suicide risk assessment must be done to ensure appropriateness of an outpatient management plan and facilitate safety planning,” the authors wrote.
Lifestyle interventions may target physical activity, diet, and sleep, since unhealthy patterns in all three are associated with heightened symptoms of depression in this population. Regular moderate to vigorous physical activity, and perhaps physical activity of short duration, can improve mood in adolescents. Reduced consumption of sugar-sweetened drinks, processed foods, and meats, along with greater consumption of fruits and legumes, has been shown to reduce depressive symptoms in randomized, controlled trials with adults.
Among psychotherapeutic approaches, cognitive-behavioral therapy has shown the most evidence of efficacy among adolescents with depression, though it is less effective for those with more severe symptoms, poor coping skills, and nonsuicidal self-injury. Some evidence supports interpersonal therapy, which focuses on relationships and social functioning. The involvement of caregivers may improve the response, compared with psychotherapy that only includes the adolescent.
The authors recommend antidepressant medications in more severe cases or when psychotherapy is ineffective or impossible. Guidelines generally support trials with at least two SSRIs before switching to another drug class, since efficacy data for them are sparser, and other drugs have worse side effect profiles.
About 2% of adolescents with depression experience an increase in suicidal ideation and behavior after exposure to antidepressants, usually within the first weeks of initiation, so this potential risk should be discussed with patients and caregivers.
Clinicians feel unprepared
Commenting on the review, Pierre-Paul Tellier, MD, an associate professor of family medicine at McGill University, Montreal, said that clinicians frequently report that they do not feel confident in their ability to manage and diagnose adolescent depression. “We did two systematic reviews to look at the continuing professional development of family physicians in adolescent health, and it turned out that there’s really a very large lack. When we looked at residents and the training that they were getting in adolescent medicine, it was very similar, so they felt unprepared to deal with issues around mental health.”
Medication can be effective, but it can be seen as “an easy way out,” Dr. Tellier added. “It’s not necessarily an ideal plan. What we need to do is to change the person’s way of thinking, the person’s way of responding to a variety of things which will occur throughout their lives. People will have other transition periods in their lives. It’s best if they learn a variety of techniques to deal with depression.”
These techniques include exercise, relaxation methods [which reduce anxiety], and wellness training. Through such techniques, patients “learn a healthier way of living with themselves and who they are, and then this is a lifelong way of learning,” said Dr. Tellier. “If I give you a pill, what I’m teaching is, yes, you can feel better. But you’re not dealing with the problem, you’re just dealing with the symptoms.”
He frequently refers his patients to YouTube videos that outline and explain various strategies. A favorite is a deep breathing exercise presented by Jeremy Howick.
The authors and Dr. Tellier disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Depression is common among Canadian adolescents and often goes unnoticed. Many family physicians report feeling unprepared to identify and manage depression in these patients.
“Depression is an increasingly common but treatable condition among adolescents,” the authors wrote. “Primary care physicians and pediatricians are well positioned to support the assessment and first-line management of depression in this group, helping patients to regain their health and function.”
The article was published in CMAJ.
Distinct presentation
More than 40% of cases of depression begin during childhood. Onset at this life stage is associated with worse severity of depression in adulthood and worse social, occupational, and physical health outcomes.
Depression is influenced by genetic and environmental factors. Family history of depression is associated with a three- to fivefold increased risk of depression among older children. Genetic loci are known to be associated with depression, but exposure to parental depression, adverse childhood experiences, and family conflict are also linked to greater risk. Bullying and stigma are associated with greater risk among lesbian, gay, bisexual, and transgender youth.
Compared with adults, adolescents with depression are more likely to be irritable and to have a labile mood, rather than a low mood. Social withdrawal is also more common among adolescents than among adults. Unusual features, such as hypersomnia and increased appetite, may also be present. Anxiety, somatic symptoms, psychomotor agitation, and hallucinations are more common in adolescents than in younger persons with depression. It is vital to assess risk of suicidality and self-injury as well as support systems, and validated scales such as the Columbia Suicide Severity Rating Scale can be useful.
There is no consensus as to whether universal screening for depression is beneficial among adolescents. “Screening in this age group may be a reasonable approach, however, when implemented together with adequate systems that ensure accurate diagnosis and appropriate follow-up,” wrote the authors.
Management of depression in adolescents should begin with psychoeducation and may include lifestyle modification, psychotherapy, and medication. “Importantly, a suicide risk assessment must be done to ensure appropriateness of an outpatient management plan and facilitate safety planning,” the authors wrote.
Lifestyle interventions may target physical activity, diet, and sleep, since unhealthy patterns in all three are associated with heightened symptoms of depression in this population. Regular moderate to vigorous physical activity, and perhaps physical activity of short duration, can improve mood in adolescents. Reduced consumption of sugar-sweetened drinks, processed foods, and meats, along with greater consumption of fruits and legumes, has been shown to reduce depressive symptoms in randomized, controlled trials with adults.
Among psychotherapeutic approaches, cognitive-behavioral therapy has shown the most evidence of efficacy among adolescents with depression, though it is less effective for those with more severe symptoms, poor coping skills, and nonsuicidal self-injury. Some evidence supports interpersonal therapy, which focuses on relationships and social functioning. The involvement of caregivers may improve the response, compared with psychotherapy that only includes the adolescent.
The authors recommend antidepressant medications in more severe cases or when psychotherapy is ineffective or impossible. Guidelines generally support trials with at least two SSRIs before switching to another drug class, since efficacy data for them are sparser, and other drugs have worse side effect profiles.
About 2% of adolescents with depression experience an increase in suicidal ideation and behavior after exposure to antidepressants, usually within the first weeks of initiation, so this potential risk should be discussed with patients and caregivers.
Clinicians feel unprepared
Commenting on the review, Pierre-Paul Tellier, MD, an associate professor of family medicine at McGill University, Montreal, said that clinicians frequently report that they do not feel confident in their ability to manage and diagnose adolescent depression. “We did two systematic reviews to look at the continuing professional development of family physicians in adolescent health, and it turned out that there’s really a very large lack. When we looked at residents and the training that they were getting in adolescent medicine, it was very similar, so they felt unprepared to deal with issues around mental health.”
Medication can be effective, but it can be seen as “an easy way out,” Dr. Tellier added. “It’s not necessarily an ideal plan. What we need to do is to change the person’s way of thinking, the person’s way of responding to a variety of things which will occur throughout their lives. People will have other transition periods in their lives. It’s best if they learn a variety of techniques to deal with depression.”
These techniques include exercise, relaxation methods [which reduce anxiety], and wellness training. Through such techniques, patients “learn a healthier way of living with themselves and who they are, and then this is a lifelong way of learning,” said Dr. Tellier. “If I give you a pill, what I’m teaching is, yes, you can feel better. But you’re not dealing with the problem, you’re just dealing with the symptoms.”
He frequently refers his patients to YouTube videos that outline and explain various strategies. A favorite is a deep breathing exercise presented by Jeremy Howick.
The authors and Dr. Tellier disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM CMAJ




