User login
Ultraprocessed foods tied to faster rate of cognitive decline
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included more than 10,000 people aged 35 and older, showed that higher intake of UPF was significantly associated with a faster rate of decline in executive and global cognitive function.
“These findings show that lifestyle choices, particularly high intake of ultraprocessed foods, can influence our cognitive health many years later,” coinvestigator Natalia Goncalves, PhD, University of São Paulo, Brazil, said in an interview.
The study was published online in JAMA Neurology.
The study’s findings were presented in August at the Alzheimer’s Association International Conference (AAIC) 2022 and were reported by this news organization at that time.
High sugar, salt, fat
The new results align with another recent study linking a diet high in UPFs to an increased risk for dementia.
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs are soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, and fries.
The ELSA-Brasil study comprised 10,775 adults (mean age, 50.6 years at baseline; 55% women; 53% White) who were evaluated in three waves approximately 4 years apart from 2008 to 2017.
Information on diet was obtained via food frequency questionnaires and included details regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
During median follow-up of 8 years, people who consumed more than 20% of daily calories from UPFs (quartiles 2-4) experienced a 28% faster rate of decline in global cognition (beta = –0.004; 95% confidence interval [CI], –0.006 to –0.001; P = .003) and a 25% faster rate of decline in executive function (beta = –0.003, 95% CI, –0.005 to 0.000; P = .01) compared to peers in quartile 1 who consumed less than 20% of daily calories from UPFs.
The researchers did not investigate individual groups of UPFs.
However, Dr. Goncalves noted that some studies have linked the consumption of sugar-sweetened beverages with lower cognitive performance, lower brain volume, and poorer memory performance. Another group of ultraprocessed foods, processed meats, has been associated with increased all-cause dementia and Alzheimer’s disease.
Other limitations include the fact that self-reported diet habits were assessed only at baseline using a food frequency questionnaire that was not designed to assess the degree of processing.
While analyses were adjusted for several sociodemographic and clinical confounders, the researchers said they could not exclude the possibility of residual confounding.
Also, since neuroimaging is not available in the ELSA-Brasil study, they were not able to investigate potential mechanisms that could explain the association between higher UPF consumption and cognitive decline.
Despite these limitations, the researchers said their findings suggest that “limiting UPF consumption, particularly in middle-aged adults, may be an efficient form to prevent cognitive decline.”
Weighing the evidence
Several experts weighed in on the results in a statement from the UK nonprofit organization, Science Media Centre.
Kevin McConway, PhD, with Open University, Milton Keynes, England, said it’s important to note that the study suggests “an association, a correlation, and that doesn’t necessarily mean that the cognitive decline was caused by eating more ultra-processed foods.”
He also noted that some types of cognitive decline that are associated with aging occurred in participants in all four quartiles, which were defined by the percentage of their daily energy that came from consuming UPFs.
“That’s hardly surprising – it’s a sad fact of life that pretty well all of us gradually lose some of our cognitive functions as we go through middle and older age,” Dr. McConway said.
“The study doesn’t establish that differences in speed of cognitive decline are caused by ultra-processed food consumption anyway. That’s because it’s an observational study. If the consumption of ultra-processed food causes the differences in rate of cognitive decline, then eating less of it might slow cognitive decline, but if the cause is something else, then that won’t happen,” Dr. McConway added.
Gunter Kuhnle, PhD, professor of nutrition and food science, University of Reading, England, noted that UPFs have become a “fashionable term to explain associations between diet and ill health, and many studies have attempted to show associations.
“Most studies have been observational and had a key limitation: It is very difficult to determine ultra-processed food intake using methods that are not designed to do so, and so authors need to make a lot of assumptions. Bread and meat products are often classed as ‘ultra-processed,’ even though this is often wrong,” Dr. Kuhnle noted.
“The same applies to this study – the method used to measure ultra-processed food intake was not designed for that task and relied on assumptions. This makes it virtually impossible to draw any conclusions,” Dr. Kuhnle said.
Duane Mellor, PhD, RD, RNutr, registered dietitian and senior teaching fellow, Aston University, Birmingham, England, said the study does not change how we should try to eat to maintain good brain function and cognition.
“We should try to eat less foods which are high in added sugar, salt, and fat, which would include many of the foods classified as being ultra-processed, while eating more in terms of both quantity and variety of vegetables, fruit, nuts, seeds, and pulses, which are known to be beneficial for both our cognitive and overall health,” Dr. Mellor said.
The ELSA-Brasil study was supported by the Brazilian Ministry of Health, the Ministry of Science, Technology and Innovation, and the National Council for Scientific and Technological Development. The authors as well as Dr. McConway, Dr. Mellor, and Dr. Kuhnle have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included more than 10,000 people aged 35 and older, showed that higher intake of UPF was significantly associated with a faster rate of decline in executive and global cognitive function.
“These findings show that lifestyle choices, particularly high intake of ultraprocessed foods, can influence our cognitive health many years later,” coinvestigator Natalia Goncalves, PhD, University of São Paulo, Brazil, said in an interview.
The study was published online in JAMA Neurology.
The study’s findings were presented in August at the Alzheimer’s Association International Conference (AAIC) 2022 and were reported by this news organization at that time.
High sugar, salt, fat
The new results align with another recent study linking a diet high in UPFs to an increased risk for dementia.
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs are soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, and fries.
The ELSA-Brasil study comprised 10,775 adults (mean age, 50.6 years at baseline; 55% women; 53% White) who were evaluated in three waves approximately 4 years apart from 2008 to 2017.
Information on diet was obtained via food frequency questionnaires and included details regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
During median follow-up of 8 years, people who consumed more than 20% of daily calories from UPFs (quartiles 2-4) experienced a 28% faster rate of decline in global cognition (beta = –0.004; 95% confidence interval [CI], –0.006 to –0.001; P = .003) and a 25% faster rate of decline in executive function (beta = –0.003, 95% CI, –0.005 to 0.000; P = .01) compared to peers in quartile 1 who consumed less than 20% of daily calories from UPFs.
The researchers did not investigate individual groups of UPFs.
However, Dr. Goncalves noted that some studies have linked the consumption of sugar-sweetened beverages with lower cognitive performance, lower brain volume, and poorer memory performance. Another group of ultraprocessed foods, processed meats, has been associated with increased all-cause dementia and Alzheimer’s disease.
Other limitations include the fact that self-reported diet habits were assessed only at baseline using a food frequency questionnaire that was not designed to assess the degree of processing.
While analyses were adjusted for several sociodemographic and clinical confounders, the researchers said they could not exclude the possibility of residual confounding.
Also, since neuroimaging is not available in the ELSA-Brasil study, they were not able to investigate potential mechanisms that could explain the association between higher UPF consumption and cognitive decline.
Despite these limitations, the researchers said their findings suggest that “limiting UPF consumption, particularly in middle-aged adults, may be an efficient form to prevent cognitive decline.”
Weighing the evidence
Several experts weighed in on the results in a statement from the UK nonprofit organization, Science Media Centre.
Kevin McConway, PhD, with Open University, Milton Keynes, England, said it’s important to note that the study suggests “an association, a correlation, and that doesn’t necessarily mean that the cognitive decline was caused by eating more ultra-processed foods.”
He also noted that some types of cognitive decline that are associated with aging occurred in participants in all four quartiles, which were defined by the percentage of their daily energy that came from consuming UPFs.
“That’s hardly surprising – it’s a sad fact of life that pretty well all of us gradually lose some of our cognitive functions as we go through middle and older age,” Dr. McConway said.
“The study doesn’t establish that differences in speed of cognitive decline are caused by ultra-processed food consumption anyway. That’s because it’s an observational study. If the consumption of ultra-processed food causes the differences in rate of cognitive decline, then eating less of it might slow cognitive decline, but if the cause is something else, then that won’t happen,” Dr. McConway added.
Gunter Kuhnle, PhD, professor of nutrition and food science, University of Reading, England, noted that UPFs have become a “fashionable term to explain associations between diet and ill health, and many studies have attempted to show associations.
“Most studies have been observational and had a key limitation: It is very difficult to determine ultra-processed food intake using methods that are not designed to do so, and so authors need to make a lot of assumptions. Bread and meat products are often classed as ‘ultra-processed,’ even though this is often wrong,” Dr. Kuhnle noted.
“The same applies to this study – the method used to measure ultra-processed food intake was not designed for that task and relied on assumptions. This makes it virtually impossible to draw any conclusions,” Dr. Kuhnle said.
Duane Mellor, PhD, RD, RNutr, registered dietitian and senior teaching fellow, Aston University, Birmingham, England, said the study does not change how we should try to eat to maintain good brain function and cognition.
“We should try to eat less foods which are high in added sugar, salt, and fat, which would include many of the foods classified as being ultra-processed, while eating more in terms of both quantity and variety of vegetables, fruit, nuts, seeds, and pulses, which are known to be beneficial for both our cognitive and overall health,” Dr. Mellor said.
The ELSA-Brasil study was supported by the Brazilian Ministry of Health, the Ministry of Science, Technology and Innovation, and the National Council for Scientific and Technological Development. The authors as well as Dr. McConway, Dr. Mellor, and Dr. Kuhnle have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included more than 10,000 people aged 35 and older, showed that higher intake of UPF was significantly associated with a faster rate of decline in executive and global cognitive function.
“These findings show that lifestyle choices, particularly high intake of ultraprocessed foods, can influence our cognitive health many years later,” coinvestigator Natalia Goncalves, PhD, University of São Paulo, Brazil, said in an interview.
The study was published online in JAMA Neurology.
The study’s findings were presented in August at the Alzheimer’s Association International Conference (AAIC) 2022 and were reported by this news organization at that time.
High sugar, salt, fat
The new results align with another recent study linking a diet high in UPFs to an increased risk for dementia.
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs are soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, and fries.
The ELSA-Brasil study comprised 10,775 adults (mean age, 50.6 years at baseline; 55% women; 53% White) who were evaluated in three waves approximately 4 years apart from 2008 to 2017.
Information on diet was obtained via food frequency questionnaires and included details regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
During median follow-up of 8 years, people who consumed more than 20% of daily calories from UPFs (quartiles 2-4) experienced a 28% faster rate of decline in global cognition (beta = –0.004; 95% confidence interval [CI], –0.006 to –0.001; P = .003) and a 25% faster rate of decline in executive function (beta = –0.003, 95% CI, –0.005 to 0.000; P = .01) compared to peers in quartile 1 who consumed less than 20% of daily calories from UPFs.
The researchers did not investigate individual groups of UPFs.
However, Dr. Goncalves noted that some studies have linked the consumption of sugar-sweetened beverages with lower cognitive performance, lower brain volume, and poorer memory performance. Another group of ultraprocessed foods, processed meats, has been associated with increased all-cause dementia and Alzheimer’s disease.
Other limitations include the fact that self-reported diet habits were assessed only at baseline using a food frequency questionnaire that was not designed to assess the degree of processing.
While analyses were adjusted for several sociodemographic and clinical confounders, the researchers said they could not exclude the possibility of residual confounding.
Also, since neuroimaging is not available in the ELSA-Brasil study, they were not able to investigate potential mechanisms that could explain the association between higher UPF consumption and cognitive decline.
Despite these limitations, the researchers said their findings suggest that “limiting UPF consumption, particularly in middle-aged adults, may be an efficient form to prevent cognitive decline.”
Weighing the evidence
Several experts weighed in on the results in a statement from the UK nonprofit organization, Science Media Centre.
Kevin McConway, PhD, with Open University, Milton Keynes, England, said it’s important to note that the study suggests “an association, a correlation, and that doesn’t necessarily mean that the cognitive decline was caused by eating more ultra-processed foods.”
He also noted that some types of cognitive decline that are associated with aging occurred in participants in all four quartiles, which were defined by the percentage of their daily energy that came from consuming UPFs.
“That’s hardly surprising – it’s a sad fact of life that pretty well all of us gradually lose some of our cognitive functions as we go through middle and older age,” Dr. McConway said.
“The study doesn’t establish that differences in speed of cognitive decline are caused by ultra-processed food consumption anyway. That’s because it’s an observational study. If the consumption of ultra-processed food causes the differences in rate of cognitive decline, then eating less of it might slow cognitive decline, but if the cause is something else, then that won’t happen,” Dr. McConway added.
Gunter Kuhnle, PhD, professor of nutrition and food science, University of Reading, England, noted that UPFs have become a “fashionable term to explain associations between diet and ill health, and many studies have attempted to show associations.
“Most studies have been observational and had a key limitation: It is very difficult to determine ultra-processed food intake using methods that are not designed to do so, and so authors need to make a lot of assumptions. Bread and meat products are often classed as ‘ultra-processed,’ even though this is often wrong,” Dr. Kuhnle noted.
“The same applies to this study – the method used to measure ultra-processed food intake was not designed for that task and relied on assumptions. This makes it virtually impossible to draw any conclusions,” Dr. Kuhnle said.
Duane Mellor, PhD, RD, RNutr, registered dietitian and senior teaching fellow, Aston University, Birmingham, England, said the study does not change how we should try to eat to maintain good brain function and cognition.
“We should try to eat less foods which are high in added sugar, salt, and fat, which would include many of the foods classified as being ultra-processed, while eating more in terms of both quantity and variety of vegetables, fruit, nuts, seeds, and pulses, which are known to be beneficial for both our cognitive and overall health,” Dr. Mellor said.
The ELSA-Brasil study was supported by the Brazilian Ministry of Health, the Ministry of Science, Technology and Innovation, and the National Council for Scientific and Technological Development. The authors as well as Dr. McConway, Dr. Mellor, and Dr. Kuhnle have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NEUROLOGY
Confirmed: Amyloid, tau levels rise years before Alzheimer’s onset
“Our results confirm accelerated biomarker changes during preclinical AD and highlight the important role of amyloid levels in tau accelerations,” the investigators note.
“These data may suggest that there is a short therapeutic window for slowing AD pathogenesis prior to the emergence of clinical symptoms – and that this window may occur after amyloid accumulation begins but before amyloid has substantial impacts on tau accumulation,” study investigator Corinne Pettigrew, PhD, department of neurology, Johns Hopkins University School of Medicine, Baltimore, told this news organization.
The study was published online in Alzheimer’s and Dementia.
Novel long-term CSF data
The study builds on previous research by examining changes in cerebrospinal fluid (CSF) biomarkers over longer periods than had been done previously, particularly among largely middle-aged and cognitively normal at baseline individuals.
The researchers examined changes in amyloid beta (Aβ) 42/Aβ40, phosphorylated tau181 (p-tau181), and total tau (t-tau) in CSF over an average of 10.7 years (and up to 23 years) among 278 individuals who were largely middle-aged persons who were cognitively normal at baseline.
“To our knowledge, no prior study among initially cognitively normal, primarily middle-aged individuals has described CSF AD biomarker changes over this duration of follow-up,” the researchers write.
During follow-up, 94 individuals who initially had normal cognition developed mild cognitive impairment (MCI).
Lower baseline levels of amyloid were associated with greater increases in tau (more strongly in men than women), while accelerations in tau were more closely linked to onset of MCI, the researchers report.
Among individuals who developed MCI, biomarker levels were more abnormal and tau increased to a greater extent prior to the onset of MCI symptoms, they found.
Clear impact of APOE4
The findings also suggest that among APOE4 carriers, amyloid onset occurs at an earlier age and rates of amyloid positivity are higher, but there are no differences in rates of change in amyloid over time.
“APOE4 genetic status was not related to changes in CSF beta-amyloid after accounting for the fact that APOE4 carriers have higher rates of amyloid positivity,” said Dr. Pettigrew.
“These findings suggest that APOE4 genetic status shifts the age of onset of amyloid accumulation (with APOE4 carriers having an earlier age of onset compared to non-carriers), but that APOE4 is not related to rates of change in CSF beta-amyloid over time,” she added.
“Thus, cognitively normal APOE4 carriers may be in more advanced preclinical AD stages at younger ages than individuals who are not APOE4 carriers, which is likely relevant for optimizing clinical trial recruitment strategies,” she said.
Funding for the study was provided by the National Institutes of Health. Dr. Pettigrew has disclosed no relevant financial relationships. The original article contains a complete list of author disclosures.
A version of this article first appeared on Medscape.com.
“Our results confirm accelerated biomarker changes during preclinical AD and highlight the important role of amyloid levels in tau accelerations,” the investigators note.
“These data may suggest that there is a short therapeutic window for slowing AD pathogenesis prior to the emergence of clinical symptoms – and that this window may occur after amyloid accumulation begins but before amyloid has substantial impacts on tau accumulation,” study investigator Corinne Pettigrew, PhD, department of neurology, Johns Hopkins University School of Medicine, Baltimore, told this news organization.
The study was published online in Alzheimer’s and Dementia.
Novel long-term CSF data
The study builds on previous research by examining changes in cerebrospinal fluid (CSF) biomarkers over longer periods than had been done previously, particularly among largely middle-aged and cognitively normal at baseline individuals.
The researchers examined changes in amyloid beta (Aβ) 42/Aβ40, phosphorylated tau181 (p-tau181), and total tau (t-tau) in CSF over an average of 10.7 years (and up to 23 years) among 278 individuals who were largely middle-aged persons who were cognitively normal at baseline.
“To our knowledge, no prior study among initially cognitively normal, primarily middle-aged individuals has described CSF AD biomarker changes over this duration of follow-up,” the researchers write.
During follow-up, 94 individuals who initially had normal cognition developed mild cognitive impairment (MCI).
Lower baseline levels of amyloid were associated with greater increases in tau (more strongly in men than women), while accelerations in tau were more closely linked to onset of MCI, the researchers report.
Among individuals who developed MCI, biomarker levels were more abnormal and tau increased to a greater extent prior to the onset of MCI symptoms, they found.
Clear impact of APOE4
The findings also suggest that among APOE4 carriers, amyloid onset occurs at an earlier age and rates of amyloid positivity are higher, but there are no differences in rates of change in amyloid over time.
“APOE4 genetic status was not related to changes in CSF beta-amyloid after accounting for the fact that APOE4 carriers have higher rates of amyloid positivity,” said Dr. Pettigrew.
“These findings suggest that APOE4 genetic status shifts the age of onset of amyloid accumulation (with APOE4 carriers having an earlier age of onset compared to non-carriers), but that APOE4 is not related to rates of change in CSF beta-amyloid over time,” she added.
“Thus, cognitively normal APOE4 carriers may be in more advanced preclinical AD stages at younger ages than individuals who are not APOE4 carriers, which is likely relevant for optimizing clinical trial recruitment strategies,” she said.
Funding for the study was provided by the National Institutes of Health. Dr. Pettigrew has disclosed no relevant financial relationships. The original article contains a complete list of author disclosures.
A version of this article first appeared on Medscape.com.
“Our results confirm accelerated biomarker changes during preclinical AD and highlight the important role of amyloid levels in tau accelerations,” the investigators note.
“These data may suggest that there is a short therapeutic window for slowing AD pathogenesis prior to the emergence of clinical symptoms – and that this window may occur after amyloid accumulation begins but before amyloid has substantial impacts on tau accumulation,” study investigator Corinne Pettigrew, PhD, department of neurology, Johns Hopkins University School of Medicine, Baltimore, told this news organization.
The study was published online in Alzheimer’s and Dementia.
Novel long-term CSF data
The study builds on previous research by examining changes in cerebrospinal fluid (CSF) biomarkers over longer periods than had been done previously, particularly among largely middle-aged and cognitively normal at baseline individuals.
The researchers examined changes in amyloid beta (Aβ) 42/Aβ40, phosphorylated tau181 (p-tau181), and total tau (t-tau) in CSF over an average of 10.7 years (and up to 23 years) among 278 individuals who were largely middle-aged persons who were cognitively normal at baseline.
“To our knowledge, no prior study among initially cognitively normal, primarily middle-aged individuals has described CSF AD biomarker changes over this duration of follow-up,” the researchers write.
During follow-up, 94 individuals who initially had normal cognition developed mild cognitive impairment (MCI).
Lower baseline levels of amyloid were associated with greater increases in tau (more strongly in men than women), while accelerations in tau were more closely linked to onset of MCI, the researchers report.
Among individuals who developed MCI, biomarker levels were more abnormal and tau increased to a greater extent prior to the onset of MCI symptoms, they found.
Clear impact of APOE4
The findings also suggest that among APOE4 carriers, amyloid onset occurs at an earlier age and rates of amyloid positivity are higher, but there are no differences in rates of change in amyloid over time.
“APOE4 genetic status was not related to changes in CSF beta-amyloid after accounting for the fact that APOE4 carriers have higher rates of amyloid positivity,” said Dr. Pettigrew.
“These findings suggest that APOE4 genetic status shifts the age of onset of amyloid accumulation (with APOE4 carriers having an earlier age of onset compared to non-carriers), but that APOE4 is not related to rates of change in CSF beta-amyloid over time,” she added.
“Thus, cognitively normal APOE4 carriers may be in more advanced preclinical AD stages at younger ages than individuals who are not APOE4 carriers, which is likely relevant for optimizing clinical trial recruitment strategies,” she said.
Funding for the study was provided by the National Institutes of Health. Dr. Pettigrew has disclosed no relevant financial relationships. The original article contains a complete list of author disclosures.
A version of this article first appeared on Medscape.com.
FROM ALZHEIMER’S AND DEMENTIA
Resilience and mind-body interventions in late-life depression
Resilience has been defined as the ability to adapt and thrive in the face of adversity, acute stress, or trauma.1 Originally conceived as an inborn trait characteristic, resilience is now conceptualized as a dynamic, multidimensional capacity influenced by the interactions between internal factors (eg, personality, cognitive capacity, physical health) and environmental resources (eg, social status, financial stability).2,3 Resilience in older adults (typically defined as age ≥65) can improve the prognosis and outcomes for physical and mental conditions.4 The construct is closely aligned with “successful aging” and can be fostered in older adults, leading to improved physical and mental health and well-being.5
While initially resilience was conceptualized as the opposite of depressive states, recent research has identified resilience in the context of major depressive disorder (MDD) as the net effects of various psychosocial and biological variables that decrease the risk of onset, relapse, or depressive illness severity and increase the probability or speed of recovery.6 Late-life depression (LLD) in adults age >65 is a common and debilitating disease, often leading to decreased psychological well-being, increased cognitive decline, and excess mortality.7,8 LLD is associated with several factors, such as cerebrovascular disease, neurodegenerative disease, and inflammation, all of which could contribute to brain vulnerability and an increased risk of depression.9 Physical and cognitive engagement, physical activity, and high brain reserve have been shown to confer resilience to affective and cognitive changes in older adults, despite brain vulnerability.9
The greatest levels of resilience have been observed in individuals in their fifth decade of life and later,4,10 with high levels of resilience significantly contributing to longevity5; however, little is known about which factors contribute to heterogeneity in resilience characteristics and outcomes.4 Furthermore, the concept of resilience continues to raise numerous questions, including:
- how resilience should be measured or defined
- what factors promote or deter the development of resilience
- the effects of resilience on various health and psychological outcomes
- which interventions are effective in enhancing resilience in older adults.4
In this article, we describe resilience in older adults with LLD, its clinical and neurocognitive correlates, and underlying neurobiological and immunological biomarkers. We also examine resilience-building interventions, such as mind-body therapies (MBTs), that have been shown to enhance resilience by promoting positive perceptions of difficult experiences and challenges.
Clinical and neurocognitive correlates of resilience
Resilience varies substantially among older adults with LLD as well as across the lifespan of an individual.11 Identifying clinical components and predictors of resilience may usefully inform the development and testing of interventions to prevent and treat LLD.11 One tool widely used to measure resilience—the self-report Connor-Davidson Resilience Scale (CD-RISC)12— has been found to have clinically relevant characteristics.1,11 Using data from 337 older adults with LLD, Laird et al11 performed an exploratory factor analysis of the CD-RISC and found a 4-factor model:
- grit
- adaptive coping self-efficacy
- accommodative coping self-efficacy
- spirituality.1,11
Having a strong sense of purpose and not being easily discouraged by failure were items characteristic of grit.1,11 The preference to take the lead in problem-solving was typical of items loading on adaptive coping self-efficacy, while accommodative coping self-efficacy measured flexibility, cognitive reframing, a sense of humor, and acceptance in the face of uncontrollable stress.1,11 Finally, the belief that “things happen for a reason” and that “sometimes fate or God can help me” are characteristics of spirituality. 1,11 Using a multivariate model, the greatest variance in total resilience scores was explained by less depression, less apathy, higher quality of life, non-White race, and, somewhat counterintuitively, greater medical comorbidity.1,11 Thus, interventions designed to help older adults cultivate grit, active coping, accommodative coping, and spirituality may enhance resilience in LLD.
Resilience may also be positively associated with cognitive functioning and could be neuroprotective in LLD.13 Laird et al13 investigated associations between baseline resilience and several domains of neurocognitive functioning in 288 older adults with LLD. Several positive associations were found between measured language performance and total resilience, active coping, and accommodative coping.13 Additionally, total resilience and accommodative coping were significantly associated with a lower self-reported frequency of forgetfulness, a subjective measure of memory used in this study.13 Together, these results suggest that interventions targeting language might be useful to improve coping in LLD.13 Another interesting finding was that the resilience subdomain of spirituality was negatively associated with memory, language, and executive functioning performance.13 A distinction must be made between religious attendance (eg, regular attendance at religious institutions) vs religious beliefs, which may account for the previously reported associations between spirituality and improved cognition.13
Continue to: Self-reported resilience...
Self-reported resilience may also predict greater responsivity to antidepressant medication in patients with LLD.14 Older adults with LLD and greater self-reported baseline resilience were more likely to experience improvement or remission from depression with antidepressant treatment.14 This is congruent with conceptualizations of resilience as “the ability to adapt to and recover from stress.”14,15 Of the 4 identified resilience factors (grit, adaptive coping, accommodative coping, and spirituality), it appears that accommodative coping predicts LLD treatment response and remission.14 The unique ability to accommodate is associated with better mental health outcomes in the face of uncontrollable stress.14,16-18 Older adults appear to engage in more accommodative coping due to frequent uncontrollable stress and aging-related physiological changes (eg, sleep changes, chronic pain, declining cognition). This could make accommodative coping especially important in this population.14,19
The Figure, adapted from Weisenbach et al,9 exhibits factors that contribute to LLD, including cerebrovascular disease, neurodegeneration, and chronic inflammation, all of which can lead to a vulnerable aging brain that is at higher risk for depression, particularly within the context of stress. Clinical and neurocognitive factors associated with resilience can help buffer vulnerable brains from developing depression.
Neurobiological biomarkers of resilience in LLD
Gross anatomical indicators: Findings from neuroimaging
The neurobiology underlying psychological resilience involves brain networks associated with stress response, negative affect, and emotional control.19 Increased amygdala reactivity and amygdala frontal connectivity are often implicated in neurobiological models of resilience.20 Leaver et al20 correlated psychological resilience measures with amygdala function in 48 depressed vs nondepressed individuals using functional magnetic resonance imaging. Specifically, they targeted the basolateral, centromedial, and superficial nuclei groups of the amygdala while comparing the 2 groups based on resilience scores (CD-RISC), depressive symptom severity, and depression status.20 A significant correlation was identified between resilience and connectivity between the superficial group of amygdala nuclei and the ventral default mode network (VDMN).20 High levels of psychological resilience were associated with lower basal amygdala activity and decreased connectivity between amygdala nuclei and the VDMN.20 Additionally, lower depressive symptoms were associated with higher connectivity between the amygdalae and the dorsal frontal networks.20 These results suggest a complex relationship between amygdala activity, dorsal frontal regions, resilience, and LLD.20
Vlasova et al21 further addressed the multifactorial character of psychological resilience. The associations between the 4 factors of resilience and the regional integrity of white matter in older adults with LLD were examined using diffusion-weighted MRI.21 Grit was found to be associated with greater white matter integrity in the genu of the corpus callosum and cingulum bundle in LLD.21 There was also a positive association between grit and fractional anisotropy (FA) in the callosal region connecting the prefrontal cortex and FA in the cingulum fibers.21 However, results regarding the FA in the cingulum fibers did not survive correction for multiple comparisons and should be considered with caution, pending further research.21
Continue to: Stress response biomarkers of resilience
Stress response biomarkers of resilience
Stress response biomarkers include endocrine, immune, and inflammatory indices. Stress has been identified as a factor in inflammatory responses. Stress-related overstimulation of the HPA axis may increase the risk of LLD.22 Numerous studies have demonstrated an association between increased levels of peripheral proinflammatory cytokines and depressive symptoms in older adults.23 Interleukin-6 (IL-6) has been increasingly linked with depressive symptoms and poor memory performance in older adults.9 There also appears to be an interaction of inflammatory and vascular processes predisposing to LLD, as increased levels of IL-6 and C-reactive protein have been associated with higher white matter pathology.9 Additionally, proinflammatory cytokines impact monoamine neurotransmitter pathways, leading to a reduction in tryptophan and serotonin synthesis, disruption of glucocorticoid receptors, and a decrease in hippocampal neurotrophic support.9 Alexopoulos et al24 further explain that a prolonged CNS immune response can affect emotional and cognitive network functions related to LLD and has a role in the etiology of depressive symptoms in older adults.
Cardiovascular comorbidity and autonomic nervous system dysfunction
Many studies have revealed evidence of a bidirectional association between cardiovascular disease and depression.25 Dysregulation of the autonomic nervous system (ANS) is an underlying mechanism that could explain the link between cardiovascular risk and MDD via heart rate variability (HRV), though research examining age-related capacities provide conflicting data.25,26 HRV is a surrogate index of resting cardiac vagal outflow that represents the ability of the ANS to adapt to psychological, social, and physical environmental changes.27 Higher overall HRV is associated with greater self-regulating capacity, including behavioral, cognitive, and emotional control.28 Additionally, higher HRV may serve as a biomarker of resilience to the development of stress-related disorders such as MDD. Recent studies have shown an overall reduction in HRV in older adults with LLD.29 When high- and low-frequency HRV were investigated separately, only low-frequency HRV was significantly reduced in patients with depression.29 One explanation is that older adults with depression have impaired or reduced baroreflex sensitivity and gain, which is often associated with an increased risk of mortality following cardiac events.30 More research is needed to examine the complex processes required to better characterize the correlation between resilience in cardiovascular disease and autonomic dysfunction.
The Box6,31,32 describes the relationship between markers of cellular health and resilience.
Box
Among the biomarkers of resilience, telomere length and telomerase activity serve as biomarkers of biological aging that can differ from the chronological age and mark successful anti-aging, stress-reducing strategies.31 Telomerase, the cellular enzyme that regulates the health of cells when they reproduce (preserving the telomeres, repetitive DNA strands at the ends of chromosomes), is associated with overall cell health and cellular biological age.31 When telomerase is reduced, the telomeres in a cell are clipped, causing the cells to age more rapidly as the telomeres get shorter through the process of cellular reproduction.31 Psychological stress may play a significant role in telomerase production and subsequent telomere length.32 Lavretsky et al32 evaluated the effect of brief daily yogic meditation on depressive symptoms and immune cell telomerase activity in a family of dementia caregivers with mild depressive symptoms. Brief daily meditation practice led to significant lower levels of depressive symptoms that was accompanied by an increase in telomerase activity, suggesting improvement in stress-induced cellular aging.6,32
Mind-body therapies
There is increasing interest in improving older adults’ physical and emotional well-being while promoting resilience through stress-reducing lifestyle interventions such as MBTs.33 Because MBTs are often considered a natural and safer option compared to conventional medicine, these interventions are rapidly gaining popularity in the United States.33,34 According to a 2017 National Health Survey, there were 5% to 10% increases in the use of yoga, meditation, and chiropractic care from 2012 to 2017, with growing evidence supporting MBTs as minimally invasive, cost-effective approaches for managing stress and neurocognitive disorders.35 In contrast to pharmacologic approaches, MBTs can be used to train individuals to self-regulate in the face of adversity and stress, thus increasing their resilience.
MBTs can be divided into mindful movement exercises and meditative practices. Mindful movement exercises include yoga, tai chi, and qigong. Meditative practices that do not include movement include progressive relaxation, mindfulness, meditation, and acceptance therapies. On average, both mindful movement exercise (eg, yoga) and multicomponent mindfulness-based interventions (eg, mindfulness-based cognitive therapy, mindfulness-based stress reduction [MBSR], and mindfulness-based relapse prevention) can be as effective as other active treatments for psychiatric disorders such as MDD, anxiety, and substance use disorders.36,37 MBSR specifically has been shown to increase empathy, self-control, self-compassion, relationship quality, mindfulness, and spirituality as well as decrease rumination in healthy older adults.38 This suggests that MBSR can help strengthen the 4 factors of resilience.
Continue to: Research has also begun...
Research has also begun to evaluate the neurobiological mechanisms by which meditative therapies enhance resilience in mental health disorders, and several promising mechanistic domains (neural, hormonal, immune, cellular, and cardiovascular) have been identified.39 The physical yoga discipline includes asanas (postures), pranayama (breathing techniques), and dhyana (meditation). With the inclusion of mindfulness training, yoga involves the practice of meditation as well as the dynamic combination of proprioceptive and interoceptive awareness, resulting in both attention and profound focus.40 Dedicated yoga practice allows an individual to develop skills to withdraw the senses (pratyahara), concentrate the mind (dharana), and establish unwavering awareness (dhyana).41 The physical and cognitive benefits associated with yoga and mindfulness may be due to mechanisms including pranayama and activation of the parasympathetic nervous system; meditative or contemplative practices; increased body perception; stronger functional connectivity within the basal ganglia; or neuroplastic effects of increased grey matter volume and amygdala with regional enlargement.41 The new learning aspect of yoga practice may contribute to enhancing or improving various aspects of cognition, although the mechanisms are yet to be clarified.
Continued research in this area will promote the integration of MBTs into mainstream clinical practice and help alleviate the increased chronic health burden of an aging population. In the face of the COVID-19 pandemic, public interest in improving resilience and mental health42 can be supported by MBTs that can improve coping with the stress of the pandemic and enhance critical organ function (eg, lungs, heart, brain).43,44 As a result of these limitations, many resources and health care services have used telehealth and virtual platforms to adapt to these challenges and continue offering MBTs.45
Enhancing resilience to improve clinical outcomes
Increasing our understanding of clinical, neurocognitive, and neurobiological markers of resilience in older adults with and without depression could inform the development of interventions that treat and prevent mood and cognitive disorders of aging. Furthermore, stress reduction, decreased inflammation, and improved emotional regulation may have direct neuroplastic effects on the brain, leading to greater resilience. Complementary use of MBTs combined with standard antidepressant treatment may allow for additional improvement in clinical outcomes of LLD, including resilience, quality of life, general health, and cognitive function. Additional research testing the efficacy of those interventions designed to improve resilience in older adults with mood and mental disorders is needed.
Bottom Line
Identifying the clinical, neurocognitive, and neurobiological biomarkers of resilience in late-life depression could aid in the development of targeted interventions that treat and prevent mood and cognitive disorders of aging. Mind-body interventions can help boost resilience and improve outcomes in geriatric patients with mood and cognitive disorders.
Related Resources
- Lavretsky H. Resilience and Aging: Research and Practice. Johns Hopkins University Press; 2014.
- Lavretsky H, Sajatovic M, Reynolds CF, eds. Complementary and Integrative Therapies for Mental Health and Aging. Oxford University Press; 2016.
- Eyre HA, Berk M, Lavretsky H, et al, eds. Convergence Mental Health: A Transdisciplinary Approach to Innovation. Oxford University Press; 2021.
- UCLA Jane & Terry Semel Institute for Neuroscience & Human Behavior. Late-life Depression, Stress, and Wellness Research Program. https://www.semel.ucla.edu/latelife
1. Reynolds CF. Promoting resilience, reducing depression in older adults. Int Psychogeriatr. 2019;31(2):169-171.
2. Windle G. What is resilience? A review and concept analysis. Rev Clin Gerontol. 2011;21(2):152-169.
3. Southwick SM, Charney DS. The science of resilience: implications for the prevention and treatment of depression. Science. 2012;338(6103):79-82.
4. Dunn LB, Predescu I. Resilience: a rich concept in need of research comment on: “Neurocognitive correlates of resilience in late-life depression” (by Laird et al.). Am J Geriatr Psychiatry. 2019;27(1):18-20.
5. Harmell AL, Kamat R, Jeste DV, et al. Resilience-building interventions for successful and positive aging. In: Lavretsky H, Sajatovic M, Reynolds C III, eds. Complementary and Integrative Therapies for Mental Health and Aging. Oxford University Press; 2015:305-316.
6. Laird KT, Krause B, Funes C, et al. Psychobiological factors of resilience and depression in late life. Transl Psychiatry. 2019;9(1):88.
7. Byers AL, Yaffe K. Depression and risk of developing dementia. Nat Rev Neurol. 2011;7(6):323-331.
8. Callahan CM, Wolinsky FD, Stump TE, et al. Mortality, symptoms, and functional impairment in late-life depression. J Gen Intern Med. 1998;13(11):746-752.
9. Weisenbach SL, Kumar A. Current understanding of the neurobiology and longitudinal course of geriatric depression. Curr Psychiatry Rep. 2014;16(9):463.
10. Southwick SM, Litz BT, Charney D, et al. Resilience and Mental Health: Challenges Across the Lifespan. Cambridge University Press; 2011.
11. Laird KT, Lavretsky H, Paholpak P, et al. Clinical correlates of resilience factors in geriatric depression. Int Psychogeriatr. 2019;31(2):193-202.
12. Connor KM, Davidson JRT. Development of a new resilience scale: the Connor-Davidson Resilience Scale (CD-RISC). Depress Anxiety. 2003;18(2):76-82.
13. Laird KT, Lavretsky H, Wu P, et al. Neurocognitive correlates of resilience in late-life depression. Am J Geriatr Psychiatry. 2019;27(1):12-17.
14. Laird KT, Lavretsky H, St Cyr N, et al. Resilience predicts remission in antidepressant treatment of geriatric depression. Int J Geriatr Psychiatry. 2018;33(12):1596-1603.
15. Waugh CE, Koster EH. A resilience framework for promoting stable remission from depression. Clin Psychol Rev. 2015;41:49-60.
16. Boerner K. Adaptation to disability among middle-aged and older adults: the role of assimilative and accommodative coping. J Gerontol B Psychol Sci Soc Sci. 2004;59(1):P35-P42.
17. Zakowski SG, Hall MH, Klein LC, et al. Appraised control, coping, and stress in a community sample: a test of the goodness-of-fit hypothesis. Ann Behav Med. 2001;23(3):158-165.
18. Cheng C, Lau HB, Chan MP. Coping flexibility and psychological adjustment to stressful life changes: a meta-analytic review. Psychol Bull. 2014;140(6):1582-1607.
19. Stokes SA, Gordon SE. Common stressors experienced by the well elderly. Clinical implications. J Gerontol Nurs. 2003;29(5):38-46.
20. Leaver AM, Yang H, Siddarth P, et al. Resilience and amygdala function in older healthy and depressed adults. J Affect Disord. 2018;237:27-34.
21. Vlasova RM, Siddarth P, Krause B, et al. Resilience and white matter integrity in geriatric depression. Am J Geriatr Psychiatry. 2018;26(8):874-883.
22. Chopra K, Kumar B, Kuhad A. Pathobiological targets of depression. Expert Opin Ther Targets. 2011;15(4):379-400.
23. Martínez-Cengotitabengoa M, Carrascón L, O’Brien JT, et al. Peripheral inflammatory parameters in late-life depression: a systematic review. Int J Mol Sci. 2016;17(12):2022.
24. Alexopoulos GS, Morimoto SS. The inflammation hypothesis in geriatric depression. Int J Geriatr Psychiatry. 2011;26(11):1109-1118.
25. Carney RM, Freedland KE, Sheline YI, et al. Depression and coronary heart disease: a review for cardiologists. Clin Cardiol. 1997;20(3):196-200.
26. Carney RM, Freedland KE, Steinmeyer BC, et al. Nighttime heart rate predicts response to depression treatment in patients with coronary heart disease. J Affect Disord. 2016;200:165-171.
27. Appelhans BM, Luecken LJ. Heart rate variability as an index of regulated emotional responding. Rev Gen Psych. 2006;10(3):229-240.
28. Holzman JB, Bridgett DJ. Heart rate variability indices as bio-markers of top-down self-regulatory mechanisms: a meta-analytic review. Neurosci Biobehav Rev. 2017;74(Pt A):233-255.
29. Brown L, Karmakar C, Gray R, et al. Heart rate variability alterations in late life depression: a meta-analysis. J Affect Disord. 2018;235:456-466.
30. La Rovere MT, Bigger JT Jr, Marcus FI, et al. Baroreflex sensitivity and heart-rate variability in prediction of total cardiac mortality after myocardial infarction. ATRAMI (Autonomic Tone and Reflexes After Myocardial Infarction) Investigators. Lancet. 1998;351(1901):478-484.
31. Chakravarti D, LaBella KA, DePinho RA. Telomeres: history, health, and hallmarks of aging. Cell. 2021;184(2):306-322.
32. Lavretsky H, Epel ES, Siddarth P, et al. A pilot study of yogic meditation for family dementia caregivers with depressive symptoms: effects on mental health, cognition, and telomerase activity. Int J Geriatr Psychiatry. 2013;28(1):57-65.
33. Siddiqui MJ, Min CS, Verma RK, et al. Role of complementary and alternative medicine in geriatric care: a mini review. Pharmacogn Rev. 2014;8(16):81-87.
34. Nguyen SA, Lavretsky H. Emerging complementary and integrative therapies for geriatric mental health. Curr Treat Options Psychiatry. 2020;7(4):447-470.
35. Clarke TC, Barnes PM, Black LI, et al. Use of yoga, meditation, and chiropractors among U.S. adults aged 18 and over. NCHS Data Brief. 2018;(325):1-8.
36. Hofmann SG, Gómez AF. Mindfulness-based interventions for anxiety and depression. Psychiatr Clin North Am. 2017;40(4):739-749.
37. Ramadas E, de Lima MP, Caetano T, et al. Effectiveness of mindfulness-based relapse prevention in individuals with substance use disorders: a systematic review. Behav Sci (Basel). 2021;11(10):133.
38. Chiesa A, Serretti A. Mindfulness-based stress reduction for stress management in healthy people: a review and meta-analysis. J Altern Complement Med. 2009;15(5):593-600.
39. Strauss C, Cavanagh K, Oliver A, et al. Mindfulness-based interventions for people diagnosed with a current episode of an anxiety or depressive disorder: a meta-analysis of randomised controlled trials. PLoS One. 2014;9(4):e96110.
40. Chobe S, Chobe M, Metri K, et al. Impact of yoga on cognition and mental health among elderly: a systematic review. Complement Ther Med. 2020;52:102421.
41. Brunner D, Abramovitch A, Etherton J. A yoga program for cognitive enhancement. PLoS One. 2017;12(8):e0182366.
42. Dai J, Sang X, Menhas R, et al. The influence of COVID-19 pandemic on physical health-psychological health, physical activity, and overall well-being: the mediating role of emotional regulation. Front Psychol. 2021;12:667461.
43. Grolli RE, Mingoti MED, Bertollo AG, et al. Impact of COVID-19 in the mental health in elderly: psychological and biological updates. Mol Neurobiol. 2021;58(5):1905-1916.
44. Johansson A, Mohamed MS, Moulin TC, et al. Neurological manifestations of COVID-19: a comprehensive literature review and discussion of mechanisms. J Neuroimmunol. 2021;358:577658.
45. Pandya SP. Older women and wellbeing through the pandemic: examining the effect of daily online yoga lessons. Health Care Women Int. 2021;42(11):1255-1278.
Resilience has been defined as the ability to adapt and thrive in the face of adversity, acute stress, or trauma.1 Originally conceived as an inborn trait characteristic, resilience is now conceptualized as a dynamic, multidimensional capacity influenced by the interactions between internal factors (eg, personality, cognitive capacity, physical health) and environmental resources (eg, social status, financial stability).2,3 Resilience in older adults (typically defined as age ≥65) can improve the prognosis and outcomes for physical and mental conditions.4 The construct is closely aligned with “successful aging” and can be fostered in older adults, leading to improved physical and mental health and well-being.5
While initially resilience was conceptualized as the opposite of depressive states, recent research has identified resilience in the context of major depressive disorder (MDD) as the net effects of various psychosocial and biological variables that decrease the risk of onset, relapse, or depressive illness severity and increase the probability or speed of recovery.6 Late-life depression (LLD) in adults age >65 is a common and debilitating disease, often leading to decreased psychological well-being, increased cognitive decline, and excess mortality.7,8 LLD is associated with several factors, such as cerebrovascular disease, neurodegenerative disease, and inflammation, all of which could contribute to brain vulnerability and an increased risk of depression.9 Physical and cognitive engagement, physical activity, and high brain reserve have been shown to confer resilience to affective and cognitive changes in older adults, despite brain vulnerability.9
The greatest levels of resilience have been observed in individuals in their fifth decade of life and later,4,10 with high levels of resilience significantly contributing to longevity5; however, little is known about which factors contribute to heterogeneity in resilience characteristics and outcomes.4 Furthermore, the concept of resilience continues to raise numerous questions, including:
- how resilience should be measured or defined
- what factors promote or deter the development of resilience
- the effects of resilience on various health and psychological outcomes
- which interventions are effective in enhancing resilience in older adults.4
In this article, we describe resilience in older adults with LLD, its clinical and neurocognitive correlates, and underlying neurobiological and immunological biomarkers. We also examine resilience-building interventions, such as mind-body therapies (MBTs), that have been shown to enhance resilience by promoting positive perceptions of difficult experiences and challenges.
Clinical and neurocognitive correlates of resilience
Resilience varies substantially among older adults with LLD as well as across the lifespan of an individual.11 Identifying clinical components and predictors of resilience may usefully inform the development and testing of interventions to prevent and treat LLD.11 One tool widely used to measure resilience—the self-report Connor-Davidson Resilience Scale (CD-RISC)12— has been found to have clinically relevant characteristics.1,11 Using data from 337 older adults with LLD, Laird et al11 performed an exploratory factor analysis of the CD-RISC and found a 4-factor model:
- grit
- adaptive coping self-efficacy
- accommodative coping self-efficacy
- spirituality.1,11
Having a strong sense of purpose and not being easily discouraged by failure were items characteristic of grit.1,11 The preference to take the lead in problem-solving was typical of items loading on adaptive coping self-efficacy, while accommodative coping self-efficacy measured flexibility, cognitive reframing, a sense of humor, and acceptance in the face of uncontrollable stress.1,11 Finally, the belief that “things happen for a reason” and that “sometimes fate or God can help me” are characteristics of spirituality. 1,11 Using a multivariate model, the greatest variance in total resilience scores was explained by less depression, less apathy, higher quality of life, non-White race, and, somewhat counterintuitively, greater medical comorbidity.1,11 Thus, interventions designed to help older adults cultivate grit, active coping, accommodative coping, and spirituality may enhance resilience in LLD.
Resilience may also be positively associated with cognitive functioning and could be neuroprotective in LLD.13 Laird et al13 investigated associations between baseline resilience and several domains of neurocognitive functioning in 288 older adults with LLD. Several positive associations were found between measured language performance and total resilience, active coping, and accommodative coping.13 Additionally, total resilience and accommodative coping were significantly associated with a lower self-reported frequency of forgetfulness, a subjective measure of memory used in this study.13 Together, these results suggest that interventions targeting language might be useful to improve coping in LLD.13 Another interesting finding was that the resilience subdomain of spirituality was negatively associated with memory, language, and executive functioning performance.13 A distinction must be made between religious attendance (eg, regular attendance at religious institutions) vs religious beliefs, which may account for the previously reported associations between spirituality and improved cognition.13
Continue to: Self-reported resilience...
Self-reported resilience may also predict greater responsivity to antidepressant medication in patients with LLD.14 Older adults with LLD and greater self-reported baseline resilience were more likely to experience improvement or remission from depression with antidepressant treatment.14 This is congruent with conceptualizations of resilience as “the ability to adapt to and recover from stress.”14,15 Of the 4 identified resilience factors (grit, adaptive coping, accommodative coping, and spirituality), it appears that accommodative coping predicts LLD treatment response and remission.14 The unique ability to accommodate is associated with better mental health outcomes in the face of uncontrollable stress.14,16-18 Older adults appear to engage in more accommodative coping due to frequent uncontrollable stress and aging-related physiological changes (eg, sleep changes, chronic pain, declining cognition). This could make accommodative coping especially important in this population.14,19
The Figure, adapted from Weisenbach et al,9 exhibits factors that contribute to LLD, including cerebrovascular disease, neurodegeneration, and chronic inflammation, all of which can lead to a vulnerable aging brain that is at higher risk for depression, particularly within the context of stress. Clinical and neurocognitive factors associated with resilience can help buffer vulnerable brains from developing depression.
Neurobiological biomarkers of resilience in LLD
Gross anatomical indicators: Findings from neuroimaging
The neurobiology underlying psychological resilience involves brain networks associated with stress response, negative affect, and emotional control.19 Increased amygdala reactivity and amygdala frontal connectivity are often implicated in neurobiological models of resilience.20 Leaver et al20 correlated psychological resilience measures with amygdala function in 48 depressed vs nondepressed individuals using functional magnetic resonance imaging. Specifically, they targeted the basolateral, centromedial, and superficial nuclei groups of the amygdala while comparing the 2 groups based on resilience scores (CD-RISC), depressive symptom severity, and depression status.20 A significant correlation was identified between resilience and connectivity between the superficial group of amygdala nuclei and the ventral default mode network (VDMN).20 High levels of psychological resilience were associated with lower basal amygdala activity and decreased connectivity between amygdala nuclei and the VDMN.20 Additionally, lower depressive symptoms were associated with higher connectivity between the amygdalae and the dorsal frontal networks.20 These results suggest a complex relationship between amygdala activity, dorsal frontal regions, resilience, and LLD.20
Vlasova et al21 further addressed the multifactorial character of psychological resilience. The associations between the 4 factors of resilience and the regional integrity of white matter in older adults with LLD were examined using diffusion-weighted MRI.21 Grit was found to be associated with greater white matter integrity in the genu of the corpus callosum and cingulum bundle in LLD.21 There was also a positive association between grit and fractional anisotropy (FA) in the callosal region connecting the prefrontal cortex and FA in the cingulum fibers.21 However, results regarding the FA in the cingulum fibers did not survive correction for multiple comparisons and should be considered with caution, pending further research.21
Continue to: Stress response biomarkers of resilience
Stress response biomarkers of resilience
Stress response biomarkers include endocrine, immune, and inflammatory indices. Stress has been identified as a factor in inflammatory responses. Stress-related overstimulation of the HPA axis may increase the risk of LLD.22 Numerous studies have demonstrated an association between increased levels of peripheral proinflammatory cytokines and depressive symptoms in older adults.23 Interleukin-6 (IL-6) has been increasingly linked with depressive symptoms and poor memory performance in older adults.9 There also appears to be an interaction of inflammatory and vascular processes predisposing to LLD, as increased levels of IL-6 and C-reactive protein have been associated with higher white matter pathology.9 Additionally, proinflammatory cytokines impact monoamine neurotransmitter pathways, leading to a reduction in tryptophan and serotonin synthesis, disruption of glucocorticoid receptors, and a decrease in hippocampal neurotrophic support.9 Alexopoulos et al24 further explain that a prolonged CNS immune response can affect emotional and cognitive network functions related to LLD and has a role in the etiology of depressive symptoms in older adults.
Cardiovascular comorbidity and autonomic nervous system dysfunction
Many studies have revealed evidence of a bidirectional association between cardiovascular disease and depression.25 Dysregulation of the autonomic nervous system (ANS) is an underlying mechanism that could explain the link between cardiovascular risk and MDD via heart rate variability (HRV), though research examining age-related capacities provide conflicting data.25,26 HRV is a surrogate index of resting cardiac vagal outflow that represents the ability of the ANS to adapt to psychological, social, and physical environmental changes.27 Higher overall HRV is associated with greater self-regulating capacity, including behavioral, cognitive, and emotional control.28 Additionally, higher HRV may serve as a biomarker of resilience to the development of stress-related disorders such as MDD. Recent studies have shown an overall reduction in HRV in older adults with LLD.29 When high- and low-frequency HRV were investigated separately, only low-frequency HRV was significantly reduced in patients with depression.29 One explanation is that older adults with depression have impaired or reduced baroreflex sensitivity and gain, which is often associated with an increased risk of mortality following cardiac events.30 More research is needed to examine the complex processes required to better characterize the correlation between resilience in cardiovascular disease and autonomic dysfunction.
The Box6,31,32 describes the relationship between markers of cellular health and resilience.
Box
Among the biomarkers of resilience, telomere length and telomerase activity serve as biomarkers of biological aging that can differ from the chronological age and mark successful anti-aging, stress-reducing strategies.31 Telomerase, the cellular enzyme that regulates the health of cells when they reproduce (preserving the telomeres, repetitive DNA strands at the ends of chromosomes), is associated with overall cell health and cellular biological age.31 When telomerase is reduced, the telomeres in a cell are clipped, causing the cells to age more rapidly as the telomeres get shorter through the process of cellular reproduction.31 Psychological stress may play a significant role in telomerase production and subsequent telomere length.32 Lavretsky et al32 evaluated the effect of brief daily yogic meditation on depressive symptoms and immune cell telomerase activity in a family of dementia caregivers with mild depressive symptoms. Brief daily meditation practice led to significant lower levels of depressive symptoms that was accompanied by an increase in telomerase activity, suggesting improvement in stress-induced cellular aging.6,32
Mind-body therapies
There is increasing interest in improving older adults’ physical and emotional well-being while promoting resilience through stress-reducing lifestyle interventions such as MBTs.33 Because MBTs are often considered a natural and safer option compared to conventional medicine, these interventions are rapidly gaining popularity in the United States.33,34 According to a 2017 National Health Survey, there were 5% to 10% increases in the use of yoga, meditation, and chiropractic care from 2012 to 2017, with growing evidence supporting MBTs as minimally invasive, cost-effective approaches for managing stress and neurocognitive disorders.35 In contrast to pharmacologic approaches, MBTs can be used to train individuals to self-regulate in the face of adversity and stress, thus increasing their resilience.
MBTs can be divided into mindful movement exercises and meditative practices. Mindful movement exercises include yoga, tai chi, and qigong. Meditative practices that do not include movement include progressive relaxation, mindfulness, meditation, and acceptance therapies. On average, both mindful movement exercise (eg, yoga) and multicomponent mindfulness-based interventions (eg, mindfulness-based cognitive therapy, mindfulness-based stress reduction [MBSR], and mindfulness-based relapse prevention) can be as effective as other active treatments for psychiatric disorders such as MDD, anxiety, and substance use disorders.36,37 MBSR specifically has been shown to increase empathy, self-control, self-compassion, relationship quality, mindfulness, and spirituality as well as decrease rumination in healthy older adults.38 This suggests that MBSR can help strengthen the 4 factors of resilience.
Continue to: Research has also begun...
Research has also begun to evaluate the neurobiological mechanisms by which meditative therapies enhance resilience in mental health disorders, and several promising mechanistic domains (neural, hormonal, immune, cellular, and cardiovascular) have been identified.39 The physical yoga discipline includes asanas (postures), pranayama (breathing techniques), and dhyana (meditation). With the inclusion of mindfulness training, yoga involves the practice of meditation as well as the dynamic combination of proprioceptive and interoceptive awareness, resulting in both attention and profound focus.40 Dedicated yoga practice allows an individual to develop skills to withdraw the senses (pratyahara), concentrate the mind (dharana), and establish unwavering awareness (dhyana).41 The physical and cognitive benefits associated with yoga and mindfulness may be due to mechanisms including pranayama and activation of the parasympathetic nervous system; meditative or contemplative practices; increased body perception; stronger functional connectivity within the basal ganglia; or neuroplastic effects of increased grey matter volume and amygdala with regional enlargement.41 The new learning aspect of yoga practice may contribute to enhancing or improving various aspects of cognition, although the mechanisms are yet to be clarified.
Continued research in this area will promote the integration of MBTs into mainstream clinical practice and help alleviate the increased chronic health burden of an aging population. In the face of the COVID-19 pandemic, public interest in improving resilience and mental health42 can be supported by MBTs that can improve coping with the stress of the pandemic and enhance critical organ function (eg, lungs, heart, brain).43,44 As a result of these limitations, many resources and health care services have used telehealth and virtual platforms to adapt to these challenges and continue offering MBTs.45
Enhancing resilience to improve clinical outcomes
Increasing our understanding of clinical, neurocognitive, and neurobiological markers of resilience in older adults with and without depression could inform the development of interventions that treat and prevent mood and cognitive disorders of aging. Furthermore, stress reduction, decreased inflammation, and improved emotional regulation may have direct neuroplastic effects on the brain, leading to greater resilience. Complementary use of MBTs combined with standard antidepressant treatment may allow for additional improvement in clinical outcomes of LLD, including resilience, quality of life, general health, and cognitive function. Additional research testing the efficacy of those interventions designed to improve resilience in older adults with mood and mental disorders is needed.
Bottom Line
Identifying the clinical, neurocognitive, and neurobiological biomarkers of resilience in late-life depression could aid in the development of targeted interventions that treat and prevent mood and cognitive disorders of aging. Mind-body interventions can help boost resilience and improve outcomes in geriatric patients with mood and cognitive disorders.
Related Resources
- Lavretsky H. Resilience and Aging: Research and Practice. Johns Hopkins University Press; 2014.
- Lavretsky H, Sajatovic M, Reynolds CF, eds. Complementary and Integrative Therapies for Mental Health and Aging. Oxford University Press; 2016.
- Eyre HA, Berk M, Lavretsky H, et al, eds. Convergence Mental Health: A Transdisciplinary Approach to Innovation. Oxford University Press; 2021.
- UCLA Jane & Terry Semel Institute for Neuroscience & Human Behavior. Late-life Depression, Stress, and Wellness Research Program. https://www.semel.ucla.edu/latelife
Resilience has been defined as the ability to adapt and thrive in the face of adversity, acute stress, or trauma.1 Originally conceived as an inborn trait characteristic, resilience is now conceptualized as a dynamic, multidimensional capacity influenced by the interactions between internal factors (eg, personality, cognitive capacity, physical health) and environmental resources (eg, social status, financial stability).2,3 Resilience in older adults (typically defined as age ≥65) can improve the prognosis and outcomes for physical and mental conditions.4 The construct is closely aligned with “successful aging” and can be fostered in older adults, leading to improved physical and mental health and well-being.5
While initially resilience was conceptualized as the opposite of depressive states, recent research has identified resilience in the context of major depressive disorder (MDD) as the net effects of various psychosocial and biological variables that decrease the risk of onset, relapse, or depressive illness severity and increase the probability or speed of recovery.6 Late-life depression (LLD) in adults age >65 is a common and debilitating disease, often leading to decreased psychological well-being, increased cognitive decline, and excess mortality.7,8 LLD is associated with several factors, such as cerebrovascular disease, neurodegenerative disease, and inflammation, all of which could contribute to brain vulnerability and an increased risk of depression.9 Physical and cognitive engagement, physical activity, and high brain reserve have been shown to confer resilience to affective and cognitive changes in older adults, despite brain vulnerability.9
The greatest levels of resilience have been observed in individuals in their fifth decade of life and later,4,10 with high levels of resilience significantly contributing to longevity5; however, little is known about which factors contribute to heterogeneity in resilience characteristics and outcomes.4 Furthermore, the concept of resilience continues to raise numerous questions, including:
- how resilience should be measured or defined
- what factors promote or deter the development of resilience
- the effects of resilience on various health and psychological outcomes
- which interventions are effective in enhancing resilience in older adults.4
In this article, we describe resilience in older adults with LLD, its clinical and neurocognitive correlates, and underlying neurobiological and immunological biomarkers. We also examine resilience-building interventions, such as mind-body therapies (MBTs), that have been shown to enhance resilience by promoting positive perceptions of difficult experiences and challenges.
Clinical and neurocognitive correlates of resilience
Resilience varies substantially among older adults with LLD as well as across the lifespan of an individual.11 Identifying clinical components and predictors of resilience may usefully inform the development and testing of interventions to prevent and treat LLD.11 One tool widely used to measure resilience—the self-report Connor-Davidson Resilience Scale (CD-RISC)12— has been found to have clinically relevant characteristics.1,11 Using data from 337 older adults with LLD, Laird et al11 performed an exploratory factor analysis of the CD-RISC and found a 4-factor model:
- grit
- adaptive coping self-efficacy
- accommodative coping self-efficacy
- spirituality.1,11
Having a strong sense of purpose and not being easily discouraged by failure were items characteristic of grit.1,11 The preference to take the lead in problem-solving was typical of items loading on adaptive coping self-efficacy, while accommodative coping self-efficacy measured flexibility, cognitive reframing, a sense of humor, and acceptance in the face of uncontrollable stress.1,11 Finally, the belief that “things happen for a reason” and that “sometimes fate or God can help me” are characteristics of spirituality. 1,11 Using a multivariate model, the greatest variance in total resilience scores was explained by less depression, less apathy, higher quality of life, non-White race, and, somewhat counterintuitively, greater medical comorbidity.1,11 Thus, interventions designed to help older adults cultivate grit, active coping, accommodative coping, and spirituality may enhance resilience in LLD.
Resilience may also be positively associated with cognitive functioning and could be neuroprotective in LLD.13 Laird et al13 investigated associations between baseline resilience and several domains of neurocognitive functioning in 288 older adults with LLD. Several positive associations were found between measured language performance and total resilience, active coping, and accommodative coping.13 Additionally, total resilience and accommodative coping were significantly associated with a lower self-reported frequency of forgetfulness, a subjective measure of memory used in this study.13 Together, these results suggest that interventions targeting language might be useful to improve coping in LLD.13 Another interesting finding was that the resilience subdomain of spirituality was negatively associated with memory, language, and executive functioning performance.13 A distinction must be made between religious attendance (eg, regular attendance at religious institutions) vs religious beliefs, which may account for the previously reported associations between spirituality and improved cognition.13
Continue to: Self-reported resilience...
Self-reported resilience may also predict greater responsivity to antidepressant medication in patients with LLD.14 Older adults with LLD and greater self-reported baseline resilience were more likely to experience improvement or remission from depression with antidepressant treatment.14 This is congruent with conceptualizations of resilience as “the ability to adapt to and recover from stress.”14,15 Of the 4 identified resilience factors (grit, adaptive coping, accommodative coping, and spirituality), it appears that accommodative coping predicts LLD treatment response and remission.14 The unique ability to accommodate is associated with better mental health outcomes in the face of uncontrollable stress.14,16-18 Older adults appear to engage in more accommodative coping due to frequent uncontrollable stress and aging-related physiological changes (eg, sleep changes, chronic pain, declining cognition). This could make accommodative coping especially important in this population.14,19
The Figure, adapted from Weisenbach et al,9 exhibits factors that contribute to LLD, including cerebrovascular disease, neurodegeneration, and chronic inflammation, all of which can lead to a vulnerable aging brain that is at higher risk for depression, particularly within the context of stress. Clinical and neurocognitive factors associated with resilience can help buffer vulnerable brains from developing depression.
Neurobiological biomarkers of resilience in LLD
Gross anatomical indicators: Findings from neuroimaging
The neurobiology underlying psychological resilience involves brain networks associated with stress response, negative affect, and emotional control.19 Increased amygdala reactivity and amygdala frontal connectivity are often implicated in neurobiological models of resilience.20 Leaver et al20 correlated psychological resilience measures with amygdala function in 48 depressed vs nondepressed individuals using functional magnetic resonance imaging. Specifically, they targeted the basolateral, centromedial, and superficial nuclei groups of the amygdala while comparing the 2 groups based on resilience scores (CD-RISC), depressive symptom severity, and depression status.20 A significant correlation was identified between resilience and connectivity between the superficial group of amygdala nuclei and the ventral default mode network (VDMN).20 High levels of psychological resilience were associated with lower basal amygdala activity and decreased connectivity between amygdala nuclei and the VDMN.20 Additionally, lower depressive symptoms were associated with higher connectivity between the amygdalae and the dorsal frontal networks.20 These results suggest a complex relationship between amygdala activity, dorsal frontal regions, resilience, and LLD.20
Vlasova et al21 further addressed the multifactorial character of psychological resilience. The associations between the 4 factors of resilience and the regional integrity of white matter in older adults with LLD were examined using diffusion-weighted MRI.21 Grit was found to be associated with greater white matter integrity in the genu of the corpus callosum and cingulum bundle in LLD.21 There was also a positive association between grit and fractional anisotropy (FA) in the callosal region connecting the prefrontal cortex and FA in the cingulum fibers.21 However, results regarding the FA in the cingulum fibers did not survive correction for multiple comparisons and should be considered with caution, pending further research.21
Continue to: Stress response biomarkers of resilience
Stress response biomarkers of resilience
Stress response biomarkers include endocrine, immune, and inflammatory indices. Stress has been identified as a factor in inflammatory responses. Stress-related overstimulation of the HPA axis may increase the risk of LLD.22 Numerous studies have demonstrated an association between increased levels of peripheral proinflammatory cytokines and depressive symptoms in older adults.23 Interleukin-6 (IL-6) has been increasingly linked with depressive symptoms and poor memory performance in older adults.9 There also appears to be an interaction of inflammatory and vascular processes predisposing to LLD, as increased levels of IL-6 and C-reactive protein have been associated with higher white matter pathology.9 Additionally, proinflammatory cytokines impact monoamine neurotransmitter pathways, leading to a reduction in tryptophan and serotonin synthesis, disruption of glucocorticoid receptors, and a decrease in hippocampal neurotrophic support.9 Alexopoulos et al24 further explain that a prolonged CNS immune response can affect emotional and cognitive network functions related to LLD and has a role in the etiology of depressive symptoms in older adults.
Cardiovascular comorbidity and autonomic nervous system dysfunction
Many studies have revealed evidence of a bidirectional association between cardiovascular disease and depression.25 Dysregulation of the autonomic nervous system (ANS) is an underlying mechanism that could explain the link between cardiovascular risk and MDD via heart rate variability (HRV), though research examining age-related capacities provide conflicting data.25,26 HRV is a surrogate index of resting cardiac vagal outflow that represents the ability of the ANS to adapt to psychological, social, and physical environmental changes.27 Higher overall HRV is associated with greater self-regulating capacity, including behavioral, cognitive, and emotional control.28 Additionally, higher HRV may serve as a biomarker of resilience to the development of stress-related disorders such as MDD. Recent studies have shown an overall reduction in HRV in older adults with LLD.29 When high- and low-frequency HRV were investigated separately, only low-frequency HRV was significantly reduced in patients with depression.29 One explanation is that older adults with depression have impaired or reduced baroreflex sensitivity and gain, which is often associated with an increased risk of mortality following cardiac events.30 More research is needed to examine the complex processes required to better characterize the correlation between resilience in cardiovascular disease and autonomic dysfunction.
The Box6,31,32 describes the relationship between markers of cellular health and resilience.
Box
Among the biomarkers of resilience, telomere length and telomerase activity serve as biomarkers of biological aging that can differ from the chronological age and mark successful anti-aging, stress-reducing strategies.31 Telomerase, the cellular enzyme that regulates the health of cells when they reproduce (preserving the telomeres, repetitive DNA strands at the ends of chromosomes), is associated with overall cell health and cellular biological age.31 When telomerase is reduced, the telomeres in a cell are clipped, causing the cells to age more rapidly as the telomeres get shorter through the process of cellular reproduction.31 Psychological stress may play a significant role in telomerase production and subsequent telomere length.32 Lavretsky et al32 evaluated the effect of brief daily yogic meditation on depressive symptoms and immune cell telomerase activity in a family of dementia caregivers with mild depressive symptoms. Brief daily meditation practice led to significant lower levels of depressive symptoms that was accompanied by an increase in telomerase activity, suggesting improvement in stress-induced cellular aging.6,32
Mind-body therapies
There is increasing interest in improving older adults’ physical and emotional well-being while promoting resilience through stress-reducing lifestyle interventions such as MBTs.33 Because MBTs are often considered a natural and safer option compared to conventional medicine, these interventions are rapidly gaining popularity in the United States.33,34 According to a 2017 National Health Survey, there were 5% to 10% increases in the use of yoga, meditation, and chiropractic care from 2012 to 2017, with growing evidence supporting MBTs as minimally invasive, cost-effective approaches for managing stress and neurocognitive disorders.35 In contrast to pharmacologic approaches, MBTs can be used to train individuals to self-regulate in the face of adversity and stress, thus increasing their resilience.
MBTs can be divided into mindful movement exercises and meditative practices. Mindful movement exercises include yoga, tai chi, and qigong. Meditative practices that do not include movement include progressive relaxation, mindfulness, meditation, and acceptance therapies. On average, both mindful movement exercise (eg, yoga) and multicomponent mindfulness-based interventions (eg, mindfulness-based cognitive therapy, mindfulness-based stress reduction [MBSR], and mindfulness-based relapse prevention) can be as effective as other active treatments for psychiatric disorders such as MDD, anxiety, and substance use disorders.36,37 MBSR specifically has been shown to increase empathy, self-control, self-compassion, relationship quality, mindfulness, and spirituality as well as decrease rumination in healthy older adults.38 This suggests that MBSR can help strengthen the 4 factors of resilience.
Continue to: Research has also begun...
Research has also begun to evaluate the neurobiological mechanisms by which meditative therapies enhance resilience in mental health disorders, and several promising mechanistic domains (neural, hormonal, immune, cellular, and cardiovascular) have been identified.39 The physical yoga discipline includes asanas (postures), pranayama (breathing techniques), and dhyana (meditation). With the inclusion of mindfulness training, yoga involves the practice of meditation as well as the dynamic combination of proprioceptive and interoceptive awareness, resulting in both attention and profound focus.40 Dedicated yoga practice allows an individual to develop skills to withdraw the senses (pratyahara), concentrate the mind (dharana), and establish unwavering awareness (dhyana).41 The physical and cognitive benefits associated with yoga and mindfulness may be due to mechanisms including pranayama and activation of the parasympathetic nervous system; meditative or contemplative practices; increased body perception; stronger functional connectivity within the basal ganglia; or neuroplastic effects of increased grey matter volume and amygdala with regional enlargement.41 The new learning aspect of yoga practice may contribute to enhancing or improving various aspects of cognition, although the mechanisms are yet to be clarified.
Continued research in this area will promote the integration of MBTs into mainstream clinical practice and help alleviate the increased chronic health burden of an aging population. In the face of the COVID-19 pandemic, public interest in improving resilience and mental health42 can be supported by MBTs that can improve coping with the stress of the pandemic and enhance critical organ function (eg, lungs, heart, brain).43,44 As a result of these limitations, many resources and health care services have used telehealth and virtual platforms to adapt to these challenges and continue offering MBTs.45
Enhancing resilience to improve clinical outcomes
Increasing our understanding of clinical, neurocognitive, and neurobiological markers of resilience in older adults with and without depression could inform the development of interventions that treat and prevent mood and cognitive disorders of aging. Furthermore, stress reduction, decreased inflammation, and improved emotional regulation may have direct neuroplastic effects on the brain, leading to greater resilience. Complementary use of MBTs combined with standard antidepressant treatment may allow for additional improvement in clinical outcomes of LLD, including resilience, quality of life, general health, and cognitive function. Additional research testing the efficacy of those interventions designed to improve resilience in older adults with mood and mental disorders is needed.
Bottom Line
Identifying the clinical, neurocognitive, and neurobiological biomarkers of resilience in late-life depression could aid in the development of targeted interventions that treat and prevent mood and cognitive disorders of aging. Mind-body interventions can help boost resilience and improve outcomes in geriatric patients with mood and cognitive disorders.
Related Resources
- Lavretsky H. Resilience and Aging: Research and Practice. Johns Hopkins University Press; 2014.
- Lavretsky H, Sajatovic M, Reynolds CF, eds. Complementary and Integrative Therapies for Mental Health and Aging. Oxford University Press; 2016.
- Eyre HA, Berk M, Lavretsky H, et al, eds. Convergence Mental Health: A Transdisciplinary Approach to Innovation. Oxford University Press; 2021.
- UCLA Jane & Terry Semel Institute for Neuroscience & Human Behavior. Late-life Depression, Stress, and Wellness Research Program. https://www.semel.ucla.edu/latelife
1. Reynolds CF. Promoting resilience, reducing depression in older adults. Int Psychogeriatr. 2019;31(2):169-171.
2. Windle G. What is resilience? A review and concept analysis. Rev Clin Gerontol. 2011;21(2):152-169.
3. Southwick SM, Charney DS. The science of resilience: implications for the prevention and treatment of depression. Science. 2012;338(6103):79-82.
4. Dunn LB, Predescu I. Resilience: a rich concept in need of research comment on: “Neurocognitive correlates of resilience in late-life depression” (by Laird et al.). Am J Geriatr Psychiatry. 2019;27(1):18-20.
5. Harmell AL, Kamat R, Jeste DV, et al. Resilience-building interventions for successful and positive aging. In: Lavretsky H, Sajatovic M, Reynolds C III, eds. Complementary and Integrative Therapies for Mental Health and Aging. Oxford University Press; 2015:305-316.
6. Laird KT, Krause B, Funes C, et al. Psychobiological factors of resilience and depression in late life. Transl Psychiatry. 2019;9(1):88.
7. Byers AL, Yaffe K. Depression and risk of developing dementia. Nat Rev Neurol. 2011;7(6):323-331.
8. Callahan CM, Wolinsky FD, Stump TE, et al. Mortality, symptoms, and functional impairment in late-life depression. J Gen Intern Med. 1998;13(11):746-752.
9. Weisenbach SL, Kumar A. Current understanding of the neurobiology and longitudinal course of geriatric depression. Curr Psychiatry Rep. 2014;16(9):463.
10. Southwick SM, Litz BT, Charney D, et al. Resilience and Mental Health: Challenges Across the Lifespan. Cambridge University Press; 2011.
11. Laird KT, Lavretsky H, Paholpak P, et al. Clinical correlates of resilience factors in geriatric depression. Int Psychogeriatr. 2019;31(2):193-202.
12. Connor KM, Davidson JRT. Development of a new resilience scale: the Connor-Davidson Resilience Scale (CD-RISC). Depress Anxiety. 2003;18(2):76-82.
13. Laird KT, Lavretsky H, Wu P, et al. Neurocognitive correlates of resilience in late-life depression. Am J Geriatr Psychiatry. 2019;27(1):12-17.
14. Laird KT, Lavretsky H, St Cyr N, et al. Resilience predicts remission in antidepressant treatment of geriatric depression. Int J Geriatr Psychiatry. 2018;33(12):1596-1603.
15. Waugh CE, Koster EH. A resilience framework for promoting stable remission from depression. Clin Psychol Rev. 2015;41:49-60.
16. Boerner K. Adaptation to disability among middle-aged and older adults: the role of assimilative and accommodative coping. J Gerontol B Psychol Sci Soc Sci. 2004;59(1):P35-P42.
17. Zakowski SG, Hall MH, Klein LC, et al. Appraised control, coping, and stress in a community sample: a test of the goodness-of-fit hypothesis. Ann Behav Med. 2001;23(3):158-165.
18. Cheng C, Lau HB, Chan MP. Coping flexibility and psychological adjustment to stressful life changes: a meta-analytic review. Psychol Bull. 2014;140(6):1582-1607.
19. Stokes SA, Gordon SE. Common stressors experienced by the well elderly. Clinical implications. J Gerontol Nurs. 2003;29(5):38-46.
20. Leaver AM, Yang H, Siddarth P, et al. Resilience and amygdala function in older healthy and depressed adults. J Affect Disord. 2018;237:27-34.
21. Vlasova RM, Siddarth P, Krause B, et al. Resilience and white matter integrity in geriatric depression. Am J Geriatr Psychiatry. 2018;26(8):874-883.
22. Chopra K, Kumar B, Kuhad A. Pathobiological targets of depression. Expert Opin Ther Targets. 2011;15(4):379-400.
23. Martínez-Cengotitabengoa M, Carrascón L, O’Brien JT, et al. Peripheral inflammatory parameters in late-life depression: a systematic review. Int J Mol Sci. 2016;17(12):2022.
24. Alexopoulos GS, Morimoto SS. The inflammation hypothesis in geriatric depression. Int J Geriatr Psychiatry. 2011;26(11):1109-1118.
25. Carney RM, Freedland KE, Sheline YI, et al. Depression and coronary heart disease: a review for cardiologists. Clin Cardiol. 1997;20(3):196-200.
26. Carney RM, Freedland KE, Steinmeyer BC, et al. Nighttime heart rate predicts response to depression treatment in patients with coronary heart disease. J Affect Disord. 2016;200:165-171.
27. Appelhans BM, Luecken LJ. Heart rate variability as an index of regulated emotional responding. Rev Gen Psych. 2006;10(3):229-240.
28. Holzman JB, Bridgett DJ. Heart rate variability indices as bio-markers of top-down self-regulatory mechanisms: a meta-analytic review. Neurosci Biobehav Rev. 2017;74(Pt A):233-255.
29. Brown L, Karmakar C, Gray R, et al. Heart rate variability alterations in late life depression: a meta-analysis. J Affect Disord. 2018;235:456-466.
30. La Rovere MT, Bigger JT Jr, Marcus FI, et al. Baroreflex sensitivity and heart-rate variability in prediction of total cardiac mortality after myocardial infarction. ATRAMI (Autonomic Tone and Reflexes After Myocardial Infarction) Investigators. Lancet. 1998;351(1901):478-484.
31. Chakravarti D, LaBella KA, DePinho RA. Telomeres: history, health, and hallmarks of aging. Cell. 2021;184(2):306-322.
32. Lavretsky H, Epel ES, Siddarth P, et al. A pilot study of yogic meditation for family dementia caregivers with depressive symptoms: effects on mental health, cognition, and telomerase activity. Int J Geriatr Psychiatry. 2013;28(1):57-65.
33. Siddiqui MJ, Min CS, Verma RK, et al. Role of complementary and alternative medicine in geriatric care: a mini review. Pharmacogn Rev. 2014;8(16):81-87.
34. Nguyen SA, Lavretsky H. Emerging complementary and integrative therapies for geriatric mental health. Curr Treat Options Psychiatry. 2020;7(4):447-470.
35. Clarke TC, Barnes PM, Black LI, et al. Use of yoga, meditation, and chiropractors among U.S. adults aged 18 and over. NCHS Data Brief. 2018;(325):1-8.
36. Hofmann SG, Gómez AF. Mindfulness-based interventions for anxiety and depression. Psychiatr Clin North Am. 2017;40(4):739-749.
37. Ramadas E, de Lima MP, Caetano T, et al. Effectiveness of mindfulness-based relapse prevention in individuals with substance use disorders: a systematic review. Behav Sci (Basel). 2021;11(10):133.
38. Chiesa A, Serretti A. Mindfulness-based stress reduction for stress management in healthy people: a review and meta-analysis. J Altern Complement Med. 2009;15(5):593-600.
39. Strauss C, Cavanagh K, Oliver A, et al. Mindfulness-based interventions for people diagnosed with a current episode of an anxiety or depressive disorder: a meta-analysis of randomised controlled trials. PLoS One. 2014;9(4):e96110.
40. Chobe S, Chobe M, Metri K, et al. Impact of yoga on cognition and mental health among elderly: a systematic review. Complement Ther Med. 2020;52:102421.
41. Brunner D, Abramovitch A, Etherton J. A yoga program for cognitive enhancement. PLoS One. 2017;12(8):e0182366.
42. Dai J, Sang X, Menhas R, et al. The influence of COVID-19 pandemic on physical health-psychological health, physical activity, and overall well-being: the mediating role of emotional regulation. Front Psychol. 2021;12:667461.
43. Grolli RE, Mingoti MED, Bertollo AG, et al. Impact of COVID-19 in the mental health in elderly: psychological and biological updates. Mol Neurobiol. 2021;58(5):1905-1916.
44. Johansson A, Mohamed MS, Moulin TC, et al. Neurological manifestations of COVID-19: a comprehensive literature review and discussion of mechanisms. J Neuroimmunol. 2021;358:577658.
45. Pandya SP. Older women and wellbeing through the pandemic: examining the effect of daily online yoga lessons. Health Care Women Int. 2021;42(11):1255-1278.
1. Reynolds CF. Promoting resilience, reducing depression in older adults. Int Psychogeriatr. 2019;31(2):169-171.
2. Windle G. What is resilience? A review and concept analysis. Rev Clin Gerontol. 2011;21(2):152-169.
3. Southwick SM, Charney DS. The science of resilience: implications for the prevention and treatment of depression. Science. 2012;338(6103):79-82.
4. Dunn LB, Predescu I. Resilience: a rich concept in need of research comment on: “Neurocognitive correlates of resilience in late-life depression” (by Laird et al.). Am J Geriatr Psychiatry. 2019;27(1):18-20.
5. Harmell AL, Kamat R, Jeste DV, et al. Resilience-building interventions for successful and positive aging. In: Lavretsky H, Sajatovic M, Reynolds C III, eds. Complementary and Integrative Therapies for Mental Health and Aging. Oxford University Press; 2015:305-316.
6. Laird KT, Krause B, Funes C, et al. Psychobiological factors of resilience and depression in late life. Transl Psychiatry. 2019;9(1):88.
7. Byers AL, Yaffe K. Depression and risk of developing dementia. Nat Rev Neurol. 2011;7(6):323-331.
8. Callahan CM, Wolinsky FD, Stump TE, et al. Mortality, symptoms, and functional impairment in late-life depression. J Gen Intern Med. 1998;13(11):746-752.
9. Weisenbach SL, Kumar A. Current understanding of the neurobiology and longitudinal course of geriatric depression. Curr Psychiatry Rep. 2014;16(9):463.
10. Southwick SM, Litz BT, Charney D, et al. Resilience and Mental Health: Challenges Across the Lifespan. Cambridge University Press; 2011.
11. Laird KT, Lavretsky H, Paholpak P, et al. Clinical correlates of resilience factors in geriatric depression. Int Psychogeriatr. 2019;31(2):193-202.
12. Connor KM, Davidson JRT. Development of a new resilience scale: the Connor-Davidson Resilience Scale (CD-RISC). Depress Anxiety. 2003;18(2):76-82.
13. Laird KT, Lavretsky H, Wu P, et al. Neurocognitive correlates of resilience in late-life depression. Am J Geriatr Psychiatry. 2019;27(1):12-17.
14. Laird KT, Lavretsky H, St Cyr N, et al. Resilience predicts remission in antidepressant treatment of geriatric depression. Int J Geriatr Psychiatry. 2018;33(12):1596-1603.
15. Waugh CE, Koster EH. A resilience framework for promoting stable remission from depression. Clin Psychol Rev. 2015;41:49-60.
16. Boerner K. Adaptation to disability among middle-aged and older adults: the role of assimilative and accommodative coping. J Gerontol B Psychol Sci Soc Sci. 2004;59(1):P35-P42.
17. Zakowski SG, Hall MH, Klein LC, et al. Appraised control, coping, and stress in a community sample: a test of the goodness-of-fit hypothesis. Ann Behav Med. 2001;23(3):158-165.
18. Cheng C, Lau HB, Chan MP. Coping flexibility and psychological adjustment to stressful life changes: a meta-analytic review. Psychol Bull. 2014;140(6):1582-1607.
19. Stokes SA, Gordon SE. Common stressors experienced by the well elderly. Clinical implications. J Gerontol Nurs. 2003;29(5):38-46.
20. Leaver AM, Yang H, Siddarth P, et al. Resilience and amygdala function in older healthy and depressed adults. J Affect Disord. 2018;237:27-34.
21. Vlasova RM, Siddarth P, Krause B, et al. Resilience and white matter integrity in geriatric depression. Am J Geriatr Psychiatry. 2018;26(8):874-883.
22. Chopra K, Kumar B, Kuhad A. Pathobiological targets of depression. Expert Opin Ther Targets. 2011;15(4):379-400.
23. Martínez-Cengotitabengoa M, Carrascón L, O’Brien JT, et al. Peripheral inflammatory parameters in late-life depression: a systematic review. Int J Mol Sci. 2016;17(12):2022.
24. Alexopoulos GS, Morimoto SS. The inflammation hypothesis in geriatric depression. Int J Geriatr Psychiatry. 2011;26(11):1109-1118.
25. Carney RM, Freedland KE, Sheline YI, et al. Depression and coronary heart disease: a review for cardiologists. Clin Cardiol. 1997;20(3):196-200.
26. Carney RM, Freedland KE, Steinmeyer BC, et al. Nighttime heart rate predicts response to depression treatment in patients with coronary heart disease. J Affect Disord. 2016;200:165-171.
27. Appelhans BM, Luecken LJ. Heart rate variability as an index of regulated emotional responding. Rev Gen Psych. 2006;10(3):229-240.
28. Holzman JB, Bridgett DJ. Heart rate variability indices as bio-markers of top-down self-regulatory mechanisms: a meta-analytic review. Neurosci Biobehav Rev. 2017;74(Pt A):233-255.
29. Brown L, Karmakar C, Gray R, et al. Heart rate variability alterations in late life depression: a meta-analysis. J Affect Disord. 2018;235:456-466.
30. La Rovere MT, Bigger JT Jr, Marcus FI, et al. Baroreflex sensitivity and heart-rate variability in prediction of total cardiac mortality after myocardial infarction. ATRAMI (Autonomic Tone and Reflexes After Myocardial Infarction) Investigators. Lancet. 1998;351(1901):478-484.
31. Chakravarti D, LaBella KA, DePinho RA. Telomeres: history, health, and hallmarks of aging. Cell. 2021;184(2):306-322.
32. Lavretsky H, Epel ES, Siddarth P, et al. A pilot study of yogic meditation for family dementia caregivers with depressive symptoms: effects on mental health, cognition, and telomerase activity. Int J Geriatr Psychiatry. 2013;28(1):57-65.
33. Siddiqui MJ, Min CS, Verma RK, et al. Role of complementary and alternative medicine in geriatric care: a mini review. Pharmacogn Rev. 2014;8(16):81-87.
34. Nguyen SA, Lavretsky H. Emerging complementary and integrative therapies for geriatric mental health. Curr Treat Options Psychiatry. 2020;7(4):447-470.
35. Clarke TC, Barnes PM, Black LI, et al. Use of yoga, meditation, and chiropractors among U.S. adults aged 18 and over. NCHS Data Brief. 2018;(325):1-8.
36. Hofmann SG, Gómez AF. Mindfulness-based interventions for anxiety and depression. Psychiatr Clin North Am. 2017;40(4):739-749.
37. Ramadas E, de Lima MP, Caetano T, et al. Effectiveness of mindfulness-based relapse prevention in individuals with substance use disorders: a systematic review. Behav Sci (Basel). 2021;11(10):133.
38. Chiesa A, Serretti A. Mindfulness-based stress reduction for stress management in healthy people: a review and meta-analysis. J Altern Complement Med. 2009;15(5):593-600.
39. Strauss C, Cavanagh K, Oliver A, et al. Mindfulness-based interventions for people diagnosed with a current episode of an anxiety or depressive disorder: a meta-analysis of randomised controlled trials. PLoS One. 2014;9(4):e96110.
40. Chobe S, Chobe M, Metri K, et al. Impact of yoga on cognition and mental health among elderly: a systematic review. Complement Ther Med. 2020;52:102421.
41. Brunner D, Abramovitch A, Etherton J. A yoga program for cognitive enhancement. PLoS One. 2017;12(8):e0182366.
42. Dai J, Sang X, Menhas R, et al. The influence of COVID-19 pandemic on physical health-psychological health, physical activity, and overall well-being: the mediating role of emotional regulation. Front Psychol. 2021;12:667461.
43. Grolli RE, Mingoti MED, Bertollo AG, et al. Impact of COVID-19 in the mental health in elderly: psychological and biological updates. Mol Neurobiol. 2021;58(5):1905-1916.
44. Johansson A, Mohamed MS, Moulin TC, et al. Neurological manifestations of COVID-19: a comprehensive literature review and discussion of mechanisms. J Neuroimmunol. 2021;358:577658.
45. Pandya SP. Older women and wellbeing through the pandemic: examining the effect of daily online yoga lessons. Health Care Women Int. 2021;42(11):1255-1278.
U.S. flu activity already at mid-season levels
according to the Centers of Disease Control and Prevention.
Nationally, 6% of all outpatient visits were because of flu or flu-like illness for the week of Nov. 13-19, up from 5.8% the previous week, the CDC’s Influenza Division said in its weekly FluView report.
Those figures are the highest recorded in November since 2009, but the peak of the 2009-10 flu season occurred even earlier – the week of Oct. 18-24 – and the rate of flu-like illness had already dropped to just over 4.0% by Nov. 15-21 that year and continued to drop thereafter.
Although COVID-19 and respiratory syncytial virus (RSV) are included in the data from the CDC’s Outpatient Influenza-like Illness Surveillance Network, the agency did note that “seasonal influenza activity is elevated across the country” and estimated that “there have been at least 6.2 million illnesses, 53,000 hospitalizations, and 2,900 deaths from flu” during the 2022-23 season.
Total flu deaths include 11 reported in children as of Nov. 19, and children ages 0-4 had a higher proportion of visits for flu like-illness than other age groups.
The agency also said the cumulative hospitalization rate of 11.3 per 100,000 population “is higher than the rate observed in [the corresponding week of] every previous season since 2010-2011.” Adults 65 years and older have the highest cumulative rate, 25.9 per 100,000, for this year, compared with 20.7 for children 0-4; 11.1 for adults 50-64; 10.3 for children 5-17; and 5.6 for adults 18-49 years old, the CDC said.
A version of this article first appeared on WebMD.com.
according to the Centers of Disease Control and Prevention.
Nationally, 6% of all outpatient visits were because of flu or flu-like illness for the week of Nov. 13-19, up from 5.8% the previous week, the CDC’s Influenza Division said in its weekly FluView report.
Those figures are the highest recorded in November since 2009, but the peak of the 2009-10 flu season occurred even earlier – the week of Oct. 18-24 – and the rate of flu-like illness had already dropped to just over 4.0% by Nov. 15-21 that year and continued to drop thereafter.
Although COVID-19 and respiratory syncytial virus (RSV) are included in the data from the CDC’s Outpatient Influenza-like Illness Surveillance Network, the agency did note that “seasonal influenza activity is elevated across the country” and estimated that “there have been at least 6.2 million illnesses, 53,000 hospitalizations, and 2,900 deaths from flu” during the 2022-23 season.
Total flu deaths include 11 reported in children as of Nov. 19, and children ages 0-4 had a higher proportion of visits for flu like-illness than other age groups.
The agency also said the cumulative hospitalization rate of 11.3 per 100,000 population “is higher than the rate observed in [the corresponding week of] every previous season since 2010-2011.” Adults 65 years and older have the highest cumulative rate, 25.9 per 100,000, for this year, compared with 20.7 for children 0-4; 11.1 for adults 50-64; 10.3 for children 5-17; and 5.6 for adults 18-49 years old, the CDC said.
A version of this article first appeared on WebMD.com.
according to the Centers of Disease Control and Prevention.
Nationally, 6% of all outpatient visits were because of flu or flu-like illness for the week of Nov. 13-19, up from 5.8% the previous week, the CDC’s Influenza Division said in its weekly FluView report.
Those figures are the highest recorded in November since 2009, but the peak of the 2009-10 flu season occurred even earlier – the week of Oct. 18-24 – and the rate of flu-like illness had already dropped to just over 4.0% by Nov. 15-21 that year and continued to drop thereafter.
Although COVID-19 and respiratory syncytial virus (RSV) are included in the data from the CDC’s Outpatient Influenza-like Illness Surveillance Network, the agency did note that “seasonal influenza activity is elevated across the country” and estimated that “there have been at least 6.2 million illnesses, 53,000 hospitalizations, and 2,900 deaths from flu” during the 2022-23 season.
Total flu deaths include 11 reported in children as of Nov. 19, and children ages 0-4 had a higher proportion of visits for flu like-illness than other age groups.
The agency also said the cumulative hospitalization rate of 11.3 per 100,000 population “is higher than the rate observed in [the corresponding week of] every previous season since 2010-2011.” Adults 65 years and older have the highest cumulative rate, 25.9 per 100,000, for this year, compared with 20.7 for children 0-4; 11.1 for adults 50-64; 10.3 for children 5-17; and 5.6 for adults 18-49 years old, the CDC said.
A version of this article first appeared on WebMD.com.
Sarilumab effective for polymyalgia rheumatica in phase 3 trial
PHILADELPHIA – Treatment with the interleukin-6 receptor antagonist sarilumab (Kevzara), along with a 14-week taper of glucocorticoids, proved to have significant efficacy in patients with relapsing polymyalgia rheumatica (PMR) who were resistant to glucocorticoids in a phase 3 trial.
No new safety concerns were found with sarilumab in the multicenter, randomized, double-blind, placebo-controlled SAPHYR trial. Sarilumab is approved in the United States for the treatment of moderate to severe active rheumatoid arthritis in adults who have had an inadequate response or intolerance to one or more disease-modifying antirheumatic drugs.
The results, presented at the annual meeting of the American College of Rheumatology by Robert Spiera, MD, director of the Scleroderma, Vasculitis, and Myositis Center at the Hospital for Special Surgery in New York, included clinically meaningful improvement in quality-of-life scores.
The disease, which primarily affects people over age 65, can cause widespread aching and stiffness. It’s one of the most common inflammatory diseases among older adults.
PMR is relatively easy to treat with glucocorticoids, but relapses are common, which means long courses of glucocorticoid therapy and the side effects that come with them.
Need for a steroid-sparing therapy
“We recognize that a steroid-sparing drug in polymyalgia rheumatica seems to be an unmet need,” Dr. Spiera said at the meeting.
The trial, sponsored by Sanofi, included active, refractory PMR patients who flared within 3 months of study entry while on at least 7.5 mg/day of prednisone or the equivalent. They were randomly assigned (1:1) to 52 weeks of treatment with subcutaneous sarilumab 200 mg every 2 weeks plus the rapid 14-week glucocorticoid tapering regimen or were given placebo every 2 weeks plus a more traditional 52-week tapering of glucocorticoids.
COVID hampered recruitment
Recruitment was stopped early because of complications during the COVID-19 pandemic, so between October 2018 and July 2020, 118 of the intended 280 patients were recruited, and 117 were treated (sarilumab = 59, placebo = 58). Median age was 69 years in the treatment group and 70 among those taking placebo.
Of the 117 treated, only 78 patients (67%) completed treatment (sarilumab = 42, placebo = 36). The main reasons for stopping treatment were adverse events – including seven with sarilumab and four with placebo – and lack of efficacy (sarilumab = four, placebo = nine).
The primary outcome was the proportion of patients who reached sustained remission at 52 weeks, defined as disease remission by week 12 and no disease flare, normal C-reactive protein (CRP), and adherence to the glucocorticoid taper during weeks 12-52.
The researchers found that sustained remission was significantly higher in the sarilumab arm versus the control group (28.3% versus 10.3%; P = .0193).
IL-6 inhibitors lower CRP, but if you take CRP out of the definition, Dr. Spiera said, “we still saw this difference: 31.7% of patients treated with sarilumab and 13.8% treated with placebo and a longer taper achieved that endpoint.”
Forty-four percent lower risk of flare with sarilumab
Patients in the sarilumab group also had 44% lower risk of having a flare after achieving clinical remission versus the comparator group (16.7% versus 29.3%; hazard ratio, 0.56; 95% confidence interval, 0.35-0.90; P = .0153).
Patient-reported outcomes, which included physical and mental health scores and disability index results, favored sarilumab.
The incidence of treatment-emergent adverse events (TEAEs) was numerically higher in the sarilumab group, compared with the control group (94.9% versus 84.5%). TEAEs included neutropenia (15.3%) and arthralgia (15.3%) in the sarilumab group and insomnia (15.5%) in the comparator arm.
However, the frequency of serious AEs was higher in the control group, compared with the sarilumab arm (20.7% versus 13.6%). No deaths were reported, and, importantly in this age group treated with concurrent glucocorticoids and an IL-6 inhibitor, Dr. Spiera said, “there were no cases of diverticulitis requiring intervention.”
Dr. Spiera was asked about a seemingly low remission rate. He answered that the bar was very high for remission in this study.
Patients had to achieve remission by week 12 and with the rapid 14-week taper. “That means by week 12 the sarilumab arm patients were only on 2 mg of daily prednisone or its equivalent,” he said.
Patients had to maintain that for another 40 weeks, he noted, adding, “I think especially in the context of quality of life and function indices, these were important results.”
Sebastian E. Sattui, MD, director of the University of Pittsburgh Medical Center vasculitis clinic, told this news organization that prolonged use of glucocorticoids in patients with PMR remains an important concern and the need for other options is critical.
“Around 30% of patients with PMR remain on prednisone 5 years after diagnosis,” he said. “Low-dose glucocorticoids are still associated with significant morbidity. Until recently, there has been a paucity of high-quality data regarding the use of steroid-sparing agents in PMR. “
He noted that the SAPHYR trial data are promising “with sarilumab being successful in achieving remission while minimizing glucocorticoids in patients with relapsing PMR.” The clinically meaningful improvement in patient-reported outcomes was just as important, he added.
The main unanswered question is whether the disease-modifying ability of sarilumab will continue after it is stopped, Dr. Sattui said.
Dr. Spiera is a consultant for Sanofi, which funded the trial. He also disclosed financial relationships with GlaxoSmithKline, Boehringer Ingelheim, Corbus, InflaRx, AbbVie/Abbott, Novartis, Chemocentryx, Roche, and Vera. Dr. Sattui has received research support from AstraZeneca and has done unpaid consulting work for Sanofi.
A version of this article first appeared on Medscape.com.
PHILADELPHIA – Treatment with the interleukin-6 receptor antagonist sarilumab (Kevzara), along with a 14-week taper of glucocorticoids, proved to have significant efficacy in patients with relapsing polymyalgia rheumatica (PMR) who were resistant to glucocorticoids in a phase 3 trial.
No new safety concerns were found with sarilumab in the multicenter, randomized, double-blind, placebo-controlled SAPHYR trial. Sarilumab is approved in the United States for the treatment of moderate to severe active rheumatoid arthritis in adults who have had an inadequate response or intolerance to one or more disease-modifying antirheumatic drugs.
The results, presented at the annual meeting of the American College of Rheumatology by Robert Spiera, MD, director of the Scleroderma, Vasculitis, and Myositis Center at the Hospital for Special Surgery in New York, included clinically meaningful improvement in quality-of-life scores.
The disease, which primarily affects people over age 65, can cause widespread aching and stiffness. It’s one of the most common inflammatory diseases among older adults.
PMR is relatively easy to treat with glucocorticoids, but relapses are common, which means long courses of glucocorticoid therapy and the side effects that come with them.
Need for a steroid-sparing therapy
“We recognize that a steroid-sparing drug in polymyalgia rheumatica seems to be an unmet need,” Dr. Spiera said at the meeting.
The trial, sponsored by Sanofi, included active, refractory PMR patients who flared within 3 months of study entry while on at least 7.5 mg/day of prednisone or the equivalent. They were randomly assigned (1:1) to 52 weeks of treatment with subcutaneous sarilumab 200 mg every 2 weeks plus the rapid 14-week glucocorticoid tapering regimen or were given placebo every 2 weeks plus a more traditional 52-week tapering of glucocorticoids.
COVID hampered recruitment
Recruitment was stopped early because of complications during the COVID-19 pandemic, so between October 2018 and July 2020, 118 of the intended 280 patients were recruited, and 117 were treated (sarilumab = 59, placebo = 58). Median age was 69 years in the treatment group and 70 among those taking placebo.
Of the 117 treated, only 78 patients (67%) completed treatment (sarilumab = 42, placebo = 36). The main reasons for stopping treatment were adverse events – including seven with sarilumab and four with placebo – and lack of efficacy (sarilumab = four, placebo = nine).
The primary outcome was the proportion of patients who reached sustained remission at 52 weeks, defined as disease remission by week 12 and no disease flare, normal C-reactive protein (CRP), and adherence to the glucocorticoid taper during weeks 12-52.
The researchers found that sustained remission was significantly higher in the sarilumab arm versus the control group (28.3% versus 10.3%; P = .0193).
IL-6 inhibitors lower CRP, but if you take CRP out of the definition, Dr. Spiera said, “we still saw this difference: 31.7% of patients treated with sarilumab and 13.8% treated with placebo and a longer taper achieved that endpoint.”
Forty-four percent lower risk of flare with sarilumab
Patients in the sarilumab group also had 44% lower risk of having a flare after achieving clinical remission versus the comparator group (16.7% versus 29.3%; hazard ratio, 0.56; 95% confidence interval, 0.35-0.90; P = .0153).
Patient-reported outcomes, which included physical and mental health scores and disability index results, favored sarilumab.
The incidence of treatment-emergent adverse events (TEAEs) was numerically higher in the sarilumab group, compared with the control group (94.9% versus 84.5%). TEAEs included neutropenia (15.3%) and arthralgia (15.3%) in the sarilumab group and insomnia (15.5%) in the comparator arm.
However, the frequency of serious AEs was higher in the control group, compared with the sarilumab arm (20.7% versus 13.6%). No deaths were reported, and, importantly in this age group treated with concurrent glucocorticoids and an IL-6 inhibitor, Dr. Spiera said, “there were no cases of diverticulitis requiring intervention.”
Dr. Spiera was asked about a seemingly low remission rate. He answered that the bar was very high for remission in this study.
Patients had to achieve remission by week 12 and with the rapid 14-week taper. “That means by week 12 the sarilumab arm patients were only on 2 mg of daily prednisone or its equivalent,” he said.
Patients had to maintain that for another 40 weeks, he noted, adding, “I think especially in the context of quality of life and function indices, these were important results.”
Sebastian E. Sattui, MD, director of the University of Pittsburgh Medical Center vasculitis clinic, told this news organization that prolonged use of glucocorticoids in patients with PMR remains an important concern and the need for other options is critical.
“Around 30% of patients with PMR remain on prednisone 5 years after diagnosis,” he said. “Low-dose glucocorticoids are still associated with significant morbidity. Until recently, there has been a paucity of high-quality data regarding the use of steroid-sparing agents in PMR. “
He noted that the SAPHYR trial data are promising “with sarilumab being successful in achieving remission while minimizing glucocorticoids in patients with relapsing PMR.” The clinically meaningful improvement in patient-reported outcomes was just as important, he added.
The main unanswered question is whether the disease-modifying ability of sarilumab will continue after it is stopped, Dr. Sattui said.
Dr. Spiera is a consultant for Sanofi, which funded the trial. He also disclosed financial relationships with GlaxoSmithKline, Boehringer Ingelheim, Corbus, InflaRx, AbbVie/Abbott, Novartis, Chemocentryx, Roche, and Vera. Dr. Sattui has received research support from AstraZeneca and has done unpaid consulting work for Sanofi.
A version of this article first appeared on Medscape.com.
PHILADELPHIA – Treatment with the interleukin-6 receptor antagonist sarilumab (Kevzara), along with a 14-week taper of glucocorticoids, proved to have significant efficacy in patients with relapsing polymyalgia rheumatica (PMR) who were resistant to glucocorticoids in a phase 3 trial.
No new safety concerns were found with sarilumab in the multicenter, randomized, double-blind, placebo-controlled SAPHYR trial. Sarilumab is approved in the United States for the treatment of moderate to severe active rheumatoid arthritis in adults who have had an inadequate response or intolerance to one or more disease-modifying antirheumatic drugs.
The results, presented at the annual meeting of the American College of Rheumatology by Robert Spiera, MD, director of the Scleroderma, Vasculitis, and Myositis Center at the Hospital for Special Surgery in New York, included clinically meaningful improvement in quality-of-life scores.
The disease, which primarily affects people over age 65, can cause widespread aching and stiffness. It’s one of the most common inflammatory diseases among older adults.
PMR is relatively easy to treat with glucocorticoids, but relapses are common, which means long courses of glucocorticoid therapy and the side effects that come with them.
Need for a steroid-sparing therapy
“We recognize that a steroid-sparing drug in polymyalgia rheumatica seems to be an unmet need,” Dr. Spiera said at the meeting.
The trial, sponsored by Sanofi, included active, refractory PMR patients who flared within 3 months of study entry while on at least 7.5 mg/day of prednisone or the equivalent. They were randomly assigned (1:1) to 52 weeks of treatment with subcutaneous sarilumab 200 mg every 2 weeks plus the rapid 14-week glucocorticoid tapering regimen or were given placebo every 2 weeks plus a more traditional 52-week tapering of glucocorticoids.
COVID hampered recruitment
Recruitment was stopped early because of complications during the COVID-19 pandemic, so between October 2018 and July 2020, 118 of the intended 280 patients were recruited, and 117 were treated (sarilumab = 59, placebo = 58). Median age was 69 years in the treatment group and 70 among those taking placebo.
Of the 117 treated, only 78 patients (67%) completed treatment (sarilumab = 42, placebo = 36). The main reasons for stopping treatment were adverse events – including seven with sarilumab and four with placebo – and lack of efficacy (sarilumab = four, placebo = nine).
The primary outcome was the proportion of patients who reached sustained remission at 52 weeks, defined as disease remission by week 12 and no disease flare, normal C-reactive protein (CRP), and adherence to the glucocorticoid taper during weeks 12-52.
The researchers found that sustained remission was significantly higher in the sarilumab arm versus the control group (28.3% versus 10.3%; P = .0193).
IL-6 inhibitors lower CRP, but if you take CRP out of the definition, Dr. Spiera said, “we still saw this difference: 31.7% of patients treated with sarilumab and 13.8% treated with placebo and a longer taper achieved that endpoint.”
Forty-four percent lower risk of flare with sarilumab
Patients in the sarilumab group also had 44% lower risk of having a flare after achieving clinical remission versus the comparator group (16.7% versus 29.3%; hazard ratio, 0.56; 95% confidence interval, 0.35-0.90; P = .0153).
Patient-reported outcomes, which included physical and mental health scores and disability index results, favored sarilumab.
The incidence of treatment-emergent adverse events (TEAEs) was numerically higher in the sarilumab group, compared with the control group (94.9% versus 84.5%). TEAEs included neutropenia (15.3%) and arthralgia (15.3%) in the sarilumab group and insomnia (15.5%) in the comparator arm.
However, the frequency of serious AEs was higher in the control group, compared with the sarilumab arm (20.7% versus 13.6%). No deaths were reported, and, importantly in this age group treated with concurrent glucocorticoids and an IL-6 inhibitor, Dr. Spiera said, “there were no cases of diverticulitis requiring intervention.”
Dr. Spiera was asked about a seemingly low remission rate. He answered that the bar was very high for remission in this study.
Patients had to achieve remission by week 12 and with the rapid 14-week taper. “That means by week 12 the sarilumab arm patients were only on 2 mg of daily prednisone or its equivalent,” he said.
Patients had to maintain that for another 40 weeks, he noted, adding, “I think especially in the context of quality of life and function indices, these were important results.”
Sebastian E. Sattui, MD, director of the University of Pittsburgh Medical Center vasculitis clinic, told this news organization that prolonged use of glucocorticoids in patients with PMR remains an important concern and the need for other options is critical.
“Around 30% of patients with PMR remain on prednisone 5 years after diagnosis,” he said. “Low-dose glucocorticoids are still associated with significant morbidity. Until recently, there has been a paucity of high-quality data regarding the use of steroid-sparing agents in PMR. “
He noted that the SAPHYR trial data are promising “with sarilumab being successful in achieving remission while minimizing glucocorticoids in patients with relapsing PMR.” The clinically meaningful improvement in patient-reported outcomes was just as important, he added.
The main unanswered question is whether the disease-modifying ability of sarilumab will continue after it is stopped, Dr. Sattui said.
Dr. Spiera is a consultant for Sanofi, which funded the trial. He also disclosed financial relationships with GlaxoSmithKline, Boehringer Ingelheim, Corbus, InflaRx, AbbVie/Abbott, Novartis, Chemocentryx, Roche, and Vera. Dr. Sattui has received research support from AstraZeneca and has done unpaid consulting work for Sanofi.
A version of this article first appeared on Medscape.com.
AT ACR 2022
Anesthetic Choices and Postoperative Delirium Incidence: Propofol vs Sevoflurane
Study 1 Overview (Chang et al)
Objective: To assess the incidence of postoperative delirium (POD) following propofol- vs sevoflurane-based anesthesia in geriatric spine surgery patients.
Design: Retrospective, single-blinded observational study of propofol- and sevoflurane-based anesthesia cohorts.
Setting and participants: Patients eligible for this study were aged 65 years or older admitted to the SMG-SNU Boramae Medical Center (Seoul, South Korea). All patients underwent general anesthesia either via intravenous propofol or inhalational sevoflurane for spine surgery between January 2015 and December 2019. Patients were retrospectively identified via electronic medical records. Patient exclusion criteria included preoperative delirium, history of dementia, psychiatric disease, alcoholism, hepatic or renal dysfunction, postoperative mechanical ventilation dependence, other surgery within the recent 6 months, maintenance of intraoperative anesthesia with combined anesthetics, or incomplete medical record.
Main outcome measures: The primary outcome was the incidence of POD after administration of propofol- and sevoflurane-based anesthesia during hospitalization. Patients were screened for POD regularly by attending nurses using the Nursing Delirium Screening Scale (disorientation, inappropriate behavior, inappropriate communication, hallucination, and psychomotor retardation) during the entirety of the patient’s hospital stay; if 1 or more screening criteria were met, a psychiatrist was consulted for the proper diagnosis and management of delirium. A psychiatric diagnosis was required for a case to be counted toward the incidence of POD in this study. Secondary outcomes included postoperative 30-day complications (angina, myocardial infarction, transient ischemic attack/stroke, pneumonia, deep vein thrombosis, pulmonary embolism, acute kidney injury, or infection) and length of postoperative hospital stay.
Main results: POD occurred in 29 patients (10.3%) out of the total cohort of 281. POD was more common in the sevoflurane group than in the propofol group (15.7% vs 5.0%; P = .003). Using multivariable logistic regression, inhalational sevoflurane was associated with an increased risk of POD as compared to propofol-based anesthesia (odds ratio [OR], 4.120; 95% CI, 1.549-10.954; P = .005). There was no association between choice of anesthetic and postoperative 30-day complications or the length of postoperative hospital stay. Both older age (OR, 1.242; 95% CI, 1.130-1.366; P < .001) and higher pain score at postoperative day 1 (OR, 1.338; 95% CI, 1.056-1.696; P = .016) were associated with increased risk of POD.
Conclusion: Propofol-based anesthesia was associated with a lower incidence of and risk for POD than sevoflurane-based anesthesia in older patients undergoing spine surgery.
Study 2 Overview (Mei et al)
Objective: To determine the incidence and duration of POD in older patients after total knee/hip replacement (TKR/THR) under intravenous propofol or inhalational sevoflurane general anesthesia.
Design: Randomized clinical trial of propofol and sevoflurane groups.
Setting and participants: This study was conducted at the Shanghai Tenth People’s Hospital and involved 209 participants enrolled between June 2016 and November 2019. All participants were 60 years of age or older, scheduled for TKR/THR surgery under general anesthesia, American Society of Anesthesiologists (ASA) class I to III, and assessed to be of normal cognitive function preoperatively via a Mini-Mental State Examination. Participant exclusion criteria included preexisting delirium as assessed by the Confusion Assessment Method (CAM), prior diagnosed neurological diseases (eg, Parkinson’s disease), prior diagnosed mental disorders (eg, schizophrenia), or impaired vision or hearing that would influence cognitive assessments. All participants were randomly assigned to either sevoflurane or propofol anesthesia for their surgery via a computer-generated list. Of these, 103 received inhalational sevoflurane and 106 received intravenous propofol. All participants received standardized postoperative care.
Main outcome measures: All participants were interviewed by investigators, who were blinded to the anesthesia regimen, twice daily on postoperative days 1, 2, and 3 using CAM and a CAM-based scoring system (CAM-S) to assess delirium severity. The CAM encapsulated 4 criteria: acute onset and fluctuating course, agitation, disorganized thinking, and altered level of consciousness. To diagnose delirium, both the first and second criteria must be met, in addition to either the third or fourth criterion. The averages of the scores across the 3 postoperative days indicated delirium severity, while the incidence and duration of delirium was assessed by the presence of delirium as determined by CAM on any postoperative day.
Main results: All eligible participants (N = 209; mean [SD] age 71.2 [6.7] years; 29.2% male) were included in the final analysis. The incidence of POD was not statistically different between the propofol and sevoflurane groups (33.0% vs 23.3%; P = .119, Chi-square test). It was estimated that 316 participants in each arm of the study were needed to detect statistical differences. The number of days of POD per person were higher with propofol anesthesia as compared to sevoflurane (0.5 [0.8] vs 0.3 [0.5]; P = .049, Student’s t-test).
Conclusion: This underpowered study showed a 9.7% difference in the incidence of POD between older adults who received propofol (33.0%) and sevoflurane (23.3%) after THR/TKR. Further studies with a larger sample size are needed to compare general anesthetics and their role in POD.
Commentary
Delirium is characterized by an acute state of confusion with fluctuating mental status, inattention, disorganized thinking, and altered level of consciousness. It is often caused by medications and/or their related adverse effects, infections, electrolyte imbalances, and other clinical etiologies. Delirium often manifests in post-surgical settings, disproportionately affecting older patients and leading to increased risk of morbidity, mortality, hospital length of stay, and health care costs.1 Intraoperative risk factors for POD are determined by the degree of operative stress (eg, lower-risk surgeries put the patient at reduced risk for POD as compared to higher-risk surgeries) and are additive to preexisting patient-specific risk factors, such as older age and functional impairment.1 Because operative stress is associated with risk for POD, limiting operative stress in controlled ways, such as through the choice of anesthetic agent administered, may be a pragmatic way to manage operative risks and optimize outcomes, especially when serving a surgically vulnerable population.
In Study 1, Chang et al sought to assess whether 2 commonly utilized general anesthetics, propofol and sevoflurane, in older patients undergoing spine surgery differentially affected the incidence of POD. In this retrospective, single-blinded observational study of 281 geriatric patients, the researchers found that sevoflurane was associated with a higher risk of POD as compared to propofol. However, these anesthetics were not associated with surgical outcomes such as postoperative 30-day complications or the length of postoperative hospital stay. While these findings added new knowledge to this field of research, several limitations should be kept in mind when interpreting this study’s results. For instance, the sample size was relatively small, with all cases selected from a single center utilizing a retrospective analysis. In addition, although a standardized nursing screening tool was used as a method for delirium detection, hypoactive delirium or less symptomatic delirium may have been missed, which in turn would lead to an underestimation of POD incidence. The latter is a common limitation in delirium research.
In Study 2, Mei et al similarly explored the effects of general anesthetics on POD in older surgical patients. Specifically, using a randomized clinical trial design, the investigators compared propofol with sevoflurane in older patients who underwent TKR/THR, and their roles in POD severity and duration. Although the incidence of POD was higher in those who received propofol compared to sevoflurane, this trial was underpowered and the results did not reach statistical significance. In addition, while the duration of POD was slightly longer in the propofol group compared to the sevoflurane group (0.5 vs 0.3 days), it was unclear if this finding was clinically significant. Similar to many research studies in POD, limitations of Study 2 included a small sample size of 209 patients, with all participants enrolled from a single center. On the other hand, this study illustrated the feasibility of a method that allowed reproducible prospective assessment of POD time course using CAM and CAM-S.
Applications for Clinical Practice and System Implementation
The delineation of risk factors that contribute to delirium after surgery in older patients is key to mitigating risks for POD and improving clinical outcomes. An important step towards a better understanding of these modifiable risk factors is to clearly quantify intraoperative risk of POD attributable to specific anesthetics. While preclinical studies have shown differential neurotoxicity effects of propofol and sevoflurane, their impact on clinically important neurologic outcomes such as delirium and cognitive decline remains poorly understood. Although Studies 1 and 2 both provided head-to-head comparisons of propofol and sevoflurane as risk factors for POD in high-operative-stress surgeries in older patients, the results were inconsistent. That being said, this small incremental increase in knowledge was not unexpected in the course of discovery around a clinically complex research question. Importantly, these studies provided evidence regarding the methodological approaches that could be taken to further this line of research.
The mediating factors of the differences on neurologic outcomes between anesthetic agents are likely pharmacological, biological, and methodological. Pharmacologically, the differences between target receptors, such as GABAA (propofol, etomidate) or NMDA (ketamine), could be a defining feature in the difference in incidence of POD. Additionally, secondary actions of anesthetic agents on glycine, nicotinic, and acetylcholine receptors could play a role as well. Biologically, genes such as CYP2E1, CYP2B6, CYP2C9, GSTP1, UGT1A9, SULT1A1, and NQO1 have all been identified as genetic factors in the metabolism of anesthetics, and variations in such genes could result in different responses to anesthetics.2 Methodologically, routes of anesthetic administration (eg, inhalation vs intravenous), preexisting anatomical structures, or confounding medical conditions (eg, lower respiratory volume due to older age) may influence POD incidence, duration, or severity. Moreover, methodological differences between Studies 1 and 2, such as surgeries performed (spinal vs TKR/THR), patient populations (South Korean vs Chinese), and the diagnosis and monitoring of delirium (retrospective screening and diagnosis vs prospective CAM/CAM-S) may impact delirium outcomes. Thus, these factors should be considered in the design of future clinical trials undertaken to investigate the effects of anesthetics on POD.
Given the high prevalence of delirium and its associated adverse outcomes in the immediate postoperative period in older patients, further research is warranted to determine how anesthetics affect POD in order to optimize perioperative care and mitigate risks in this vulnerable population. Moreover, parallel investigations into how anesthetics differentially impact the development of transient or longer-term cognitive impairment after a surgical procedure (ie, postoperative cognitive dysfunction) in older adults are urgently needed in order to improve their cognitive health.
Practice Points
- Intravenous propofol and inhalational sevoflurane may be differentially associated with incidence, duration, and severity of POD in geriatric surgical patients.
- Further larger-scale studies are warranted to clarify the role of anesthetic choice in POD in order to optimize surgical outcomes in older patients.
–Jared Doan, BS, and Fred Ko, MD
Icahn School of Medicine at Mount Sinai
1. Dasgupta M, Dumbrell AC. Preoperative risk assessment for delirium after noncardiac surgery: a systematic review. J Am Geriatr Soc. 2006;54(10):1578-1589. doi:10.1111/j.1532-5415.2006.00893.x
2. Mikstacki A, Skrzypczak-Zielinska M, Tamowicz B, et al. The impact of genetic factors on response to anaesthetics. Adv Med Sci. 2013;58(1):9-14. doi:10.2478/v10039-012-0065-z
Study 1 Overview (Chang et al)
Objective: To assess the incidence of postoperative delirium (POD) following propofol- vs sevoflurane-based anesthesia in geriatric spine surgery patients.
Design: Retrospective, single-blinded observational study of propofol- and sevoflurane-based anesthesia cohorts.
Setting and participants: Patients eligible for this study were aged 65 years or older admitted to the SMG-SNU Boramae Medical Center (Seoul, South Korea). All patients underwent general anesthesia either via intravenous propofol or inhalational sevoflurane for spine surgery between January 2015 and December 2019. Patients were retrospectively identified via electronic medical records. Patient exclusion criteria included preoperative delirium, history of dementia, psychiatric disease, alcoholism, hepatic or renal dysfunction, postoperative mechanical ventilation dependence, other surgery within the recent 6 months, maintenance of intraoperative anesthesia with combined anesthetics, or incomplete medical record.
Main outcome measures: The primary outcome was the incidence of POD after administration of propofol- and sevoflurane-based anesthesia during hospitalization. Patients were screened for POD regularly by attending nurses using the Nursing Delirium Screening Scale (disorientation, inappropriate behavior, inappropriate communication, hallucination, and psychomotor retardation) during the entirety of the patient’s hospital stay; if 1 or more screening criteria were met, a psychiatrist was consulted for the proper diagnosis and management of delirium. A psychiatric diagnosis was required for a case to be counted toward the incidence of POD in this study. Secondary outcomes included postoperative 30-day complications (angina, myocardial infarction, transient ischemic attack/stroke, pneumonia, deep vein thrombosis, pulmonary embolism, acute kidney injury, or infection) and length of postoperative hospital stay.
Main results: POD occurred in 29 patients (10.3%) out of the total cohort of 281. POD was more common in the sevoflurane group than in the propofol group (15.7% vs 5.0%; P = .003). Using multivariable logistic regression, inhalational sevoflurane was associated with an increased risk of POD as compared to propofol-based anesthesia (odds ratio [OR], 4.120; 95% CI, 1.549-10.954; P = .005). There was no association between choice of anesthetic and postoperative 30-day complications or the length of postoperative hospital stay. Both older age (OR, 1.242; 95% CI, 1.130-1.366; P < .001) and higher pain score at postoperative day 1 (OR, 1.338; 95% CI, 1.056-1.696; P = .016) were associated with increased risk of POD.
Conclusion: Propofol-based anesthesia was associated with a lower incidence of and risk for POD than sevoflurane-based anesthesia in older patients undergoing spine surgery.
Study 2 Overview (Mei et al)
Objective: To determine the incidence and duration of POD in older patients after total knee/hip replacement (TKR/THR) under intravenous propofol or inhalational sevoflurane general anesthesia.
Design: Randomized clinical trial of propofol and sevoflurane groups.
Setting and participants: This study was conducted at the Shanghai Tenth People’s Hospital and involved 209 participants enrolled between June 2016 and November 2019. All participants were 60 years of age or older, scheduled for TKR/THR surgery under general anesthesia, American Society of Anesthesiologists (ASA) class I to III, and assessed to be of normal cognitive function preoperatively via a Mini-Mental State Examination. Participant exclusion criteria included preexisting delirium as assessed by the Confusion Assessment Method (CAM), prior diagnosed neurological diseases (eg, Parkinson’s disease), prior diagnosed mental disorders (eg, schizophrenia), or impaired vision or hearing that would influence cognitive assessments. All participants were randomly assigned to either sevoflurane or propofol anesthesia for their surgery via a computer-generated list. Of these, 103 received inhalational sevoflurane and 106 received intravenous propofol. All participants received standardized postoperative care.
Main outcome measures: All participants were interviewed by investigators, who were blinded to the anesthesia regimen, twice daily on postoperative days 1, 2, and 3 using CAM and a CAM-based scoring system (CAM-S) to assess delirium severity. The CAM encapsulated 4 criteria: acute onset and fluctuating course, agitation, disorganized thinking, and altered level of consciousness. To diagnose delirium, both the first and second criteria must be met, in addition to either the third or fourth criterion. The averages of the scores across the 3 postoperative days indicated delirium severity, while the incidence and duration of delirium was assessed by the presence of delirium as determined by CAM on any postoperative day.
Main results: All eligible participants (N = 209; mean [SD] age 71.2 [6.7] years; 29.2% male) were included in the final analysis. The incidence of POD was not statistically different between the propofol and sevoflurane groups (33.0% vs 23.3%; P = .119, Chi-square test). It was estimated that 316 participants in each arm of the study were needed to detect statistical differences. The number of days of POD per person were higher with propofol anesthesia as compared to sevoflurane (0.5 [0.8] vs 0.3 [0.5]; P = .049, Student’s t-test).
Conclusion: This underpowered study showed a 9.7% difference in the incidence of POD between older adults who received propofol (33.0%) and sevoflurane (23.3%) after THR/TKR. Further studies with a larger sample size are needed to compare general anesthetics and their role in POD.
Commentary
Delirium is characterized by an acute state of confusion with fluctuating mental status, inattention, disorganized thinking, and altered level of consciousness. It is often caused by medications and/or their related adverse effects, infections, electrolyte imbalances, and other clinical etiologies. Delirium often manifests in post-surgical settings, disproportionately affecting older patients and leading to increased risk of morbidity, mortality, hospital length of stay, and health care costs.1 Intraoperative risk factors for POD are determined by the degree of operative stress (eg, lower-risk surgeries put the patient at reduced risk for POD as compared to higher-risk surgeries) and are additive to preexisting patient-specific risk factors, such as older age and functional impairment.1 Because operative stress is associated with risk for POD, limiting operative stress in controlled ways, such as through the choice of anesthetic agent administered, may be a pragmatic way to manage operative risks and optimize outcomes, especially when serving a surgically vulnerable population.
In Study 1, Chang et al sought to assess whether 2 commonly utilized general anesthetics, propofol and sevoflurane, in older patients undergoing spine surgery differentially affected the incidence of POD. In this retrospective, single-blinded observational study of 281 geriatric patients, the researchers found that sevoflurane was associated with a higher risk of POD as compared to propofol. However, these anesthetics were not associated with surgical outcomes such as postoperative 30-day complications or the length of postoperative hospital stay. While these findings added new knowledge to this field of research, several limitations should be kept in mind when interpreting this study’s results. For instance, the sample size was relatively small, with all cases selected from a single center utilizing a retrospective analysis. In addition, although a standardized nursing screening tool was used as a method for delirium detection, hypoactive delirium or less symptomatic delirium may have been missed, which in turn would lead to an underestimation of POD incidence. The latter is a common limitation in delirium research.
In Study 2, Mei et al similarly explored the effects of general anesthetics on POD in older surgical patients. Specifically, using a randomized clinical trial design, the investigators compared propofol with sevoflurane in older patients who underwent TKR/THR, and their roles in POD severity and duration. Although the incidence of POD was higher in those who received propofol compared to sevoflurane, this trial was underpowered and the results did not reach statistical significance. In addition, while the duration of POD was slightly longer in the propofol group compared to the sevoflurane group (0.5 vs 0.3 days), it was unclear if this finding was clinically significant. Similar to many research studies in POD, limitations of Study 2 included a small sample size of 209 patients, with all participants enrolled from a single center. On the other hand, this study illustrated the feasibility of a method that allowed reproducible prospective assessment of POD time course using CAM and CAM-S.
Applications for Clinical Practice and System Implementation
The delineation of risk factors that contribute to delirium after surgery in older patients is key to mitigating risks for POD and improving clinical outcomes. An important step towards a better understanding of these modifiable risk factors is to clearly quantify intraoperative risk of POD attributable to specific anesthetics. While preclinical studies have shown differential neurotoxicity effects of propofol and sevoflurane, their impact on clinically important neurologic outcomes such as delirium and cognitive decline remains poorly understood. Although Studies 1 and 2 both provided head-to-head comparisons of propofol and sevoflurane as risk factors for POD in high-operative-stress surgeries in older patients, the results were inconsistent. That being said, this small incremental increase in knowledge was not unexpected in the course of discovery around a clinically complex research question. Importantly, these studies provided evidence regarding the methodological approaches that could be taken to further this line of research.
The mediating factors of the differences on neurologic outcomes between anesthetic agents are likely pharmacological, biological, and methodological. Pharmacologically, the differences between target receptors, such as GABAA (propofol, etomidate) or NMDA (ketamine), could be a defining feature in the difference in incidence of POD. Additionally, secondary actions of anesthetic agents on glycine, nicotinic, and acetylcholine receptors could play a role as well. Biologically, genes such as CYP2E1, CYP2B6, CYP2C9, GSTP1, UGT1A9, SULT1A1, and NQO1 have all been identified as genetic factors in the metabolism of anesthetics, and variations in such genes could result in different responses to anesthetics.2 Methodologically, routes of anesthetic administration (eg, inhalation vs intravenous), preexisting anatomical structures, or confounding medical conditions (eg, lower respiratory volume due to older age) may influence POD incidence, duration, or severity. Moreover, methodological differences between Studies 1 and 2, such as surgeries performed (spinal vs TKR/THR), patient populations (South Korean vs Chinese), and the diagnosis and monitoring of delirium (retrospective screening and diagnosis vs prospective CAM/CAM-S) may impact delirium outcomes. Thus, these factors should be considered in the design of future clinical trials undertaken to investigate the effects of anesthetics on POD.
Given the high prevalence of delirium and its associated adverse outcomes in the immediate postoperative period in older patients, further research is warranted to determine how anesthetics affect POD in order to optimize perioperative care and mitigate risks in this vulnerable population. Moreover, parallel investigations into how anesthetics differentially impact the development of transient or longer-term cognitive impairment after a surgical procedure (ie, postoperative cognitive dysfunction) in older adults are urgently needed in order to improve their cognitive health.
Practice Points
- Intravenous propofol and inhalational sevoflurane may be differentially associated with incidence, duration, and severity of POD in geriatric surgical patients.
- Further larger-scale studies are warranted to clarify the role of anesthetic choice in POD in order to optimize surgical outcomes in older patients.
–Jared Doan, BS, and Fred Ko, MD
Icahn School of Medicine at Mount Sinai
Study 1 Overview (Chang et al)
Objective: To assess the incidence of postoperative delirium (POD) following propofol- vs sevoflurane-based anesthesia in geriatric spine surgery patients.
Design: Retrospective, single-blinded observational study of propofol- and sevoflurane-based anesthesia cohorts.
Setting and participants: Patients eligible for this study were aged 65 years or older admitted to the SMG-SNU Boramae Medical Center (Seoul, South Korea). All patients underwent general anesthesia either via intravenous propofol or inhalational sevoflurane for spine surgery between January 2015 and December 2019. Patients were retrospectively identified via electronic medical records. Patient exclusion criteria included preoperative delirium, history of dementia, psychiatric disease, alcoholism, hepatic or renal dysfunction, postoperative mechanical ventilation dependence, other surgery within the recent 6 months, maintenance of intraoperative anesthesia with combined anesthetics, or incomplete medical record.
Main outcome measures: The primary outcome was the incidence of POD after administration of propofol- and sevoflurane-based anesthesia during hospitalization. Patients were screened for POD regularly by attending nurses using the Nursing Delirium Screening Scale (disorientation, inappropriate behavior, inappropriate communication, hallucination, and psychomotor retardation) during the entirety of the patient’s hospital stay; if 1 or more screening criteria were met, a psychiatrist was consulted for the proper diagnosis and management of delirium. A psychiatric diagnosis was required for a case to be counted toward the incidence of POD in this study. Secondary outcomes included postoperative 30-day complications (angina, myocardial infarction, transient ischemic attack/stroke, pneumonia, deep vein thrombosis, pulmonary embolism, acute kidney injury, or infection) and length of postoperative hospital stay.
Main results: POD occurred in 29 patients (10.3%) out of the total cohort of 281. POD was more common in the sevoflurane group than in the propofol group (15.7% vs 5.0%; P = .003). Using multivariable logistic regression, inhalational sevoflurane was associated with an increased risk of POD as compared to propofol-based anesthesia (odds ratio [OR], 4.120; 95% CI, 1.549-10.954; P = .005). There was no association between choice of anesthetic and postoperative 30-day complications or the length of postoperative hospital stay. Both older age (OR, 1.242; 95% CI, 1.130-1.366; P < .001) and higher pain score at postoperative day 1 (OR, 1.338; 95% CI, 1.056-1.696; P = .016) were associated with increased risk of POD.
Conclusion: Propofol-based anesthesia was associated with a lower incidence of and risk for POD than sevoflurane-based anesthesia in older patients undergoing spine surgery.
Study 2 Overview (Mei et al)
Objective: To determine the incidence and duration of POD in older patients after total knee/hip replacement (TKR/THR) under intravenous propofol or inhalational sevoflurane general anesthesia.
Design: Randomized clinical trial of propofol and sevoflurane groups.
Setting and participants: This study was conducted at the Shanghai Tenth People’s Hospital and involved 209 participants enrolled between June 2016 and November 2019. All participants were 60 years of age or older, scheduled for TKR/THR surgery under general anesthesia, American Society of Anesthesiologists (ASA) class I to III, and assessed to be of normal cognitive function preoperatively via a Mini-Mental State Examination. Participant exclusion criteria included preexisting delirium as assessed by the Confusion Assessment Method (CAM), prior diagnosed neurological diseases (eg, Parkinson’s disease), prior diagnosed mental disorders (eg, schizophrenia), or impaired vision or hearing that would influence cognitive assessments. All participants were randomly assigned to either sevoflurane or propofol anesthesia for their surgery via a computer-generated list. Of these, 103 received inhalational sevoflurane and 106 received intravenous propofol. All participants received standardized postoperative care.
Main outcome measures: All participants were interviewed by investigators, who were blinded to the anesthesia regimen, twice daily on postoperative days 1, 2, and 3 using CAM and a CAM-based scoring system (CAM-S) to assess delirium severity. The CAM encapsulated 4 criteria: acute onset and fluctuating course, agitation, disorganized thinking, and altered level of consciousness. To diagnose delirium, both the first and second criteria must be met, in addition to either the third or fourth criterion. The averages of the scores across the 3 postoperative days indicated delirium severity, while the incidence and duration of delirium was assessed by the presence of delirium as determined by CAM on any postoperative day.
Main results: All eligible participants (N = 209; mean [SD] age 71.2 [6.7] years; 29.2% male) were included in the final analysis. The incidence of POD was not statistically different between the propofol and sevoflurane groups (33.0% vs 23.3%; P = .119, Chi-square test). It was estimated that 316 participants in each arm of the study were needed to detect statistical differences. The number of days of POD per person were higher with propofol anesthesia as compared to sevoflurane (0.5 [0.8] vs 0.3 [0.5]; P = .049, Student’s t-test).
Conclusion: This underpowered study showed a 9.7% difference in the incidence of POD between older adults who received propofol (33.0%) and sevoflurane (23.3%) after THR/TKR. Further studies with a larger sample size are needed to compare general anesthetics and their role in POD.
Commentary
Delirium is characterized by an acute state of confusion with fluctuating mental status, inattention, disorganized thinking, and altered level of consciousness. It is often caused by medications and/or their related adverse effects, infections, electrolyte imbalances, and other clinical etiologies. Delirium often manifests in post-surgical settings, disproportionately affecting older patients and leading to increased risk of morbidity, mortality, hospital length of stay, and health care costs.1 Intraoperative risk factors for POD are determined by the degree of operative stress (eg, lower-risk surgeries put the patient at reduced risk for POD as compared to higher-risk surgeries) and are additive to preexisting patient-specific risk factors, such as older age and functional impairment.1 Because operative stress is associated with risk for POD, limiting operative stress in controlled ways, such as through the choice of anesthetic agent administered, may be a pragmatic way to manage operative risks and optimize outcomes, especially when serving a surgically vulnerable population.
In Study 1, Chang et al sought to assess whether 2 commonly utilized general anesthetics, propofol and sevoflurane, in older patients undergoing spine surgery differentially affected the incidence of POD. In this retrospective, single-blinded observational study of 281 geriatric patients, the researchers found that sevoflurane was associated with a higher risk of POD as compared to propofol. However, these anesthetics were not associated with surgical outcomes such as postoperative 30-day complications or the length of postoperative hospital stay. While these findings added new knowledge to this field of research, several limitations should be kept in mind when interpreting this study’s results. For instance, the sample size was relatively small, with all cases selected from a single center utilizing a retrospective analysis. In addition, although a standardized nursing screening tool was used as a method for delirium detection, hypoactive delirium or less symptomatic delirium may have been missed, which in turn would lead to an underestimation of POD incidence. The latter is a common limitation in delirium research.
In Study 2, Mei et al similarly explored the effects of general anesthetics on POD in older surgical patients. Specifically, using a randomized clinical trial design, the investigators compared propofol with sevoflurane in older patients who underwent TKR/THR, and their roles in POD severity and duration. Although the incidence of POD was higher in those who received propofol compared to sevoflurane, this trial was underpowered and the results did not reach statistical significance. In addition, while the duration of POD was slightly longer in the propofol group compared to the sevoflurane group (0.5 vs 0.3 days), it was unclear if this finding was clinically significant. Similar to many research studies in POD, limitations of Study 2 included a small sample size of 209 patients, with all participants enrolled from a single center. On the other hand, this study illustrated the feasibility of a method that allowed reproducible prospective assessment of POD time course using CAM and CAM-S.
Applications for Clinical Practice and System Implementation
The delineation of risk factors that contribute to delirium after surgery in older patients is key to mitigating risks for POD and improving clinical outcomes. An important step towards a better understanding of these modifiable risk factors is to clearly quantify intraoperative risk of POD attributable to specific anesthetics. While preclinical studies have shown differential neurotoxicity effects of propofol and sevoflurane, their impact on clinically important neurologic outcomes such as delirium and cognitive decline remains poorly understood. Although Studies 1 and 2 both provided head-to-head comparisons of propofol and sevoflurane as risk factors for POD in high-operative-stress surgeries in older patients, the results were inconsistent. That being said, this small incremental increase in knowledge was not unexpected in the course of discovery around a clinically complex research question. Importantly, these studies provided evidence regarding the methodological approaches that could be taken to further this line of research.
The mediating factors of the differences on neurologic outcomes between anesthetic agents are likely pharmacological, biological, and methodological. Pharmacologically, the differences between target receptors, such as GABAA (propofol, etomidate) or NMDA (ketamine), could be a defining feature in the difference in incidence of POD. Additionally, secondary actions of anesthetic agents on glycine, nicotinic, and acetylcholine receptors could play a role as well. Biologically, genes such as CYP2E1, CYP2B6, CYP2C9, GSTP1, UGT1A9, SULT1A1, and NQO1 have all been identified as genetic factors in the metabolism of anesthetics, and variations in such genes could result in different responses to anesthetics.2 Methodologically, routes of anesthetic administration (eg, inhalation vs intravenous), preexisting anatomical structures, or confounding medical conditions (eg, lower respiratory volume due to older age) may influence POD incidence, duration, or severity. Moreover, methodological differences between Studies 1 and 2, such as surgeries performed (spinal vs TKR/THR), patient populations (South Korean vs Chinese), and the diagnosis and monitoring of delirium (retrospective screening and diagnosis vs prospective CAM/CAM-S) may impact delirium outcomes. Thus, these factors should be considered in the design of future clinical trials undertaken to investigate the effects of anesthetics on POD.
Given the high prevalence of delirium and its associated adverse outcomes in the immediate postoperative period in older patients, further research is warranted to determine how anesthetics affect POD in order to optimize perioperative care and mitigate risks in this vulnerable population. Moreover, parallel investigations into how anesthetics differentially impact the development of transient or longer-term cognitive impairment after a surgical procedure (ie, postoperative cognitive dysfunction) in older adults are urgently needed in order to improve their cognitive health.
Practice Points
- Intravenous propofol and inhalational sevoflurane may be differentially associated with incidence, duration, and severity of POD in geriatric surgical patients.
- Further larger-scale studies are warranted to clarify the role of anesthetic choice in POD in order to optimize surgical outcomes in older patients.
–Jared Doan, BS, and Fred Ko, MD
Icahn School of Medicine at Mount Sinai
1. Dasgupta M, Dumbrell AC. Preoperative risk assessment for delirium after noncardiac surgery: a systematic review. J Am Geriatr Soc. 2006;54(10):1578-1589. doi:10.1111/j.1532-5415.2006.00893.x
2. Mikstacki A, Skrzypczak-Zielinska M, Tamowicz B, et al. The impact of genetic factors on response to anaesthetics. Adv Med Sci. 2013;58(1):9-14. doi:10.2478/v10039-012-0065-z
1. Dasgupta M, Dumbrell AC. Preoperative risk assessment for delirium after noncardiac surgery: a systematic review. J Am Geriatr Soc. 2006;54(10):1578-1589. doi:10.1111/j.1532-5415.2006.00893.x
2. Mikstacki A, Skrzypczak-Zielinska M, Tamowicz B, et al. The impact of genetic factors on response to anaesthetics. Adv Med Sci. 2013;58(1):9-14. doi:10.2478/v10039-012-0065-z
Hiccups in patients with cancer often overlooked, undertreated
But even if recognized, hiccups may not be treated effectively, according to a national survey of cancer care clinicians.
When poorly controlled, persistent hiccups can affect a patient’s quality of life, with 40% of survey respondents considering chronic hiccups “much more” or “somewhat more” severe than nausea and vomiting.
Overall, the findings indicate that patients with cancer who develop persistent hiccups are “truly suffering,” the authors wrote.
The survey results were published online recently in the American Journal of Hospice and Palliative Medicine.
Hiccups may simply be a nuisance for most, but these spasms can become problematic for patients with cancer, leading to sleep deprivation, fatigue, aspiration pneumonia, compromised food intake, weight loss, pain, and even death.
Hiccups can develop when the nerve that controls the diaphragm becomes irritated, which can be triggered by certain chemotherapy drugs.
Yet few studies have focused on hiccups in patients with cancer and none, until now, has sought the perspectives of cancer care clinicians.
Aminah Jatoi, MD, medical oncologist with the Mayo Clinic in Rochester, Minn., and two Mayo colleagues developed a survey, alongside MeterHealth, which this news organization distributed to clinicians with an interest in cancer care.
The survey gauged clinicians’ awareness or lack of awareness about clinically significant hiccups as well as treatments for hiccups and whether they consider hiccups an unmet palliative need.
A total of 684 clinicians completed two eligibility screening questions, which required them to have cared for more than 10 patients with cancer in the past 6 months with clinically significant hiccups (defined as hiccups that lasted more than 48 hours or occurred from cancer or cancer care).
Among 113 eligible health care professionals, 90 completed the survey: 42 physicians, 29 nurses, 15 nurse practitioners, and 4 physician assistants.
The survey revealed three key issues.
The first is that hiccups appear to be an underrecognized issue.
Among health care professionals who answered the eligibility screening questions, fewer than 20% reported caring for more than 10 patients with cancer in the past 6 months who had persistent hiccups. Most of these clinicians reported caring for more than 1,000 patients per year.
Given that 15%-40% of patients with cancer report hiccups, this finding suggests that hiccups are not widely recognized by health care professionals.
Second: The survey data showed that hiccups often increase patients’ anxiety, fatigue, and sleep problems and can decrease productivity at work or school.
In fact, when comparing hiccups to nausea and vomiting – sometimes described as one of the most severe side effects of cancer care – 40% of respondents rated hiccups as “much more” or “somewhat more” severe than nausea and vomiting for their patients and 38% rated the severity of the two issues as “about the same.”
Finally, even when hiccups are recognized and treated, about 20% of respondents said that current therapies are not very effective, and more treatment options are needed.
Among the survey respondents, the most frequently prescribed medications for chronic hiccups were the antipsychotic chlorpromazine, the muscle relaxant baclofen (Lioresal), the antiemetic metoclopramide (Metozolv ODT, Reglan), and the anticonvulsants gabapentin (Neurontin) and carbamazepine (Tegretol).
Survey respondents who provided comments about current treatments for hiccups highlighted a range of challenges. One respondent said, “When current therapies do not work, it can be very demoralizing to our patients.” Another said, “I feel like it is a gamble whether treatment for hiccups will work or not.”
Still another felt that while current treatments work “quite well to halt hiccups,” they come with side effects which can be “quite severe.”
These results “clearly point to the unmet needs of hiccups in patients with cancer and should prompt more research aimed at generating more palliative options,” the authors said.
This research had no commercial funding. MeterHealth reviewed the manuscript and provided input on the accuracy of methods and results. Dr. Jatoi reports serving on an advisory board for MeterHealth (honoraria to institution).
A version of this article first appeared on Medscape.com.
But even if recognized, hiccups may not be treated effectively, according to a national survey of cancer care clinicians.
When poorly controlled, persistent hiccups can affect a patient’s quality of life, with 40% of survey respondents considering chronic hiccups “much more” or “somewhat more” severe than nausea and vomiting.
Overall, the findings indicate that patients with cancer who develop persistent hiccups are “truly suffering,” the authors wrote.
The survey results were published online recently in the American Journal of Hospice and Palliative Medicine.
Hiccups may simply be a nuisance for most, but these spasms can become problematic for patients with cancer, leading to sleep deprivation, fatigue, aspiration pneumonia, compromised food intake, weight loss, pain, and even death.
Hiccups can develop when the nerve that controls the diaphragm becomes irritated, which can be triggered by certain chemotherapy drugs.
Yet few studies have focused on hiccups in patients with cancer and none, until now, has sought the perspectives of cancer care clinicians.
Aminah Jatoi, MD, medical oncologist with the Mayo Clinic in Rochester, Minn., and two Mayo colleagues developed a survey, alongside MeterHealth, which this news organization distributed to clinicians with an interest in cancer care.
The survey gauged clinicians’ awareness or lack of awareness about clinically significant hiccups as well as treatments for hiccups and whether they consider hiccups an unmet palliative need.
A total of 684 clinicians completed two eligibility screening questions, which required them to have cared for more than 10 patients with cancer in the past 6 months with clinically significant hiccups (defined as hiccups that lasted more than 48 hours or occurred from cancer or cancer care).
Among 113 eligible health care professionals, 90 completed the survey: 42 physicians, 29 nurses, 15 nurse practitioners, and 4 physician assistants.
The survey revealed three key issues.
The first is that hiccups appear to be an underrecognized issue.
Among health care professionals who answered the eligibility screening questions, fewer than 20% reported caring for more than 10 patients with cancer in the past 6 months who had persistent hiccups. Most of these clinicians reported caring for more than 1,000 patients per year.
Given that 15%-40% of patients with cancer report hiccups, this finding suggests that hiccups are not widely recognized by health care professionals.
Second: The survey data showed that hiccups often increase patients’ anxiety, fatigue, and sleep problems and can decrease productivity at work or school.
In fact, when comparing hiccups to nausea and vomiting – sometimes described as one of the most severe side effects of cancer care – 40% of respondents rated hiccups as “much more” or “somewhat more” severe than nausea and vomiting for their patients and 38% rated the severity of the two issues as “about the same.”
Finally, even when hiccups are recognized and treated, about 20% of respondents said that current therapies are not very effective, and more treatment options are needed.
Among the survey respondents, the most frequently prescribed medications for chronic hiccups were the antipsychotic chlorpromazine, the muscle relaxant baclofen (Lioresal), the antiemetic metoclopramide (Metozolv ODT, Reglan), and the anticonvulsants gabapentin (Neurontin) and carbamazepine (Tegretol).
Survey respondents who provided comments about current treatments for hiccups highlighted a range of challenges. One respondent said, “When current therapies do not work, it can be very demoralizing to our patients.” Another said, “I feel like it is a gamble whether treatment for hiccups will work or not.”
Still another felt that while current treatments work “quite well to halt hiccups,” they come with side effects which can be “quite severe.”
These results “clearly point to the unmet needs of hiccups in patients with cancer and should prompt more research aimed at generating more palliative options,” the authors said.
This research had no commercial funding. MeterHealth reviewed the manuscript and provided input on the accuracy of methods and results. Dr. Jatoi reports serving on an advisory board for MeterHealth (honoraria to institution).
A version of this article first appeared on Medscape.com.
But even if recognized, hiccups may not be treated effectively, according to a national survey of cancer care clinicians.
When poorly controlled, persistent hiccups can affect a patient’s quality of life, with 40% of survey respondents considering chronic hiccups “much more” or “somewhat more” severe than nausea and vomiting.
Overall, the findings indicate that patients with cancer who develop persistent hiccups are “truly suffering,” the authors wrote.
The survey results were published online recently in the American Journal of Hospice and Palliative Medicine.
Hiccups may simply be a nuisance for most, but these spasms can become problematic for patients with cancer, leading to sleep deprivation, fatigue, aspiration pneumonia, compromised food intake, weight loss, pain, and even death.
Hiccups can develop when the nerve that controls the diaphragm becomes irritated, which can be triggered by certain chemotherapy drugs.
Yet few studies have focused on hiccups in patients with cancer and none, until now, has sought the perspectives of cancer care clinicians.
Aminah Jatoi, MD, medical oncologist with the Mayo Clinic in Rochester, Minn., and two Mayo colleagues developed a survey, alongside MeterHealth, which this news organization distributed to clinicians with an interest in cancer care.
The survey gauged clinicians’ awareness or lack of awareness about clinically significant hiccups as well as treatments for hiccups and whether they consider hiccups an unmet palliative need.
A total of 684 clinicians completed two eligibility screening questions, which required them to have cared for more than 10 patients with cancer in the past 6 months with clinically significant hiccups (defined as hiccups that lasted more than 48 hours or occurred from cancer or cancer care).
Among 113 eligible health care professionals, 90 completed the survey: 42 physicians, 29 nurses, 15 nurse practitioners, and 4 physician assistants.
The survey revealed three key issues.
The first is that hiccups appear to be an underrecognized issue.
Among health care professionals who answered the eligibility screening questions, fewer than 20% reported caring for more than 10 patients with cancer in the past 6 months who had persistent hiccups. Most of these clinicians reported caring for more than 1,000 patients per year.
Given that 15%-40% of patients with cancer report hiccups, this finding suggests that hiccups are not widely recognized by health care professionals.
Second: The survey data showed that hiccups often increase patients’ anxiety, fatigue, and sleep problems and can decrease productivity at work or school.
In fact, when comparing hiccups to nausea and vomiting – sometimes described as one of the most severe side effects of cancer care – 40% of respondents rated hiccups as “much more” or “somewhat more” severe than nausea and vomiting for their patients and 38% rated the severity of the two issues as “about the same.”
Finally, even when hiccups are recognized and treated, about 20% of respondents said that current therapies are not very effective, and more treatment options are needed.
Among the survey respondents, the most frequently prescribed medications for chronic hiccups were the antipsychotic chlorpromazine, the muscle relaxant baclofen (Lioresal), the antiemetic metoclopramide (Metozolv ODT, Reglan), and the anticonvulsants gabapentin (Neurontin) and carbamazepine (Tegretol).
Survey respondents who provided comments about current treatments for hiccups highlighted a range of challenges. One respondent said, “When current therapies do not work, it can be very demoralizing to our patients.” Another said, “I feel like it is a gamble whether treatment for hiccups will work or not.”
Still another felt that while current treatments work “quite well to halt hiccups,” they come with side effects which can be “quite severe.”
These results “clearly point to the unmet needs of hiccups in patients with cancer and should prompt more research aimed at generating more palliative options,” the authors said.
This research had no commercial funding. MeterHealth reviewed the manuscript and provided input on the accuracy of methods and results. Dr. Jatoi reports serving on an advisory board for MeterHealth (honoraria to institution).
A version of this article first appeared on Medscape.com.
FROM THE AMERICAN JOURNAL OF HOSPICE AND PALLIATIVE MEDICINE
Which anticoagulant is safest for frail elderly patients with nonvalvular A-fib?
ILLUSTRATIVE CASE
A frail 76-year-old woman with a history of hypertension and hyperlipidemia presents for evaluation of palpitations. An in-office electrocardiogram reveals that the patient is in AF. Her CHA2DS2-VASc score is 4 and her HAS-BLED score is 2.2,3 Using shared decision making, you decide to start medications for her AF. You plan to initiate a beta-blocker for rate control and must now decide on anticoagulation. Which oral anticoagulant would you prescribe for this patient’s AF, given her frail status?
Frailty is defined as a state of vulnerability with a decreased ability to recover from an acute stressful event.4 The prevalence of frailty varies by the measurements used and the population studied. A 2021 meta-analysis found that frailty prevalence ranges from 12% to 24% worldwide in patients older than 50 years5 and may increase to > 30% among those ages 85 years and older.6 Frailty increases rates of AEs such as falls7 and fracture,8 leading to disability,9 decreased quality of life,10 increased utilization of health care,11 and increased mortality.12 A number of validated approaches are available to screen for and measure frailty.13-18
Given the association with negative health outcomes and high health care utilization, frailty is an important clinical factor for physicians to consider when treating elderly patients. Frailty assessment may allow for more tailored treatment choices for patients, with a potential reduction in complications. Although CHA2DS2-VASc and HAS-BLED scores assist in the decision-making process of whether to start anticoagulation, these tools do not take frailty into consideration or guide anticoagulant choice.2,3 The purpose of this study was to analyze how levels of frailty affect the association of 3 different direct oral anticoagulants (DOACs) vs warfarin with various AEs (death, stroke, or major bleeding).
STUDY SUMMARY
This DOAC rose above the others
This retrospective cohort study compared the safety of 3 DOACs—dabigatran, rivaroxaban, and apixaban—vs warfarin in Medicare beneficiaries with AF, using 1:1 propensity score (PS)–matched analysis. Eligible patients were ages 65 years or older, with a filled prescription for a DOAC or warfarin, no prior oral anticoagulant exposure in the previous 183 days, a diagnostic code of AF, and continuous enrollment in Medicare Parts A, B, and D only. Patients were excluded if they had missing demographic data, received hospice care, resided in a nursing facility at drug initiation, had another indication for anticoagulation, or had a contraindication to either a DOAC or warfarin.
Frailty was measured using a claims-based frailty index (CFI), which applies health care utilization data to estimate a frailty index, with cut points for nonfrailty, prefrailty, and frailty. The CFI score has 93 claims-based variables, including wheelchairs and durable medical equipment, open wounds, diseases such as chronic obstructive pulmonary disease and ischemic heart disease, and transportation services.15-17 In this study, nonfrailty was defined as a CFI < 0.15, prefrailty as a CFI of 0.15 to 0.24, and frailty as a CFI ≥ 0.25.
The primary outcome—a composite endpoint of death, ischemic stroke, or major bleeding—was measured for each of the DOAC–warfarin cohorts in the overall population and stratified by frailty classification. Patients were followed until the occurrence of a study outcome, Medicare disenrollment, the end of the study period, discontinuation of the index drug (defined as > 5 days), change to a different anticoagulant, admission to a nursing facility, enrollment in hospice, initiation of dialysis, or kidney transplant. The authors conducted a PS-matched analysis to reduce any imbalances in clinical characteristics between the DOAC- and warfarin-treated groups, as well as a sensitivity analysis to assess the strength of the data findings using different assumptions.
The authors created 3 DOAC–warfarin cohorts: dabigatran (n = 81,863) vs warfarin (n = 256,722), rivaroxaban (n = 185,011) vs warfarin (n = 228,028), and apixaban (n = 222,478) vs warfarin (n = 206,031). After PS matching, the mean age in all cohorts was 76 to 77 years, about 50% were female, and 91% were White. The mean HAS-BLED score was 2 and the mean CHA2DS2-VASc score was 4. The mean CFI was 0.19 to 0.20, defined as prefrail. Patients classified as frail were older, more likely to be female, and more likely to have greater comorbidities, higher scores on CHA2DS2-VASc and HAS-BLED, and higher health care utilization.
Continue to: In the dabigatran-warfarin...
In the dabigatran–warfarin cohort (median follow-up, 72 days), the event rate of the composite endpoint per 1000 person-years (PY) was 63.5 for dabigatran and 65.6 for warfarin (hazard ratio [HR] = 0.98; 95% CI, 0.92 to 1.05; rate difference [RD] per 1000 PY = –2.2; 95% CI, –6.5 to 2.1). A lower rate of the composite endpoint was associated with dabigatran than warfarin for the nonfrail subgroup but not the prefrail or frail groups.
In the rivaroxaban–warfarin cohort (median follow-up, 82 days), the composite endpoint rate per 1000 PY was 77.8 for rivaroxaban and 83.7 for warfarin (HR = 0.98; 95% CI, 0.94 to 1.02; RD per 1000 PY = –5.9; 95% CI, –9.4 to –2.4). When stratifying by frailty category, both dabigatran and rivaroxaban were associated with a lower composite endpoint rate than warfarin for the nonfrail population only (HR = 0.81; 95% CI, 0.68 to 0.97, and HR = 0.88; 95% CI, 0.77 to 0.99, respectively).
In the apixaban–warfarin cohort (median follow-up, 84 days), the rate of the composite endpoint per 1000 PY was 60.1 for apixaban and 92.3 for warfarin (HR = 0.68; 95% CI, 0.65 to 0.72; RD per 1000 PY = –32.2; 95% CI, –36.1 to –28.3). The beneficial association for apixaban was present in all frailty categories, with an HR of 0.61 (95% CI, 0.52 to 0.71) for nonfrail patients, 0.66 (95% CI, 0.61 to 0.70) for prefrail patients, and 0.73 (95% CI, 0.67 to 0.80) for frail patients. Apixaban was the only DOAC with a relative reduction in the hazard of death, ischemic stroke, or major bleeding among all frailty groups.
WHAT’S NEW
Only apixaban had lower AE rates vs warfarin across frailty levels
Three DOACs (dabigatran, rivaroxaban, and apixaban) reduced the risk of death, ischemic stroke, or major bleeding compared with warfarin in older adults with AF, but only apixaban was associated with a relative reduction of these adverse outcomes in patients of all frailty classifications.
CAVEATS
Important data but RCTs are needed
The power of this observational study is considerable. However, it remains a retrospective observational study. The authors attempted to account for these limitations and potential confounders by performing a PS-matched analysis and sensitivity analysis; however, these findings should be confirmed with randomized controlled trials.
Continue to: Additionally, the study...
Additionally, the study collected data on each of the DOAC–warfarin cohorts for < 90 days. Trials to address long-term outcomes are warranted.
Finally, there was no control group in comparison with anticoagulation. It is possible that choosing not to use an anticoagulant is the best choice for frail elderly patients.
CHALLENGES TO IMPLEMENTATION
Doctors need a practical frailty scale, patients need an affordable Rx
Frailty is not often considered a measurable trait. The approach used in the study to determine the CFI is not a practical clinical tool. Studies comparing a frailty calculation software application or an easily implementable survey may help bring this clinically impactful information to the hands of primary care physicians. The Clinical Frailty Scale—a brief, 7-point scale based on the physician’s clinical impression of the patient—has been found to correlate with other established frailty measures18 and might be an option for busy clinicians. However, the current study did not utilize this measurement, and the validity of its use by primary care physicians in the outpatient setting requires further study.
In addition, cost may be a barrier for patients younger than 65 years or for those older than 65 years who do not qualify for Medicare or do not have Medicare Part D. The average monthly cost of the DOACs ranges from $560 for dabigatran19 to $600 for rivaroxaban20 and $623 for apixaban.21 As always, the choice of anticoagulant therapy is a clinical judgment and a joint decision of the patient and physician.
1. Kim DH, Pawar A, Gagne JJ, et al. Frailty and clinical outcomes of direct oral anticoagulants versus warfarin in older adults with atrial fibrillation: a cohort study. Ann Intern Med. 2021;174:1214-1223. doi: 10.7326/M20-7141
2. Zhu W, He W, Guo L, et al. The HAS-BLED score for predicting major bleeding risk in anticoagulated patients with atrial fibrillation: a systematic review and meta-analysis. Clin Cardiol. 2015;38:555-561. doi: 10.1002/clc.22435
3. Olesen JB, Lip GYH, Hansen ML, et al. Validation of risk stratification schemes for predicting stroke and thromboembolism in patients with atrial fibrillation: nationwide cohort study. BMJ. 2011;342:d124. doi: 10.1136/bmj.d124
4. Xue QL. The frailty syndrome: definition and natural history. Clin Geriatr Med. 2011;27:1-15. doi: 10.1016/j.cger.2010.08.009
5. O’Caoimh R, Sezgin D, O’Donovan MR, et al. Prevalence of frailty in 62 countries across the world: a systematic review and meta-analysis of population-level studies. Age Ageing. 2021;50:96-104. doi: 10.1093/ageing/afaa219
6. Campitelli MA, Bronskill SE, Hogan DB, et al. The prevalence and health consequences of frailty in a population-based older home care cohort: a comparison of different measures. BMC Geriatr. 2016;16:133. doi: 10.1186/s12877-016-0309-z
7. Kojima G. Frailty as a predictor of future falls among community-dwelling older people: a systematic review and meta-analysis. J Am Med Dir Assoc. 2015;16:1027-1033. doi: 10.1016/j.jamda. 2015.06.018
8. Kojima G. Frailty as a predictor of fractures among community-dwelling older people: a systematic review and meta-analysis. Bone. 2016;90:116-122. doi: 10.1016/j.bone.2016.06.009
9. Kojima G. Quick and simple FRAIL scale predicts incident activities of daily living (ADL) and instrumental ADL (IADL) disabilities: a systematic review and meta-analysis. J Am Med Dir Assoc. 2018;19:1063-1068. doi: 10.1016/j.jamda.2018.07.019
10. Kojima G, Liljas AEM, Iliffe S. Frailty syndrome: implications and challenges for health care policy. Risk Manag Healthc Policy. 2019;12:23-30. doi: 10.2147/RMHP.S168750
11. Roe L, Normand C, Wren MA, et al. The impact of frailty on healthcare utilisation in Ireland: evidence from The Irish Longitudinal Study on Ageing. BMC Geriatr. 2017;17:203. doi: 10.1186/s12877-017-0579-0
12. Hao Q, Zhou L, Dong B, et al. The role of frailty in predicting mortality and readmission in older adults in acute care wards: a prospective study. Sci Rep. 2019;9:1207. doi: 10.1038/s41598-018-38072-7
13. Fried LP, Tangen CM, Walston J, et al; Cardiovascular Health Study Collaborative Research Group. Frailty in older adults: evidence for a phenotype. J Gerontol A Biol Sci Med Sci. 2001;56:M146-M156. doi: 10.1093/gerona/56.3.m146
14. Ryan J, Espinoza S, Ernst ME, et al. Validation of a deficit-accumulation frailty Index in the ASPirin in Reducing Events in the Elderly study and its predictive capacity for disability-free survival. J Gerontol A Biol Sci Med Sci. 2022;77:19-26. doi: 10.1093/gerona/glab225
15. Kim DH, Glynn RJ, Avorn J, et al. Validation of a claims-based frailty index against physical performance and adverse health outcomes in the Health and Retirement Study. J Gerontol A Biol Sci Med Sci. 2019;74:1271-1276. doi: 10.1093/gerona/gly197
16. Kim DH, Schneeweiss S, Glynn RJ, et al. Measuring frailty in Medicare data: development and validation of a claims-based frailty index. J Gerontol A Biol Sci Med Sci. 2018;73:980-987. doi: 10.1093/gerona/glx229
17. Claims-based frailty index. Harvard Dataverse website. 2022. Accessed April 5, 2022. https://dataverse.harvard.edu/dataverse/cfi
18. Rockwood K, Song X, MacKnight C, et al. A global clinical measure of fitness and frailty in elderly people. CMAJ. 2005;173:489-95. doi: 10.1503/cmaj.050051
19. Dabigatran. GoodRx. Accessed September 26, 2022. www.goodrx.com/dabigatran
20. Rivaroxaban. GoodRx. Accessed September 26, 2022. www.goodrx.com/rivaroxaban
21. Apixaban (Eliquis). GoodRx. Accessed September 26, 2022. www.goodrx.com/eliquis
ILLUSTRATIVE CASE
A frail 76-year-old woman with a history of hypertension and hyperlipidemia presents for evaluation of palpitations. An in-office electrocardiogram reveals that the patient is in AF. Her CHA2DS2-VASc score is 4 and her HAS-BLED score is 2.2,3 Using shared decision making, you decide to start medications for her AF. You plan to initiate a beta-blocker for rate control and must now decide on anticoagulation. Which oral anticoagulant would you prescribe for this patient’s AF, given her frail status?
Frailty is defined as a state of vulnerability with a decreased ability to recover from an acute stressful event.4 The prevalence of frailty varies by the measurements used and the population studied. A 2021 meta-analysis found that frailty prevalence ranges from 12% to 24% worldwide in patients older than 50 years5 and may increase to > 30% among those ages 85 years and older.6 Frailty increases rates of AEs such as falls7 and fracture,8 leading to disability,9 decreased quality of life,10 increased utilization of health care,11 and increased mortality.12 A number of validated approaches are available to screen for and measure frailty.13-18
Given the association with negative health outcomes and high health care utilization, frailty is an important clinical factor for physicians to consider when treating elderly patients. Frailty assessment may allow for more tailored treatment choices for patients, with a potential reduction in complications. Although CHA2DS2-VASc and HAS-BLED scores assist in the decision-making process of whether to start anticoagulation, these tools do not take frailty into consideration or guide anticoagulant choice.2,3 The purpose of this study was to analyze how levels of frailty affect the association of 3 different direct oral anticoagulants (DOACs) vs warfarin with various AEs (death, stroke, or major bleeding).
STUDY SUMMARY
This DOAC rose above the others
This retrospective cohort study compared the safety of 3 DOACs—dabigatran, rivaroxaban, and apixaban—vs warfarin in Medicare beneficiaries with AF, using 1:1 propensity score (PS)–matched analysis. Eligible patients were ages 65 years or older, with a filled prescription for a DOAC or warfarin, no prior oral anticoagulant exposure in the previous 183 days, a diagnostic code of AF, and continuous enrollment in Medicare Parts A, B, and D only. Patients were excluded if they had missing demographic data, received hospice care, resided in a nursing facility at drug initiation, had another indication for anticoagulation, or had a contraindication to either a DOAC or warfarin.
Frailty was measured using a claims-based frailty index (CFI), which applies health care utilization data to estimate a frailty index, with cut points for nonfrailty, prefrailty, and frailty. The CFI score has 93 claims-based variables, including wheelchairs and durable medical equipment, open wounds, diseases such as chronic obstructive pulmonary disease and ischemic heart disease, and transportation services.15-17 In this study, nonfrailty was defined as a CFI < 0.15, prefrailty as a CFI of 0.15 to 0.24, and frailty as a CFI ≥ 0.25.
The primary outcome—a composite endpoint of death, ischemic stroke, or major bleeding—was measured for each of the DOAC–warfarin cohorts in the overall population and stratified by frailty classification. Patients were followed until the occurrence of a study outcome, Medicare disenrollment, the end of the study period, discontinuation of the index drug (defined as > 5 days), change to a different anticoagulant, admission to a nursing facility, enrollment in hospice, initiation of dialysis, or kidney transplant. The authors conducted a PS-matched analysis to reduce any imbalances in clinical characteristics between the DOAC- and warfarin-treated groups, as well as a sensitivity analysis to assess the strength of the data findings using different assumptions.
The authors created 3 DOAC–warfarin cohorts: dabigatran (n = 81,863) vs warfarin (n = 256,722), rivaroxaban (n = 185,011) vs warfarin (n = 228,028), and apixaban (n = 222,478) vs warfarin (n = 206,031). After PS matching, the mean age in all cohorts was 76 to 77 years, about 50% were female, and 91% were White. The mean HAS-BLED score was 2 and the mean CHA2DS2-VASc score was 4. The mean CFI was 0.19 to 0.20, defined as prefrail. Patients classified as frail were older, more likely to be female, and more likely to have greater comorbidities, higher scores on CHA2DS2-VASc and HAS-BLED, and higher health care utilization.
Continue to: In the dabigatran-warfarin...
In the dabigatran–warfarin cohort (median follow-up, 72 days), the event rate of the composite endpoint per 1000 person-years (PY) was 63.5 for dabigatran and 65.6 for warfarin (hazard ratio [HR] = 0.98; 95% CI, 0.92 to 1.05; rate difference [RD] per 1000 PY = –2.2; 95% CI, –6.5 to 2.1). A lower rate of the composite endpoint was associated with dabigatran than warfarin for the nonfrail subgroup but not the prefrail or frail groups.
In the rivaroxaban–warfarin cohort (median follow-up, 82 days), the composite endpoint rate per 1000 PY was 77.8 for rivaroxaban and 83.7 for warfarin (HR = 0.98; 95% CI, 0.94 to 1.02; RD per 1000 PY = –5.9; 95% CI, –9.4 to –2.4). When stratifying by frailty category, both dabigatran and rivaroxaban were associated with a lower composite endpoint rate than warfarin for the nonfrail population only (HR = 0.81; 95% CI, 0.68 to 0.97, and HR = 0.88; 95% CI, 0.77 to 0.99, respectively).
In the apixaban–warfarin cohort (median follow-up, 84 days), the rate of the composite endpoint per 1000 PY was 60.1 for apixaban and 92.3 for warfarin (HR = 0.68; 95% CI, 0.65 to 0.72; RD per 1000 PY = –32.2; 95% CI, –36.1 to –28.3). The beneficial association for apixaban was present in all frailty categories, with an HR of 0.61 (95% CI, 0.52 to 0.71) for nonfrail patients, 0.66 (95% CI, 0.61 to 0.70) for prefrail patients, and 0.73 (95% CI, 0.67 to 0.80) for frail patients. Apixaban was the only DOAC with a relative reduction in the hazard of death, ischemic stroke, or major bleeding among all frailty groups.
WHAT’S NEW
Only apixaban had lower AE rates vs warfarin across frailty levels
Three DOACs (dabigatran, rivaroxaban, and apixaban) reduced the risk of death, ischemic stroke, or major bleeding compared with warfarin in older adults with AF, but only apixaban was associated with a relative reduction of these adverse outcomes in patients of all frailty classifications.
CAVEATS
Important data but RCTs are needed
The power of this observational study is considerable. However, it remains a retrospective observational study. The authors attempted to account for these limitations and potential confounders by performing a PS-matched analysis and sensitivity analysis; however, these findings should be confirmed with randomized controlled trials.
Continue to: Additionally, the study...
Additionally, the study collected data on each of the DOAC–warfarin cohorts for < 90 days. Trials to address long-term outcomes are warranted.
Finally, there was no control group in comparison with anticoagulation. It is possible that choosing not to use an anticoagulant is the best choice for frail elderly patients.
CHALLENGES TO IMPLEMENTATION
Doctors need a practical frailty scale, patients need an affordable Rx
Frailty is not often considered a measurable trait. The approach used in the study to determine the CFI is not a practical clinical tool. Studies comparing a frailty calculation software application or an easily implementable survey may help bring this clinically impactful information to the hands of primary care physicians. The Clinical Frailty Scale—a brief, 7-point scale based on the physician’s clinical impression of the patient—has been found to correlate with other established frailty measures18 and might be an option for busy clinicians. However, the current study did not utilize this measurement, and the validity of its use by primary care physicians in the outpatient setting requires further study.
In addition, cost may be a barrier for patients younger than 65 years or for those older than 65 years who do not qualify for Medicare or do not have Medicare Part D. The average monthly cost of the DOACs ranges from $560 for dabigatran19 to $600 for rivaroxaban20 and $623 for apixaban.21 As always, the choice of anticoagulant therapy is a clinical judgment and a joint decision of the patient and physician.
ILLUSTRATIVE CASE
A frail 76-year-old woman with a history of hypertension and hyperlipidemia presents for evaluation of palpitations. An in-office electrocardiogram reveals that the patient is in AF. Her CHA2DS2-VASc score is 4 and her HAS-BLED score is 2.2,3 Using shared decision making, you decide to start medications for her AF. You plan to initiate a beta-blocker for rate control and must now decide on anticoagulation. Which oral anticoagulant would you prescribe for this patient’s AF, given her frail status?
Frailty is defined as a state of vulnerability with a decreased ability to recover from an acute stressful event.4 The prevalence of frailty varies by the measurements used and the population studied. A 2021 meta-analysis found that frailty prevalence ranges from 12% to 24% worldwide in patients older than 50 years5 and may increase to > 30% among those ages 85 years and older.6 Frailty increases rates of AEs such as falls7 and fracture,8 leading to disability,9 decreased quality of life,10 increased utilization of health care,11 and increased mortality.12 A number of validated approaches are available to screen for and measure frailty.13-18
Given the association with negative health outcomes and high health care utilization, frailty is an important clinical factor for physicians to consider when treating elderly patients. Frailty assessment may allow for more tailored treatment choices for patients, with a potential reduction in complications. Although CHA2DS2-VASc and HAS-BLED scores assist in the decision-making process of whether to start anticoagulation, these tools do not take frailty into consideration or guide anticoagulant choice.2,3 The purpose of this study was to analyze how levels of frailty affect the association of 3 different direct oral anticoagulants (DOACs) vs warfarin with various AEs (death, stroke, or major bleeding).
STUDY SUMMARY
This DOAC rose above the others
This retrospective cohort study compared the safety of 3 DOACs—dabigatran, rivaroxaban, and apixaban—vs warfarin in Medicare beneficiaries with AF, using 1:1 propensity score (PS)–matched analysis. Eligible patients were ages 65 years or older, with a filled prescription for a DOAC or warfarin, no prior oral anticoagulant exposure in the previous 183 days, a diagnostic code of AF, and continuous enrollment in Medicare Parts A, B, and D only. Patients were excluded if they had missing demographic data, received hospice care, resided in a nursing facility at drug initiation, had another indication for anticoagulation, or had a contraindication to either a DOAC or warfarin.
Frailty was measured using a claims-based frailty index (CFI), which applies health care utilization data to estimate a frailty index, with cut points for nonfrailty, prefrailty, and frailty. The CFI score has 93 claims-based variables, including wheelchairs and durable medical equipment, open wounds, diseases such as chronic obstructive pulmonary disease and ischemic heart disease, and transportation services.15-17 In this study, nonfrailty was defined as a CFI < 0.15, prefrailty as a CFI of 0.15 to 0.24, and frailty as a CFI ≥ 0.25.
The primary outcome—a composite endpoint of death, ischemic stroke, or major bleeding—was measured for each of the DOAC–warfarin cohorts in the overall population and stratified by frailty classification. Patients were followed until the occurrence of a study outcome, Medicare disenrollment, the end of the study period, discontinuation of the index drug (defined as > 5 days), change to a different anticoagulant, admission to a nursing facility, enrollment in hospice, initiation of dialysis, or kidney transplant. The authors conducted a PS-matched analysis to reduce any imbalances in clinical characteristics between the DOAC- and warfarin-treated groups, as well as a sensitivity analysis to assess the strength of the data findings using different assumptions.
The authors created 3 DOAC–warfarin cohorts: dabigatran (n = 81,863) vs warfarin (n = 256,722), rivaroxaban (n = 185,011) vs warfarin (n = 228,028), and apixaban (n = 222,478) vs warfarin (n = 206,031). After PS matching, the mean age in all cohorts was 76 to 77 years, about 50% were female, and 91% were White. The mean HAS-BLED score was 2 and the mean CHA2DS2-VASc score was 4. The mean CFI was 0.19 to 0.20, defined as prefrail. Patients classified as frail were older, more likely to be female, and more likely to have greater comorbidities, higher scores on CHA2DS2-VASc and HAS-BLED, and higher health care utilization.
Continue to: In the dabigatran-warfarin...
In the dabigatran–warfarin cohort (median follow-up, 72 days), the event rate of the composite endpoint per 1000 person-years (PY) was 63.5 for dabigatran and 65.6 for warfarin (hazard ratio [HR] = 0.98; 95% CI, 0.92 to 1.05; rate difference [RD] per 1000 PY = –2.2; 95% CI, –6.5 to 2.1). A lower rate of the composite endpoint was associated with dabigatran than warfarin for the nonfrail subgroup but not the prefrail or frail groups.
In the rivaroxaban–warfarin cohort (median follow-up, 82 days), the composite endpoint rate per 1000 PY was 77.8 for rivaroxaban and 83.7 for warfarin (HR = 0.98; 95% CI, 0.94 to 1.02; RD per 1000 PY = –5.9; 95% CI, –9.4 to –2.4). When stratifying by frailty category, both dabigatran and rivaroxaban were associated with a lower composite endpoint rate than warfarin for the nonfrail population only (HR = 0.81; 95% CI, 0.68 to 0.97, and HR = 0.88; 95% CI, 0.77 to 0.99, respectively).
In the apixaban–warfarin cohort (median follow-up, 84 days), the rate of the composite endpoint per 1000 PY was 60.1 for apixaban and 92.3 for warfarin (HR = 0.68; 95% CI, 0.65 to 0.72; RD per 1000 PY = –32.2; 95% CI, –36.1 to –28.3). The beneficial association for apixaban was present in all frailty categories, with an HR of 0.61 (95% CI, 0.52 to 0.71) for nonfrail patients, 0.66 (95% CI, 0.61 to 0.70) for prefrail patients, and 0.73 (95% CI, 0.67 to 0.80) for frail patients. Apixaban was the only DOAC with a relative reduction in the hazard of death, ischemic stroke, or major bleeding among all frailty groups.
WHAT’S NEW
Only apixaban had lower AE rates vs warfarin across frailty levels
Three DOACs (dabigatran, rivaroxaban, and apixaban) reduced the risk of death, ischemic stroke, or major bleeding compared with warfarin in older adults with AF, but only apixaban was associated with a relative reduction of these adverse outcomes in patients of all frailty classifications.
CAVEATS
Important data but RCTs are needed
The power of this observational study is considerable. However, it remains a retrospective observational study. The authors attempted to account for these limitations and potential confounders by performing a PS-matched analysis and sensitivity analysis; however, these findings should be confirmed with randomized controlled trials.
Continue to: Additionally, the study...
Additionally, the study collected data on each of the DOAC–warfarin cohorts for < 90 days. Trials to address long-term outcomes are warranted.
Finally, there was no control group in comparison with anticoagulation. It is possible that choosing not to use an anticoagulant is the best choice for frail elderly patients.
CHALLENGES TO IMPLEMENTATION
Doctors need a practical frailty scale, patients need an affordable Rx
Frailty is not often considered a measurable trait. The approach used in the study to determine the CFI is not a practical clinical tool. Studies comparing a frailty calculation software application or an easily implementable survey may help bring this clinically impactful information to the hands of primary care physicians. The Clinical Frailty Scale—a brief, 7-point scale based on the physician’s clinical impression of the patient—has been found to correlate with other established frailty measures18 and might be an option for busy clinicians. However, the current study did not utilize this measurement, and the validity of its use by primary care physicians in the outpatient setting requires further study.
In addition, cost may be a barrier for patients younger than 65 years or for those older than 65 years who do not qualify for Medicare or do not have Medicare Part D. The average monthly cost of the DOACs ranges from $560 for dabigatran19 to $600 for rivaroxaban20 and $623 for apixaban.21 As always, the choice of anticoagulant therapy is a clinical judgment and a joint decision of the patient and physician.
1. Kim DH, Pawar A, Gagne JJ, et al. Frailty and clinical outcomes of direct oral anticoagulants versus warfarin in older adults with atrial fibrillation: a cohort study. Ann Intern Med. 2021;174:1214-1223. doi: 10.7326/M20-7141
2. Zhu W, He W, Guo L, et al. The HAS-BLED score for predicting major bleeding risk in anticoagulated patients with atrial fibrillation: a systematic review and meta-analysis. Clin Cardiol. 2015;38:555-561. doi: 10.1002/clc.22435
3. Olesen JB, Lip GYH, Hansen ML, et al. Validation of risk stratification schemes for predicting stroke and thromboembolism in patients with atrial fibrillation: nationwide cohort study. BMJ. 2011;342:d124. doi: 10.1136/bmj.d124
4. Xue QL. The frailty syndrome: definition and natural history. Clin Geriatr Med. 2011;27:1-15. doi: 10.1016/j.cger.2010.08.009
5. O’Caoimh R, Sezgin D, O’Donovan MR, et al. Prevalence of frailty in 62 countries across the world: a systematic review and meta-analysis of population-level studies. Age Ageing. 2021;50:96-104. doi: 10.1093/ageing/afaa219
6. Campitelli MA, Bronskill SE, Hogan DB, et al. The prevalence and health consequences of frailty in a population-based older home care cohort: a comparison of different measures. BMC Geriatr. 2016;16:133. doi: 10.1186/s12877-016-0309-z
7. Kojima G. Frailty as a predictor of future falls among community-dwelling older people: a systematic review and meta-analysis. J Am Med Dir Assoc. 2015;16:1027-1033. doi: 10.1016/j.jamda. 2015.06.018
8. Kojima G. Frailty as a predictor of fractures among community-dwelling older people: a systematic review and meta-analysis. Bone. 2016;90:116-122. doi: 10.1016/j.bone.2016.06.009
9. Kojima G. Quick and simple FRAIL scale predicts incident activities of daily living (ADL) and instrumental ADL (IADL) disabilities: a systematic review and meta-analysis. J Am Med Dir Assoc. 2018;19:1063-1068. doi: 10.1016/j.jamda.2018.07.019
10. Kojima G, Liljas AEM, Iliffe S. Frailty syndrome: implications and challenges for health care policy. Risk Manag Healthc Policy. 2019;12:23-30. doi: 10.2147/RMHP.S168750
11. Roe L, Normand C, Wren MA, et al. The impact of frailty on healthcare utilisation in Ireland: evidence from The Irish Longitudinal Study on Ageing. BMC Geriatr. 2017;17:203. doi: 10.1186/s12877-017-0579-0
12. Hao Q, Zhou L, Dong B, et al. The role of frailty in predicting mortality and readmission in older adults in acute care wards: a prospective study. Sci Rep. 2019;9:1207. doi: 10.1038/s41598-018-38072-7
13. Fried LP, Tangen CM, Walston J, et al; Cardiovascular Health Study Collaborative Research Group. Frailty in older adults: evidence for a phenotype. J Gerontol A Biol Sci Med Sci. 2001;56:M146-M156. doi: 10.1093/gerona/56.3.m146
14. Ryan J, Espinoza S, Ernst ME, et al. Validation of a deficit-accumulation frailty Index in the ASPirin in Reducing Events in the Elderly study and its predictive capacity for disability-free survival. J Gerontol A Biol Sci Med Sci. 2022;77:19-26. doi: 10.1093/gerona/glab225
15. Kim DH, Glynn RJ, Avorn J, et al. Validation of a claims-based frailty index against physical performance and adverse health outcomes in the Health and Retirement Study. J Gerontol A Biol Sci Med Sci. 2019;74:1271-1276. doi: 10.1093/gerona/gly197
16. Kim DH, Schneeweiss S, Glynn RJ, et al. Measuring frailty in Medicare data: development and validation of a claims-based frailty index. J Gerontol A Biol Sci Med Sci. 2018;73:980-987. doi: 10.1093/gerona/glx229
17. Claims-based frailty index. Harvard Dataverse website. 2022. Accessed April 5, 2022. https://dataverse.harvard.edu/dataverse/cfi
18. Rockwood K, Song X, MacKnight C, et al. A global clinical measure of fitness and frailty in elderly people. CMAJ. 2005;173:489-95. doi: 10.1503/cmaj.050051
19. Dabigatran. GoodRx. Accessed September 26, 2022. www.goodrx.com/dabigatran
20. Rivaroxaban. GoodRx. Accessed September 26, 2022. www.goodrx.com/rivaroxaban
21. Apixaban (Eliquis). GoodRx. Accessed September 26, 2022. www.goodrx.com/eliquis
1. Kim DH, Pawar A, Gagne JJ, et al. Frailty and clinical outcomes of direct oral anticoagulants versus warfarin in older adults with atrial fibrillation: a cohort study. Ann Intern Med. 2021;174:1214-1223. doi: 10.7326/M20-7141
2. Zhu W, He W, Guo L, et al. The HAS-BLED score for predicting major bleeding risk in anticoagulated patients with atrial fibrillation: a systematic review and meta-analysis. Clin Cardiol. 2015;38:555-561. doi: 10.1002/clc.22435
3. Olesen JB, Lip GYH, Hansen ML, et al. Validation of risk stratification schemes for predicting stroke and thromboembolism in patients with atrial fibrillation: nationwide cohort study. BMJ. 2011;342:d124. doi: 10.1136/bmj.d124
4. Xue QL. The frailty syndrome: definition and natural history. Clin Geriatr Med. 2011;27:1-15. doi: 10.1016/j.cger.2010.08.009
5. O’Caoimh R, Sezgin D, O’Donovan MR, et al. Prevalence of frailty in 62 countries across the world: a systematic review and meta-analysis of population-level studies. Age Ageing. 2021;50:96-104. doi: 10.1093/ageing/afaa219
6. Campitelli MA, Bronskill SE, Hogan DB, et al. The prevalence and health consequences of frailty in a population-based older home care cohort: a comparison of different measures. BMC Geriatr. 2016;16:133. doi: 10.1186/s12877-016-0309-z
7. Kojima G. Frailty as a predictor of future falls among community-dwelling older people: a systematic review and meta-analysis. J Am Med Dir Assoc. 2015;16:1027-1033. doi: 10.1016/j.jamda. 2015.06.018
8. Kojima G. Frailty as a predictor of fractures among community-dwelling older people: a systematic review and meta-analysis. Bone. 2016;90:116-122. doi: 10.1016/j.bone.2016.06.009
9. Kojima G. Quick and simple FRAIL scale predicts incident activities of daily living (ADL) and instrumental ADL (IADL) disabilities: a systematic review and meta-analysis. J Am Med Dir Assoc. 2018;19:1063-1068. doi: 10.1016/j.jamda.2018.07.019
10. Kojima G, Liljas AEM, Iliffe S. Frailty syndrome: implications and challenges for health care policy. Risk Manag Healthc Policy. 2019;12:23-30. doi: 10.2147/RMHP.S168750
11. Roe L, Normand C, Wren MA, et al. The impact of frailty on healthcare utilisation in Ireland: evidence from The Irish Longitudinal Study on Ageing. BMC Geriatr. 2017;17:203. doi: 10.1186/s12877-017-0579-0
12. Hao Q, Zhou L, Dong B, et al. The role of frailty in predicting mortality and readmission in older adults in acute care wards: a prospective study. Sci Rep. 2019;9:1207. doi: 10.1038/s41598-018-38072-7
13. Fried LP, Tangen CM, Walston J, et al; Cardiovascular Health Study Collaborative Research Group. Frailty in older adults: evidence for a phenotype. J Gerontol A Biol Sci Med Sci. 2001;56:M146-M156. doi: 10.1093/gerona/56.3.m146
14. Ryan J, Espinoza S, Ernst ME, et al. Validation of a deficit-accumulation frailty Index in the ASPirin in Reducing Events in the Elderly study and its predictive capacity for disability-free survival. J Gerontol A Biol Sci Med Sci. 2022;77:19-26. doi: 10.1093/gerona/glab225
15. Kim DH, Glynn RJ, Avorn J, et al. Validation of a claims-based frailty index against physical performance and adverse health outcomes in the Health and Retirement Study. J Gerontol A Biol Sci Med Sci. 2019;74:1271-1276. doi: 10.1093/gerona/gly197
16. Kim DH, Schneeweiss S, Glynn RJ, et al. Measuring frailty in Medicare data: development and validation of a claims-based frailty index. J Gerontol A Biol Sci Med Sci. 2018;73:980-987. doi: 10.1093/gerona/glx229
17. Claims-based frailty index. Harvard Dataverse website. 2022. Accessed April 5, 2022. https://dataverse.harvard.edu/dataverse/cfi
18. Rockwood K, Song X, MacKnight C, et al. A global clinical measure of fitness and frailty in elderly people. CMAJ. 2005;173:489-95. doi: 10.1503/cmaj.050051
19. Dabigatran. GoodRx. Accessed September 26, 2022. www.goodrx.com/dabigatran
20. Rivaroxaban. GoodRx. Accessed September 26, 2022. www.goodrx.com/rivaroxaban
21. Apixaban (Eliquis). GoodRx. Accessed September 26, 2022. www.goodrx.com/eliquis
PRACTICE CHANGER
Consider apixaban, which demonstrated a lower adverse event (AE) rate than warfarin regardless of frailty status, for anticoagulation treatment of older patients with nonvalvular atrial fibrillation (AF); by comparison, AE rates for dabigatran and rivaroxaban were lower vs warfarin only among nonfrail individuals.
STRENGTH OF RECOMMENDATION
C: Based on a retrospective observational cohort study.1
Kim DH, Pawar A, Gagne JJ, et al. Frailty and clinical outcomes of direct oral anticoagulants versus warfarin in older adults with atrial fibrillation: a cohort study. Ann Intern Med. 2021;174:1214-1223. doi: 10.7326/M20-7141
Prednisone, colchicine equivalent in efficacy for CPP crystal arthritis
PHILADELPHIA – Prednisone appears to have the edge over colchicine for control of pain in patients with acute calcium pyrophosphate (CPP) crystal arthritis, an intensely painful rheumatic disease primarily affecting older patients.
Among 111 patients with acute CPP crystal arthritis randomized to receive either prednisone or colchicine for control of acute pain in a multicenter study, 2 days of therapy with the oral agents provided equivalent pain relief on the second day, and patients generally tolerated each agent well, reported Tristan Pascart, MD, from the Groupement Hospitalier de l’Institut Catholique de Lille (France).
“Almost three-fourths of patients are considered to be good responders to both drugs on day 3, and, maybe, safety is the key issue distinguishing the two treatments: Colchicine was generally well tolerated, but even with this very short time frame of treatment, one patient out of five had diarrhea, which is more of a concern in this elderly population at risk of dehydration,” he said in an oral abstract session at the annual meeting of the American College of Rheumatology.
In contrast, only about 6% of patients assigned to prednisone had diarrhea, and other adverse events that occurred more frequently with the corticosteroid, including hypertension, hyperglycemia, and insomnia all resolved after the therapy was stopped.
Common and acutely painful
Acute CPP crystal arthritis is a common complication that often occurs during hospitalization for primarily nonrheumatologic causes, Dr. Pascart said, and “in the absence of clinical trials, the management relies on expert opinion, which stems from extrapolated data from gap studies” primarily with prednisone or colchicine, Dr. Pascart said.
To fill in the knowledge gap, Dr. Pascart and colleagues conducted the COLCHICORT study to evaluate whether the two drugs were comparable in efficacy and safety for control of acute pain in a vulnerable population.
The multicenter, open-label trial included patients older than age 65 years with an estimated glomerular filtration rate above 30 mL/min per 1.73 m2 who presented with acute CPP deposition arthritis with symptoms occurring within the previous 36 hours. CPP arthritis was defined by the identification of CPP crystals on synovial fluid analysis or typical clinical presentation with evidence of chondrocalcinosis on x-rays or ultrasound.
Patients with a history of gout, cognitive decline that could impair pain assessment, or contraindications to either of the study drugs were excluded.
The participants were randomized to receive either colchicine 1.5 mg (1 mg to start, then 0.5 mg one hour later) at baseline and then 1 mg on day 1, or oral prednisone 30 mg at baseline and on day 1. The patients also received 1 g of systemic acetaminophen, and three 50-mg doses of tramadol during the first 24 hours.
Of the 111 patients randomized, 54 were assigned to receive prednisone, and 57 were assigned to receive colchicine. Baseline characteristics were similar between the groups, with a mean age of about 86 years, body mass index of around 25 kg/m2, and blood pressure in the range of 130/69 mm Hg.
For nearly half of all patients in study each arm the most painful joint was the knee, followed by wrists and ankles.
There was no difference between the groups in the primary efficacy outcome of a change at 24 hours over baseline in visual analog scale (VAS) (0-100 mm) scores, either in a per-protocol analysis or modified intention-to-treat analysis. The mean change in VAS at 24 hours in the colchicine group was –36.6 mm, compared with –37.7 mm in the prednisone group. The investigators had previously determined that any difference between the two drugs of less than 13 mm on pain VAS at 24 hours would meet the definition for equivalent efficacy.
In both groups, a majority of patients had either an improvement greater than 50% in pain VAS scores and/or a pain VAS score less than 40 mm at both 24 and 48 hours.
At 7 days of follow-up, 21.8% of patients assigned to colchicine had diarrhea, compared with 5.6% of those assigned to prednisone. Adverse events occurring more frequently with prednisone included hyperglycemia, hypertension, and insomnia.
Patients who received colchicine and were also on statins had a trend toward a higher risk for diarrhea, but the study was not adequately powered to detect an association, and the trend was not statistically significant, Dr. Pascart said.
“Taken together, safety issues suggest that prednisone should be considered as the first-line therapy in acute CPP crystal arthritis. Future research is warranted to determine factors increasing the risk of colchicine-induced diarrhea,” he concluded.
Both drugs are used
Sara K. Tedeschi, MD, from Brigham & Women’s Hospital in Boston, who attended the session where the data were presented, has a special clinical interest in CPP deposition disease. She applauded Dr. Pascart and colleagues for conducting a rare clinical trial in CPP crystal arthritis.
In an interview, she said that the study suggests “we can keep in mind shorter courses of treatment for acute CPP crystal arthritis; I think that’s one big takeaway from this study.”
Asked whether she would change her practice based on the findings, Dr. Tedeschi replied: “I personally am not sure that I would be moved to use prednisone more than colchicine; I actually take away from this that colchicine is equivalent to prednisone for short-term use for CPP arthritis, but I think it’s also really important to note that this is in the context of quite a lot of acetaminophen and quite a lot of tramadol, and frankly I don’t usually use tramadol with my patients, but I might consider doing that, especially as there were no delirium events in this population.”
Dr. Tedeschi was not involved in the study.
Asked the same question, Michael Toprover, MD, from New York University Langone Medical Center, a moderator of the session who was not involved in the study, said: “I usually use a combination of medications. I generally, in someone who is hospitalized in particular and is in such severe pain, use a combination of colchicine and prednisone, unless I’m worried about infection, in which case I’ll start colchicine until we’ve proven that it’s CPPD, and then I’ll add prednisone.”
The study was funded by PHRC-1 GIRCI Nord Ouest, a clinical research program funded by the Ministry of Health in France. Dr. Pascart, Dr. Tedeschi, and Dr. Toprover all reported having no relevant conflicts of interest.
PHILADELPHIA – Prednisone appears to have the edge over colchicine for control of pain in patients with acute calcium pyrophosphate (CPP) crystal arthritis, an intensely painful rheumatic disease primarily affecting older patients.
Among 111 patients with acute CPP crystal arthritis randomized to receive either prednisone or colchicine for control of acute pain in a multicenter study, 2 days of therapy with the oral agents provided equivalent pain relief on the second day, and patients generally tolerated each agent well, reported Tristan Pascart, MD, from the Groupement Hospitalier de l’Institut Catholique de Lille (France).
“Almost three-fourths of patients are considered to be good responders to both drugs on day 3, and, maybe, safety is the key issue distinguishing the two treatments: Colchicine was generally well tolerated, but even with this very short time frame of treatment, one patient out of five had diarrhea, which is more of a concern in this elderly population at risk of dehydration,” he said in an oral abstract session at the annual meeting of the American College of Rheumatology.
In contrast, only about 6% of patients assigned to prednisone had diarrhea, and other adverse events that occurred more frequently with the corticosteroid, including hypertension, hyperglycemia, and insomnia all resolved after the therapy was stopped.
Common and acutely painful
Acute CPP crystal arthritis is a common complication that often occurs during hospitalization for primarily nonrheumatologic causes, Dr. Pascart said, and “in the absence of clinical trials, the management relies on expert opinion, which stems from extrapolated data from gap studies” primarily with prednisone or colchicine, Dr. Pascart said.
To fill in the knowledge gap, Dr. Pascart and colleagues conducted the COLCHICORT study to evaluate whether the two drugs were comparable in efficacy and safety for control of acute pain in a vulnerable population.
The multicenter, open-label trial included patients older than age 65 years with an estimated glomerular filtration rate above 30 mL/min per 1.73 m2 who presented with acute CPP deposition arthritis with symptoms occurring within the previous 36 hours. CPP arthritis was defined by the identification of CPP crystals on synovial fluid analysis or typical clinical presentation with evidence of chondrocalcinosis on x-rays or ultrasound.
Patients with a history of gout, cognitive decline that could impair pain assessment, or contraindications to either of the study drugs were excluded.
The participants were randomized to receive either colchicine 1.5 mg (1 mg to start, then 0.5 mg one hour later) at baseline and then 1 mg on day 1, or oral prednisone 30 mg at baseline and on day 1. The patients also received 1 g of systemic acetaminophen, and three 50-mg doses of tramadol during the first 24 hours.
Of the 111 patients randomized, 54 were assigned to receive prednisone, and 57 were assigned to receive colchicine. Baseline characteristics were similar between the groups, with a mean age of about 86 years, body mass index of around 25 kg/m2, and blood pressure in the range of 130/69 mm Hg.
For nearly half of all patients in study each arm the most painful joint was the knee, followed by wrists and ankles.
There was no difference between the groups in the primary efficacy outcome of a change at 24 hours over baseline in visual analog scale (VAS) (0-100 mm) scores, either in a per-protocol analysis or modified intention-to-treat analysis. The mean change in VAS at 24 hours in the colchicine group was –36.6 mm, compared with –37.7 mm in the prednisone group. The investigators had previously determined that any difference between the two drugs of less than 13 mm on pain VAS at 24 hours would meet the definition for equivalent efficacy.
In both groups, a majority of patients had either an improvement greater than 50% in pain VAS scores and/or a pain VAS score less than 40 mm at both 24 and 48 hours.
At 7 days of follow-up, 21.8% of patients assigned to colchicine had diarrhea, compared with 5.6% of those assigned to prednisone. Adverse events occurring more frequently with prednisone included hyperglycemia, hypertension, and insomnia.
Patients who received colchicine and were also on statins had a trend toward a higher risk for diarrhea, but the study was not adequately powered to detect an association, and the trend was not statistically significant, Dr. Pascart said.
“Taken together, safety issues suggest that prednisone should be considered as the first-line therapy in acute CPP crystal arthritis. Future research is warranted to determine factors increasing the risk of colchicine-induced diarrhea,” he concluded.
Both drugs are used
Sara K. Tedeschi, MD, from Brigham & Women’s Hospital in Boston, who attended the session where the data were presented, has a special clinical interest in CPP deposition disease. She applauded Dr. Pascart and colleagues for conducting a rare clinical trial in CPP crystal arthritis.
In an interview, she said that the study suggests “we can keep in mind shorter courses of treatment for acute CPP crystal arthritis; I think that’s one big takeaway from this study.”
Asked whether she would change her practice based on the findings, Dr. Tedeschi replied: “I personally am not sure that I would be moved to use prednisone more than colchicine; I actually take away from this that colchicine is equivalent to prednisone for short-term use for CPP arthritis, but I think it’s also really important to note that this is in the context of quite a lot of acetaminophen and quite a lot of tramadol, and frankly I don’t usually use tramadol with my patients, but I might consider doing that, especially as there were no delirium events in this population.”
Dr. Tedeschi was not involved in the study.
Asked the same question, Michael Toprover, MD, from New York University Langone Medical Center, a moderator of the session who was not involved in the study, said: “I usually use a combination of medications. I generally, in someone who is hospitalized in particular and is in such severe pain, use a combination of colchicine and prednisone, unless I’m worried about infection, in which case I’ll start colchicine until we’ve proven that it’s CPPD, and then I’ll add prednisone.”
The study was funded by PHRC-1 GIRCI Nord Ouest, a clinical research program funded by the Ministry of Health in France. Dr. Pascart, Dr. Tedeschi, and Dr. Toprover all reported having no relevant conflicts of interest.
PHILADELPHIA – Prednisone appears to have the edge over colchicine for control of pain in patients with acute calcium pyrophosphate (CPP) crystal arthritis, an intensely painful rheumatic disease primarily affecting older patients.
Among 111 patients with acute CPP crystal arthritis randomized to receive either prednisone or colchicine for control of acute pain in a multicenter study, 2 days of therapy with the oral agents provided equivalent pain relief on the second day, and patients generally tolerated each agent well, reported Tristan Pascart, MD, from the Groupement Hospitalier de l’Institut Catholique de Lille (France).
“Almost three-fourths of patients are considered to be good responders to both drugs on day 3, and, maybe, safety is the key issue distinguishing the two treatments: Colchicine was generally well tolerated, but even with this very short time frame of treatment, one patient out of five had diarrhea, which is more of a concern in this elderly population at risk of dehydration,” he said in an oral abstract session at the annual meeting of the American College of Rheumatology.
In contrast, only about 6% of patients assigned to prednisone had diarrhea, and other adverse events that occurred more frequently with the corticosteroid, including hypertension, hyperglycemia, and insomnia all resolved after the therapy was stopped.
Common and acutely painful
Acute CPP crystal arthritis is a common complication that often occurs during hospitalization for primarily nonrheumatologic causes, Dr. Pascart said, and “in the absence of clinical trials, the management relies on expert opinion, which stems from extrapolated data from gap studies” primarily with prednisone or colchicine, Dr. Pascart said.
To fill in the knowledge gap, Dr. Pascart and colleagues conducted the COLCHICORT study to evaluate whether the two drugs were comparable in efficacy and safety for control of acute pain in a vulnerable population.
The multicenter, open-label trial included patients older than age 65 years with an estimated glomerular filtration rate above 30 mL/min per 1.73 m2 who presented with acute CPP deposition arthritis with symptoms occurring within the previous 36 hours. CPP arthritis was defined by the identification of CPP crystals on synovial fluid analysis or typical clinical presentation with evidence of chondrocalcinosis on x-rays or ultrasound.
Patients with a history of gout, cognitive decline that could impair pain assessment, or contraindications to either of the study drugs were excluded.
The participants were randomized to receive either colchicine 1.5 mg (1 mg to start, then 0.5 mg one hour later) at baseline and then 1 mg on day 1, or oral prednisone 30 mg at baseline and on day 1. The patients also received 1 g of systemic acetaminophen, and three 50-mg doses of tramadol during the first 24 hours.
Of the 111 patients randomized, 54 were assigned to receive prednisone, and 57 were assigned to receive colchicine. Baseline characteristics were similar between the groups, with a mean age of about 86 years, body mass index of around 25 kg/m2, and blood pressure in the range of 130/69 mm Hg.
For nearly half of all patients in study each arm the most painful joint was the knee, followed by wrists and ankles.
There was no difference between the groups in the primary efficacy outcome of a change at 24 hours over baseline in visual analog scale (VAS) (0-100 mm) scores, either in a per-protocol analysis or modified intention-to-treat analysis. The mean change in VAS at 24 hours in the colchicine group was –36.6 mm, compared with –37.7 mm in the prednisone group. The investigators had previously determined that any difference between the two drugs of less than 13 mm on pain VAS at 24 hours would meet the definition for equivalent efficacy.
In both groups, a majority of patients had either an improvement greater than 50% in pain VAS scores and/or a pain VAS score less than 40 mm at both 24 and 48 hours.
At 7 days of follow-up, 21.8% of patients assigned to colchicine had diarrhea, compared with 5.6% of those assigned to prednisone. Adverse events occurring more frequently with prednisone included hyperglycemia, hypertension, and insomnia.
Patients who received colchicine and were also on statins had a trend toward a higher risk for diarrhea, but the study was not adequately powered to detect an association, and the trend was not statistically significant, Dr. Pascart said.
“Taken together, safety issues suggest that prednisone should be considered as the first-line therapy in acute CPP crystal arthritis. Future research is warranted to determine factors increasing the risk of colchicine-induced diarrhea,” he concluded.
Both drugs are used
Sara K. Tedeschi, MD, from Brigham & Women’s Hospital in Boston, who attended the session where the data were presented, has a special clinical interest in CPP deposition disease. She applauded Dr. Pascart and colleagues for conducting a rare clinical trial in CPP crystal arthritis.
In an interview, she said that the study suggests “we can keep in mind shorter courses of treatment for acute CPP crystal arthritis; I think that’s one big takeaway from this study.”
Asked whether she would change her practice based on the findings, Dr. Tedeschi replied: “I personally am not sure that I would be moved to use prednisone more than colchicine; I actually take away from this that colchicine is equivalent to prednisone for short-term use for CPP arthritis, but I think it’s also really important to note that this is in the context of quite a lot of acetaminophen and quite a lot of tramadol, and frankly I don’t usually use tramadol with my patients, but I might consider doing that, especially as there were no delirium events in this population.”
Dr. Tedeschi was not involved in the study.
Asked the same question, Michael Toprover, MD, from New York University Langone Medical Center, a moderator of the session who was not involved in the study, said: “I usually use a combination of medications. I generally, in someone who is hospitalized in particular and is in such severe pain, use a combination of colchicine and prednisone, unless I’m worried about infection, in which case I’ll start colchicine until we’ve proven that it’s CPPD, and then I’ll add prednisone.”
The study was funded by PHRC-1 GIRCI Nord Ouest, a clinical research program funded by the Ministry of Health in France. Dr. Pascart, Dr. Tedeschi, and Dr. Toprover all reported having no relevant conflicts of interest.
AT ACR 2022
Total replacement and fusion yield similar outcomes for ankle osteoarthritis
Ankle osteoarthritis remains a cause of severe pain and disability. Patients are treated nonoperatively if possible, but surgery is often needed for individuals with end-stage disease, wrote Andrew Goldberg, MBBS, of University College London and colleagues in the Annals of Internal Medicine.
“Most patients with ankle arthritis respond to nonoperative treatments, such as weight loss, activity modification, support braces, and analgesia, [but] once the disease has progressed to end-stage osteoarthritis, the main surgical treatments are total ankle re-placement or ankle arthrodesis,” Dr. Goldberg said, in an interview.
In the new study, patients were randomized to receive either a total ankle replacement (TAR) or ankle fusion (AF).
“We showed that, in both treatment groups the clinical scores improved hugely, by more than three times the minimal clinically important difference,” Dr. Goldberg said in an interview.
“Although the ankle replacement arm improved, on average, by more than an extra 4 points over ankle fusion, this was not considered clinically or statistically significant,” he said.
The study is the first randomized trial to show high-quality and robust results, he noted, and findings support data from previous studies.
“Although both TAR and ankle fusion have been shown to be effective, they are very different treatments, with one fusing the bones so that there is no ankle joint movement, and the other replacing the joint with the aim of retaining ankle joint movement. It is difficult for a patient to know which treatment is more suitable for them, with most seeking guidance from their surgeon,” he said.
Generating high-quality evidence
The study, a randomized, multicenter, open-label trial known as TARVA (Total Ankle Replacement Versus Ankle Arthrodesis), aimed to compare the clinical effectiveness of the two existing publicly funded U.K. treatment options, the authors wrote.
Patients were recruited at 17 U.K. centers between March 6, 2015, and Jan. 10, 2019. The study enrolled 303 adults aged 50-85 years with end-stage ankle osteoarthritis. The mean age of the participants was 68 years; 71% were men. A total of 137 TAR patients and 144 ankle fusion patients completed their surgeries with clinical scores available for analysis. Baseline characteristics were mainly similar between the groups.
Blinding was not possible because of the nature of the procedures, but the surgeons who screened the patients were not aware of the randomization allocations, the researchers noted. A total of 33 surgeons participated in the trial, with a median number of seven patients per surgeon during the study period.
For TAR, U.K. surgeons use both two-component, fixed-bearing and three-component, mobile-bearing implants, the authors write. Ankle fusion was done using the surgeon’s usual technique of either arthroscopic-assisted or open ankle fusion.
The primary outcome was the change in the Manchester–Oxford Foot Questionnaire walking/standing (MOXFQ-W/S) domain scores from baseline to 52 weeks after surgery. The MOXFQ-W/S uses a scale of 0-100, with lower scores representing better outcomes. Secondary outcomes included change in the MOXFQ-W/S scores at 26 weeks after surgery, as well as measures of patient quality of life.
No statistically significant difference
Overall, the mean MOXFQ-W/S scores improved significantly from baseline to 52 weeks for both groups, with average improvements of 49.9 in the TAR group and 44.4 points in the AF group. The average scores at 52 weeks were 31.4 in the TAR group and 36.8 in the AF group.
The adjusted difference in score change from baseline was –5.56, showing a slightly greater degree of improvement with TAR, but this difference was not clinically or statistically significant, the researchers noted.
Adverse event numbers were similar for both procedures, with 54% of TAR patients and 53% of AF patients experiencing at least 1 adverse event during the study period. Of those, 18% of TAR patients and 24% of AF patients experienced at least 1 serious adverse event.
However, the TAR patients experienced a higher rate of wound healing complications and nerve injuries, while thromboembolism was higher in the AF patients, the researchers noted.
A prespecified subgroup analysis of patients with osteoarthritis in adjacent joints suggested a greater improvement in TAR, compared with AF, a difference that increased when fixed-bearing TAR was compared with AF, the authors wrote.
“This reinforces previous reports that suggest that the presence of adjacent joint arthritis may be an indication for ankle replacement over AF,” the authors wrote in their discussion.
“Many of these patients did not have any symptoms in the adjacent joints,” they noted.
“The presence of adjacent joint arthritis, meaning the wear and tear of the joints around the ankle joint, seemed to favor ankle replacement,” Dr. Goldberg said. Approximately 30 joints in the foot continue to move after the ankle is fused, and if these adjacent joints are not healthy before surgery [as was the case in 42% of the study patients], the results of fusion were less successful, he explained.
A post hoc analysis between TAR subtypes showed that patients who had fixed-bearing TAR had significantly greater improvements, compared with AF patients, but this difference was not observed in patients who had mobile-bearing TAR, the researchers noted.
Dr. Goldberg said it was surprising “that, in a separate analysis, we found that the fixed-bearing ankle replacement patients [who accounted for half of the implants used] improved by a much greater difference when compared to ankle fusion.”
The study findings were limited by several factors including the short follow-up and study design that allowed surgeons to choose any implant and technique, the researchers noted.
Other limitations include a lack of data on cost-effectiveness and the impact of comorbidities on outcomes, they wrote. However, the study is the first completed multicenter randomized controlled trial to compare TAR and AF procedures for end-stage ankle osteoarthritis and shows that both yield similar clinical improvements, they concluded.
Data can inform treatment discussion
The take-home messages for clinicians are that both ankle replacement and ankle fusion are effective treatments that improve patients’ quality of life, and it is important to establish the health of adjacent joints before making treatment recommendations, Dr. Goldberg said.
“Careful counseling on the relative risks of each procedure should be part of the informed consent process,” he added. Ideally, all patients seeking surgical care for ankle arthritis should have a choice between ankle replacement and ankle fusion, but sometimes there is inequity of provision of the two treatments, he noted.
“We now encourage all surgeons to work in ankle arthritis networks so that every patient, no matter where they live, can have choice about the best treatment for them,” he said.
Researchers met the challenge of surgical RCT
Randomized trials of surgical interventions are challenging to conduct, and therefore limited, wrote Bruce Sangeorzan, MD, of the University of Washington, Seattle, and colleagues in an accompanying editorial. However, the new study was strengthened by the inclusion of 17 centers for heterogeneity of implant type and surgeon experience level, the editorialists said in the Annals of Internal Medicine.
The study is especially important, because ankle arthritis treatment is very understudied, compared with hip and knee arthritis, but it has a similar impact on activity, editorial coauthor Dr. Sangeorzan said in an interview.
“Randomized controlled trials are the gold standard for comparing medical therapies,” he said, “but they are very difficult to do in surgical treatments, particularly when the two treatments can be differentiated, in this case by movement of the ankle.”
In addition, there is a strong placebo effect attached to interventions, Dr. Sangeorzan noted. “Determining best-case treatment relies on prospective research, preferably randomized. Since both ankle fusion and ankle replacement are effective therapies, a prospective randomized trial is the best way to help make treatment decisions,” he said.
The current study findings are not surprising, but they are preliminary, and 1 year of follow-up is not enough to determine effectiveness, Dr. Sangeorzan emphasized. However, “the authors have done the hard work of randomizing the patients and collecting the data, and the patients can now be followed for a longer time,” he said.
“In addition, the trial was designed with multiple secondary outcome measures, so the data can be matched up with larger trials that were not randomized to identify key elements of success for each procedure,” he noted.
The key message for clinicians is that ankle arthritis has a significant impact on patients’ lives, but there are two effective treatments that can reduce the impact of the disease, said Dr. Sangeorzan. “The data suggest that there are differences in implant design and differences in comorbidities that should influence decision-making,” he added.
Additional research is needed in the form of a longer study duration with larger cohorts, said Dr. Sangeorzan. In particular, researchers need to determine what comorbidities might drive patients to one type of care vs. another, he said. “The suggestion that [patients receiving implants with two motion segments have better outcomes than those receiving implants with a one-motion segment] also deserves further study,” he added.
The research was supported by the UK National Institute for Health and Care Research Health Technology Assessment Programme. The trial was sponsored by University College London. Dr. Goldberg disclosed grant support from NIHR HTA, as well as financial relationships with companies including Stryker, Paragon 28, and stock options with Standing CT Company, Elstree Waterfront Outpatients, and X Bolt Orthopedics.
The editorialists had no financial conflicts to disclose.
Ankle osteoarthritis remains a cause of severe pain and disability. Patients are treated nonoperatively if possible, but surgery is often needed for individuals with end-stage disease, wrote Andrew Goldberg, MBBS, of University College London and colleagues in the Annals of Internal Medicine.
“Most patients with ankle arthritis respond to nonoperative treatments, such as weight loss, activity modification, support braces, and analgesia, [but] once the disease has progressed to end-stage osteoarthritis, the main surgical treatments are total ankle re-placement or ankle arthrodesis,” Dr. Goldberg said, in an interview.
In the new study, patients were randomized to receive either a total ankle replacement (TAR) or ankle fusion (AF).
“We showed that, in both treatment groups the clinical scores improved hugely, by more than three times the minimal clinically important difference,” Dr. Goldberg said in an interview.
“Although the ankle replacement arm improved, on average, by more than an extra 4 points over ankle fusion, this was not considered clinically or statistically significant,” he said.
The study is the first randomized trial to show high-quality and robust results, he noted, and findings support data from previous studies.
“Although both TAR and ankle fusion have been shown to be effective, they are very different treatments, with one fusing the bones so that there is no ankle joint movement, and the other replacing the joint with the aim of retaining ankle joint movement. It is difficult for a patient to know which treatment is more suitable for them, with most seeking guidance from their surgeon,” he said.
Generating high-quality evidence
The study, a randomized, multicenter, open-label trial known as TARVA (Total Ankle Replacement Versus Ankle Arthrodesis), aimed to compare the clinical effectiveness of the two existing publicly funded U.K. treatment options, the authors wrote.
Patients were recruited at 17 U.K. centers between March 6, 2015, and Jan. 10, 2019. The study enrolled 303 adults aged 50-85 years with end-stage ankle osteoarthritis. The mean age of the participants was 68 years; 71% were men. A total of 137 TAR patients and 144 ankle fusion patients completed their surgeries with clinical scores available for analysis. Baseline characteristics were mainly similar between the groups.
Blinding was not possible because of the nature of the procedures, but the surgeons who screened the patients were not aware of the randomization allocations, the researchers noted. A total of 33 surgeons participated in the trial, with a median number of seven patients per surgeon during the study period.
For TAR, U.K. surgeons use both two-component, fixed-bearing and three-component, mobile-bearing implants, the authors write. Ankle fusion was done using the surgeon’s usual technique of either arthroscopic-assisted or open ankle fusion.
The primary outcome was the change in the Manchester–Oxford Foot Questionnaire walking/standing (MOXFQ-W/S) domain scores from baseline to 52 weeks after surgery. The MOXFQ-W/S uses a scale of 0-100, with lower scores representing better outcomes. Secondary outcomes included change in the MOXFQ-W/S scores at 26 weeks after surgery, as well as measures of patient quality of life.
No statistically significant difference
Overall, the mean MOXFQ-W/S scores improved significantly from baseline to 52 weeks for both groups, with average improvements of 49.9 in the TAR group and 44.4 points in the AF group. The average scores at 52 weeks were 31.4 in the TAR group and 36.8 in the AF group.
The adjusted difference in score change from baseline was –5.56, showing a slightly greater degree of improvement with TAR, but this difference was not clinically or statistically significant, the researchers noted.
Adverse event numbers were similar for both procedures, with 54% of TAR patients and 53% of AF patients experiencing at least 1 adverse event during the study period. Of those, 18% of TAR patients and 24% of AF patients experienced at least 1 serious adverse event.
However, the TAR patients experienced a higher rate of wound healing complications and nerve injuries, while thromboembolism was higher in the AF patients, the researchers noted.
A prespecified subgroup analysis of patients with osteoarthritis in adjacent joints suggested a greater improvement in TAR, compared with AF, a difference that increased when fixed-bearing TAR was compared with AF, the authors wrote.
“This reinforces previous reports that suggest that the presence of adjacent joint arthritis may be an indication for ankle replacement over AF,” the authors wrote in their discussion.
“Many of these patients did not have any symptoms in the adjacent joints,” they noted.
“The presence of adjacent joint arthritis, meaning the wear and tear of the joints around the ankle joint, seemed to favor ankle replacement,” Dr. Goldberg said. Approximately 30 joints in the foot continue to move after the ankle is fused, and if these adjacent joints are not healthy before surgery [as was the case in 42% of the study patients], the results of fusion were less successful, he explained.
A post hoc analysis between TAR subtypes showed that patients who had fixed-bearing TAR had significantly greater improvements, compared with AF patients, but this difference was not observed in patients who had mobile-bearing TAR, the researchers noted.
Dr. Goldberg said it was surprising “that, in a separate analysis, we found that the fixed-bearing ankle replacement patients [who accounted for half of the implants used] improved by a much greater difference when compared to ankle fusion.”
The study findings were limited by several factors including the short follow-up and study design that allowed surgeons to choose any implant and technique, the researchers noted.
Other limitations include a lack of data on cost-effectiveness and the impact of comorbidities on outcomes, they wrote. However, the study is the first completed multicenter randomized controlled trial to compare TAR and AF procedures for end-stage ankle osteoarthritis and shows that both yield similar clinical improvements, they concluded.
Data can inform treatment discussion
The take-home messages for clinicians are that both ankle replacement and ankle fusion are effective treatments that improve patients’ quality of life, and it is important to establish the health of adjacent joints before making treatment recommendations, Dr. Goldberg said.
“Careful counseling on the relative risks of each procedure should be part of the informed consent process,” he added. Ideally, all patients seeking surgical care for ankle arthritis should have a choice between ankle replacement and ankle fusion, but sometimes there is inequity of provision of the two treatments, he noted.
“We now encourage all surgeons to work in ankle arthritis networks so that every patient, no matter where they live, can have choice about the best treatment for them,” he said.
Researchers met the challenge of surgical RCT
Randomized trials of surgical interventions are challenging to conduct, and therefore limited, wrote Bruce Sangeorzan, MD, of the University of Washington, Seattle, and colleagues in an accompanying editorial. However, the new study was strengthened by the inclusion of 17 centers for heterogeneity of implant type and surgeon experience level, the editorialists said in the Annals of Internal Medicine.
The study is especially important, because ankle arthritis treatment is very understudied, compared with hip and knee arthritis, but it has a similar impact on activity, editorial coauthor Dr. Sangeorzan said in an interview.
“Randomized controlled trials are the gold standard for comparing medical therapies,” he said, “but they are very difficult to do in surgical treatments, particularly when the two treatments can be differentiated, in this case by movement of the ankle.”
In addition, there is a strong placebo effect attached to interventions, Dr. Sangeorzan noted. “Determining best-case treatment relies on prospective research, preferably randomized. Since both ankle fusion and ankle replacement are effective therapies, a prospective randomized trial is the best way to help make treatment decisions,” he said.
The current study findings are not surprising, but they are preliminary, and 1 year of follow-up is not enough to determine effectiveness, Dr. Sangeorzan emphasized. However, “the authors have done the hard work of randomizing the patients and collecting the data, and the patients can now be followed for a longer time,” he said.
“In addition, the trial was designed with multiple secondary outcome measures, so the data can be matched up with larger trials that were not randomized to identify key elements of success for each procedure,” he noted.
The key message for clinicians is that ankle arthritis has a significant impact on patients’ lives, but there are two effective treatments that can reduce the impact of the disease, said Dr. Sangeorzan. “The data suggest that there are differences in implant design and differences in comorbidities that should influence decision-making,” he added.
Additional research is needed in the form of a longer study duration with larger cohorts, said Dr. Sangeorzan. In particular, researchers need to determine what comorbidities might drive patients to one type of care vs. another, he said. “The suggestion that [patients receiving implants with two motion segments have better outcomes than those receiving implants with a one-motion segment] also deserves further study,” he added.
The research was supported by the UK National Institute for Health and Care Research Health Technology Assessment Programme. The trial was sponsored by University College London. Dr. Goldberg disclosed grant support from NIHR HTA, as well as financial relationships with companies including Stryker, Paragon 28, and stock options with Standing CT Company, Elstree Waterfront Outpatients, and X Bolt Orthopedics.
The editorialists had no financial conflicts to disclose.
Ankle osteoarthritis remains a cause of severe pain and disability. Patients are treated nonoperatively if possible, but surgery is often needed for individuals with end-stage disease, wrote Andrew Goldberg, MBBS, of University College London and colleagues in the Annals of Internal Medicine.
“Most patients with ankle arthritis respond to nonoperative treatments, such as weight loss, activity modification, support braces, and analgesia, [but] once the disease has progressed to end-stage osteoarthritis, the main surgical treatments are total ankle re-placement or ankle arthrodesis,” Dr. Goldberg said, in an interview.
In the new study, patients were randomized to receive either a total ankle replacement (TAR) or ankle fusion (AF).
“We showed that, in both treatment groups the clinical scores improved hugely, by more than three times the minimal clinically important difference,” Dr. Goldberg said in an interview.
“Although the ankle replacement arm improved, on average, by more than an extra 4 points over ankle fusion, this was not considered clinically or statistically significant,” he said.
The study is the first randomized trial to show high-quality and robust results, he noted, and findings support data from previous studies.
“Although both TAR and ankle fusion have been shown to be effective, they are very different treatments, with one fusing the bones so that there is no ankle joint movement, and the other replacing the joint with the aim of retaining ankle joint movement. It is difficult for a patient to know which treatment is more suitable for them, with most seeking guidance from their surgeon,” he said.
Generating high-quality evidence
The study, a randomized, multicenter, open-label trial known as TARVA (Total Ankle Replacement Versus Ankle Arthrodesis), aimed to compare the clinical effectiveness of the two existing publicly funded U.K. treatment options, the authors wrote.
Patients were recruited at 17 U.K. centers between March 6, 2015, and Jan. 10, 2019. The study enrolled 303 adults aged 50-85 years with end-stage ankle osteoarthritis. The mean age of the participants was 68 years; 71% were men. A total of 137 TAR patients and 144 ankle fusion patients completed their surgeries with clinical scores available for analysis. Baseline characteristics were mainly similar between the groups.
Blinding was not possible because of the nature of the procedures, but the surgeons who screened the patients were not aware of the randomization allocations, the researchers noted. A total of 33 surgeons participated in the trial, with a median number of seven patients per surgeon during the study period.
For TAR, U.K. surgeons use both two-component, fixed-bearing and three-component, mobile-bearing implants, the authors write. Ankle fusion was done using the surgeon’s usual technique of either arthroscopic-assisted or open ankle fusion.
The primary outcome was the change in the Manchester–Oxford Foot Questionnaire walking/standing (MOXFQ-W/S) domain scores from baseline to 52 weeks after surgery. The MOXFQ-W/S uses a scale of 0-100, with lower scores representing better outcomes. Secondary outcomes included change in the MOXFQ-W/S scores at 26 weeks after surgery, as well as measures of patient quality of life.
No statistically significant difference
Overall, the mean MOXFQ-W/S scores improved significantly from baseline to 52 weeks for both groups, with average improvements of 49.9 in the TAR group and 44.4 points in the AF group. The average scores at 52 weeks were 31.4 in the TAR group and 36.8 in the AF group.
The adjusted difference in score change from baseline was –5.56, showing a slightly greater degree of improvement with TAR, but this difference was not clinically or statistically significant, the researchers noted.
Adverse event numbers were similar for both procedures, with 54% of TAR patients and 53% of AF patients experiencing at least 1 adverse event during the study period. Of those, 18% of TAR patients and 24% of AF patients experienced at least 1 serious adverse event.
However, the TAR patients experienced a higher rate of wound healing complications and nerve injuries, while thromboembolism was higher in the AF patients, the researchers noted.
A prespecified subgroup analysis of patients with osteoarthritis in adjacent joints suggested a greater improvement in TAR, compared with AF, a difference that increased when fixed-bearing TAR was compared with AF, the authors wrote.
“This reinforces previous reports that suggest that the presence of adjacent joint arthritis may be an indication for ankle replacement over AF,” the authors wrote in their discussion.
“Many of these patients did not have any symptoms in the adjacent joints,” they noted.
“The presence of adjacent joint arthritis, meaning the wear and tear of the joints around the ankle joint, seemed to favor ankle replacement,” Dr. Goldberg said. Approximately 30 joints in the foot continue to move after the ankle is fused, and if these adjacent joints are not healthy before surgery [as was the case in 42% of the study patients], the results of fusion were less successful, he explained.
A post hoc analysis between TAR subtypes showed that patients who had fixed-bearing TAR had significantly greater improvements, compared with AF patients, but this difference was not observed in patients who had mobile-bearing TAR, the researchers noted.
Dr. Goldberg said it was surprising “that, in a separate analysis, we found that the fixed-bearing ankle replacement patients [who accounted for half of the implants used] improved by a much greater difference when compared to ankle fusion.”
The study findings were limited by several factors including the short follow-up and study design that allowed surgeons to choose any implant and technique, the researchers noted.
Other limitations include a lack of data on cost-effectiveness and the impact of comorbidities on outcomes, they wrote. However, the study is the first completed multicenter randomized controlled trial to compare TAR and AF procedures for end-stage ankle osteoarthritis and shows that both yield similar clinical improvements, they concluded.
Data can inform treatment discussion
The take-home messages for clinicians are that both ankle replacement and ankle fusion are effective treatments that improve patients’ quality of life, and it is important to establish the health of adjacent joints before making treatment recommendations, Dr. Goldberg said.
“Careful counseling on the relative risks of each procedure should be part of the informed consent process,” he added. Ideally, all patients seeking surgical care for ankle arthritis should have a choice between ankle replacement and ankle fusion, but sometimes there is inequity of provision of the two treatments, he noted.
“We now encourage all surgeons to work in ankle arthritis networks so that every patient, no matter where they live, can have choice about the best treatment for them,” he said.
Researchers met the challenge of surgical RCT
Randomized trials of surgical interventions are challenging to conduct, and therefore limited, wrote Bruce Sangeorzan, MD, of the University of Washington, Seattle, and colleagues in an accompanying editorial. However, the new study was strengthened by the inclusion of 17 centers for heterogeneity of implant type and surgeon experience level, the editorialists said in the Annals of Internal Medicine.
The study is especially important, because ankle arthritis treatment is very understudied, compared with hip and knee arthritis, but it has a similar impact on activity, editorial coauthor Dr. Sangeorzan said in an interview.
“Randomized controlled trials are the gold standard for comparing medical therapies,” he said, “but they are very difficult to do in surgical treatments, particularly when the two treatments can be differentiated, in this case by movement of the ankle.”
In addition, there is a strong placebo effect attached to interventions, Dr. Sangeorzan noted. “Determining best-case treatment relies on prospective research, preferably randomized. Since both ankle fusion and ankle replacement are effective therapies, a prospective randomized trial is the best way to help make treatment decisions,” he said.
The current study findings are not surprising, but they are preliminary, and 1 year of follow-up is not enough to determine effectiveness, Dr. Sangeorzan emphasized. However, “the authors have done the hard work of randomizing the patients and collecting the data, and the patients can now be followed for a longer time,” he said.
“In addition, the trial was designed with multiple secondary outcome measures, so the data can be matched up with larger trials that were not randomized to identify key elements of success for each procedure,” he noted.
The key message for clinicians is that ankle arthritis has a significant impact on patients’ lives, but there are two effective treatments that can reduce the impact of the disease, said Dr. Sangeorzan. “The data suggest that there are differences in implant design and differences in comorbidities that should influence decision-making,” he added.
Additional research is needed in the form of a longer study duration with larger cohorts, said Dr. Sangeorzan. In particular, researchers need to determine what comorbidities might drive patients to one type of care vs. another, he said. “The suggestion that [patients receiving implants with two motion segments have better outcomes than those receiving implants with a one-motion segment] also deserves further study,” he added.
The research was supported by the UK National Institute for Health and Care Research Health Technology Assessment Programme. The trial was sponsored by University College London. Dr. Goldberg disclosed grant support from NIHR HTA, as well as financial relationships with companies including Stryker, Paragon 28, and stock options with Standing CT Company, Elstree Waterfront Outpatients, and X Bolt Orthopedics.
The editorialists had no financial conflicts to disclose.