LayerRx Mapping ID
118
Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image
Medscape Lead Concept
902

High stress levels linked to cognitive decline

Article Type
Changed
Tue, 03/14/2023 - 12:38

Older people with high levels of stress are nearly 40% more likely to have cognitive impairment than those with low stress, a new study shows.

Individuals with elevated stress levels also had higher rates of diabetes, hypertension, and other cardiovascular disease (CVD) risk factors. But even after controlling for those risk factors, stress remained an independent predictor of cognitive decline.

The national cohort study showed that the association between stress and cognition was similar between Black and White individuals and that those with controlled stress were less likely to have cognitive impairment than those with uncontrolled or new stress.

“We have known for a while that excess levels of stress can be harmful for the human body and the heart, but we are now adding more evidence that excess levels of stress can be harmful for cognition,” said lead investigator Ambar Kulshreshtha, MD, PhD, associate professor of family and preventive medicine and epidemiology at Emory University, Atlanta. “We were able to see that regardless of race or gender, stress is bad.”

The findings were published online  in JAMA Network Open.
 

Independent risk factor

For the study, investigators analyzed data from the Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, a national population-based cohort of Black and White participants aged 45 years or older, sampled from the U.S. population.

Participants completed a questionnaire designed to evaluate stress levels when they were enrolled in the study between 2003 and 2007 and again about 11 years after enrollment.

Of the 24,448 participants (41.6% Black) in the study, 22.9% reported elevated stress levels.

Those with high stress were more likely to be younger, female, Black, smokers, and have a higher body mass index and less likely to have a college degree and to be physically active. They also had a lower family income and were more likely to have cardiovascular disease risk factors, such as hypertension, diabetes, and dyslipidemia.

Participants with elevated levels of perceived stress were 37% more likely to have poor cognition after adjustment for sociodemographic variables, cardiovascular risk factors, and depression (adjusted odds ratio, 1.37; 95% confidence interval, 1.22-1.53).

There was no significant difference between Black and White participants.
 

Damaging consequences

Researchers also found a dose-response relationship, with the greatest cognitive decline found in people who reported high stress at both time points and those who had new stress at follow up (aOR, 1.16; 95% CI, 0.92-1.45), compared with those with resolved stress (aOR, 1.03; 95% CI, 0.81-1.32) or no stress (aOR, 0.81; 95% CI, 0.68-0.97).

A change in perceived stress by 1 unit was associated with 4% increased risk of cognitive impairment after adjusting for sociodemographic variables, CVD risk factors, lifestyle factors, and depressive symptoms (aOR, 1.04; 95% CI, 1.02-1.06).

Although the study didn’t reveal the mechanisms that might link stress and cognition, it does point to stress as a potentially modifiable risk factor for cognitive decline, Dr. Kulshreshtha said.

“One in three of my patients have had to deal with extra levels of stress and anxiety over the past few years,” said Dr. Kulshreshtha. “We as clinicians are aware that stress can have damaging consequences to the heart and other organs, and when we see patients who have these complaints, especially elderly patients, we should spend some time asking people about their stress and how they are managing it.”
 

 

 

Additional screening

Gregory Day, MD, a neurologist at the Mayo Clinic, Jacksonville, Fla., said that the findings help fill a void in the research about stress and cognition.

“It’s a potentially important association that’s easy for us to miss in clinical practice,” said Dr. Day, who was not a part of the study. “It’s one of those things that we can all recognize impacts health, but we have very few large, well thought out studies that give us the data we need to inform its place in clinical decision-making.”

In addition to its large sample size, the overrepresentation of diverse populations is a strength of the study and a contribution to the field, Dr. Day said.

“One question they don’t directly ask is, is this association maybe due to some differences in stress? And the answer from the paper is probably not, because it looks like when we control for these things, we don’t see big differences incident risk factors between race,” he added.

The findings also point to the need for clinicians, especially primary care physicians, to screen patients for stress during routine examinations.

“Every visit is an opportunity to screen for risk factors that are going to set people up for future brain health,” Dr. Day said. “In addition to screening for all of these other risk factors for brain health, maybe it’s worth including some more global assessment of stress in a standard screener.”

The study was funded by the National Institute of Neurological Disorders and Stroke, the National Institutes of Health, and the National Institute on Aging. Dr. Kulshreshtha and Dr. Day report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Older people with high levels of stress are nearly 40% more likely to have cognitive impairment than those with low stress, a new study shows.

Individuals with elevated stress levels also had higher rates of diabetes, hypertension, and other cardiovascular disease (CVD) risk factors. But even after controlling for those risk factors, stress remained an independent predictor of cognitive decline.

The national cohort study showed that the association between stress and cognition was similar between Black and White individuals and that those with controlled stress were less likely to have cognitive impairment than those with uncontrolled or new stress.

“We have known for a while that excess levels of stress can be harmful for the human body and the heart, but we are now adding more evidence that excess levels of stress can be harmful for cognition,” said lead investigator Ambar Kulshreshtha, MD, PhD, associate professor of family and preventive medicine and epidemiology at Emory University, Atlanta. “We were able to see that regardless of race or gender, stress is bad.”

The findings were published online  in JAMA Network Open.
 

Independent risk factor

For the study, investigators analyzed data from the Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, a national population-based cohort of Black and White participants aged 45 years or older, sampled from the U.S. population.

Participants completed a questionnaire designed to evaluate stress levels when they were enrolled in the study between 2003 and 2007 and again about 11 years after enrollment.

Of the 24,448 participants (41.6% Black) in the study, 22.9% reported elevated stress levels.

Those with high stress were more likely to be younger, female, Black, smokers, and have a higher body mass index and less likely to have a college degree and to be physically active. They also had a lower family income and were more likely to have cardiovascular disease risk factors, such as hypertension, diabetes, and dyslipidemia.

Participants with elevated levels of perceived stress were 37% more likely to have poor cognition after adjustment for sociodemographic variables, cardiovascular risk factors, and depression (adjusted odds ratio, 1.37; 95% confidence interval, 1.22-1.53).

There was no significant difference between Black and White participants.
 

Damaging consequences

Researchers also found a dose-response relationship, with the greatest cognitive decline found in people who reported high stress at both time points and those who had new stress at follow up (aOR, 1.16; 95% CI, 0.92-1.45), compared with those with resolved stress (aOR, 1.03; 95% CI, 0.81-1.32) or no stress (aOR, 0.81; 95% CI, 0.68-0.97).

A change in perceived stress by 1 unit was associated with 4% increased risk of cognitive impairment after adjusting for sociodemographic variables, CVD risk factors, lifestyle factors, and depressive symptoms (aOR, 1.04; 95% CI, 1.02-1.06).

Although the study didn’t reveal the mechanisms that might link stress and cognition, it does point to stress as a potentially modifiable risk factor for cognitive decline, Dr. Kulshreshtha said.

“One in three of my patients have had to deal with extra levels of stress and anxiety over the past few years,” said Dr. Kulshreshtha. “We as clinicians are aware that stress can have damaging consequences to the heart and other organs, and when we see patients who have these complaints, especially elderly patients, we should spend some time asking people about their stress and how they are managing it.”
 

 

 

Additional screening

Gregory Day, MD, a neurologist at the Mayo Clinic, Jacksonville, Fla., said that the findings help fill a void in the research about stress and cognition.

“It’s a potentially important association that’s easy for us to miss in clinical practice,” said Dr. Day, who was not a part of the study. “It’s one of those things that we can all recognize impacts health, but we have very few large, well thought out studies that give us the data we need to inform its place in clinical decision-making.”

In addition to its large sample size, the overrepresentation of diverse populations is a strength of the study and a contribution to the field, Dr. Day said.

“One question they don’t directly ask is, is this association maybe due to some differences in stress? And the answer from the paper is probably not, because it looks like when we control for these things, we don’t see big differences incident risk factors between race,” he added.

The findings also point to the need for clinicians, especially primary care physicians, to screen patients for stress during routine examinations.

“Every visit is an opportunity to screen for risk factors that are going to set people up for future brain health,” Dr. Day said. “In addition to screening for all of these other risk factors for brain health, maybe it’s worth including some more global assessment of stress in a standard screener.”

The study was funded by the National Institute of Neurological Disorders and Stroke, the National Institutes of Health, and the National Institute on Aging. Dr. Kulshreshtha and Dr. Day report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Older people with high levels of stress are nearly 40% more likely to have cognitive impairment than those with low stress, a new study shows.

Individuals with elevated stress levels also had higher rates of diabetes, hypertension, and other cardiovascular disease (CVD) risk factors. But even after controlling for those risk factors, stress remained an independent predictor of cognitive decline.

The national cohort study showed that the association between stress and cognition was similar between Black and White individuals and that those with controlled stress were less likely to have cognitive impairment than those with uncontrolled or new stress.

“We have known for a while that excess levels of stress can be harmful for the human body and the heart, but we are now adding more evidence that excess levels of stress can be harmful for cognition,” said lead investigator Ambar Kulshreshtha, MD, PhD, associate professor of family and preventive medicine and epidemiology at Emory University, Atlanta. “We were able to see that regardless of race or gender, stress is bad.”

The findings were published online  in JAMA Network Open.
 

Independent risk factor

For the study, investigators analyzed data from the Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, a national population-based cohort of Black and White participants aged 45 years or older, sampled from the U.S. population.

Participants completed a questionnaire designed to evaluate stress levels when they were enrolled in the study between 2003 and 2007 and again about 11 years after enrollment.

Of the 24,448 participants (41.6% Black) in the study, 22.9% reported elevated stress levels.

Those with high stress were more likely to be younger, female, Black, smokers, and have a higher body mass index and less likely to have a college degree and to be physically active. They also had a lower family income and were more likely to have cardiovascular disease risk factors, such as hypertension, diabetes, and dyslipidemia.

Participants with elevated levels of perceived stress were 37% more likely to have poor cognition after adjustment for sociodemographic variables, cardiovascular risk factors, and depression (adjusted odds ratio, 1.37; 95% confidence interval, 1.22-1.53).

There was no significant difference between Black and White participants.
 

Damaging consequences

Researchers also found a dose-response relationship, with the greatest cognitive decline found in people who reported high stress at both time points and those who had new stress at follow up (aOR, 1.16; 95% CI, 0.92-1.45), compared with those with resolved stress (aOR, 1.03; 95% CI, 0.81-1.32) or no stress (aOR, 0.81; 95% CI, 0.68-0.97).

A change in perceived stress by 1 unit was associated with 4% increased risk of cognitive impairment after adjusting for sociodemographic variables, CVD risk factors, lifestyle factors, and depressive symptoms (aOR, 1.04; 95% CI, 1.02-1.06).

Although the study didn’t reveal the mechanisms that might link stress and cognition, it does point to stress as a potentially modifiable risk factor for cognitive decline, Dr. Kulshreshtha said.

“One in three of my patients have had to deal with extra levels of stress and anxiety over the past few years,” said Dr. Kulshreshtha. “We as clinicians are aware that stress can have damaging consequences to the heart and other organs, and when we see patients who have these complaints, especially elderly patients, we should spend some time asking people about their stress and how they are managing it.”
 

 

 

Additional screening

Gregory Day, MD, a neurologist at the Mayo Clinic, Jacksonville, Fla., said that the findings help fill a void in the research about stress and cognition.

“It’s a potentially important association that’s easy for us to miss in clinical practice,” said Dr. Day, who was not a part of the study. “It’s one of those things that we can all recognize impacts health, but we have very few large, well thought out studies that give us the data we need to inform its place in clinical decision-making.”

In addition to its large sample size, the overrepresentation of diverse populations is a strength of the study and a contribution to the field, Dr. Day said.

“One question they don’t directly ask is, is this association maybe due to some differences in stress? And the answer from the paper is probably not, because it looks like when we control for these things, we don’t see big differences incident risk factors between race,” he added.

The findings also point to the need for clinicians, especially primary care physicians, to screen patients for stress during routine examinations.

“Every visit is an opportunity to screen for risk factors that are going to set people up for future brain health,” Dr. Day said. “In addition to screening for all of these other risk factors for brain health, maybe it’s worth including some more global assessment of stress in a standard screener.”

The study was funded by the National Institute of Neurological Disorders and Stroke, the National Institutes of Health, and the National Institute on Aging. Dr. Kulshreshtha and Dr. Day report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

From JAMA Network Open

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Alzheimer's Disease Presentation

Article Type
Changed
Mon, 03/13/2023 - 14:36

Publications
Topics
Sections

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 03/13/2023 - 14:30
Un-Gate On Date
Mon, 03/13/2023 - 14:30
Use ProPublica
CFC Schedule Remove Status
Mon, 03/13/2023 - 14:30
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Antipsychotic cuts Alzheimer’s-related agitation

Article Type
Changed
Tue, 03/14/2023 - 10:54

NEW ORLEANS - The antipsychotic brexpiprazole effectively improves agitation associated with Alzheimer’s disease (AD) with favorable tolerability, results of a phase 3 study suggest.

“In this phase 3 trial of patients with agitation in Alzheimer’s dementia, treatment with brexpiprazole 2 or 3 mg/day resulted in statistically significantly greater improvements in agitation versus placebo on the primary and key secondary endpoints,” said study investigator George Grossberg, MD, professor and director of the division of geriatric psychiatry, department of psychiatry & behavioral neuroscience, Saint Louis University.

Dr. Grossberg presented the findings as part of the annual meeting of the American Association for Geriatric Psychiatry.

Agitation common, distressing

With two previous studies also showing efficacy of brexpiprazole in AD-related agitation, Dr. Grossberg speculated that brexpiprazole will become the first drug to be approved for agitation in AD.

Agitation is one of the most common AD symptoms and is arguably the most distressing for patients and caregivers alike, Dr. Grossberg noted.

The drug was approved by the Food and Drug Administration in 2015 as an adjunctive therapy to antidepressants for adults with major depressive disorder and for adults with schizophrenia.

To investigate the drug at effective doses for AD-related agitation, the researchers conducted a phase 3 multicenter trial that included 345 patients with AD who met criteria for agitation and aggression.

Study participants had a mean Mini-Mental State Examination (MMSE) score between 5 and 22 at screening and baseline and a mean Cohen-Mansfield Agitation Inventory (CMAI) total score of about 79. A score above 45 is considered clinically significant agitation. Use of AD medications were permitted.

Patients had a mean age of 74 years and were randomly assigned in a 2:1 ratio to receive treatment with brexpiprazole 2 mg (n = 75) or 3 mg (n = 153) per day, or placebo (n = 117).

The study’s primary endpoint was improvement as assessed by the CMAI. Over 12 weeks, participants in the brexpiprazole group experienced greater improvement in agitation, with a mean change of –22.6 with brexpiprazole vs. –17.3 with placebo (P = .0026).

Brexpiprazole was also associated with significantly greater improvement in the secondary outcome of change from baseline to week 12 in agitation severity, as assessed using the Clinical Global Impression-Severity of Illness (CGI-S) score (mean change, –1.20 with brexpiprazole vs. –0.93 with placebo; P = .0078).

Specifically, treatment with the drug resulted in improvements in three key subscales of agitation, including aggressive behavior, such as physically striking out (P < .01 vs. placebo); physically nonaggressive; and verbally agitated, such as screaming or cursing (both P < .05).

Treatment-emergent adverse events (TEAEs) associated with brexpiprazole vs. placebo included somnolence (3.5% vs. 0.9%), nasopharyngitis (3.1% vs. 1.7%), dizziness (2.7% vs. 1.7%), diarrhea (2.2% vs. 0.9%), urinary tract infection (2.2% vs. 0.9%), and asthenia (2.2% vs. 0.0%).

“Aside from headache, no other TEAEs had an incidence of more than 5% in the brexpiprazole (2 or 3 mg) group, or in either dose group,” Dr. Grossberg said. “Cognition also remained stable,” he added.

 

 

Boxed warnings

Adverse events commonly associated with brexpiprazole include weight change, extrapyramidal events, falls, cardiovascular events, and sedation. In the study, all occurred at an incidence of less than 2% in both study groups, he noted.

Compared with the antipsychotic aripiprazole, brexpiprazole is associated with lower weight gain and akathisia, or motor restlessness.

One death occurred in the brexpiprazole 3 mg group in a patient who had heart failure, pneumonia, and cachexia. At autopsy, it was found the patient had cerebral and coronary atherosclerosis. The death was considered to be unrelated to brexpiprazole, said Dr. Grossberg.

This finding is notable because a caveat is that brexpiprazole, like aripiprazole and other typical and atypical antipsychotics, carries an FDA boxed warning related to an increased risk for death in older patients when used for dementia-related psychosis.

Noting that a black box warning about mortality risk is not a minor issue, Dr. Grossberg added that the risks are relatively low, whereas the risks associated with agitation in dementia can be high.

“If it’s an emergency situation, you have to treat the patient because otherwise they may harm someone else, or harm the staff, or harm their loved ones or themselves, and in those cases, we want to treat the patient first, get them under control, and then we worry about the black box,” he said.

In addition, “the No. 1 reason for getting kicked out of a nursing home is agitation or severe behaviors in the context of a dementia or a major neurocognitive disorder that the facility cannot control,” Dr. Grossberg added.

In such cases, patients may wind up in an emergency department and may not be welcome back at the nursing home.

“There’s always a risk/benefit ratio, and I have that discussion with patients and their families, but I can tell you that I’ve never had a family ask me not to use a medication because of the black box warning, because they see how miserable and how out of control their loved one is and they’re miserable because they see the suffering and will ask that we do anything that we can to get this behavior under control,” Dr. Grossberg said.

Caution still warranted

Commenting on the study, Rajesh R. Tampi, MD, professor and chairman of the department of psychiatry and the Bhatia Family Endowed Chair in Psychiatry at Creighton University, Omaha, Neb., underscored that, owing to the concerns behind the FDA warnings, “nonpharmacologic management is the cornerstone of treating agitation in Alzheimer’s dementia.”

He noted that the lack of an FDA-approved drug for agitation with AD is the result of “the overall benefits of any of the drug classes or drugs trialed to treat agitation in Alzheimer’s dementia vs. their adverse effect profile,” he said.

Therefore, he continued, “any medication or medication class should be used with caution among these individuals who often have polymorbidity.”

Dr. Tampi agreed that “the use of each drug for agitation in AD should be on a case-by-case basis with a clear and documented risk/benefit discussion with the patient and their families.”

“These medications should only be used for refractory symptoms or emergency situations where the agitation is not managed adequately with nonpharmacologic techniques and with a clear and documented risk/benefit discussion with patients and their families,” Dr. Tampi said. 

The study was supported by Otsuka Pharmaceutical Development & Commercialization and H. Lundbeck. Dr. Grossberg has received consulting fees from Acadia, Avanir, Biogen, BioXcel, Genentech, Karuna, Lundbeck, Otsuka, Roche, and Takeda. Dr. Tampi had no disclosures to report.

A version of this article first appeared on Medscape.com.

This article was updated 3/14/23.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

NEW ORLEANS - The antipsychotic brexpiprazole effectively improves agitation associated with Alzheimer’s disease (AD) with favorable tolerability, results of a phase 3 study suggest.

“In this phase 3 trial of patients with agitation in Alzheimer’s dementia, treatment with brexpiprazole 2 or 3 mg/day resulted in statistically significantly greater improvements in agitation versus placebo on the primary and key secondary endpoints,” said study investigator George Grossberg, MD, professor and director of the division of geriatric psychiatry, department of psychiatry & behavioral neuroscience, Saint Louis University.

Dr. Grossberg presented the findings as part of the annual meeting of the American Association for Geriatric Psychiatry.

Agitation common, distressing

With two previous studies also showing efficacy of brexpiprazole in AD-related agitation, Dr. Grossberg speculated that brexpiprazole will become the first drug to be approved for agitation in AD.

Agitation is one of the most common AD symptoms and is arguably the most distressing for patients and caregivers alike, Dr. Grossberg noted.

The drug was approved by the Food and Drug Administration in 2015 as an adjunctive therapy to antidepressants for adults with major depressive disorder and for adults with schizophrenia.

To investigate the drug at effective doses for AD-related agitation, the researchers conducted a phase 3 multicenter trial that included 345 patients with AD who met criteria for agitation and aggression.

Study participants had a mean Mini-Mental State Examination (MMSE) score between 5 and 22 at screening and baseline and a mean Cohen-Mansfield Agitation Inventory (CMAI) total score of about 79. A score above 45 is considered clinically significant agitation. Use of AD medications were permitted.

Patients had a mean age of 74 years and were randomly assigned in a 2:1 ratio to receive treatment with brexpiprazole 2 mg (n = 75) or 3 mg (n = 153) per day, or placebo (n = 117).

The study’s primary endpoint was improvement as assessed by the CMAI. Over 12 weeks, participants in the brexpiprazole group experienced greater improvement in agitation, with a mean change of –22.6 with brexpiprazole vs. –17.3 with placebo (P = .0026).

Brexpiprazole was also associated with significantly greater improvement in the secondary outcome of change from baseline to week 12 in agitation severity, as assessed using the Clinical Global Impression-Severity of Illness (CGI-S) score (mean change, –1.20 with brexpiprazole vs. –0.93 with placebo; P = .0078).

Specifically, treatment with the drug resulted in improvements in three key subscales of agitation, including aggressive behavior, such as physically striking out (P < .01 vs. placebo); physically nonaggressive; and verbally agitated, such as screaming or cursing (both P < .05).

Treatment-emergent adverse events (TEAEs) associated with brexpiprazole vs. placebo included somnolence (3.5% vs. 0.9%), nasopharyngitis (3.1% vs. 1.7%), dizziness (2.7% vs. 1.7%), diarrhea (2.2% vs. 0.9%), urinary tract infection (2.2% vs. 0.9%), and asthenia (2.2% vs. 0.0%).

“Aside from headache, no other TEAEs had an incidence of more than 5% in the brexpiprazole (2 or 3 mg) group, or in either dose group,” Dr. Grossberg said. “Cognition also remained stable,” he added.

 

 

Boxed warnings

Adverse events commonly associated with brexpiprazole include weight change, extrapyramidal events, falls, cardiovascular events, and sedation. In the study, all occurred at an incidence of less than 2% in both study groups, he noted.

Compared with the antipsychotic aripiprazole, brexpiprazole is associated with lower weight gain and akathisia, or motor restlessness.

One death occurred in the brexpiprazole 3 mg group in a patient who had heart failure, pneumonia, and cachexia. At autopsy, it was found the patient had cerebral and coronary atherosclerosis. The death was considered to be unrelated to brexpiprazole, said Dr. Grossberg.

This finding is notable because a caveat is that brexpiprazole, like aripiprazole and other typical and atypical antipsychotics, carries an FDA boxed warning related to an increased risk for death in older patients when used for dementia-related psychosis.

Noting that a black box warning about mortality risk is not a minor issue, Dr. Grossberg added that the risks are relatively low, whereas the risks associated with agitation in dementia can be high.

“If it’s an emergency situation, you have to treat the patient because otherwise they may harm someone else, or harm the staff, or harm their loved ones or themselves, and in those cases, we want to treat the patient first, get them under control, and then we worry about the black box,” he said.

In addition, “the No. 1 reason for getting kicked out of a nursing home is agitation or severe behaviors in the context of a dementia or a major neurocognitive disorder that the facility cannot control,” Dr. Grossberg added.

In such cases, patients may wind up in an emergency department and may not be welcome back at the nursing home.

“There’s always a risk/benefit ratio, and I have that discussion with patients and their families, but I can tell you that I’ve never had a family ask me not to use a medication because of the black box warning, because they see how miserable and how out of control their loved one is and they’re miserable because they see the suffering and will ask that we do anything that we can to get this behavior under control,” Dr. Grossberg said.

Caution still warranted

Commenting on the study, Rajesh R. Tampi, MD, professor and chairman of the department of psychiatry and the Bhatia Family Endowed Chair in Psychiatry at Creighton University, Omaha, Neb., underscored that, owing to the concerns behind the FDA warnings, “nonpharmacologic management is the cornerstone of treating agitation in Alzheimer’s dementia.”

He noted that the lack of an FDA-approved drug for agitation with AD is the result of “the overall benefits of any of the drug classes or drugs trialed to treat agitation in Alzheimer’s dementia vs. their adverse effect profile,” he said.

Therefore, he continued, “any medication or medication class should be used with caution among these individuals who often have polymorbidity.”

Dr. Tampi agreed that “the use of each drug for agitation in AD should be on a case-by-case basis with a clear and documented risk/benefit discussion with the patient and their families.”

“These medications should only be used for refractory symptoms or emergency situations where the agitation is not managed adequately with nonpharmacologic techniques and with a clear and documented risk/benefit discussion with patients and their families,” Dr. Tampi said. 

The study was supported by Otsuka Pharmaceutical Development & Commercialization and H. Lundbeck. Dr. Grossberg has received consulting fees from Acadia, Avanir, Biogen, BioXcel, Genentech, Karuna, Lundbeck, Otsuka, Roche, and Takeda. Dr. Tampi had no disclosures to report.

A version of this article first appeared on Medscape.com.

This article was updated 3/14/23.

NEW ORLEANS - The antipsychotic brexpiprazole effectively improves agitation associated with Alzheimer’s disease (AD) with favorable tolerability, results of a phase 3 study suggest.

“In this phase 3 trial of patients with agitation in Alzheimer’s dementia, treatment with brexpiprazole 2 or 3 mg/day resulted in statistically significantly greater improvements in agitation versus placebo on the primary and key secondary endpoints,” said study investigator George Grossberg, MD, professor and director of the division of geriatric psychiatry, department of psychiatry & behavioral neuroscience, Saint Louis University.

Dr. Grossberg presented the findings as part of the annual meeting of the American Association for Geriatric Psychiatry.

Agitation common, distressing

With two previous studies also showing efficacy of brexpiprazole in AD-related agitation, Dr. Grossberg speculated that brexpiprazole will become the first drug to be approved for agitation in AD.

Agitation is one of the most common AD symptoms and is arguably the most distressing for patients and caregivers alike, Dr. Grossberg noted.

The drug was approved by the Food and Drug Administration in 2015 as an adjunctive therapy to antidepressants for adults with major depressive disorder and for adults with schizophrenia.

To investigate the drug at effective doses for AD-related agitation, the researchers conducted a phase 3 multicenter trial that included 345 patients with AD who met criteria for agitation and aggression.

Study participants had a mean Mini-Mental State Examination (MMSE) score between 5 and 22 at screening and baseline and a mean Cohen-Mansfield Agitation Inventory (CMAI) total score of about 79. A score above 45 is considered clinically significant agitation. Use of AD medications were permitted.

Patients had a mean age of 74 years and were randomly assigned in a 2:1 ratio to receive treatment with brexpiprazole 2 mg (n = 75) or 3 mg (n = 153) per day, or placebo (n = 117).

The study’s primary endpoint was improvement as assessed by the CMAI. Over 12 weeks, participants in the brexpiprazole group experienced greater improvement in agitation, with a mean change of –22.6 with brexpiprazole vs. –17.3 with placebo (P = .0026).

Brexpiprazole was also associated with significantly greater improvement in the secondary outcome of change from baseline to week 12 in agitation severity, as assessed using the Clinical Global Impression-Severity of Illness (CGI-S) score (mean change, –1.20 with brexpiprazole vs. –0.93 with placebo; P = .0078).

Specifically, treatment with the drug resulted in improvements in three key subscales of agitation, including aggressive behavior, such as physically striking out (P < .01 vs. placebo); physically nonaggressive; and verbally agitated, such as screaming or cursing (both P < .05).

Treatment-emergent adverse events (TEAEs) associated with brexpiprazole vs. placebo included somnolence (3.5% vs. 0.9%), nasopharyngitis (3.1% vs. 1.7%), dizziness (2.7% vs. 1.7%), diarrhea (2.2% vs. 0.9%), urinary tract infection (2.2% vs. 0.9%), and asthenia (2.2% vs. 0.0%).

“Aside from headache, no other TEAEs had an incidence of more than 5% in the brexpiprazole (2 or 3 mg) group, or in either dose group,” Dr. Grossberg said. “Cognition also remained stable,” he added.

 

 

Boxed warnings

Adverse events commonly associated with brexpiprazole include weight change, extrapyramidal events, falls, cardiovascular events, and sedation. In the study, all occurred at an incidence of less than 2% in both study groups, he noted.

Compared with the antipsychotic aripiprazole, brexpiprazole is associated with lower weight gain and akathisia, or motor restlessness.

One death occurred in the brexpiprazole 3 mg group in a patient who had heart failure, pneumonia, and cachexia. At autopsy, it was found the patient had cerebral and coronary atherosclerosis. The death was considered to be unrelated to brexpiprazole, said Dr. Grossberg.

This finding is notable because a caveat is that brexpiprazole, like aripiprazole and other typical and atypical antipsychotics, carries an FDA boxed warning related to an increased risk for death in older patients when used for dementia-related psychosis.

Noting that a black box warning about mortality risk is not a minor issue, Dr. Grossberg added that the risks are relatively low, whereas the risks associated with agitation in dementia can be high.

“If it’s an emergency situation, you have to treat the patient because otherwise they may harm someone else, or harm the staff, or harm their loved ones or themselves, and in those cases, we want to treat the patient first, get them under control, and then we worry about the black box,” he said.

In addition, “the No. 1 reason for getting kicked out of a nursing home is agitation or severe behaviors in the context of a dementia or a major neurocognitive disorder that the facility cannot control,” Dr. Grossberg added.

In such cases, patients may wind up in an emergency department and may not be welcome back at the nursing home.

“There’s always a risk/benefit ratio, and I have that discussion with patients and their families, but I can tell you that I’ve never had a family ask me not to use a medication because of the black box warning, because they see how miserable and how out of control their loved one is and they’re miserable because they see the suffering and will ask that we do anything that we can to get this behavior under control,” Dr. Grossberg said.

Caution still warranted

Commenting on the study, Rajesh R. Tampi, MD, professor and chairman of the department of psychiatry and the Bhatia Family Endowed Chair in Psychiatry at Creighton University, Omaha, Neb., underscored that, owing to the concerns behind the FDA warnings, “nonpharmacologic management is the cornerstone of treating agitation in Alzheimer’s dementia.”

He noted that the lack of an FDA-approved drug for agitation with AD is the result of “the overall benefits of any of the drug classes or drugs trialed to treat agitation in Alzheimer’s dementia vs. their adverse effect profile,” he said.

Therefore, he continued, “any medication or medication class should be used with caution among these individuals who often have polymorbidity.”

Dr. Tampi agreed that “the use of each drug for agitation in AD should be on a case-by-case basis with a clear and documented risk/benefit discussion with the patient and their families.”

“These medications should only be used for refractory symptoms or emergency situations where the agitation is not managed adequately with nonpharmacologic techniques and with a clear and documented risk/benefit discussion with patients and their families,” Dr. Tampi said. 

The study was supported by Otsuka Pharmaceutical Development & Commercialization and H. Lundbeck. Dr. Grossberg has received consulting fees from Acadia, Avanir, Biogen, BioXcel, Genentech, Karuna, Lundbeck, Otsuka, Roche, and Takeda. Dr. Tampi had no disclosures to report.

A version of this article first appeared on Medscape.com.

This article was updated 3/14/23.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT AAGP 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cognitive remediation training reduces aggression in schizophrenia

Article Type
Changed
Thu, 03/09/2023 - 18:28

Cognitive remediation training, with or without social cognitive training, was associated with reduced aggressive behavior in schizophrenia, based on data from 130 individuals.

Aggressive behavior, including verbal or physical threats or violent acts, is at least four times more likely among individuals with schizophrenia, compared with the general population, wrote Anzalee Khan, PhD, of the Nathan S. Kline Institute for Psychiatric Research, Orangeburg, N.Y., and colleagues. Recent studies suggest that psychosocial treatments such as cognitive remediation training (CRT) or social cognition training (SCT) may be helpful, but the potential benefit of combining these strategies has not been explored, they said.

In a study published in Schizophrenia Research , the authors randomized 62 adults with a diagnosis of schizophrenia or schizoaffective disorder to 36 sessions of a combination treatment with cognitive remediation and social cognition; 68 were randomized to cognitive remediation and computer-based control treatment. Participants also had at least one confirmed assault in the past year, or scores of 5 or higher on the Life History of Aggression scale. Complete data were analyzed for 45 patients in the CRT/SRT group and 34 in the CRT control group.

The primary outcome was the measure of aggression using the Modified Overt Aggression Scale (OAS-M) in which higher scores indicate higher levels of aggression. Incidents of aggression were coded based on hospital staff reports and summarized weekly. The mean age of the participants was 34.9 years (ranging from 18 to 60 years), 85% were male, and the mean years of education was 11.5.

At the study’s end (14 weeks), participants in both groups showed significant reductions in measures of aggression from baseline, with the largest effect size for the total global OAS-M score (effect size 1.11 for CRT plus SCT and 0.73 for the CRT plus control group).

The results failed to confirm the hypothesis that the combination of CRT and SCT would significantly increase improvements in aggression compared with CRT alone, the researchers wrote in their discussion. Potential reasons include underdosed SCT intervention (only 12 sessions) and the nature of the SCT used in the study, which had few aggressive social interaction models and more models related to social engagement.

Although adding SCT did not have a significant impact on aggression, patients in the CRT plus SCT group showed greater improvement in cognitive function, emotion recognition, and mentalizing, compared with the controls without SCT, the researchers noted.

“While these findings are not surprising given that participants in the CRT plus SCT group received active social cognition training, they do support the idea that social cognition training may have contributed to further strengthen our effect on cognition,” they wrote.

The findings were limited by several factors including the study population of individuals with chronic schizophrenia and low levels of function in long-term tertiary care, which may limit generalizability, and the inability to control for the effects of pharmacotherapy, the researchers said.

However, the results were strengthened by the multidimensional assessments at both time points and the use of two cognitive and social cognition interventions, and suggest that adding social cognitive training enhanced the effect of CRT on cognitive function, emotion regulation, and mentalizing capacity, they said.

“Future studies are needed to examine the antiaggressive effects of a more intensive and more targeted social cognition intervention combined with CRT,” they concluded.

The study was supported by the Brain and Behavior Research Foundation and the Weill Cornell Clinical and Translational Science Award Program, National Institutes of Health/National Center for Advancing Translational Sciences. The researchers had no financial conflicts to disclose.

Publications
Topics
Sections

Cognitive remediation training, with or without social cognitive training, was associated with reduced aggressive behavior in schizophrenia, based on data from 130 individuals.

Aggressive behavior, including verbal or physical threats or violent acts, is at least four times more likely among individuals with schizophrenia, compared with the general population, wrote Anzalee Khan, PhD, of the Nathan S. Kline Institute for Psychiatric Research, Orangeburg, N.Y., and colleagues. Recent studies suggest that psychosocial treatments such as cognitive remediation training (CRT) or social cognition training (SCT) may be helpful, but the potential benefit of combining these strategies has not been explored, they said.

In a study published in Schizophrenia Research , the authors randomized 62 adults with a diagnosis of schizophrenia or schizoaffective disorder to 36 sessions of a combination treatment with cognitive remediation and social cognition; 68 were randomized to cognitive remediation and computer-based control treatment. Participants also had at least one confirmed assault in the past year, or scores of 5 or higher on the Life History of Aggression scale. Complete data were analyzed for 45 patients in the CRT/SRT group and 34 in the CRT control group.

The primary outcome was the measure of aggression using the Modified Overt Aggression Scale (OAS-M) in which higher scores indicate higher levels of aggression. Incidents of aggression were coded based on hospital staff reports and summarized weekly. The mean age of the participants was 34.9 years (ranging from 18 to 60 years), 85% were male, and the mean years of education was 11.5.

At the study’s end (14 weeks), participants in both groups showed significant reductions in measures of aggression from baseline, with the largest effect size for the total global OAS-M score (effect size 1.11 for CRT plus SCT and 0.73 for the CRT plus control group).

The results failed to confirm the hypothesis that the combination of CRT and SCT would significantly increase improvements in aggression compared with CRT alone, the researchers wrote in their discussion. Potential reasons include underdosed SCT intervention (only 12 sessions) and the nature of the SCT used in the study, which had few aggressive social interaction models and more models related to social engagement.

Although adding SCT did not have a significant impact on aggression, patients in the CRT plus SCT group showed greater improvement in cognitive function, emotion recognition, and mentalizing, compared with the controls without SCT, the researchers noted.

“While these findings are not surprising given that participants in the CRT plus SCT group received active social cognition training, they do support the idea that social cognition training may have contributed to further strengthen our effect on cognition,” they wrote.

The findings were limited by several factors including the study population of individuals with chronic schizophrenia and low levels of function in long-term tertiary care, which may limit generalizability, and the inability to control for the effects of pharmacotherapy, the researchers said.

However, the results were strengthened by the multidimensional assessments at both time points and the use of two cognitive and social cognition interventions, and suggest that adding social cognitive training enhanced the effect of CRT on cognitive function, emotion regulation, and mentalizing capacity, they said.

“Future studies are needed to examine the antiaggressive effects of a more intensive and more targeted social cognition intervention combined with CRT,” they concluded.

The study was supported by the Brain and Behavior Research Foundation and the Weill Cornell Clinical and Translational Science Award Program, National Institutes of Health/National Center for Advancing Translational Sciences. The researchers had no financial conflicts to disclose.

Cognitive remediation training, with or without social cognitive training, was associated with reduced aggressive behavior in schizophrenia, based on data from 130 individuals.

Aggressive behavior, including verbal or physical threats or violent acts, is at least four times more likely among individuals with schizophrenia, compared with the general population, wrote Anzalee Khan, PhD, of the Nathan S. Kline Institute for Psychiatric Research, Orangeburg, N.Y., and colleagues. Recent studies suggest that psychosocial treatments such as cognitive remediation training (CRT) or social cognition training (SCT) may be helpful, but the potential benefit of combining these strategies has not been explored, they said.

In a study published in Schizophrenia Research , the authors randomized 62 adults with a diagnosis of schizophrenia or schizoaffective disorder to 36 sessions of a combination treatment with cognitive remediation and social cognition; 68 were randomized to cognitive remediation and computer-based control treatment. Participants also had at least one confirmed assault in the past year, or scores of 5 or higher on the Life History of Aggression scale. Complete data were analyzed for 45 patients in the CRT/SRT group and 34 in the CRT control group.

The primary outcome was the measure of aggression using the Modified Overt Aggression Scale (OAS-M) in which higher scores indicate higher levels of aggression. Incidents of aggression were coded based on hospital staff reports and summarized weekly. The mean age of the participants was 34.9 years (ranging from 18 to 60 years), 85% were male, and the mean years of education was 11.5.

At the study’s end (14 weeks), participants in both groups showed significant reductions in measures of aggression from baseline, with the largest effect size for the total global OAS-M score (effect size 1.11 for CRT plus SCT and 0.73 for the CRT plus control group).

The results failed to confirm the hypothesis that the combination of CRT and SCT would significantly increase improvements in aggression compared with CRT alone, the researchers wrote in their discussion. Potential reasons include underdosed SCT intervention (only 12 sessions) and the nature of the SCT used in the study, which had few aggressive social interaction models and more models related to social engagement.

Although adding SCT did not have a significant impact on aggression, patients in the CRT plus SCT group showed greater improvement in cognitive function, emotion recognition, and mentalizing, compared with the controls without SCT, the researchers noted.

“While these findings are not surprising given that participants in the CRT plus SCT group received active social cognition training, they do support the idea that social cognition training may have contributed to further strengthen our effect on cognition,” they wrote.

The findings were limited by several factors including the study population of individuals with chronic schizophrenia and low levels of function in long-term tertiary care, which may limit generalizability, and the inability to control for the effects of pharmacotherapy, the researchers said.

However, the results were strengthened by the multidimensional assessments at both time points and the use of two cognitive and social cognition interventions, and suggest that adding social cognitive training enhanced the effect of CRT on cognitive function, emotion regulation, and mentalizing capacity, they said.

“Future studies are needed to examine the antiaggressive effects of a more intensive and more targeted social cognition intervention combined with CRT,” they concluded.

The study was supported by the Brain and Behavior Research Foundation and the Weill Cornell Clinical and Translational Science Award Program, National Institutes of Health/National Center for Advancing Translational Sciences. The researchers had no financial conflicts to disclose.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SCHIZOPHRENIA RESEARCH

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Migraine after concussion linked to worse outcomes

Article Type
Changed
Thu, 03/09/2023 - 18:24

Children who experience migraine headaches in the aftermath of a concussion are more likely to experience prolonged symptoms of the head injury than are those with other forms of headache or no headaches at all, researchers have found.

“Early assessment of headache – and whether it has migraine features – after concussion can be helpful in predicting which children are at risk for poor outcomes and identifying children who require targeted intervention,” said senior author Keith Owen Yeates, PhD, the Ronald and Irene Ward Chair in Pediatric Brain Injury Professor and head of the department of psychology at the University of Calgary (Alta.). “Posttraumatic headache, especially when it involves migraine features, is a strong predictor of persisting symptoms and poorer quality of life after childhood concussion.”

Approximately 840,000 children per year visit an emergency department in the United States after having a traumatic brain injury. As many as 90% of those visits are considered to involve a concussion, according to the investigators. Although most children recover quickly, approximately one-third continue to report symptoms a month after the event.

Posttraumatic headache occurs in up to 90% of children, most commonly with features of migraine.

The new study, published in JAMA Network Open, was a secondary analysis of the Advancing Concussion Assessment in Pediatrics (A-CAP) prospective cohort study. The study was conducted at five emergency departments in Canada from September 2016 to July 2019 and included children and adolescents aged 8-17 years who presented with acute concussion or an orthopedic injury.

Children were included in the concussion group if they had a history of blunt head trauma resulting in at least one of three criteria consistent with the World Health Organization definition of mild traumatic brain injury. The criteria include loss of consciousness for less than 30 minutes, a Glasgow Coma Scale score of 13 or 14, or at least one acute sign or symptom of concussion, as noted by emergency clinicians.

Patients were excluded from the concussion group if they had deteriorating neurologic status, underwent neurosurgical intervention, had posttraumatic amnesia that lasted more than 24 hours, or had a score higher than 4 on the Abbreviated Injury Scale (AIS). The orthopedic injury group included patients without symptoms of concussion and with blunt trauma associated with an AIS 13 score of 4 or less. Patients were excluded from both groups if they had an overnight hospitalization for traumatic brain injury, a concussion within the past 3 months, or a neurodevelopmental disorder.

The researchers analyzed data from 928 children of 967 enrolled in the study. The median age was 12.2 years, and 41.3% were female. The final study cohort included 239 children with orthopedic injuries but no headache, 160 with a concussion and no headache, 134 with a concussion and nonmigraine headaches, and 254 with a concussion and migraine headaches.

Children with posttraumatic migraines 10 days after a concussion had the most severe symptoms and worst quality of life 3 months following their head trauma, the researchers found. Children without headaches within 10 days after concussion had the best 3-month outcomes, comparable to those with orthopedic injuries alone.

The researchers said the strengths of their study included its large population and its inclusion of various causes of head trauma, not just sports-related concussions. Limitations included self-reports of headaches instead of a physician diagnosis and lack of control for clinical interventions that might have affected the outcomes.

Charles Tator, MD, PhD, director of the Canadian Concussion Centre at Toronto Western Hospital, said the findings were unsurprising.

“Headaches are the most common symptom after concussion,” Dr. Tator, who was not involved in the latest research, told this news organization. “In my practice and research with concussed kids 11 and up and with adults, those with preconcussion history of migraine are the most difficult to treat because their headaches don’t improve unless specific measures are taken.”

Dr. Tator, who also is a professor of neurosurgery at the University of Toronto, said clinicians who treat concussions must determine which type of headaches children are experiencing – and refer as early as possible for migraine prevention or treatment and medication, as warranted.

“Early recognition after concussion that migraine headaches are occurring will save kids a lot of suffering,” he said.

The study was supported by a Canadian Institute of Health Research Foundation Grant and by funds from the Alberta Children’s Hospital Foundation and the Alberta Children’s Hospital Research Institute. Dr. Tator has disclosed no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Children who experience migraine headaches in the aftermath of a concussion are more likely to experience prolonged symptoms of the head injury than are those with other forms of headache or no headaches at all, researchers have found.

“Early assessment of headache – and whether it has migraine features – after concussion can be helpful in predicting which children are at risk for poor outcomes and identifying children who require targeted intervention,” said senior author Keith Owen Yeates, PhD, the Ronald and Irene Ward Chair in Pediatric Brain Injury Professor and head of the department of psychology at the University of Calgary (Alta.). “Posttraumatic headache, especially when it involves migraine features, is a strong predictor of persisting symptoms and poorer quality of life after childhood concussion.”

Approximately 840,000 children per year visit an emergency department in the United States after having a traumatic brain injury. As many as 90% of those visits are considered to involve a concussion, according to the investigators. Although most children recover quickly, approximately one-third continue to report symptoms a month after the event.

Posttraumatic headache occurs in up to 90% of children, most commonly with features of migraine.

The new study, published in JAMA Network Open, was a secondary analysis of the Advancing Concussion Assessment in Pediatrics (A-CAP) prospective cohort study. The study was conducted at five emergency departments in Canada from September 2016 to July 2019 and included children and adolescents aged 8-17 years who presented with acute concussion or an orthopedic injury.

Children were included in the concussion group if they had a history of blunt head trauma resulting in at least one of three criteria consistent with the World Health Organization definition of mild traumatic brain injury. The criteria include loss of consciousness for less than 30 minutes, a Glasgow Coma Scale score of 13 or 14, or at least one acute sign or symptom of concussion, as noted by emergency clinicians.

Patients were excluded from the concussion group if they had deteriorating neurologic status, underwent neurosurgical intervention, had posttraumatic amnesia that lasted more than 24 hours, or had a score higher than 4 on the Abbreviated Injury Scale (AIS). The orthopedic injury group included patients without symptoms of concussion and with blunt trauma associated with an AIS 13 score of 4 or less. Patients were excluded from both groups if they had an overnight hospitalization for traumatic brain injury, a concussion within the past 3 months, or a neurodevelopmental disorder.

The researchers analyzed data from 928 children of 967 enrolled in the study. The median age was 12.2 years, and 41.3% were female. The final study cohort included 239 children with orthopedic injuries but no headache, 160 with a concussion and no headache, 134 with a concussion and nonmigraine headaches, and 254 with a concussion and migraine headaches.

Children with posttraumatic migraines 10 days after a concussion had the most severe symptoms and worst quality of life 3 months following their head trauma, the researchers found. Children without headaches within 10 days after concussion had the best 3-month outcomes, comparable to those with orthopedic injuries alone.

The researchers said the strengths of their study included its large population and its inclusion of various causes of head trauma, not just sports-related concussions. Limitations included self-reports of headaches instead of a physician diagnosis and lack of control for clinical interventions that might have affected the outcomes.

Charles Tator, MD, PhD, director of the Canadian Concussion Centre at Toronto Western Hospital, said the findings were unsurprising.

“Headaches are the most common symptom after concussion,” Dr. Tator, who was not involved in the latest research, told this news organization. “In my practice and research with concussed kids 11 and up and with adults, those with preconcussion history of migraine are the most difficult to treat because their headaches don’t improve unless specific measures are taken.”

Dr. Tator, who also is a professor of neurosurgery at the University of Toronto, said clinicians who treat concussions must determine which type of headaches children are experiencing – and refer as early as possible for migraine prevention or treatment and medication, as warranted.

“Early recognition after concussion that migraine headaches are occurring will save kids a lot of suffering,” he said.

The study was supported by a Canadian Institute of Health Research Foundation Grant and by funds from the Alberta Children’s Hospital Foundation and the Alberta Children’s Hospital Research Institute. Dr. Tator has disclosed no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Children who experience migraine headaches in the aftermath of a concussion are more likely to experience prolonged symptoms of the head injury than are those with other forms of headache or no headaches at all, researchers have found.

“Early assessment of headache – and whether it has migraine features – after concussion can be helpful in predicting which children are at risk for poor outcomes and identifying children who require targeted intervention,” said senior author Keith Owen Yeates, PhD, the Ronald and Irene Ward Chair in Pediatric Brain Injury Professor and head of the department of psychology at the University of Calgary (Alta.). “Posttraumatic headache, especially when it involves migraine features, is a strong predictor of persisting symptoms and poorer quality of life after childhood concussion.”

Approximately 840,000 children per year visit an emergency department in the United States after having a traumatic brain injury. As many as 90% of those visits are considered to involve a concussion, according to the investigators. Although most children recover quickly, approximately one-third continue to report symptoms a month after the event.

Posttraumatic headache occurs in up to 90% of children, most commonly with features of migraine.

The new study, published in JAMA Network Open, was a secondary analysis of the Advancing Concussion Assessment in Pediatrics (A-CAP) prospective cohort study. The study was conducted at five emergency departments in Canada from September 2016 to July 2019 and included children and adolescents aged 8-17 years who presented with acute concussion or an orthopedic injury.

Children were included in the concussion group if they had a history of blunt head trauma resulting in at least one of three criteria consistent with the World Health Organization definition of mild traumatic brain injury. The criteria include loss of consciousness for less than 30 minutes, a Glasgow Coma Scale score of 13 or 14, or at least one acute sign or symptom of concussion, as noted by emergency clinicians.

Patients were excluded from the concussion group if they had deteriorating neurologic status, underwent neurosurgical intervention, had posttraumatic amnesia that lasted more than 24 hours, or had a score higher than 4 on the Abbreviated Injury Scale (AIS). The orthopedic injury group included patients without symptoms of concussion and with blunt trauma associated with an AIS 13 score of 4 or less. Patients were excluded from both groups if they had an overnight hospitalization for traumatic brain injury, a concussion within the past 3 months, or a neurodevelopmental disorder.

The researchers analyzed data from 928 children of 967 enrolled in the study. The median age was 12.2 years, and 41.3% were female. The final study cohort included 239 children with orthopedic injuries but no headache, 160 with a concussion and no headache, 134 with a concussion and nonmigraine headaches, and 254 with a concussion and migraine headaches.

Children with posttraumatic migraines 10 days after a concussion had the most severe symptoms and worst quality of life 3 months following their head trauma, the researchers found. Children without headaches within 10 days after concussion had the best 3-month outcomes, comparable to those with orthopedic injuries alone.

The researchers said the strengths of their study included its large population and its inclusion of various causes of head trauma, not just sports-related concussions. Limitations included self-reports of headaches instead of a physician diagnosis and lack of control for clinical interventions that might have affected the outcomes.

Charles Tator, MD, PhD, director of the Canadian Concussion Centre at Toronto Western Hospital, said the findings were unsurprising.

“Headaches are the most common symptom after concussion,” Dr. Tator, who was not involved in the latest research, told this news organization. “In my practice and research with concussed kids 11 and up and with adults, those with preconcussion history of migraine are the most difficult to treat because their headaches don’t improve unless specific measures are taken.”

Dr. Tator, who also is a professor of neurosurgery at the University of Toronto, said clinicians who treat concussions must determine which type of headaches children are experiencing – and refer as early as possible for migraine prevention or treatment and medication, as warranted.

“Early recognition after concussion that migraine headaches are occurring will save kids a lot of suffering,” he said.

The study was supported by a Canadian Institute of Health Research Foundation Grant and by funds from the Alberta Children’s Hospital Foundation and the Alberta Children’s Hospital Research Institute. Dr. Tator has disclosed no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Black people are less likely to receive dementia meds

Article Type
Changed
Tue, 04/04/2023 - 14:07

 

Black people with dementia are less likely than their White peers to receive cognitive enhancers and other medications for dementia in the outpatient setting, preliminary data from a retrospective study show.

“There have been disparities regarding the use of cognition-enhancing medications in the treatment of dementia described in the literature, and disparities in the use of adjunctive treatments for other neuropsychiatric symptoms of dementia described in hospital and nursing home settings,” said study investigator Alice Hawkins, MD, with the department of neurology, Icahn School of Medicine at Mount Sinai, New York. “However, less is known about use of dementia medications that people take at home. Our study found disparities in this area as well,” Dr. Hawkins said.

The findings were released ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

More research needed

The researchers analyzed data on 3,655 Black and 12,885 White patients with a diagnosis of dementia who were seen at Mount Sinai. They evaluated utilization of five medication classes:

  • cholinesterase inhibitors.
  • N-methyl D-aspartate (NMDA) receptor antagonists.
  • selective serotonin reuptake inhibitors (SSRIs).
  • antipsychotics.
  • benzodiazepines.

They found that Black patients with dementia received cognitive enhancers less often than White patients with dementia (20% vs. 30% for cholinesterase inhibitors; 10% vs. 17% for NMDA antagonists).

Black patients with dementia were also less likely to receive medications for behavioral and psychological symptom management, compared with White peers (24% vs. 40% for SSRIs; 18% vs. 22% for antipsychotics; and 18% vs. 37% for benzodiazepines).

These disparities remained even after controlling for factors such as demographics and insurance coverage.

“Larger systemic forces such as systemic racism, quality of care, and provider bias are harder to pin down, particularly in the medical record, though they all may be playing a role in perpetuating these inequities. More research will be needed to pinpoint all the factors that are contributing to these disparities,” said Dr. Hawkins.

The researchers found Black patients who were referred to a neurologist received cholinesterase inhibitors and NMDA antagonists at rates comparable with White patients. “Therefore, referrals to specialists such as neurologists may decrease the disparities for these prescriptions,” Dr. Hawkins said.
 

Crucial research

Commenting on the findings, Carl V. Hill, PhD, MPH, Alzheimer’s Association chief diversity, equity, and inclusion officer, said the study “adds to previous research that points to inequities in the administering of medications for dementia symptoms, and highlights the inequities we know exist in dementia care.”

“Cognitive enhancers and other behavioral/psychological management drugs, while they don’t stop, slow, or cure dementia, can offer relief for some of the challenging symptoms associated with diseases caused by dementia. If people aren’t being appropriately prescribed medications that may offer symptom relief from this challenging disease, it could lead to poorer health outcomes,” said Dr. Hill.

“These data underscore the importance of health disparities research that is crucial in uncovering inequities in dementia treatment, care, and research for Black individuals, as well as all underrepresented populations.

“We must create a society in which the underserved, disproportionately affected, and underrepresented are safe, cared for, and valued. This can be done through enhancing cultural competence in health care settings, improving representation within the health care system, and engaging and building trust with diverse communities,” Dr. Hill said.

The Alzheimer’s Association has partnered with more than 500 diverse community-based groups on disease education programs to ensure families have information and resources to navigate this devastating disease.

The study was supported by the American Academy of Neurology Resident Research Scholarship. Dr. Hawkins and Dr. Hill reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Issue
Neurology Reviews - 31(4)
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Black people with dementia are less likely than their White peers to receive cognitive enhancers and other medications for dementia in the outpatient setting, preliminary data from a retrospective study show.

“There have been disparities regarding the use of cognition-enhancing medications in the treatment of dementia described in the literature, and disparities in the use of adjunctive treatments for other neuropsychiatric symptoms of dementia described in hospital and nursing home settings,” said study investigator Alice Hawkins, MD, with the department of neurology, Icahn School of Medicine at Mount Sinai, New York. “However, less is known about use of dementia medications that people take at home. Our study found disparities in this area as well,” Dr. Hawkins said.

The findings were released ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

More research needed

The researchers analyzed data on 3,655 Black and 12,885 White patients with a diagnosis of dementia who were seen at Mount Sinai. They evaluated utilization of five medication classes:

  • cholinesterase inhibitors.
  • N-methyl D-aspartate (NMDA) receptor antagonists.
  • selective serotonin reuptake inhibitors (SSRIs).
  • antipsychotics.
  • benzodiazepines.

They found that Black patients with dementia received cognitive enhancers less often than White patients with dementia (20% vs. 30% for cholinesterase inhibitors; 10% vs. 17% for NMDA antagonists).

Black patients with dementia were also less likely to receive medications for behavioral and psychological symptom management, compared with White peers (24% vs. 40% for SSRIs; 18% vs. 22% for antipsychotics; and 18% vs. 37% for benzodiazepines).

These disparities remained even after controlling for factors such as demographics and insurance coverage.

“Larger systemic forces such as systemic racism, quality of care, and provider bias are harder to pin down, particularly in the medical record, though they all may be playing a role in perpetuating these inequities. More research will be needed to pinpoint all the factors that are contributing to these disparities,” said Dr. Hawkins.

The researchers found Black patients who were referred to a neurologist received cholinesterase inhibitors and NMDA antagonists at rates comparable with White patients. “Therefore, referrals to specialists such as neurologists may decrease the disparities for these prescriptions,” Dr. Hawkins said.
 

Crucial research

Commenting on the findings, Carl V. Hill, PhD, MPH, Alzheimer’s Association chief diversity, equity, and inclusion officer, said the study “adds to previous research that points to inequities in the administering of medications for dementia symptoms, and highlights the inequities we know exist in dementia care.”

“Cognitive enhancers and other behavioral/psychological management drugs, while they don’t stop, slow, or cure dementia, can offer relief for some of the challenging symptoms associated with diseases caused by dementia. If people aren’t being appropriately prescribed medications that may offer symptom relief from this challenging disease, it could lead to poorer health outcomes,” said Dr. Hill.

“These data underscore the importance of health disparities research that is crucial in uncovering inequities in dementia treatment, care, and research for Black individuals, as well as all underrepresented populations.

“We must create a society in which the underserved, disproportionately affected, and underrepresented are safe, cared for, and valued. This can be done through enhancing cultural competence in health care settings, improving representation within the health care system, and engaging and building trust with diverse communities,” Dr. Hill said.

The Alzheimer’s Association has partnered with more than 500 diverse community-based groups on disease education programs to ensure families have information and resources to navigate this devastating disease.

The study was supported by the American Academy of Neurology Resident Research Scholarship. Dr. Hawkins and Dr. Hill reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Black people with dementia are less likely than their White peers to receive cognitive enhancers and other medications for dementia in the outpatient setting, preliminary data from a retrospective study show.

“There have been disparities regarding the use of cognition-enhancing medications in the treatment of dementia described in the literature, and disparities in the use of adjunctive treatments for other neuropsychiatric symptoms of dementia described in hospital and nursing home settings,” said study investigator Alice Hawkins, MD, with the department of neurology, Icahn School of Medicine at Mount Sinai, New York. “However, less is known about use of dementia medications that people take at home. Our study found disparities in this area as well,” Dr. Hawkins said.

The findings were released ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

More research needed

The researchers analyzed data on 3,655 Black and 12,885 White patients with a diagnosis of dementia who were seen at Mount Sinai. They evaluated utilization of five medication classes:

  • cholinesterase inhibitors.
  • N-methyl D-aspartate (NMDA) receptor antagonists.
  • selective serotonin reuptake inhibitors (SSRIs).
  • antipsychotics.
  • benzodiazepines.

They found that Black patients with dementia received cognitive enhancers less often than White patients with dementia (20% vs. 30% for cholinesterase inhibitors; 10% vs. 17% for NMDA antagonists).

Black patients with dementia were also less likely to receive medications for behavioral and psychological symptom management, compared with White peers (24% vs. 40% for SSRIs; 18% vs. 22% for antipsychotics; and 18% vs. 37% for benzodiazepines).

These disparities remained even after controlling for factors such as demographics and insurance coverage.

“Larger systemic forces such as systemic racism, quality of care, and provider bias are harder to pin down, particularly in the medical record, though they all may be playing a role in perpetuating these inequities. More research will be needed to pinpoint all the factors that are contributing to these disparities,” said Dr. Hawkins.

The researchers found Black patients who were referred to a neurologist received cholinesterase inhibitors and NMDA antagonists at rates comparable with White patients. “Therefore, referrals to specialists such as neurologists may decrease the disparities for these prescriptions,” Dr. Hawkins said.
 

Crucial research

Commenting on the findings, Carl V. Hill, PhD, MPH, Alzheimer’s Association chief diversity, equity, and inclusion officer, said the study “adds to previous research that points to inequities in the administering of medications for dementia symptoms, and highlights the inequities we know exist in dementia care.”

“Cognitive enhancers and other behavioral/psychological management drugs, while they don’t stop, slow, or cure dementia, can offer relief for some of the challenging symptoms associated with diseases caused by dementia. If people aren’t being appropriately prescribed medications that may offer symptom relief from this challenging disease, it could lead to poorer health outcomes,” said Dr. Hill.

“These data underscore the importance of health disparities research that is crucial in uncovering inequities in dementia treatment, care, and research for Black individuals, as well as all underrepresented populations.

“We must create a society in which the underserved, disproportionately affected, and underrepresented are safe, cared for, and valued. This can be done through enhancing cultural competence in health care settings, improving representation within the health care system, and engaging and building trust with diverse communities,” Dr. Hill said.

The Alzheimer’s Association has partnered with more than 500 diverse community-based groups on disease education programs to ensure families have information and resources to navigate this devastating disease.

The study was supported by the American Academy of Neurology Resident Research Scholarship. Dr. Hawkins and Dr. Hill reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Issue
Neurology Reviews - 31(4)
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Childhood nightmares a prelude to cognitive problems, Parkinson’s?

Article Type
Changed
Tue, 03/07/2023 - 17:19

 

Children who suffer from persistent bad dreams may be at increased risk for cognitive impairment or Parkinson’s disease (PD) later in life, new research shows.

Compared with children who never had distressing dreams between ages 7 and 11 years, those who had persistent distressing dreams were 76% more likely to develop cognitive impairment and roughly seven times more likely to develop PD by age 50 years.

It’s been shown previously that sleep problems in adulthood, including distressing dreams, can precede the onset of neurodegenerative diseases such as Alzheimer’s disease (AD) or PD by several years, and in some cases decades, study investigator Abidemi Otaiku, BMBS, University of Birmingham (England), told this news organization.

However, no studies have investigated whether distressing dreams during childhood might also be associated with increased risk for cognitive decline or PD.

“As such, these findings provide evidence for the first time that certain sleep problems in childhood (having regular distressing dreams) could be an early indicator of increased dementia and PD risk,” Dr. Otaiku said.

He noted that the findings build on previous studies which showed that regular nightmares in childhood could be an early indicator for psychiatric problems in adolescence, such as borderline personality disorder, attention-deficit/hyperactivity disorder, and psychosis.

The study was published online February 26 in The Lancet journal eClinicalMedicine.

Statistically significant

The prospective, longitudinal analysis used data from the 1958 British Birth Cohort Study, a prospective birth cohort which included all people born in Britain during a single week in 1958.

At age 7 years (in 1965) and 11 years (in 1969), mothers were asked to report whether their child experienced “bad dreams or night terrors” in the past 3 months, and cognitive impairment and PD were determined at age 50 (2008).

Among a total of 6,991 children (51% girls), 78.2% never had distressing dreams, 17.9% had transient distressing dreams (either at ages 7 or 11 years), and 3.8% had persistent distressing dreams (at both ages 7 and 11 years).

By age 50, 262 participants had developed cognitive impairment, and five had been diagnosed with PD.

After adjusting for all covariates, having more regular distressing dreams during childhood was “linearly and statistically significantly” associated with higher risk of developing cognitive impairment or PD by age 50 years (P = .037). This was the case in both boys and girls.

Compared with children who never had bad dreams, peers who had persistent distressing dreams (at ages 7 and 11 years) had an 85% increased risk for cognitive impairment or PD by age 50 (adjusted odds ratio, 1.85; 95% confidence interval, 1.10-3.11; P = .019).

The associations remained when incident cognitive impairment and incident PD were analyzed separately.

Compared with children who never had distressing dreams, children who had persistent distressing dreams were 76% more likely to develop cognitive impairment by age 50 years (aOR, 1.76; 95% CI, 1.03-2.99; P = .037), and were about seven times more likely to be diagnosed with PD by age 50 years (aOR, 7.35; 95% CI, 1.03-52.73; P = .047).

The linear association was statistically significant for PD (P = .050) and had a trend toward statistical significance for cognitive impairment (P = .074).

 

 

Mechanism unclear

“Early-life nightmares might be causally associated with cognitive impairment and PD, noncausally associated with cognitive impairment and PD, or both. At this stage it remains unclear which of the three options is correct. Therefore, further research on mechanisms is needed,” Dr. Otaiku told this news organization.

“One plausible noncausal explanation is that there are shared genetic factors which predispose individuals to having frequent nightmares in childhood, and to developing neurodegenerative diseases such as AD or PD in adulthood,” he added.

It’s also plausible that having regular nightmares throughout childhood could be a causal risk factor for cognitive impairment and PD by causing chronic sleep disruption, he noted.

“Chronic sleep disruption due to nightmares might lead to impaired glymphatic clearance during sleep – and thus greater accumulation of pathological proteins in the brain, such as amyloid-beta and alpha-synuclein,” Dr. Otaiku said.

Disrupted sleep throughout childhood might also impair normal brain development, which could make children’s brains less resilient to neuropathologic damage, he said.

Clinical implications?

There are established treatments for childhood nightmares, including nonpharmacologic approaches.

“For children who have regular nightmares that lead to impaired daytime functioning, it may well be a good idea for them to see a sleep physician to discuss whether treatment may be needed,” Dr. Otaiku said.

But should doctors treat children with persistent nightmares for the purpose of preventing neurodegenerative diseases in adulthood or psychiatric problems in adolescence?

“It’s an interesting possibility. However, more research is needed to confirm these epidemiological associations and to determine whether or not nightmares are a causal risk factor for these conditions,” Dr. Otaiku concluded.

The study received no external funding. Dr. Otaiku reports no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Children who suffer from persistent bad dreams may be at increased risk for cognitive impairment or Parkinson’s disease (PD) later in life, new research shows.

Compared with children who never had distressing dreams between ages 7 and 11 years, those who had persistent distressing dreams were 76% more likely to develop cognitive impairment and roughly seven times more likely to develop PD by age 50 years.

It’s been shown previously that sleep problems in adulthood, including distressing dreams, can precede the onset of neurodegenerative diseases such as Alzheimer’s disease (AD) or PD by several years, and in some cases decades, study investigator Abidemi Otaiku, BMBS, University of Birmingham (England), told this news organization.

However, no studies have investigated whether distressing dreams during childhood might also be associated with increased risk for cognitive decline or PD.

“As such, these findings provide evidence for the first time that certain sleep problems in childhood (having regular distressing dreams) could be an early indicator of increased dementia and PD risk,” Dr. Otaiku said.

He noted that the findings build on previous studies which showed that regular nightmares in childhood could be an early indicator for psychiatric problems in adolescence, such as borderline personality disorder, attention-deficit/hyperactivity disorder, and psychosis.

The study was published online February 26 in The Lancet journal eClinicalMedicine.

Statistically significant

The prospective, longitudinal analysis used data from the 1958 British Birth Cohort Study, a prospective birth cohort which included all people born in Britain during a single week in 1958.

At age 7 years (in 1965) and 11 years (in 1969), mothers were asked to report whether their child experienced “bad dreams or night terrors” in the past 3 months, and cognitive impairment and PD were determined at age 50 (2008).

Among a total of 6,991 children (51% girls), 78.2% never had distressing dreams, 17.9% had transient distressing dreams (either at ages 7 or 11 years), and 3.8% had persistent distressing dreams (at both ages 7 and 11 years).

By age 50, 262 participants had developed cognitive impairment, and five had been diagnosed with PD.

After adjusting for all covariates, having more regular distressing dreams during childhood was “linearly and statistically significantly” associated with higher risk of developing cognitive impairment or PD by age 50 years (P = .037). This was the case in both boys and girls.

Compared with children who never had bad dreams, peers who had persistent distressing dreams (at ages 7 and 11 years) had an 85% increased risk for cognitive impairment or PD by age 50 (adjusted odds ratio, 1.85; 95% confidence interval, 1.10-3.11; P = .019).

The associations remained when incident cognitive impairment and incident PD were analyzed separately.

Compared with children who never had distressing dreams, children who had persistent distressing dreams were 76% more likely to develop cognitive impairment by age 50 years (aOR, 1.76; 95% CI, 1.03-2.99; P = .037), and were about seven times more likely to be diagnosed with PD by age 50 years (aOR, 7.35; 95% CI, 1.03-52.73; P = .047).

The linear association was statistically significant for PD (P = .050) and had a trend toward statistical significance for cognitive impairment (P = .074).

 

 

Mechanism unclear

“Early-life nightmares might be causally associated with cognitive impairment and PD, noncausally associated with cognitive impairment and PD, or both. At this stage it remains unclear which of the three options is correct. Therefore, further research on mechanisms is needed,” Dr. Otaiku told this news organization.

“One plausible noncausal explanation is that there are shared genetic factors which predispose individuals to having frequent nightmares in childhood, and to developing neurodegenerative diseases such as AD or PD in adulthood,” he added.

It’s also plausible that having regular nightmares throughout childhood could be a causal risk factor for cognitive impairment and PD by causing chronic sleep disruption, he noted.

“Chronic sleep disruption due to nightmares might lead to impaired glymphatic clearance during sleep – and thus greater accumulation of pathological proteins in the brain, such as amyloid-beta and alpha-synuclein,” Dr. Otaiku said.

Disrupted sleep throughout childhood might also impair normal brain development, which could make children’s brains less resilient to neuropathologic damage, he said.

Clinical implications?

There are established treatments for childhood nightmares, including nonpharmacologic approaches.

“For children who have regular nightmares that lead to impaired daytime functioning, it may well be a good idea for them to see a sleep physician to discuss whether treatment may be needed,” Dr. Otaiku said.

But should doctors treat children with persistent nightmares for the purpose of preventing neurodegenerative diseases in adulthood or psychiatric problems in adolescence?

“It’s an interesting possibility. However, more research is needed to confirm these epidemiological associations and to determine whether or not nightmares are a causal risk factor for these conditions,” Dr. Otaiku concluded.

The study received no external funding. Dr. Otaiku reports no relevant disclosures.

A version of this article first appeared on Medscape.com.

 

Children who suffer from persistent bad dreams may be at increased risk for cognitive impairment or Parkinson’s disease (PD) later in life, new research shows.

Compared with children who never had distressing dreams between ages 7 and 11 years, those who had persistent distressing dreams were 76% more likely to develop cognitive impairment and roughly seven times more likely to develop PD by age 50 years.

It’s been shown previously that sleep problems in adulthood, including distressing dreams, can precede the onset of neurodegenerative diseases such as Alzheimer’s disease (AD) or PD by several years, and in some cases decades, study investigator Abidemi Otaiku, BMBS, University of Birmingham (England), told this news organization.

However, no studies have investigated whether distressing dreams during childhood might also be associated with increased risk for cognitive decline or PD.

“As such, these findings provide evidence for the first time that certain sleep problems in childhood (having regular distressing dreams) could be an early indicator of increased dementia and PD risk,” Dr. Otaiku said.

He noted that the findings build on previous studies which showed that regular nightmares in childhood could be an early indicator for psychiatric problems in adolescence, such as borderline personality disorder, attention-deficit/hyperactivity disorder, and psychosis.

The study was published online February 26 in The Lancet journal eClinicalMedicine.

Statistically significant

The prospective, longitudinal analysis used data from the 1958 British Birth Cohort Study, a prospective birth cohort which included all people born in Britain during a single week in 1958.

At age 7 years (in 1965) and 11 years (in 1969), mothers were asked to report whether their child experienced “bad dreams or night terrors” in the past 3 months, and cognitive impairment and PD were determined at age 50 (2008).

Among a total of 6,991 children (51% girls), 78.2% never had distressing dreams, 17.9% had transient distressing dreams (either at ages 7 or 11 years), and 3.8% had persistent distressing dreams (at both ages 7 and 11 years).

By age 50, 262 participants had developed cognitive impairment, and five had been diagnosed with PD.

After adjusting for all covariates, having more regular distressing dreams during childhood was “linearly and statistically significantly” associated with higher risk of developing cognitive impairment or PD by age 50 years (P = .037). This was the case in both boys and girls.

Compared with children who never had bad dreams, peers who had persistent distressing dreams (at ages 7 and 11 years) had an 85% increased risk for cognitive impairment or PD by age 50 (adjusted odds ratio, 1.85; 95% confidence interval, 1.10-3.11; P = .019).

The associations remained when incident cognitive impairment and incident PD were analyzed separately.

Compared with children who never had distressing dreams, children who had persistent distressing dreams were 76% more likely to develop cognitive impairment by age 50 years (aOR, 1.76; 95% CI, 1.03-2.99; P = .037), and were about seven times more likely to be diagnosed with PD by age 50 years (aOR, 7.35; 95% CI, 1.03-52.73; P = .047).

The linear association was statistically significant for PD (P = .050) and had a trend toward statistical significance for cognitive impairment (P = .074).

 

 

Mechanism unclear

“Early-life nightmares might be causally associated with cognitive impairment and PD, noncausally associated with cognitive impairment and PD, or both. At this stage it remains unclear which of the three options is correct. Therefore, further research on mechanisms is needed,” Dr. Otaiku told this news organization.

“One plausible noncausal explanation is that there are shared genetic factors which predispose individuals to having frequent nightmares in childhood, and to developing neurodegenerative diseases such as AD or PD in adulthood,” he added.

It’s also plausible that having regular nightmares throughout childhood could be a causal risk factor for cognitive impairment and PD by causing chronic sleep disruption, he noted.

“Chronic sleep disruption due to nightmares might lead to impaired glymphatic clearance during sleep – and thus greater accumulation of pathological proteins in the brain, such as amyloid-beta and alpha-synuclein,” Dr. Otaiku said.

Disrupted sleep throughout childhood might also impair normal brain development, which could make children’s brains less resilient to neuropathologic damage, he said.

Clinical implications?

There are established treatments for childhood nightmares, including nonpharmacologic approaches.

“For children who have regular nightmares that lead to impaired daytime functioning, it may well be a good idea for them to see a sleep physician to discuss whether treatment may be needed,” Dr. Otaiku said.

But should doctors treat children with persistent nightmares for the purpose of preventing neurodegenerative diseases in adulthood or psychiatric problems in adolescence?

“It’s an interesting possibility. However, more research is needed to confirm these epidemiological associations and to determine whether or not nightmares are a causal risk factor for these conditions,” Dr. Otaiku concluded.

The study received no external funding. Dr. Otaiku reports no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECLINICALMEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Even mild COVID is hard on the brain

Article Type
Changed
Tue, 04/25/2023 - 13:57

Even mild cases of COVID-19 can affect the function and structure of the brain, early research suggests.

“Our results suggest a severe pattern of changes in how the brain communicates as well as its structure, mainly in people with anxiety and depression with long-COVID syndrome, which affects so many people,” study investigator Clarissa Yasuda, MD, PhD, from University of Campinas, São Paulo, said in a news release.

“The magnitude of these changes suggests that they could lead to problems with memory and thinking skills, so we need to be exploring holistic treatments even for people mildly affected by COVID-19,” Dr. Yasuda added.

The findings were released March 6 ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

Brain shrinkage

Some studies have shown a high prevalence of symptoms of anxiety and depression in COVID-19 survivors, but few have investigated the associated cerebral changes, Dr. Yasuda told this news organization.

The study included 254 adults (177 women, 77 men, median age 41 years) who had mild COVID-19 a median of 82 days earlier. A total of 102 had symptoms of both anxiety and depression, and 152 had no such symptoms.

On brain imaging, those with COVID-19 and anxiety and depression had atrophy in the limbic area of the brain, which plays a role in memory and emotional processing.

No shrinkage in this area was evident in people who had COVID-19 without anxiety and depression or in a healthy control group of individuals without COVID-19.

The researchers also observed a “severe” pattern of abnormal cerebral functional connectivity in those with COVID-19 and anxiety and depression. 

In this functional connectivity analysis, individuals with COVID-19 and anxiety and depression had widespread functional changes in each of the 12 networks assessed, while those with COVID-19 but without symptoms of anxiety and depression showed changes in only 5 networks.
 

Mechanisms unclear

“Unfortunately, the underpinning mechanisms associated with brain changes and neuropsychiatric dysfunction after COVID-19 infection are unclear,” Dr. Yasuda told this news organization.

“Some studies have demonstrated an association between symptoms of anxiety and depression with inflammation. However, we hypothesize that these cerebral alterations may result from a more complex interaction of social, psychological, and systemic stressors, including inflammation. It is indeed intriguing that such alterations are present in individuals who presented mild acute infection,” Dr. Yasuda added.

“Symptoms of anxiety and depression are frequently observed after COVID-19 and are part of long-COVID syndrome for some individuals. These symptoms require adequate treatment to improve the quality of life, cognition, and work capacity,” she said.

Treating these symptoms may induce “brain plasticity, which may result in some degree of gray matter increase and eventually prevent further structural and functional damage,” Dr. Yasuda said. 

A limitation of the study was that symptoms of anxiety and depression were self-reported, meaning people may have misjudged or misreported symptoms.

Commenting on the findings for this news organization, Cyrus Raji, MD, PhD, with the Mallinckrodt Institute of Radiology, Washington University, St. Louis, said the idea that COVID-19 is bad for the brain isn’t new. Dr. Raji was not involved with the study.

Early in the pandemic, Dr. Raji and colleagues published a paper detailing COVID-19’s effects on the brain, and Dr. Raji followed it up with a TED talk on the subject.

“Within the growing framework of what we already know about COVID-19 infection and its adverse effects on the brain, this work incrementally adds to this knowledge by identifying functional and structural neuroimaging abnormalities related to anxiety and depression in persons suffering from COVID-19 infection,” Dr. Raji said.

The study was supported by the São Paulo Research Foundation. The authors have no relevant disclosures. Raji is a consultant for Brainreader, Apollo Health, Pacific Neuroscience Foundation, and Neurevolution LLC.

Meeting/Event
Issue
Neurology Reviews - 31(4)
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Even mild cases of COVID-19 can affect the function and structure of the brain, early research suggests.

“Our results suggest a severe pattern of changes in how the brain communicates as well as its structure, mainly in people with anxiety and depression with long-COVID syndrome, which affects so many people,” study investigator Clarissa Yasuda, MD, PhD, from University of Campinas, São Paulo, said in a news release.

“The magnitude of these changes suggests that they could lead to problems with memory and thinking skills, so we need to be exploring holistic treatments even for people mildly affected by COVID-19,” Dr. Yasuda added.

The findings were released March 6 ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

Brain shrinkage

Some studies have shown a high prevalence of symptoms of anxiety and depression in COVID-19 survivors, but few have investigated the associated cerebral changes, Dr. Yasuda told this news organization.

The study included 254 adults (177 women, 77 men, median age 41 years) who had mild COVID-19 a median of 82 days earlier. A total of 102 had symptoms of both anxiety and depression, and 152 had no such symptoms.

On brain imaging, those with COVID-19 and anxiety and depression had atrophy in the limbic area of the brain, which plays a role in memory and emotional processing.

No shrinkage in this area was evident in people who had COVID-19 without anxiety and depression or in a healthy control group of individuals without COVID-19.

The researchers also observed a “severe” pattern of abnormal cerebral functional connectivity in those with COVID-19 and anxiety and depression. 

In this functional connectivity analysis, individuals with COVID-19 and anxiety and depression had widespread functional changes in each of the 12 networks assessed, while those with COVID-19 but without symptoms of anxiety and depression showed changes in only 5 networks.
 

Mechanisms unclear

“Unfortunately, the underpinning mechanisms associated with brain changes and neuropsychiatric dysfunction after COVID-19 infection are unclear,” Dr. Yasuda told this news organization.

“Some studies have demonstrated an association between symptoms of anxiety and depression with inflammation. However, we hypothesize that these cerebral alterations may result from a more complex interaction of social, psychological, and systemic stressors, including inflammation. It is indeed intriguing that such alterations are present in individuals who presented mild acute infection,” Dr. Yasuda added.

“Symptoms of anxiety and depression are frequently observed after COVID-19 and are part of long-COVID syndrome for some individuals. These symptoms require adequate treatment to improve the quality of life, cognition, and work capacity,” she said.

Treating these symptoms may induce “brain plasticity, which may result in some degree of gray matter increase and eventually prevent further structural and functional damage,” Dr. Yasuda said. 

A limitation of the study was that symptoms of anxiety and depression were self-reported, meaning people may have misjudged or misreported symptoms.

Commenting on the findings for this news organization, Cyrus Raji, MD, PhD, with the Mallinckrodt Institute of Radiology, Washington University, St. Louis, said the idea that COVID-19 is bad for the brain isn’t new. Dr. Raji was not involved with the study.

Early in the pandemic, Dr. Raji and colleagues published a paper detailing COVID-19’s effects on the brain, and Dr. Raji followed it up with a TED talk on the subject.

“Within the growing framework of what we already know about COVID-19 infection and its adverse effects on the brain, this work incrementally adds to this knowledge by identifying functional and structural neuroimaging abnormalities related to anxiety and depression in persons suffering from COVID-19 infection,” Dr. Raji said.

The study was supported by the São Paulo Research Foundation. The authors have no relevant disclosures. Raji is a consultant for Brainreader, Apollo Health, Pacific Neuroscience Foundation, and Neurevolution LLC.

Even mild cases of COVID-19 can affect the function and structure of the brain, early research suggests.

“Our results suggest a severe pattern of changes in how the brain communicates as well as its structure, mainly in people with anxiety and depression with long-COVID syndrome, which affects so many people,” study investigator Clarissa Yasuda, MD, PhD, from University of Campinas, São Paulo, said in a news release.

“The magnitude of these changes suggests that they could lead to problems with memory and thinking skills, so we need to be exploring holistic treatments even for people mildly affected by COVID-19,” Dr. Yasuda added.

The findings were released March 6 ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

Brain shrinkage

Some studies have shown a high prevalence of symptoms of anxiety and depression in COVID-19 survivors, but few have investigated the associated cerebral changes, Dr. Yasuda told this news organization.

The study included 254 adults (177 women, 77 men, median age 41 years) who had mild COVID-19 a median of 82 days earlier. A total of 102 had symptoms of both anxiety and depression, and 152 had no such symptoms.

On brain imaging, those with COVID-19 and anxiety and depression had atrophy in the limbic area of the brain, which plays a role in memory and emotional processing.

No shrinkage in this area was evident in people who had COVID-19 without anxiety and depression or in a healthy control group of individuals without COVID-19.

The researchers also observed a “severe” pattern of abnormal cerebral functional connectivity in those with COVID-19 and anxiety and depression. 

In this functional connectivity analysis, individuals with COVID-19 and anxiety and depression had widespread functional changes in each of the 12 networks assessed, while those with COVID-19 but without symptoms of anxiety and depression showed changes in only 5 networks.
 

Mechanisms unclear

“Unfortunately, the underpinning mechanisms associated with brain changes and neuropsychiatric dysfunction after COVID-19 infection are unclear,” Dr. Yasuda told this news organization.

“Some studies have demonstrated an association between symptoms of anxiety and depression with inflammation. However, we hypothesize that these cerebral alterations may result from a more complex interaction of social, psychological, and systemic stressors, including inflammation. It is indeed intriguing that such alterations are present in individuals who presented mild acute infection,” Dr. Yasuda added.

“Symptoms of anxiety and depression are frequently observed after COVID-19 and are part of long-COVID syndrome for some individuals. These symptoms require adequate treatment to improve the quality of life, cognition, and work capacity,” she said.

Treating these symptoms may induce “brain plasticity, which may result in some degree of gray matter increase and eventually prevent further structural and functional damage,” Dr. Yasuda said. 

A limitation of the study was that symptoms of anxiety and depression were self-reported, meaning people may have misjudged or misreported symptoms.

Commenting on the findings for this news organization, Cyrus Raji, MD, PhD, with the Mallinckrodt Institute of Radiology, Washington University, St. Louis, said the idea that COVID-19 is bad for the brain isn’t new. Dr. Raji was not involved with the study.

Early in the pandemic, Dr. Raji and colleagues published a paper detailing COVID-19’s effects on the brain, and Dr. Raji followed it up with a TED talk on the subject.

“Within the growing framework of what we already know about COVID-19 infection and its adverse effects on the brain, this work incrementally adds to this knowledge by identifying functional and structural neuroimaging abnormalities related to anxiety and depression in persons suffering from COVID-19 infection,” Dr. Raji said.

The study was supported by the São Paulo Research Foundation. The authors have no relevant disclosures. Raji is a consultant for Brainreader, Apollo Health, Pacific Neuroscience Foundation, and Neurevolution LLC.

Issue
Neurology Reviews - 31(4)
Issue
Neurology Reviews - 31(4)
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Any level of physical activity tied to better later-life memory

Article Type
Changed
Tue, 04/25/2023 - 13:55

Any amount of exercise in middle age is associated with better cognition in later life, new research suggests.

A prospective study of 1,400 participants showed that those who exercised to any extent in adulthood had significantly better cognitive scores later in life, compared with their peers who were physically inactive.

Maintaining an exercise routine throughout adulthood showed the strongest link to subsequent mental acuity.

Although these associations lessened when investigators controlled for childhood cognitive ability, socioeconomic background, and education, they remained statistically significant.

“Our findings support recommendations for greater participation in physical activity across adulthood,” lead investigator Sarah-Naomi James, PhD, research fellow at the Medical Research Council Unit for Lifelong Health and Ageing at the University College London, told this news organization.

“We provide evidence to encourage inactive adults to be active even to a small extent … at any point during adulthood,” which can improve cognition and memory later in life, Dr. James said.

The findings were published online in the Journal of Neurology, Neurosurgery & Psychiatry.
 

Exercise timing

Previous studies have established a link between fitness training and cognitive benefit later in life, but the researchers wanted to explore whether the timing or type of exercise influenced cognitive outcomes in later life.

The investigators asked more than 1,400 participants in the 1946 British birth cohort how much they had exercised at ages 36, 43, 60, and 69 years.

The questions changed slightly for each assessment period, but in general, participants were asked whether in the past month they had exercised or participated in such activities as badminton, swimming, fitness exercises, yoga, dancing, football, mountain climbing, jogging, or brisk walks for 30 minutes or more; and if so, how many times they participated per month.

Prior research showed that when the participants were aged 60 years, the most commonly reported activities were walking (71%), swimming (33%), floor exercises (24%), and cycling (15%).

When they turned 69, researchers tested participants’ cognitive performance using the Addenbrooke’s Cognitive Examination–III, which measures attention and orientation, verbal fluency, memory, language, and visuospatial function. In this study sample, 53% were women, and all were White.

Physical activity levels were classified as inactive, moderately active (one to four times per month), and most active (five or more times per month). In addition, they were summed across all five assessments to create a total score ranging from 0 (inactive at all ages) to 5 (active at all ages).

Overall, 11% of participants were physically inactive at all five time points; 17% were active at one time point; 20% were active at two and three time points; 17% were active at four time points; and 15% were active at all five time points.
 

‘Cradle to grave’ study?

Results showed that being physically active at all study time points was significantly associated with higher cognitive performance, verbal memory, and processing speed when participants were aged 69 (P < .01).

Those who exercised to any extent in adulthood – even just once a month during one of the time periods, fared better cognitively in later life, compared with physically inactive participants. (P < .01).

Study limitations cited include a lack of diversity among participants and a disproportionately high attrition rate among those who were socially disadvantaged.

“Our findings show that being active during every decade from their 30s on was associated with better cognition at around 70. Indeed, those who were active for longer had the highest cognitive function,” Dr. James said.

“However, it is also never too late to start. People in our study who only started being active in their 50s or 60s still had higher cognitive scores at age 70, compared to people of the same age who had never been active,” she added.

Dr. James intends to continue following the study sample to determine whether physical activity is linked to preserved cognitive aging “and buffers the effects of cognitive deterioration in the presence of disease markers that cause dementia, ultimately delaying dementia onset.

“We hope the cohort we study will be the first ‘cradle to grave’ study in the world, where we have followed people for their entire lives,” she said.
 

 

 

Encouraging finding

In a comment, Joel Hughes, PhD, professor of psychology and director of clinical training at Kent (Ohio) State University, said the study contributes to the idea that “accumulation of physical activity over one’s lifetime fits the data better than a ‘sensitive period’ – which suggests that it’s never too late to start exercising.”

Dr. Hughes, who was not involved in the research, noted that “exercise can improve cerebral blood flow and hemodynamic function, as well as greater activation of relevant brain regions such as the frontal lobes.”

While observing that the effects of exercise on cognition are likely complex from a mechanistic point of view, the finding that “exercise preserves or improves cognition later in life is encouraging,” he said.

The study received funding from the UK Medical Research Council and Alzheimer’s Research UK. The investigators and Dr. Hughes report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Issue
Neurology Reviews - 31(4)
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Any amount of exercise in middle age is associated with better cognition in later life, new research suggests.

A prospective study of 1,400 participants showed that those who exercised to any extent in adulthood had significantly better cognitive scores later in life, compared with their peers who were physically inactive.

Maintaining an exercise routine throughout adulthood showed the strongest link to subsequent mental acuity.

Although these associations lessened when investigators controlled for childhood cognitive ability, socioeconomic background, and education, they remained statistically significant.

“Our findings support recommendations for greater participation in physical activity across adulthood,” lead investigator Sarah-Naomi James, PhD, research fellow at the Medical Research Council Unit for Lifelong Health and Ageing at the University College London, told this news organization.

“We provide evidence to encourage inactive adults to be active even to a small extent … at any point during adulthood,” which can improve cognition and memory later in life, Dr. James said.

The findings were published online in the Journal of Neurology, Neurosurgery & Psychiatry.
 

Exercise timing

Previous studies have established a link between fitness training and cognitive benefit later in life, but the researchers wanted to explore whether the timing or type of exercise influenced cognitive outcomes in later life.

The investigators asked more than 1,400 participants in the 1946 British birth cohort how much they had exercised at ages 36, 43, 60, and 69 years.

The questions changed slightly for each assessment period, but in general, participants were asked whether in the past month they had exercised or participated in such activities as badminton, swimming, fitness exercises, yoga, dancing, football, mountain climbing, jogging, or brisk walks for 30 minutes or more; and if so, how many times they participated per month.

Prior research showed that when the participants were aged 60 years, the most commonly reported activities were walking (71%), swimming (33%), floor exercises (24%), and cycling (15%).

When they turned 69, researchers tested participants’ cognitive performance using the Addenbrooke’s Cognitive Examination–III, which measures attention and orientation, verbal fluency, memory, language, and visuospatial function. In this study sample, 53% were women, and all were White.

Physical activity levels were classified as inactive, moderately active (one to four times per month), and most active (five or more times per month). In addition, they were summed across all five assessments to create a total score ranging from 0 (inactive at all ages) to 5 (active at all ages).

Overall, 11% of participants were physically inactive at all five time points; 17% were active at one time point; 20% were active at two and three time points; 17% were active at four time points; and 15% were active at all five time points.
 

‘Cradle to grave’ study?

Results showed that being physically active at all study time points was significantly associated with higher cognitive performance, verbal memory, and processing speed when participants were aged 69 (P < .01).

Those who exercised to any extent in adulthood – even just once a month during one of the time periods, fared better cognitively in later life, compared with physically inactive participants. (P < .01).

Study limitations cited include a lack of diversity among participants and a disproportionately high attrition rate among those who were socially disadvantaged.

“Our findings show that being active during every decade from their 30s on was associated with better cognition at around 70. Indeed, those who were active for longer had the highest cognitive function,” Dr. James said.

“However, it is also never too late to start. People in our study who only started being active in their 50s or 60s still had higher cognitive scores at age 70, compared to people of the same age who had never been active,” she added.

Dr. James intends to continue following the study sample to determine whether physical activity is linked to preserved cognitive aging “and buffers the effects of cognitive deterioration in the presence of disease markers that cause dementia, ultimately delaying dementia onset.

“We hope the cohort we study will be the first ‘cradle to grave’ study in the world, where we have followed people for their entire lives,” she said.
 

 

 

Encouraging finding

In a comment, Joel Hughes, PhD, professor of psychology and director of clinical training at Kent (Ohio) State University, said the study contributes to the idea that “accumulation of physical activity over one’s lifetime fits the data better than a ‘sensitive period’ – which suggests that it’s never too late to start exercising.”

Dr. Hughes, who was not involved in the research, noted that “exercise can improve cerebral blood flow and hemodynamic function, as well as greater activation of relevant brain regions such as the frontal lobes.”

While observing that the effects of exercise on cognition are likely complex from a mechanistic point of view, the finding that “exercise preserves or improves cognition later in life is encouraging,” he said.

The study received funding from the UK Medical Research Council and Alzheimer’s Research UK. The investigators and Dr. Hughes report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Any amount of exercise in middle age is associated with better cognition in later life, new research suggests.

A prospective study of 1,400 participants showed that those who exercised to any extent in adulthood had significantly better cognitive scores later in life, compared with their peers who were physically inactive.

Maintaining an exercise routine throughout adulthood showed the strongest link to subsequent mental acuity.

Although these associations lessened when investigators controlled for childhood cognitive ability, socioeconomic background, and education, they remained statistically significant.

“Our findings support recommendations for greater participation in physical activity across adulthood,” lead investigator Sarah-Naomi James, PhD, research fellow at the Medical Research Council Unit for Lifelong Health and Ageing at the University College London, told this news organization.

“We provide evidence to encourage inactive adults to be active even to a small extent … at any point during adulthood,” which can improve cognition and memory later in life, Dr. James said.

The findings were published online in the Journal of Neurology, Neurosurgery & Psychiatry.
 

Exercise timing

Previous studies have established a link between fitness training and cognitive benefit later in life, but the researchers wanted to explore whether the timing or type of exercise influenced cognitive outcomes in later life.

The investigators asked more than 1,400 participants in the 1946 British birth cohort how much they had exercised at ages 36, 43, 60, and 69 years.

The questions changed slightly for each assessment period, but in general, participants were asked whether in the past month they had exercised or participated in such activities as badminton, swimming, fitness exercises, yoga, dancing, football, mountain climbing, jogging, or brisk walks for 30 minutes or more; and if so, how many times they participated per month.

Prior research showed that when the participants were aged 60 years, the most commonly reported activities were walking (71%), swimming (33%), floor exercises (24%), and cycling (15%).

When they turned 69, researchers tested participants’ cognitive performance using the Addenbrooke’s Cognitive Examination–III, which measures attention and orientation, verbal fluency, memory, language, and visuospatial function. In this study sample, 53% were women, and all were White.

Physical activity levels were classified as inactive, moderately active (one to four times per month), and most active (five or more times per month). In addition, they were summed across all five assessments to create a total score ranging from 0 (inactive at all ages) to 5 (active at all ages).

Overall, 11% of participants were physically inactive at all five time points; 17% were active at one time point; 20% were active at two and three time points; 17% were active at four time points; and 15% were active at all five time points.
 

‘Cradle to grave’ study?

Results showed that being physically active at all study time points was significantly associated with higher cognitive performance, verbal memory, and processing speed when participants were aged 69 (P < .01).

Those who exercised to any extent in adulthood – even just once a month during one of the time periods, fared better cognitively in later life, compared with physically inactive participants. (P < .01).

Study limitations cited include a lack of diversity among participants and a disproportionately high attrition rate among those who were socially disadvantaged.

“Our findings show that being active during every decade from their 30s on was associated with better cognition at around 70. Indeed, those who were active for longer had the highest cognitive function,” Dr. James said.

“However, it is also never too late to start. People in our study who only started being active in their 50s or 60s still had higher cognitive scores at age 70, compared to people of the same age who had never been active,” she added.

Dr. James intends to continue following the study sample to determine whether physical activity is linked to preserved cognitive aging “and buffers the effects of cognitive deterioration in the presence of disease markers that cause dementia, ultimately delaying dementia onset.

“We hope the cohort we study will be the first ‘cradle to grave’ study in the world, where we have followed people for their entire lives,” she said.
 

 

 

Encouraging finding

In a comment, Joel Hughes, PhD, professor of psychology and director of clinical training at Kent (Ohio) State University, said the study contributes to the idea that “accumulation of physical activity over one’s lifetime fits the data better than a ‘sensitive period’ – which suggests that it’s never too late to start exercising.”

Dr. Hughes, who was not involved in the research, noted that “exercise can improve cerebral blood flow and hemodynamic function, as well as greater activation of relevant brain regions such as the frontal lobes.”

While observing that the effects of exercise on cognition are likely complex from a mechanistic point of view, the finding that “exercise preserves or improves cognition later in life is encouraging,” he said.

The study received funding from the UK Medical Research Council and Alzheimer’s Research UK. The investigators and Dr. Hughes report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Issue
Neurology Reviews - 31(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF NEUROLOGY, NEUROSURGERY & PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Alzheimer’s disease: What is ‘clinically meaningful’?

Article Type
Changed
Mon, 02/27/2023 - 16:44

A recent report in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association suggested that, at least for now, we need to lower the bar in Alzheimer’s disease drug trials.

Their point is that there’s no consensus on “clinically meaningful benefit.” Does it mean a complete cure for Alzheimer’s disease, with reversal of deficits? Or stopping disease progression where it is? Or just slowing things down enough that it means something to patients, family members, and caregivers?

The last one is, realistically, where we are now.

Dr. Allan M. Block, a neurologist in Scottsdale, Arizona.
Dr. Allan M. Block

The problem with this is that many nonmedical people equate “treatment” with “cure,” which isn’t close to the truth for many diseases. In Alzheimer’s disease, it’s even trickier to figure out. There’s a disparity between imaging (which suggests something that should be quite effective) and clinical results (which aren’t nearly as impressive as the PET scans).

So when I prescribe any of the Alzheimer’s medications, I make it pretty clear to patients, and more importantly the patient’s family, what they can and can’t expect. This isn’t easy, because most will come back a month later, tell me their loved one is no better, and want to try something else. So I have to explain it again. These people aren’t stupid. They’re hopeful, and also facing an impossible question. “Better” is a lot easier to judge than “slowed progression.”

“Better” is a great word for migraines. Or seizures. Or Parkinson’s disease. These are condition where patients and families can tell us whether they’ve seen an improvement.

But with the current treatments for Alzheimer’s disease we’re asking patients and families “do you think you’ve gotten any worse than you would have if you hadn’t taken the drug at all?”

That’s an impossible question to answer, unless you’re following people with objective cognitive data over time and comparing them against a placebo group, which is how these drugs got here in the first place – we know they do that.

But to a family watching their loved ones go downhill, such reassurances aren’t what they want to hear.

Regrettably, it’s where things stand. While I want to strive for absolute success in these things, today it’s simply not possible. Maybe it never will be, though I hope it is.

But, for now, I agree that we need to reframe what we’re going to consider clinically meaningful. Sometimes you have to settle for a flight of stairs instead of an elevator, but still hope that you’ll get to the top. It just takes longer, and it’s better than not going anywhere at all.

Dr. Block has a solo neurology practice in Scottsdale, Ariz.

Publications
Topics
Sections

A recent report in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association suggested that, at least for now, we need to lower the bar in Alzheimer’s disease drug trials.

Their point is that there’s no consensus on “clinically meaningful benefit.” Does it mean a complete cure for Alzheimer’s disease, with reversal of deficits? Or stopping disease progression where it is? Or just slowing things down enough that it means something to patients, family members, and caregivers?

The last one is, realistically, where we are now.

Dr. Allan M. Block, a neurologist in Scottsdale, Arizona.
Dr. Allan M. Block

The problem with this is that many nonmedical people equate “treatment” with “cure,” which isn’t close to the truth for many diseases. In Alzheimer’s disease, it’s even trickier to figure out. There’s a disparity between imaging (which suggests something that should be quite effective) and clinical results (which aren’t nearly as impressive as the PET scans).

So when I prescribe any of the Alzheimer’s medications, I make it pretty clear to patients, and more importantly the patient’s family, what they can and can’t expect. This isn’t easy, because most will come back a month later, tell me their loved one is no better, and want to try something else. So I have to explain it again. These people aren’t stupid. They’re hopeful, and also facing an impossible question. “Better” is a lot easier to judge than “slowed progression.”

“Better” is a great word for migraines. Or seizures. Or Parkinson’s disease. These are condition where patients and families can tell us whether they’ve seen an improvement.

But with the current treatments for Alzheimer’s disease we’re asking patients and families “do you think you’ve gotten any worse than you would have if you hadn’t taken the drug at all?”

That’s an impossible question to answer, unless you’re following people with objective cognitive data over time and comparing them against a placebo group, which is how these drugs got here in the first place – we know they do that.

But to a family watching their loved ones go downhill, such reassurances aren’t what they want to hear.

Regrettably, it’s where things stand. While I want to strive for absolute success in these things, today it’s simply not possible. Maybe it never will be, though I hope it is.

But, for now, I agree that we need to reframe what we’re going to consider clinically meaningful. Sometimes you have to settle for a flight of stairs instead of an elevator, but still hope that you’ll get to the top. It just takes longer, and it’s better than not going anywhere at all.

Dr. Block has a solo neurology practice in Scottsdale, Ariz.

A recent report in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association suggested that, at least for now, we need to lower the bar in Alzheimer’s disease drug trials.

Their point is that there’s no consensus on “clinically meaningful benefit.” Does it mean a complete cure for Alzheimer’s disease, with reversal of deficits? Or stopping disease progression where it is? Or just slowing things down enough that it means something to patients, family members, and caregivers?

The last one is, realistically, where we are now.

Dr. Allan M. Block, a neurologist in Scottsdale, Arizona.
Dr. Allan M. Block

The problem with this is that many nonmedical people equate “treatment” with “cure,” which isn’t close to the truth for many diseases. In Alzheimer’s disease, it’s even trickier to figure out. There’s a disparity between imaging (which suggests something that should be quite effective) and clinical results (which aren’t nearly as impressive as the PET scans).

So when I prescribe any of the Alzheimer’s medications, I make it pretty clear to patients, and more importantly the patient’s family, what they can and can’t expect. This isn’t easy, because most will come back a month later, tell me their loved one is no better, and want to try something else. So I have to explain it again. These people aren’t stupid. They’re hopeful, and also facing an impossible question. “Better” is a lot easier to judge than “slowed progression.”

“Better” is a great word for migraines. Or seizures. Or Parkinson’s disease. These are condition where patients and families can tell us whether they’ve seen an improvement.

But with the current treatments for Alzheimer’s disease we’re asking patients and families “do you think you’ve gotten any worse than you would have if you hadn’t taken the drug at all?”

That’s an impossible question to answer, unless you’re following people with objective cognitive data over time and comparing them against a placebo group, which is how these drugs got here in the first place – we know they do that.

But to a family watching their loved ones go downhill, such reassurances aren’t what they want to hear.

Regrettably, it’s where things stand. While I want to strive for absolute success in these things, today it’s simply not possible. Maybe it never will be, though I hope it is.

But, for now, I agree that we need to reframe what we’re going to consider clinically meaningful. Sometimes you have to settle for a flight of stairs instead of an elevator, but still hope that you’ll get to the top. It just takes longer, and it’s better than not going anywhere at all.

Dr. Block has a solo neurology practice in Scottsdale, Ariz.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article