Theme
medstat_icymi_bcell
icymibcell
Main menu
ICYMI B-Cell Lymphoma Featured Menu
Unpublish
Altmetric
Click for Credit Button Label
Click For Credit
DSM Affiliated
Display in offset block
Enable Disqus
Display Author and Disclosure Link
Publication Type
News
Slot System
Featured Buckets
Disable Sticky Ads
Disable Ad Block Mitigation
Featured Buckets Admin
Show Ads on this Publication's Homepage
Consolidated Pub
Show Article Page Numbers on TOC
Use larger logo size
Off
publication_blueconic_enabled
Off
Show More Destinations Menu
Disable Adhesion on Publication
Off
Restore Menu Label on Mobile Navigation
Disable Facebook Pixel from Publication
Exclude this publication from publication selection on articles and quiz
Challenge Center
Disable Inline Native ads
Activity Salesforce Deliverable ID
343187.19
Activity ID
95012
Product Name
Clinical Briefings ICYMI
Product ID
112

Premature return to play after concussion has decreased

Article Type
Changed
Thu, 12/15/2022 - 15:38

Rates of premature return to play (RTP) among student athletes following a sport-related concussion (SRC) have dropped substantially since 2011, according to a recent chart review. Rates of premature return to learn (RTL) are essentially unchanged, however.

“Delay in recovery is the major reason why it’s important not to RTL or RTP prematurely,” said James Carson, MD, associate professor of family and community medicine, University of Toronto.

“That delay in recovery only sets students further back in terms of the stress they get from being delayed with their schoolwork – they could lose their year in school, lose all their social contacts. So, there are a number of psychosocial issues that come into play if recovery is delayed, and that is what premature RTL and premature RTP will do – they delay the student’s recovery,” he emphasized.

The study was published in Canadian Family Physician.
 

Differences by sex

The study involved 241 students who had 258 distinct cases of SRC. The researchers defined premature RTP and RTL as chart records documenting the relapse, recurrence, or worsening of concussion symptoms that accompanied the patient’s RTP or RTL. Between 2011 and 2016, 26.7% of students had evidence of premature RTP, while 42.6% of them had evidence of premature RTL, the authors noted.

Compared with findings from an earlier survey of data from 2006 to 2011, the incidence of premature RTP dropped by 38.6% (P = .0003). In contrast, symptoms associated with premature RTL dropped by only 4.7% from the previous survey. This change was not statistically significant.

There was also a significant difference between males and females in the proportion of SRC cases with relapse of symptoms. Relapse occurred in 43.4% of female athletes with SRC versus 29.7% of male athletes with SRC (P = .023).

Female athletes also had significantly longer times before being cleared for RTP. The mean time was 74.5 days for females, compared with a mean of 42.3 days for male athletes (P < .001). “The median time to RTP clearance was nearly double [for female athletes] at 49 days versus 25 days [for male athletes],” wrote the authors.

The rate of premature RTL was also higher among secondary school students (48.8%), compared with 28% among elementary students and 42% among postsecondary students.
 

More concussions coming?

Before the first consensus conference, organized by the Concussion in Sport Group in 2001, management of concussion was based on rating and grading scales that had no medical evidence to support them, said Dr. Carson. After the consensus conference, it was recommended that physicians manage each concussion individually and, when it came to RTP, recommendations were based upon symptom resolution.

In contrast, there was nothing in the literature regarding how student athletes who sustain a concussion should RTL. Some schools made generous accommodations, and others none. This situation changed around 2011, when experts started publishing data about how better to accommodate student athletes who have a temporary disability for which schools need to introduce temporary accommodations to help them recover.

“Recommendations for RTP essentially had a 12-year head-start,” Dr. Carson emphasized, “and RTL had a much slower start.” Unfortunately, Dr. Carson foresees more athletes sustaining concussions as pandemic restrictions ease over the next few months. “As athletes RTP after the pandemic, they just will not be in game shape,” he said.

“In other words, athletes may not have the neuromuscular control to avoid these injuries as easily,” he added. Worse, athletes may not realize they are not quite ready to return to the expected level of participation so quickly. “I believe this scenario will lead to more concussions that will be difficult to manage in the context of an already strained health care system,” said Dr. Carson.

A limitation of the study was that it was difficult to assess whether all patients followed medical advice consistently.
 

 

 

“Very positive shifts”

Commenting on the findings, Nick Reed, PhD, Canada research chair in pediatric concussion and associate professor of occupational science and occupational therapy, University of Toronto, said that sports medicine physicians are seeing “very positive shifts” in concussion awareness and related behaviors such as providing education, support, and accommodations to students within the school environment. “More and more teachers are seeking education to learn what a concussion is and what to do to best support their students with concussion,” he said. Dr. Reed was not involved in the current study.

Indeed, this increasing awareness led to the development of a concussion education tool for teachers – SCHOOLFirst – although Dr. Reed did acknowledge that not all teachers have either the knowledge or the resources they need to optimally support their students with concussion. In the meantime, to reduce the risk of injury, Dr. Reed stressed that it is important for students to wear equipment appropriate for the game being played and to play by the rules.

“It is key to play sports in a way that is fair and respectful and not [engage] in behaviors with the intent of injuring an opponent,” he stressed. It is also important for athletes themselves to know the signs and symptoms of concussion and, if they think they have a concussion, to immediately stop playing, report how they are feeling to a coach, teacher, or parent, and to seek medical assessment to determine if they have a concussion or not.

“The key here is to focus on what the athlete can do after a concussion rather than what they can’t do,” Dr. Reed said. After even a few days of complete rest, students with a concussion can gradually introduce low levels of physical and cognitive activity that won’t make their symptoms worse. This activity can include going back to school with temporary accommodations in place, such as shorter school days and increased rest breaks. “When returning to school and to sport after a concussion, it is important to follow a stepwise and gradual return to activities so that you aren’t doing too much too fast,” Dr. Reed emphasized.

The study was conducted without external funding. Dr. Carson and Dr. Reed reported no conflicts of interest. 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Rates of premature return to play (RTP) among student athletes following a sport-related concussion (SRC) have dropped substantially since 2011, according to a recent chart review. Rates of premature return to learn (RTL) are essentially unchanged, however.

“Delay in recovery is the major reason why it’s important not to RTL or RTP prematurely,” said James Carson, MD, associate professor of family and community medicine, University of Toronto.

“That delay in recovery only sets students further back in terms of the stress they get from being delayed with their schoolwork – they could lose their year in school, lose all their social contacts. So, there are a number of psychosocial issues that come into play if recovery is delayed, and that is what premature RTL and premature RTP will do – they delay the student’s recovery,” he emphasized.

The study was published in Canadian Family Physician.
 

Differences by sex

The study involved 241 students who had 258 distinct cases of SRC. The researchers defined premature RTP and RTL as chart records documenting the relapse, recurrence, or worsening of concussion symptoms that accompanied the patient’s RTP or RTL. Between 2011 and 2016, 26.7% of students had evidence of premature RTP, while 42.6% of them had evidence of premature RTL, the authors noted.

Compared with findings from an earlier survey of data from 2006 to 2011, the incidence of premature RTP dropped by 38.6% (P = .0003). In contrast, symptoms associated with premature RTL dropped by only 4.7% from the previous survey. This change was not statistically significant.

There was also a significant difference between males and females in the proportion of SRC cases with relapse of symptoms. Relapse occurred in 43.4% of female athletes with SRC versus 29.7% of male athletes with SRC (P = .023).

Female athletes also had significantly longer times before being cleared for RTP. The mean time was 74.5 days for females, compared with a mean of 42.3 days for male athletes (P < .001). “The median time to RTP clearance was nearly double [for female athletes] at 49 days versus 25 days [for male athletes],” wrote the authors.

The rate of premature RTL was also higher among secondary school students (48.8%), compared with 28% among elementary students and 42% among postsecondary students.
 

More concussions coming?

Before the first consensus conference, organized by the Concussion in Sport Group in 2001, management of concussion was based on rating and grading scales that had no medical evidence to support them, said Dr. Carson. After the consensus conference, it was recommended that physicians manage each concussion individually and, when it came to RTP, recommendations were based upon symptom resolution.

In contrast, there was nothing in the literature regarding how student athletes who sustain a concussion should RTL. Some schools made generous accommodations, and others none. This situation changed around 2011, when experts started publishing data about how better to accommodate student athletes who have a temporary disability for which schools need to introduce temporary accommodations to help them recover.

“Recommendations for RTP essentially had a 12-year head-start,” Dr. Carson emphasized, “and RTL had a much slower start.” Unfortunately, Dr. Carson foresees more athletes sustaining concussions as pandemic restrictions ease over the next few months. “As athletes RTP after the pandemic, they just will not be in game shape,” he said.

“In other words, athletes may not have the neuromuscular control to avoid these injuries as easily,” he added. Worse, athletes may not realize they are not quite ready to return to the expected level of participation so quickly. “I believe this scenario will lead to more concussions that will be difficult to manage in the context of an already strained health care system,” said Dr. Carson.

A limitation of the study was that it was difficult to assess whether all patients followed medical advice consistently.
 

 

 

“Very positive shifts”

Commenting on the findings, Nick Reed, PhD, Canada research chair in pediatric concussion and associate professor of occupational science and occupational therapy, University of Toronto, said that sports medicine physicians are seeing “very positive shifts” in concussion awareness and related behaviors such as providing education, support, and accommodations to students within the school environment. “More and more teachers are seeking education to learn what a concussion is and what to do to best support their students with concussion,” he said. Dr. Reed was not involved in the current study.

Indeed, this increasing awareness led to the development of a concussion education tool for teachers – SCHOOLFirst – although Dr. Reed did acknowledge that not all teachers have either the knowledge or the resources they need to optimally support their students with concussion. In the meantime, to reduce the risk of injury, Dr. Reed stressed that it is important for students to wear equipment appropriate for the game being played and to play by the rules.

“It is key to play sports in a way that is fair and respectful and not [engage] in behaviors with the intent of injuring an opponent,” he stressed. It is also important for athletes themselves to know the signs and symptoms of concussion and, if they think they have a concussion, to immediately stop playing, report how they are feeling to a coach, teacher, or parent, and to seek medical assessment to determine if they have a concussion or not.

“The key here is to focus on what the athlete can do after a concussion rather than what they can’t do,” Dr. Reed said. After even a few days of complete rest, students with a concussion can gradually introduce low levels of physical and cognitive activity that won’t make their symptoms worse. This activity can include going back to school with temporary accommodations in place, such as shorter school days and increased rest breaks. “When returning to school and to sport after a concussion, it is important to follow a stepwise and gradual return to activities so that you aren’t doing too much too fast,” Dr. Reed emphasized.

The study was conducted without external funding. Dr. Carson and Dr. Reed reported no conflicts of interest. 

A version of this article first appeared on Medscape.com.

Rates of premature return to play (RTP) among student athletes following a sport-related concussion (SRC) have dropped substantially since 2011, according to a recent chart review. Rates of premature return to learn (RTL) are essentially unchanged, however.

“Delay in recovery is the major reason why it’s important not to RTL or RTP prematurely,” said James Carson, MD, associate professor of family and community medicine, University of Toronto.

“That delay in recovery only sets students further back in terms of the stress they get from being delayed with their schoolwork – they could lose their year in school, lose all their social contacts. So, there are a number of psychosocial issues that come into play if recovery is delayed, and that is what premature RTL and premature RTP will do – they delay the student’s recovery,” he emphasized.

The study was published in Canadian Family Physician.
 

Differences by sex

The study involved 241 students who had 258 distinct cases of SRC. The researchers defined premature RTP and RTL as chart records documenting the relapse, recurrence, or worsening of concussion symptoms that accompanied the patient’s RTP or RTL. Between 2011 and 2016, 26.7% of students had evidence of premature RTP, while 42.6% of them had evidence of premature RTL, the authors noted.

Compared with findings from an earlier survey of data from 2006 to 2011, the incidence of premature RTP dropped by 38.6% (P = .0003). In contrast, symptoms associated with premature RTL dropped by only 4.7% from the previous survey. This change was not statistically significant.

There was also a significant difference between males and females in the proportion of SRC cases with relapse of symptoms. Relapse occurred in 43.4% of female athletes with SRC versus 29.7% of male athletes with SRC (P = .023).

Female athletes also had significantly longer times before being cleared for RTP. The mean time was 74.5 days for females, compared with a mean of 42.3 days for male athletes (P < .001). “The median time to RTP clearance was nearly double [for female athletes] at 49 days versus 25 days [for male athletes],” wrote the authors.

The rate of premature RTL was also higher among secondary school students (48.8%), compared with 28% among elementary students and 42% among postsecondary students.
 

More concussions coming?

Before the first consensus conference, organized by the Concussion in Sport Group in 2001, management of concussion was based on rating and grading scales that had no medical evidence to support them, said Dr. Carson. After the consensus conference, it was recommended that physicians manage each concussion individually and, when it came to RTP, recommendations were based upon symptom resolution.

In contrast, there was nothing in the literature regarding how student athletes who sustain a concussion should RTL. Some schools made generous accommodations, and others none. This situation changed around 2011, when experts started publishing data about how better to accommodate student athletes who have a temporary disability for which schools need to introduce temporary accommodations to help them recover.

“Recommendations for RTP essentially had a 12-year head-start,” Dr. Carson emphasized, “and RTL had a much slower start.” Unfortunately, Dr. Carson foresees more athletes sustaining concussions as pandemic restrictions ease over the next few months. “As athletes RTP after the pandemic, they just will not be in game shape,” he said.

“In other words, athletes may not have the neuromuscular control to avoid these injuries as easily,” he added. Worse, athletes may not realize they are not quite ready to return to the expected level of participation so quickly. “I believe this scenario will lead to more concussions that will be difficult to manage in the context of an already strained health care system,” said Dr. Carson.

A limitation of the study was that it was difficult to assess whether all patients followed medical advice consistently.
 

 

 

“Very positive shifts”

Commenting on the findings, Nick Reed, PhD, Canada research chair in pediatric concussion and associate professor of occupational science and occupational therapy, University of Toronto, said that sports medicine physicians are seeing “very positive shifts” in concussion awareness and related behaviors such as providing education, support, and accommodations to students within the school environment. “More and more teachers are seeking education to learn what a concussion is and what to do to best support their students with concussion,” he said. Dr. Reed was not involved in the current study.

Indeed, this increasing awareness led to the development of a concussion education tool for teachers – SCHOOLFirst – although Dr. Reed did acknowledge that not all teachers have either the knowledge or the resources they need to optimally support their students with concussion. In the meantime, to reduce the risk of injury, Dr. Reed stressed that it is important for students to wear equipment appropriate for the game being played and to play by the rules.

“It is key to play sports in a way that is fair and respectful and not [engage] in behaviors with the intent of injuring an opponent,” he stressed. It is also important for athletes themselves to know the signs and symptoms of concussion and, if they think they have a concussion, to immediately stop playing, report how they are feeling to a coach, teacher, or parent, and to seek medical assessment to determine if they have a concussion or not.

“The key here is to focus on what the athlete can do after a concussion rather than what they can’t do,” Dr. Reed said. After even a few days of complete rest, students with a concussion can gradually introduce low levels of physical and cognitive activity that won’t make their symptoms worse. This activity can include going back to school with temporary accommodations in place, such as shorter school days and increased rest breaks. “When returning to school and to sport after a concussion, it is important to follow a stepwise and gradual return to activities so that you aren’t doing too much too fast,” Dr. Reed emphasized.

The study was conducted without external funding. Dr. Carson and Dr. Reed reported no conflicts of interest. 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Impaired vision an overlooked dementia risk factor

Article Type
Changed
Thu, 12/15/2022 - 15:38

 

Impaired vision in older adults is an underrecognized and modifiable dementia risk factor, new research suggests.

Investigators analyzed estimated population attributable fractions (PAFs) associated with dementia in more than 16,000 older adults. A PAF represents the number of dementia cases that could be prevented if a given risk factor were eliminated.

Results showed the PAF of vision impairment was 1.8%, suggesting that healthy vision had the potential to prevent more than 100,000 cases of dementia in the United States.

“Vision impairment and blindness disproportionately impact older adults, yet vision impairment is often preventable or even correctable,” study investigator Joshua Ehrlich MD, assistant professor of ophthalmology and visual sciences, University of Michigan, Ann Arbor, said in an interview.

Poor vision affects not only how individuals see the world, but also their systemic health and well-being, Dr. Ehrlich said.

“Accordingly, ensuring that older adults receive appropriate eye care is vital to promoting health, independence, and optimal aging,” he added.

The findings were published online in JAMA Neurology.
 

A surprising omission

There is an “urgent need to identify modifiable risk factors for dementia that can be targeted with interventions to slow cognitive decline and prevent dementia,” the investigators wrote.

In 2020, the Lancet Commission report on dementia prevention, intervention, and care proposed a life-course model of 12 potentially modifiable dementia risk factors. This included lower educational level, hearing loss, traumatic brain injury, hypertension, excessive alcohol consumption, obesity, smoking, depression, social isolation, physical inactivity, diabetes, and air pollution.

Together, these factors are associated with about 40% of dementia cases worldwide, the report notes.

Vision impairment was not included in this model, “despite considerable evidence that it is associated with an elevated risk of incident dementia and that it may operate through the same pathways as hearing loss,” the current researchers wrote.

“We have known for some time that vision impairment is a risk factor for dementia [and] we also know that a very large fraction of vision impairment, possibly in excess of 80%, is avoidable or has simply yet to be addressed,” Dr. Ehrlich said.

He and his colleagues found it “surprising that vision impairment had been ignored in key models of modifiable dementia risk factors that are used to shape health policy and resource allocation.” They set out to demonstrate that, “in fact, vision impairment is just as influential as a number of other long accepted modifiable dementia risk factors.”

The investigators assessed data from the Health and Retirement Study (HRS), a panel study that surveys more than 20,000 U.S. adults aged 50 years or older every 2 years.

The investigators applied the same methods used by the Lancet Commission to the HRS dataset and added vision impairment to the Lancet life-course model. Air pollution was excluded in their model “because those data were not readily available in the HRS,” the researchers wrote.

They noted the PAF is “based on the population prevalence and relative risk of dementia for each risk factor” and is “weighted, based on a principal components analysis, to account for communality (clustering of risk factors).”
 

 

 

A missed prevention opportunity

The sample included 16,690 participants (54% were women, 51.5% were at least age 65, 80.2% were White, 10.6% were Black, 9.2% were other).

In total, the 12 potentially modifiable risk factors used in the researchers’ model were associated with an estimated 62.4% of dementia cases in the United States, with hypertension as the most prevalent risk factor with the highest weighted PAF.
 

A new focus for prevention

Commenting for this article, Suzann Pershing, MD, associate professor of ophthalmology, Stanford (Calif.) University, called the study “particularly important because, despite growing recognition of its importance in relation to cognition, visual impairment is often an underrecognized risk factor.”

The current research “builds on increasingly robust medical literature linking visual impairment and dementia, applying analogous methods to those used for the life course model recently presented by the Lancet Commission to evaluate potentially modifiable dementia risk factors,” said Dr. Pershing, who was not involved with the study.

The investigators “make a compelling argument for inclusion of visual impairment as one of the potentially modifiable risk factors; practicing clinicians and health care systems may consider screening and targeted therapies to address visual impairment, with a goal of population health and contributing to a reduction in future dementia disease burden,” she added.

In an accompanying editorial), Jennifer Deal, PhD, department of epidemiology and Cochlear Center for Hearing and Public Health, Baltimore, and Julio Rojas, MD, PhD, Memory and Aging Center, department of neurology, Weill Institute for Neurosciences, University of California, San Francisco, call the findings “an important reminder that dementia is a social problem in which potentially treatable risk factors, including visual impairment, are highly prevalent in disadvantaged populations.”

The editorialists noted that 90% of cases of vision impairment are “preventable or have yet to be treated. The two “highly cost-effective interventions” of eyeglasses and/or cataract surgery “remain underused both in the U.S. and globally, especially in disadvantaged communities,” they wrote.

They added that more research is needed to “test the effectiveness of interventions to preserve cognitive health by promoting healthy vision.”

The study was supported by grants from the National Institute on Aging, the National Institutes of Health, and Research to Prevent Blindness. The investigators reported no relevant financial relationships. Dr. Deal reported having received grants from the National Institute on Aging. Dr. Rojas reported serving as site principal investigator on clinical trials for Eli Lilly and Eisai and receiving grants from the National Institute on Aging. Dr. Pershing is a consultant for Acumen, and Verana Health (as DigiSight Technologies).

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 30(6)
Publications
Topics
Sections

 

Impaired vision in older adults is an underrecognized and modifiable dementia risk factor, new research suggests.

Investigators analyzed estimated population attributable fractions (PAFs) associated with dementia in more than 16,000 older adults. A PAF represents the number of dementia cases that could be prevented if a given risk factor were eliminated.

Results showed the PAF of vision impairment was 1.8%, suggesting that healthy vision had the potential to prevent more than 100,000 cases of dementia in the United States.

“Vision impairment and blindness disproportionately impact older adults, yet vision impairment is often preventable or even correctable,” study investigator Joshua Ehrlich MD, assistant professor of ophthalmology and visual sciences, University of Michigan, Ann Arbor, said in an interview.

Poor vision affects not only how individuals see the world, but also their systemic health and well-being, Dr. Ehrlich said.

“Accordingly, ensuring that older adults receive appropriate eye care is vital to promoting health, independence, and optimal aging,” he added.

The findings were published online in JAMA Neurology.
 

A surprising omission

There is an “urgent need to identify modifiable risk factors for dementia that can be targeted with interventions to slow cognitive decline and prevent dementia,” the investigators wrote.

In 2020, the Lancet Commission report on dementia prevention, intervention, and care proposed a life-course model of 12 potentially modifiable dementia risk factors. This included lower educational level, hearing loss, traumatic brain injury, hypertension, excessive alcohol consumption, obesity, smoking, depression, social isolation, physical inactivity, diabetes, and air pollution.

Together, these factors are associated with about 40% of dementia cases worldwide, the report notes.

Vision impairment was not included in this model, “despite considerable evidence that it is associated with an elevated risk of incident dementia and that it may operate through the same pathways as hearing loss,” the current researchers wrote.

“We have known for some time that vision impairment is a risk factor for dementia [and] we also know that a very large fraction of vision impairment, possibly in excess of 80%, is avoidable or has simply yet to be addressed,” Dr. Ehrlich said.

He and his colleagues found it “surprising that vision impairment had been ignored in key models of modifiable dementia risk factors that are used to shape health policy and resource allocation.” They set out to demonstrate that, “in fact, vision impairment is just as influential as a number of other long accepted modifiable dementia risk factors.”

The investigators assessed data from the Health and Retirement Study (HRS), a panel study that surveys more than 20,000 U.S. adults aged 50 years or older every 2 years.

The investigators applied the same methods used by the Lancet Commission to the HRS dataset and added vision impairment to the Lancet life-course model. Air pollution was excluded in their model “because those data were not readily available in the HRS,” the researchers wrote.

They noted the PAF is “based on the population prevalence and relative risk of dementia for each risk factor” and is “weighted, based on a principal components analysis, to account for communality (clustering of risk factors).”
 

 

 

A missed prevention opportunity

The sample included 16,690 participants (54% were women, 51.5% were at least age 65, 80.2% were White, 10.6% were Black, 9.2% were other).

In total, the 12 potentially modifiable risk factors used in the researchers’ model were associated with an estimated 62.4% of dementia cases in the United States, with hypertension as the most prevalent risk factor with the highest weighted PAF.
 

A new focus for prevention

Commenting for this article, Suzann Pershing, MD, associate professor of ophthalmology, Stanford (Calif.) University, called the study “particularly important because, despite growing recognition of its importance in relation to cognition, visual impairment is often an underrecognized risk factor.”

The current research “builds on increasingly robust medical literature linking visual impairment and dementia, applying analogous methods to those used for the life course model recently presented by the Lancet Commission to evaluate potentially modifiable dementia risk factors,” said Dr. Pershing, who was not involved with the study.

The investigators “make a compelling argument for inclusion of visual impairment as one of the potentially modifiable risk factors; practicing clinicians and health care systems may consider screening and targeted therapies to address visual impairment, with a goal of population health and contributing to a reduction in future dementia disease burden,” she added.

In an accompanying editorial), Jennifer Deal, PhD, department of epidemiology and Cochlear Center for Hearing and Public Health, Baltimore, and Julio Rojas, MD, PhD, Memory and Aging Center, department of neurology, Weill Institute for Neurosciences, University of California, San Francisco, call the findings “an important reminder that dementia is a social problem in which potentially treatable risk factors, including visual impairment, are highly prevalent in disadvantaged populations.”

The editorialists noted that 90% of cases of vision impairment are “preventable or have yet to be treated. The two “highly cost-effective interventions” of eyeglasses and/or cataract surgery “remain underused both in the U.S. and globally, especially in disadvantaged communities,” they wrote.

They added that more research is needed to “test the effectiveness of interventions to preserve cognitive health by promoting healthy vision.”

The study was supported by grants from the National Institute on Aging, the National Institutes of Health, and Research to Prevent Blindness. The investigators reported no relevant financial relationships. Dr. Deal reported having received grants from the National Institute on Aging. Dr. Rojas reported serving as site principal investigator on clinical trials for Eli Lilly and Eisai and receiving grants from the National Institute on Aging. Dr. Pershing is a consultant for Acumen, and Verana Health (as DigiSight Technologies).

A version of this article first appeared on Medscape.com.

 

Impaired vision in older adults is an underrecognized and modifiable dementia risk factor, new research suggests.

Investigators analyzed estimated population attributable fractions (PAFs) associated with dementia in more than 16,000 older adults. A PAF represents the number of dementia cases that could be prevented if a given risk factor were eliminated.

Results showed the PAF of vision impairment was 1.8%, suggesting that healthy vision had the potential to prevent more than 100,000 cases of dementia in the United States.

“Vision impairment and blindness disproportionately impact older adults, yet vision impairment is often preventable or even correctable,” study investigator Joshua Ehrlich MD, assistant professor of ophthalmology and visual sciences, University of Michigan, Ann Arbor, said in an interview.

Poor vision affects not only how individuals see the world, but also their systemic health and well-being, Dr. Ehrlich said.

“Accordingly, ensuring that older adults receive appropriate eye care is vital to promoting health, independence, and optimal aging,” he added.

The findings were published online in JAMA Neurology.
 

A surprising omission

There is an “urgent need to identify modifiable risk factors for dementia that can be targeted with interventions to slow cognitive decline and prevent dementia,” the investigators wrote.

In 2020, the Lancet Commission report on dementia prevention, intervention, and care proposed a life-course model of 12 potentially modifiable dementia risk factors. This included lower educational level, hearing loss, traumatic brain injury, hypertension, excessive alcohol consumption, obesity, smoking, depression, social isolation, physical inactivity, diabetes, and air pollution.

Together, these factors are associated with about 40% of dementia cases worldwide, the report notes.

Vision impairment was not included in this model, “despite considerable evidence that it is associated with an elevated risk of incident dementia and that it may operate through the same pathways as hearing loss,” the current researchers wrote.

“We have known for some time that vision impairment is a risk factor for dementia [and] we also know that a very large fraction of vision impairment, possibly in excess of 80%, is avoidable or has simply yet to be addressed,” Dr. Ehrlich said.

He and his colleagues found it “surprising that vision impairment had been ignored in key models of modifiable dementia risk factors that are used to shape health policy and resource allocation.” They set out to demonstrate that, “in fact, vision impairment is just as influential as a number of other long accepted modifiable dementia risk factors.”

The investigators assessed data from the Health and Retirement Study (HRS), a panel study that surveys more than 20,000 U.S. adults aged 50 years or older every 2 years.

The investigators applied the same methods used by the Lancet Commission to the HRS dataset and added vision impairment to the Lancet life-course model. Air pollution was excluded in their model “because those data were not readily available in the HRS,” the researchers wrote.

They noted the PAF is “based on the population prevalence and relative risk of dementia for each risk factor” and is “weighted, based on a principal components analysis, to account for communality (clustering of risk factors).”
 

 

 

A missed prevention opportunity

The sample included 16,690 participants (54% were women, 51.5% were at least age 65, 80.2% were White, 10.6% were Black, 9.2% were other).

In total, the 12 potentially modifiable risk factors used in the researchers’ model were associated with an estimated 62.4% of dementia cases in the United States, with hypertension as the most prevalent risk factor with the highest weighted PAF.
 

A new focus for prevention

Commenting for this article, Suzann Pershing, MD, associate professor of ophthalmology, Stanford (Calif.) University, called the study “particularly important because, despite growing recognition of its importance in relation to cognition, visual impairment is often an underrecognized risk factor.”

The current research “builds on increasingly robust medical literature linking visual impairment and dementia, applying analogous methods to those used for the life course model recently presented by the Lancet Commission to evaluate potentially modifiable dementia risk factors,” said Dr. Pershing, who was not involved with the study.

The investigators “make a compelling argument for inclusion of visual impairment as one of the potentially modifiable risk factors; practicing clinicians and health care systems may consider screening and targeted therapies to address visual impairment, with a goal of population health and contributing to a reduction in future dementia disease burden,” she added.

In an accompanying editorial), Jennifer Deal, PhD, department of epidemiology and Cochlear Center for Hearing and Public Health, Baltimore, and Julio Rojas, MD, PhD, Memory and Aging Center, department of neurology, Weill Institute for Neurosciences, University of California, San Francisco, call the findings “an important reminder that dementia is a social problem in which potentially treatable risk factors, including visual impairment, are highly prevalent in disadvantaged populations.”

The editorialists noted that 90% of cases of vision impairment are “preventable or have yet to be treated. The two “highly cost-effective interventions” of eyeglasses and/or cataract surgery “remain underused both in the U.S. and globally, especially in disadvantaged communities,” they wrote.

They added that more research is needed to “test the effectiveness of interventions to preserve cognitive health by promoting healthy vision.”

The study was supported by grants from the National Institute on Aging, the National Institutes of Health, and Research to Prevent Blindness. The investigators reported no relevant financial relationships. Dr. Deal reported having received grants from the National Institute on Aging. Dr. Rojas reported serving as site principal investigator on clinical trials for Eli Lilly and Eisai and receiving grants from the National Institute on Aging. Dr. Pershing is a consultant for Acumen, and Verana Health (as DigiSight Technologies).

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 30(6)
Issue
Neurology Reviews - 30(6)
Publications
Publications
Topics
Article Type
Sections
Citation Override
Publish date: April 29, 2022
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Three symptoms suggest higher risk for self-injury in cancer

Article Type
Changed
Thu, 12/15/2022 - 14:32

Moderate to severe anxiety, depression, and shortness of breath indicate increased risk for nonfatal self-injury (NFSI) among patients newly diagnosed with cancer, according to a Canadian study.

In a population-based, case-control study, each of these symptoms was associated with an increase of at least 60% in the risk for NFSI in the following 180 days, the investigators report.

“Clinicians should know that self-injury is a real problem after a cancer diagnosis,” lead investigator Julie Hallet, MD, an associate scientist at Sunnybrook Health Sciences Centre in Toronto, told this news organization.

Self-injury “does not necessarily represent an attempted suicide,” she added. “While our data do not allow us to know what the intent was, we know from other work that the repercussions of distress in patients with cancer are much broader than suicide. Self-injury can be a means to cope with psychological difficulties for some patients, without intent for suicide.”

The study was published online in JAMA Oncology.
 

Nine common symptoms

The study included adults who were diagnosed with cancer between Jan. 1, 2007, and March 31, 2019, and had completed the Edmonton Symptom Assessment System (ESAS) evaluation within 36 months of their index cancer diagnosis. ESAS evaluates nine common cancer-associated symptoms, including pain, tiredness, nausea, depression, anxiety, drowsiness, appetite, well-being, and shortness of breath, on a patient-reported scale of 0 (absence of symptom) to 10 (worst possible symptom).

The analysis included 406 patients who had visited an emergency department for an NFSI within 180 days of their ESAS evaluation, as well as 1,624 matched control patients with cancer who did not have an NFSI. Case patients and control patients were matched according to age at cancer diagnosis, sex, prior self-injury within 5 years of being diagnosed with cancer, and cancer type. Nonmatched covariates included psychiatric illness and therapy received before NFSI, comorbidity burden, material deprivation, and cancer stage.
 

Toward tailored intervention

A higher proportion of case patients than control patients reported moderate to severe scores for all nine ESAS symptoms. In an adjusted analysis, moderate to severe anxiety (odds ratio, 1.61), depression (OR, 1.66), and shortness of breath (OR, 1.65) were independently associated with higher odds of subsequent NFSI. Each 10-point increase in total ESAS score also was associated with increased risk (OR, 1.51).

“These findings are important to enhance the use of screening ESAS scores to better support patients,” say the authors. “Scores from ESAS assessments can be used to identify patients at higher risk of NFSI, indicating higher level of distress, and help direct tailored assessment and intervention.”

In prior work, Dr. Hallet’s group showed that NFSI occurs in 3 of every 1,000 patients with cancer. NFSI is more frequent among younger patients and those with a history of prior mental illness. “Identifying patients at risk in clinical practice requires you to inquire about a patient’s prior history, identify high symptom scores and ask about them, and trigger intervention pathways when risk is identified,” said Dr. Hallet.

“For example, a young patient with head and neck cancer and a prior history of mental illness who reports high scores for anxiety and drowsiness would be at high risk of self-injury,” she added. Such a patient should be referred to psycho-oncology, psychiatry, or social work. “To facilitate this, we are working on prognostic scores that can be integrated in clinical practice, such as an electronic medical record, to flag patients at risk,” said Dr. Hallet. “Future work will also need to identify the optimal care pathways for at-risk patients.”
 

 

 

Self-injury vs. suicidality

Commenting on the study for this news organization, Madeline Li, MD, PhD, a psychiatrist and clinician-scientist at Toronto’s Princess Margaret Cancer Centre, said that the findings are “underwhelming” because they tell us what is already known – that “NFSI is associated with distress, and cancer is a stressor.” It would have been more interesting to ask how to distinguish patients at risk for suicide from those at risk for self-harm without suicide, she added.

“The way these authors formulated NFSI included both self-harm intent and suicidal intent,” she explained. The researchers compared patients who were at risk for these two types of events with patients without NFSI. “When we see self-harm without suicidal intent in the emergency room, it’s mostly people making cries for help,” said Dr. Li. “These are people who cut their wrists or take small overdoses on purpose without the intent to die. It would have been more interesting to see if there are different risk factors for people who are just going to self-harm vs. those who are actually going to attempt suicide.”

The study’s identification of risk factors for NSFI is important because “it does tell us that when there’s anxiety, depression, and shortness of breath, we should pay attention to these patients and do something about it,” said Dr. Li. Still, research in cancer psychiatry needs to shift its focus from identifying and addressing existing risk factors to preventing them from developing, she added.

“We need to move earlier and provide emotional and mental health support to cancer patients to prevent them from becoming suicidal, rather than intervening when somebody already is,” Dr. Li concluded.

The study was funded by the Hanna Research Award from the division of surgical oncology at the Odette Cancer Centre–Sunnybrook Health Sciences Centre and by a Sunnybrook Health Sciences Centre Alternate Funding Plan Innovation grant. It was also supported by ICES, which is funded by an annual grant from the Ontario Ministry of Health and Long-Term Care. Dr. Hallet has received personal fees from Ipsen Biopharmaceuticals Canada and AAA outside the submitted work. Dr. Li reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Moderate to severe anxiety, depression, and shortness of breath indicate increased risk for nonfatal self-injury (NFSI) among patients newly diagnosed with cancer, according to a Canadian study.

In a population-based, case-control study, each of these symptoms was associated with an increase of at least 60% in the risk for NFSI in the following 180 days, the investigators report.

“Clinicians should know that self-injury is a real problem after a cancer diagnosis,” lead investigator Julie Hallet, MD, an associate scientist at Sunnybrook Health Sciences Centre in Toronto, told this news organization.

Self-injury “does not necessarily represent an attempted suicide,” she added. “While our data do not allow us to know what the intent was, we know from other work that the repercussions of distress in patients with cancer are much broader than suicide. Self-injury can be a means to cope with psychological difficulties for some patients, without intent for suicide.”

The study was published online in JAMA Oncology.
 

Nine common symptoms

The study included adults who were diagnosed with cancer between Jan. 1, 2007, and March 31, 2019, and had completed the Edmonton Symptom Assessment System (ESAS) evaluation within 36 months of their index cancer diagnosis. ESAS evaluates nine common cancer-associated symptoms, including pain, tiredness, nausea, depression, anxiety, drowsiness, appetite, well-being, and shortness of breath, on a patient-reported scale of 0 (absence of symptom) to 10 (worst possible symptom).

The analysis included 406 patients who had visited an emergency department for an NFSI within 180 days of their ESAS evaluation, as well as 1,624 matched control patients with cancer who did not have an NFSI. Case patients and control patients were matched according to age at cancer diagnosis, sex, prior self-injury within 5 years of being diagnosed with cancer, and cancer type. Nonmatched covariates included psychiatric illness and therapy received before NFSI, comorbidity burden, material deprivation, and cancer stage.
 

Toward tailored intervention

A higher proportion of case patients than control patients reported moderate to severe scores for all nine ESAS symptoms. In an adjusted analysis, moderate to severe anxiety (odds ratio, 1.61), depression (OR, 1.66), and shortness of breath (OR, 1.65) were independently associated with higher odds of subsequent NFSI. Each 10-point increase in total ESAS score also was associated with increased risk (OR, 1.51).

“These findings are important to enhance the use of screening ESAS scores to better support patients,” say the authors. “Scores from ESAS assessments can be used to identify patients at higher risk of NFSI, indicating higher level of distress, and help direct tailored assessment and intervention.”

In prior work, Dr. Hallet’s group showed that NFSI occurs in 3 of every 1,000 patients with cancer. NFSI is more frequent among younger patients and those with a history of prior mental illness. “Identifying patients at risk in clinical practice requires you to inquire about a patient’s prior history, identify high symptom scores and ask about them, and trigger intervention pathways when risk is identified,” said Dr. Hallet.

“For example, a young patient with head and neck cancer and a prior history of mental illness who reports high scores for anxiety and drowsiness would be at high risk of self-injury,” she added. Such a patient should be referred to psycho-oncology, psychiatry, or social work. “To facilitate this, we are working on prognostic scores that can be integrated in clinical practice, such as an electronic medical record, to flag patients at risk,” said Dr. Hallet. “Future work will also need to identify the optimal care pathways for at-risk patients.”
 

 

 

Self-injury vs. suicidality

Commenting on the study for this news organization, Madeline Li, MD, PhD, a psychiatrist and clinician-scientist at Toronto’s Princess Margaret Cancer Centre, said that the findings are “underwhelming” because they tell us what is already known – that “NFSI is associated with distress, and cancer is a stressor.” It would have been more interesting to ask how to distinguish patients at risk for suicide from those at risk for self-harm without suicide, she added.

“The way these authors formulated NFSI included both self-harm intent and suicidal intent,” she explained. The researchers compared patients who were at risk for these two types of events with patients without NFSI. “When we see self-harm without suicidal intent in the emergency room, it’s mostly people making cries for help,” said Dr. Li. “These are people who cut their wrists or take small overdoses on purpose without the intent to die. It would have been more interesting to see if there are different risk factors for people who are just going to self-harm vs. those who are actually going to attempt suicide.”

The study’s identification of risk factors for NSFI is important because “it does tell us that when there’s anxiety, depression, and shortness of breath, we should pay attention to these patients and do something about it,” said Dr. Li. Still, research in cancer psychiatry needs to shift its focus from identifying and addressing existing risk factors to preventing them from developing, she added.

“We need to move earlier and provide emotional and mental health support to cancer patients to prevent them from becoming suicidal, rather than intervening when somebody already is,” Dr. Li concluded.

The study was funded by the Hanna Research Award from the division of surgical oncology at the Odette Cancer Centre–Sunnybrook Health Sciences Centre and by a Sunnybrook Health Sciences Centre Alternate Funding Plan Innovation grant. It was also supported by ICES, which is funded by an annual grant from the Ontario Ministry of Health and Long-Term Care. Dr. Hallet has received personal fees from Ipsen Biopharmaceuticals Canada and AAA outside the submitted work. Dr. Li reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Moderate to severe anxiety, depression, and shortness of breath indicate increased risk for nonfatal self-injury (NFSI) among patients newly diagnosed with cancer, according to a Canadian study.

In a population-based, case-control study, each of these symptoms was associated with an increase of at least 60% in the risk for NFSI in the following 180 days, the investigators report.

“Clinicians should know that self-injury is a real problem after a cancer diagnosis,” lead investigator Julie Hallet, MD, an associate scientist at Sunnybrook Health Sciences Centre in Toronto, told this news organization.

Self-injury “does not necessarily represent an attempted suicide,” she added. “While our data do not allow us to know what the intent was, we know from other work that the repercussions of distress in patients with cancer are much broader than suicide. Self-injury can be a means to cope with psychological difficulties for some patients, without intent for suicide.”

The study was published online in JAMA Oncology.
 

Nine common symptoms

The study included adults who were diagnosed with cancer between Jan. 1, 2007, and March 31, 2019, and had completed the Edmonton Symptom Assessment System (ESAS) evaluation within 36 months of their index cancer diagnosis. ESAS evaluates nine common cancer-associated symptoms, including pain, tiredness, nausea, depression, anxiety, drowsiness, appetite, well-being, and shortness of breath, on a patient-reported scale of 0 (absence of symptom) to 10 (worst possible symptom).

The analysis included 406 patients who had visited an emergency department for an NFSI within 180 days of their ESAS evaluation, as well as 1,624 matched control patients with cancer who did not have an NFSI. Case patients and control patients were matched according to age at cancer diagnosis, sex, prior self-injury within 5 years of being diagnosed with cancer, and cancer type. Nonmatched covariates included psychiatric illness and therapy received before NFSI, comorbidity burden, material deprivation, and cancer stage.
 

Toward tailored intervention

A higher proportion of case patients than control patients reported moderate to severe scores for all nine ESAS symptoms. In an adjusted analysis, moderate to severe anxiety (odds ratio, 1.61), depression (OR, 1.66), and shortness of breath (OR, 1.65) were independently associated with higher odds of subsequent NFSI. Each 10-point increase in total ESAS score also was associated with increased risk (OR, 1.51).

“These findings are important to enhance the use of screening ESAS scores to better support patients,” say the authors. “Scores from ESAS assessments can be used to identify patients at higher risk of NFSI, indicating higher level of distress, and help direct tailored assessment and intervention.”

In prior work, Dr. Hallet’s group showed that NFSI occurs in 3 of every 1,000 patients with cancer. NFSI is more frequent among younger patients and those with a history of prior mental illness. “Identifying patients at risk in clinical practice requires you to inquire about a patient’s prior history, identify high symptom scores and ask about them, and trigger intervention pathways when risk is identified,” said Dr. Hallet.

“For example, a young patient with head and neck cancer and a prior history of mental illness who reports high scores for anxiety and drowsiness would be at high risk of self-injury,” she added. Such a patient should be referred to psycho-oncology, psychiatry, or social work. “To facilitate this, we are working on prognostic scores that can be integrated in clinical practice, such as an electronic medical record, to flag patients at risk,” said Dr. Hallet. “Future work will also need to identify the optimal care pathways for at-risk patients.”
 

 

 

Self-injury vs. suicidality

Commenting on the study for this news organization, Madeline Li, MD, PhD, a psychiatrist and clinician-scientist at Toronto’s Princess Margaret Cancer Centre, said that the findings are “underwhelming” because they tell us what is already known – that “NFSI is associated with distress, and cancer is a stressor.” It would have been more interesting to ask how to distinguish patients at risk for suicide from those at risk for self-harm without suicide, she added.

“The way these authors formulated NFSI included both self-harm intent and suicidal intent,” she explained. The researchers compared patients who were at risk for these two types of events with patients without NFSI. “When we see self-harm without suicidal intent in the emergency room, it’s mostly people making cries for help,” said Dr. Li. “These are people who cut their wrists or take small overdoses on purpose without the intent to die. It would have been more interesting to see if there are different risk factors for people who are just going to self-harm vs. those who are actually going to attempt suicide.”

The study’s identification of risk factors for NSFI is important because “it does tell us that when there’s anxiety, depression, and shortness of breath, we should pay attention to these patients and do something about it,” said Dr. Li. Still, research in cancer psychiatry needs to shift its focus from identifying and addressing existing risk factors to preventing them from developing, she added.

“We need to move earlier and provide emotional and mental health support to cancer patients to prevent them from becoming suicidal, rather than intervening when somebody already is,” Dr. Li concluded.

The study was funded by the Hanna Research Award from the division of surgical oncology at the Odette Cancer Centre–Sunnybrook Health Sciences Centre and by a Sunnybrook Health Sciences Centre Alternate Funding Plan Innovation grant. It was also supported by ICES, which is funded by an annual grant from the Ontario Ministry of Health and Long-Term Care. Dr. Hallet has received personal fees from Ipsen Biopharmaceuticals Canada and AAA outside the submitted work. Dr. Li reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Which solid organ transplant recipients face the highest risk of skin cancer?

Article Type
Changed
Thu, 12/15/2022 - 14:33

According to the best available data, solid organ transplant recipients (SOTRs) at highest risk for developing skin cancer are thoracic organ recipients, those aged 50 or older at the time of the transplant, and males.

White patients who meet these criteria should be screening within 2 years after transplant, while Black patients should be screened within 5 years after transplant, Ally-Khan Somani, MD, PhD, said at the annual meeting of the American Academy of Dermatology.

Dr. Ally-Khan Somani

Dr. Somani, director of dermatologic surgery and the division of cutaneous oncology at Indiana University, Indianapolis, based his remarks on consensus screening guidelines assembled from three rounds of Delphi method surveys with 47 dermatologists and 37 transplant physicians, with the goal of establishing skin cancer screening recommendations for SOTRs. Among the dermatologists surveyed, 45% were Mohs surgeons and 55% were general dermatologists.

The panel recommended that the transplant team should perform risk assessment for SOTRs to risk stratify patients for skin cancer screening (high risk vs. low risk). They also proposed that dermatologists perform skin cancer screening by full-body skin examinations, and that SOTRs with a history of skin cancer should continue with routine skin cancer surveillance as recommended by their dermatologists.

Those at low risk for skin cancer include abdominal organ recipients, SOTR age of younger than 50 at time of transplant, and female gender. The guidelines recommend that White, Asian, and Hispanic patients who meet those criteria should be screened within 5 years after transplant, while no consensus was reached for Black patients who meet those criteria.



Based on posttransplant skin cancer incidence rates, risk is increased among males, Whites, thoracic organ recipients, and being age 50 or older, Dr. Somani said. “At our institution, we make sure there’s a good connection between our transplant teams and dermatologists. We recommend rapid referral for suspicious lesions and we educate patients and screen them within 1 year of transplant, or sooner for high-risk patients. Surveillance is increased to every 3 or 4 months for patients with a history of multiple or high-risk cancers or sooner, followed by routine surveillance as recommended by the patient’s dermatologist.”

To risk stratify patients on the development of their first skin cancer post transplantation, researchers developed the Skin and Ultraviolet Neoplasia Transplant Risk Assessment Calculator (SUNTRAC), a prediction tool with a freely available app. Data for the tool were drawn from the Transplant Skin Cancer Network study, a 5-year analysis of 6,340 adult recipients of a first solid organ transplant at 26 transplant centers in the United States. It generates a risk score for SOTRs (low, medium, high, or very high), which informs transplant care providers of a patient’s risk of skin cancer.

Dr. Somani disclosed that he has received grants and funding from Castle Biosciences. He is an adviser to Cook Biotech and a consultant to Sanara MedTech.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

According to the best available data, solid organ transplant recipients (SOTRs) at highest risk for developing skin cancer are thoracic organ recipients, those aged 50 or older at the time of the transplant, and males.

White patients who meet these criteria should be screening within 2 years after transplant, while Black patients should be screened within 5 years after transplant, Ally-Khan Somani, MD, PhD, said at the annual meeting of the American Academy of Dermatology.

Dr. Ally-Khan Somani

Dr. Somani, director of dermatologic surgery and the division of cutaneous oncology at Indiana University, Indianapolis, based his remarks on consensus screening guidelines assembled from three rounds of Delphi method surveys with 47 dermatologists and 37 transplant physicians, with the goal of establishing skin cancer screening recommendations for SOTRs. Among the dermatologists surveyed, 45% were Mohs surgeons and 55% were general dermatologists.

The panel recommended that the transplant team should perform risk assessment for SOTRs to risk stratify patients for skin cancer screening (high risk vs. low risk). They also proposed that dermatologists perform skin cancer screening by full-body skin examinations, and that SOTRs with a history of skin cancer should continue with routine skin cancer surveillance as recommended by their dermatologists.

Those at low risk for skin cancer include abdominal organ recipients, SOTR age of younger than 50 at time of transplant, and female gender. The guidelines recommend that White, Asian, and Hispanic patients who meet those criteria should be screened within 5 years after transplant, while no consensus was reached for Black patients who meet those criteria.



Based on posttransplant skin cancer incidence rates, risk is increased among males, Whites, thoracic organ recipients, and being age 50 or older, Dr. Somani said. “At our institution, we make sure there’s a good connection between our transplant teams and dermatologists. We recommend rapid referral for suspicious lesions and we educate patients and screen them within 1 year of transplant, or sooner for high-risk patients. Surveillance is increased to every 3 or 4 months for patients with a history of multiple or high-risk cancers or sooner, followed by routine surveillance as recommended by the patient’s dermatologist.”

To risk stratify patients on the development of their first skin cancer post transplantation, researchers developed the Skin and Ultraviolet Neoplasia Transplant Risk Assessment Calculator (SUNTRAC), a prediction tool with a freely available app. Data for the tool were drawn from the Transplant Skin Cancer Network study, a 5-year analysis of 6,340 adult recipients of a first solid organ transplant at 26 transplant centers in the United States. It generates a risk score for SOTRs (low, medium, high, or very high), which informs transplant care providers of a patient’s risk of skin cancer.

Dr. Somani disclosed that he has received grants and funding from Castle Biosciences. He is an adviser to Cook Biotech and a consultant to Sanara MedTech.

According to the best available data, solid organ transplant recipients (SOTRs) at highest risk for developing skin cancer are thoracic organ recipients, those aged 50 or older at the time of the transplant, and males.

White patients who meet these criteria should be screening within 2 years after transplant, while Black patients should be screened within 5 years after transplant, Ally-Khan Somani, MD, PhD, said at the annual meeting of the American Academy of Dermatology.

Dr. Ally-Khan Somani

Dr. Somani, director of dermatologic surgery and the division of cutaneous oncology at Indiana University, Indianapolis, based his remarks on consensus screening guidelines assembled from three rounds of Delphi method surveys with 47 dermatologists and 37 transplant physicians, with the goal of establishing skin cancer screening recommendations for SOTRs. Among the dermatologists surveyed, 45% were Mohs surgeons and 55% were general dermatologists.

The panel recommended that the transplant team should perform risk assessment for SOTRs to risk stratify patients for skin cancer screening (high risk vs. low risk). They also proposed that dermatologists perform skin cancer screening by full-body skin examinations, and that SOTRs with a history of skin cancer should continue with routine skin cancer surveillance as recommended by their dermatologists.

Those at low risk for skin cancer include abdominal organ recipients, SOTR age of younger than 50 at time of transplant, and female gender. The guidelines recommend that White, Asian, and Hispanic patients who meet those criteria should be screened within 5 years after transplant, while no consensus was reached for Black patients who meet those criteria.



Based on posttransplant skin cancer incidence rates, risk is increased among males, Whites, thoracic organ recipients, and being age 50 or older, Dr. Somani said. “At our institution, we make sure there’s a good connection between our transplant teams and dermatologists. We recommend rapid referral for suspicious lesions and we educate patients and screen them within 1 year of transplant, or sooner for high-risk patients. Surveillance is increased to every 3 or 4 months for patients with a history of multiple or high-risk cancers or sooner, followed by routine surveillance as recommended by the patient’s dermatologist.”

To risk stratify patients on the development of their first skin cancer post transplantation, researchers developed the Skin and Ultraviolet Neoplasia Transplant Risk Assessment Calculator (SUNTRAC), a prediction tool with a freely available app. Data for the tool were drawn from the Transplant Skin Cancer Network study, a 5-year analysis of 6,340 adult recipients of a first solid organ transplant at 26 transplant centers in the United States. It generates a risk score for SOTRs (low, medium, high, or very high), which informs transplant care providers of a patient’s risk of skin cancer.

Dr. Somani disclosed that he has received grants and funding from Castle Biosciences. He is an adviser to Cook Biotech and a consultant to Sanara MedTech.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT AAD 22

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Do personality traits predict cognitive decline?

Article Type
Changed
Thu, 12/15/2022 - 15:38

Extraverts and individuals who are disciplined are less likely to experience cognitive decline later in life, whereas those with neuroticism have an increased risk for cognitive dysfunction, new research shows.

Investigators analyzed data from almost 2,000 individuals enrolled in the Rush Memory and Aging Project (MAP) – a longitudinal study of older adults living in the greater Chicago metropolitan region and northeastern Illinois – with recruitment that began in 1997 and continues through today. Participants received a personality assessment as well as annual assessments of their cognitive abilities.

Those with high scores on measures of conscientiousness were significantly less likely to progress from normal cognition to mild cognitive impairment (MCI) during the study. In fact, scoring an extra 1 standard deviation on the conscientiousness scale was associated with a 22% lower risk of transitioning from no cognitive impairment (NCI) to MCI. On the other hand, scoring an additional 1 SD on a neuroticism scale was associated with a 12% increased risk of transitioning to MCI.

Participants who scored high on extraversion, as well as those who scored high on conscientiousness or low on neuroticism, tended to maintain normal cognitive functioning longer than other participants.

“Personality traits reflect relatively enduring patterns of thinking and behaving, which may cumulatively affect engagement in healthy and unhealthy behaviors and thought patterns across the lifespan,” lead author Tomiko Yoneda, PhD, a postdoctoral researcher in the department of medical social sciences, Northwestern University, Chicago, said in an interview.

“The accumulation of lifelong experiences may then contribute to susceptibility of particular diseases or disorders, such as mild cognitive impairment, or contribute to individual differences in the ability to withstand age-related neurological changes,” she added.

The study was published online in the Journal of Personality and Social Psychology.
 

Competing risk factors

Personality traits “reflect an individual’s persistent patterns of thinking, feeling, and behaving,” Dr. Yoneda said.

“For example, conscientiousness is characterized by competence, dutifulness, and self-discipline, while neuroticism is characterized by anxiety, depressive symptoms, and emotional instability. Likewise, individuals high in extraversion tend to be enthusiastic, gregarious, talkative, and assertive,” she added.

Previous research “suggests that low conscientiousness and high neuroticism are associated with an increased risk of cognitive impairment,” she continued. However, “there is also an increased risk of death in older adulthood – in other words, these outcomes are ‘competing risk factors.’”

Dr. Yoneda said her team wanted to “examine the impact of personality traits on the simultaneous risk of transitioning to mild cognitive impairment, dementia, and death.”  

For the study, the researchers analyzed data from 1,954 participants in MAP (mean age at baseline 80 years, 73.7% female, 86.8% White), who received a personality assessment and annual assessments of their cognitive abilities.

To assess personality traits – in particular, conscientiousness, neuroticism, and extraversion – the researchers used the NEO Five Factor Inventory (NEO-FFI). They also used multistate survival modeling to examine the potential association between these traits and transitions from one cognitive status category to another (NCI, MCI, and dementia) and to death.
 

Cognitive healthspan

By the end of the study, over half of the sample (54%) had died.

Most transitions showed “relative stability in cognitive status across measurement occasions.”

  • NCI to NCI (n = 7,368)
  • MCI to MCI (n = 1,244)
  • Dementia to dementia (n = 876)

There were 725 “backward transitions” from MCI to NCI, “which may reflect improvement or within-person variability in cognitive functioning, or learning effects,” the authors note.

There were only 114 “backward transitions” from dementia to MCI and only 12 from dementia to NCI, “suggesting that improvement in cognitive status was relatively rare, particularly once an individual progresses to dementia.”

After adjusting for demographics, depressive symptoms, and apolipoprotein (APOE) ε4 allele, the researchers found that personality traits were the most important factors in the transition from NCI to MCI.

Higher conscientiousness was associated with a decreased risk of transitioning from NCI to MCI (hazard ratio, 0.78; 95% confidence interval, 0.72-0.85). Conversely, higher neuroticism was associated with an increased risk of transitioning from NCI to MCI (HR, 1.12; 95% CI, 1.04-1.21) and a significantly decreased likelihood of transition back from MCI to NCI (HR, 0.90; 95% CI, 0.81-1.00).

Scoring ~6 points on a conscientiousness scale ranging from 0-48 (that is, 1 SD on the scale) was significantly associated with ~22% lower risk of transitioning forward from NCI to MCI, while scoring ~7 more points on a neuroticism scale (1 SD) was significantly associated with ~12% higher risk of transitioning from NCI to MCI.

Higher extraversion was associated with an increased likelihood of transitioning from MCI back to NCI (HR, 1.12; 95% CI, 1.03-1.22), and although extraversion was not associated with a longer total lifespan, participants who scored high on extraversion, as well as those who scored low on conscientiousness or low on neuroticism, maintained normal cognitive function longer than other participants.

“Our results suggest that high conscientiousness and low neuroticism may protect individuals against mild cognitive impairment,” said Dr. Yoneda.

Importantly, individuals who were either higher in conscientiousness, higher in extraversion, or lower in neuroticism had more years of “cognitive healthspan,” meaning more years without cognitive impairment,” she added.

In addition, “individuals lower in neuroticism and higher in extraversion were more likely to recover after receiving an MCI diagnosis, suggesting that these traits may be protective even after an individual starts to progress to dementia,” she said.

The authors note that the study focused on only three of the Big Five personality traits, while the other 2 – openness to experience and agreeableness – may also be associated with cognitive aging processes and mortality.

Nevertheless, given the current results, alongside extensive research in the personality field, aiming to increase conscientiousness through persistent behavioral change is one potential strategy for promoting healthy cognitive aging, Dr. Yoneda said.
 

‘Invaluable window’

In a comment, Brent Roberts, PhD, professor of psychology, University of Illinois Urbana-Champaign, said the study provides an “invaluable window into how personality affects the process of decline and either accelerates it, as in the role of neuroticism, or decelerates it, as in the role of conscientiousness.”

“I think the most fascinating finding was the fact that extraversion was related to transitioning from MCI back to NCI. These types of transitions have simply not been part of prior research, and it provides utterly unique insights and opportunities for interventions that may actually help people recover from a decline,” said Dr. Roberts, who was not involved in the research.

Claire Sexton, DPhil, Alzheimer’s Association director of scientific programs and outreach, called the paper “novel” because it investigated the transitions between normal cognition and mild impairment and between mild impairment and dementia.

Dr. Sexton, who was associated with this research team, cautioned that is it observational, “so it can illuminate associations or correlations, but not causes. As a result, we can’t say for sure what the mechanisms are behind these potential connections between personality and cognition, and more research is needed.”

The research was supported by the Alzheimer Society Research Program, Social Sciences and Humanities Research Council, and the National Institute on Aging of the National Institutes of Health. Dr. Yoneda and co-authors, Dr. Roberts, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Extraverts and individuals who are disciplined are less likely to experience cognitive decline later in life, whereas those with neuroticism have an increased risk for cognitive dysfunction, new research shows.

Investigators analyzed data from almost 2,000 individuals enrolled in the Rush Memory and Aging Project (MAP) – a longitudinal study of older adults living in the greater Chicago metropolitan region and northeastern Illinois – with recruitment that began in 1997 and continues through today. Participants received a personality assessment as well as annual assessments of their cognitive abilities.

Those with high scores on measures of conscientiousness were significantly less likely to progress from normal cognition to mild cognitive impairment (MCI) during the study. In fact, scoring an extra 1 standard deviation on the conscientiousness scale was associated with a 22% lower risk of transitioning from no cognitive impairment (NCI) to MCI. On the other hand, scoring an additional 1 SD on a neuroticism scale was associated with a 12% increased risk of transitioning to MCI.

Participants who scored high on extraversion, as well as those who scored high on conscientiousness or low on neuroticism, tended to maintain normal cognitive functioning longer than other participants.

“Personality traits reflect relatively enduring patterns of thinking and behaving, which may cumulatively affect engagement in healthy and unhealthy behaviors and thought patterns across the lifespan,” lead author Tomiko Yoneda, PhD, a postdoctoral researcher in the department of medical social sciences, Northwestern University, Chicago, said in an interview.

“The accumulation of lifelong experiences may then contribute to susceptibility of particular diseases or disorders, such as mild cognitive impairment, or contribute to individual differences in the ability to withstand age-related neurological changes,” she added.

The study was published online in the Journal of Personality and Social Psychology.
 

Competing risk factors

Personality traits “reflect an individual’s persistent patterns of thinking, feeling, and behaving,” Dr. Yoneda said.

“For example, conscientiousness is characterized by competence, dutifulness, and self-discipline, while neuroticism is characterized by anxiety, depressive symptoms, and emotional instability. Likewise, individuals high in extraversion tend to be enthusiastic, gregarious, talkative, and assertive,” she added.

Previous research “suggests that low conscientiousness and high neuroticism are associated with an increased risk of cognitive impairment,” she continued. However, “there is also an increased risk of death in older adulthood – in other words, these outcomes are ‘competing risk factors.’”

Dr. Yoneda said her team wanted to “examine the impact of personality traits on the simultaneous risk of transitioning to mild cognitive impairment, dementia, and death.”  

For the study, the researchers analyzed data from 1,954 participants in MAP (mean age at baseline 80 years, 73.7% female, 86.8% White), who received a personality assessment and annual assessments of their cognitive abilities.

To assess personality traits – in particular, conscientiousness, neuroticism, and extraversion – the researchers used the NEO Five Factor Inventory (NEO-FFI). They also used multistate survival modeling to examine the potential association between these traits and transitions from one cognitive status category to another (NCI, MCI, and dementia) and to death.
 

Cognitive healthspan

By the end of the study, over half of the sample (54%) had died.

Most transitions showed “relative stability in cognitive status across measurement occasions.”

  • NCI to NCI (n = 7,368)
  • MCI to MCI (n = 1,244)
  • Dementia to dementia (n = 876)

There were 725 “backward transitions” from MCI to NCI, “which may reflect improvement or within-person variability in cognitive functioning, or learning effects,” the authors note.

There were only 114 “backward transitions” from dementia to MCI and only 12 from dementia to NCI, “suggesting that improvement in cognitive status was relatively rare, particularly once an individual progresses to dementia.”

After adjusting for demographics, depressive symptoms, and apolipoprotein (APOE) ε4 allele, the researchers found that personality traits were the most important factors in the transition from NCI to MCI.

Higher conscientiousness was associated with a decreased risk of transitioning from NCI to MCI (hazard ratio, 0.78; 95% confidence interval, 0.72-0.85). Conversely, higher neuroticism was associated with an increased risk of transitioning from NCI to MCI (HR, 1.12; 95% CI, 1.04-1.21) and a significantly decreased likelihood of transition back from MCI to NCI (HR, 0.90; 95% CI, 0.81-1.00).

Scoring ~6 points on a conscientiousness scale ranging from 0-48 (that is, 1 SD on the scale) was significantly associated with ~22% lower risk of transitioning forward from NCI to MCI, while scoring ~7 more points on a neuroticism scale (1 SD) was significantly associated with ~12% higher risk of transitioning from NCI to MCI.

Higher extraversion was associated with an increased likelihood of transitioning from MCI back to NCI (HR, 1.12; 95% CI, 1.03-1.22), and although extraversion was not associated with a longer total lifespan, participants who scored high on extraversion, as well as those who scored low on conscientiousness or low on neuroticism, maintained normal cognitive function longer than other participants.

“Our results suggest that high conscientiousness and low neuroticism may protect individuals against mild cognitive impairment,” said Dr. Yoneda.

Importantly, individuals who were either higher in conscientiousness, higher in extraversion, or lower in neuroticism had more years of “cognitive healthspan,” meaning more years without cognitive impairment,” she added.

In addition, “individuals lower in neuroticism and higher in extraversion were more likely to recover after receiving an MCI diagnosis, suggesting that these traits may be protective even after an individual starts to progress to dementia,” she said.

The authors note that the study focused on only three of the Big Five personality traits, while the other 2 – openness to experience and agreeableness – may also be associated with cognitive aging processes and mortality.

Nevertheless, given the current results, alongside extensive research in the personality field, aiming to increase conscientiousness through persistent behavioral change is one potential strategy for promoting healthy cognitive aging, Dr. Yoneda said.
 

‘Invaluable window’

In a comment, Brent Roberts, PhD, professor of psychology, University of Illinois Urbana-Champaign, said the study provides an “invaluable window into how personality affects the process of decline and either accelerates it, as in the role of neuroticism, or decelerates it, as in the role of conscientiousness.”

“I think the most fascinating finding was the fact that extraversion was related to transitioning from MCI back to NCI. These types of transitions have simply not been part of prior research, and it provides utterly unique insights and opportunities for interventions that may actually help people recover from a decline,” said Dr. Roberts, who was not involved in the research.

Claire Sexton, DPhil, Alzheimer’s Association director of scientific programs and outreach, called the paper “novel” because it investigated the transitions between normal cognition and mild impairment and between mild impairment and dementia.

Dr. Sexton, who was associated with this research team, cautioned that is it observational, “so it can illuminate associations or correlations, but not causes. As a result, we can’t say for sure what the mechanisms are behind these potential connections between personality and cognition, and more research is needed.”

The research was supported by the Alzheimer Society Research Program, Social Sciences and Humanities Research Council, and the National Institute on Aging of the National Institutes of Health. Dr. Yoneda and co-authors, Dr. Roberts, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Extraverts and individuals who are disciplined are less likely to experience cognitive decline later in life, whereas those with neuroticism have an increased risk for cognitive dysfunction, new research shows.

Investigators analyzed data from almost 2,000 individuals enrolled in the Rush Memory and Aging Project (MAP) – a longitudinal study of older adults living in the greater Chicago metropolitan region and northeastern Illinois – with recruitment that began in 1997 and continues through today. Participants received a personality assessment as well as annual assessments of their cognitive abilities.

Those with high scores on measures of conscientiousness were significantly less likely to progress from normal cognition to mild cognitive impairment (MCI) during the study. In fact, scoring an extra 1 standard deviation on the conscientiousness scale was associated with a 22% lower risk of transitioning from no cognitive impairment (NCI) to MCI. On the other hand, scoring an additional 1 SD on a neuroticism scale was associated with a 12% increased risk of transitioning to MCI.

Participants who scored high on extraversion, as well as those who scored high on conscientiousness or low on neuroticism, tended to maintain normal cognitive functioning longer than other participants.

“Personality traits reflect relatively enduring patterns of thinking and behaving, which may cumulatively affect engagement in healthy and unhealthy behaviors and thought patterns across the lifespan,” lead author Tomiko Yoneda, PhD, a postdoctoral researcher in the department of medical social sciences, Northwestern University, Chicago, said in an interview.

“The accumulation of lifelong experiences may then contribute to susceptibility of particular diseases or disorders, such as mild cognitive impairment, or contribute to individual differences in the ability to withstand age-related neurological changes,” she added.

The study was published online in the Journal of Personality and Social Psychology.
 

Competing risk factors

Personality traits “reflect an individual’s persistent patterns of thinking, feeling, and behaving,” Dr. Yoneda said.

“For example, conscientiousness is characterized by competence, dutifulness, and self-discipline, while neuroticism is characterized by anxiety, depressive symptoms, and emotional instability. Likewise, individuals high in extraversion tend to be enthusiastic, gregarious, talkative, and assertive,” she added.

Previous research “suggests that low conscientiousness and high neuroticism are associated with an increased risk of cognitive impairment,” she continued. However, “there is also an increased risk of death in older adulthood – in other words, these outcomes are ‘competing risk factors.’”

Dr. Yoneda said her team wanted to “examine the impact of personality traits on the simultaneous risk of transitioning to mild cognitive impairment, dementia, and death.”  

For the study, the researchers analyzed data from 1,954 participants in MAP (mean age at baseline 80 years, 73.7% female, 86.8% White), who received a personality assessment and annual assessments of their cognitive abilities.

To assess personality traits – in particular, conscientiousness, neuroticism, and extraversion – the researchers used the NEO Five Factor Inventory (NEO-FFI). They also used multistate survival modeling to examine the potential association between these traits and transitions from one cognitive status category to another (NCI, MCI, and dementia) and to death.
 

Cognitive healthspan

By the end of the study, over half of the sample (54%) had died.

Most transitions showed “relative stability in cognitive status across measurement occasions.”

  • NCI to NCI (n = 7,368)
  • MCI to MCI (n = 1,244)
  • Dementia to dementia (n = 876)

There were 725 “backward transitions” from MCI to NCI, “which may reflect improvement or within-person variability in cognitive functioning, or learning effects,” the authors note.

There were only 114 “backward transitions” from dementia to MCI and only 12 from dementia to NCI, “suggesting that improvement in cognitive status was relatively rare, particularly once an individual progresses to dementia.”

After adjusting for demographics, depressive symptoms, and apolipoprotein (APOE) ε4 allele, the researchers found that personality traits were the most important factors in the transition from NCI to MCI.

Higher conscientiousness was associated with a decreased risk of transitioning from NCI to MCI (hazard ratio, 0.78; 95% confidence interval, 0.72-0.85). Conversely, higher neuroticism was associated with an increased risk of transitioning from NCI to MCI (HR, 1.12; 95% CI, 1.04-1.21) and a significantly decreased likelihood of transition back from MCI to NCI (HR, 0.90; 95% CI, 0.81-1.00).

Scoring ~6 points on a conscientiousness scale ranging from 0-48 (that is, 1 SD on the scale) was significantly associated with ~22% lower risk of transitioning forward from NCI to MCI, while scoring ~7 more points on a neuroticism scale (1 SD) was significantly associated with ~12% higher risk of transitioning from NCI to MCI.

Higher extraversion was associated with an increased likelihood of transitioning from MCI back to NCI (HR, 1.12; 95% CI, 1.03-1.22), and although extraversion was not associated with a longer total lifespan, participants who scored high on extraversion, as well as those who scored low on conscientiousness or low on neuroticism, maintained normal cognitive function longer than other participants.

“Our results suggest that high conscientiousness and low neuroticism may protect individuals against mild cognitive impairment,” said Dr. Yoneda.

Importantly, individuals who were either higher in conscientiousness, higher in extraversion, or lower in neuroticism had more years of “cognitive healthspan,” meaning more years without cognitive impairment,” she added.

In addition, “individuals lower in neuroticism and higher in extraversion were more likely to recover after receiving an MCI diagnosis, suggesting that these traits may be protective even after an individual starts to progress to dementia,” she said.

The authors note that the study focused on only three of the Big Five personality traits, while the other 2 – openness to experience and agreeableness – may also be associated with cognitive aging processes and mortality.

Nevertheless, given the current results, alongside extensive research in the personality field, aiming to increase conscientiousness through persistent behavioral change is one potential strategy for promoting healthy cognitive aging, Dr. Yoneda said.
 

‘Invaluable window’

In a comment, Brent Roberts, PhD, professor of psychology, University of Illinois Urbana-Champaign, said the study provides an “invaluable window into how personality affects the process of decline and either accelerates it, as in the role of neuroticism, or decelerates it, as in the role of conscientiousness.”

“I think the most fascinating finding was the fact that extraversion was related to transitioning from MCI back to NCI. These types of transitions have simply not been part of prior research, and it provides utterly unique insights and opportunities for interventions that may actually help people recover from a decline,” said Dr. Roberts, who was not involved in the research.

Claire Sexton, DPhil, Alzheimer’s Association director of scientific programs and outreach, called the paper “novel” because it investigated the transitions between normal cognition and mild impairment and between mild impairment and dementia.

Dr. Sexton, who was associated with this research team, cautioned that is it observational, “so it can illuminate associations or correlations, but not causes. As a result, we can’t say for sure what the mechanisms are behind these potential connections between personality and cognition, and more research is needed.”

The research was supported by the Alzheimer Society Research Program, Social Sciences and Humanities Research Council, and the National Institute on Aging of the National Institutes of Health. Dr. Yoneda and co-authors, Dr. Roberts, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Biomarker testing gains momentum in NSCLC

Article Type
Changed
Thu, 12/15/2022 - 14:33

Despite Spain’s lack of a national project or standard protocol for biomarker testing, more than half of patients diagnosed with stage 4 non–small cell lung cancer (NSCLC) are tested for biomarkers, according to a Spanish national registry study reported at the 2022 European Lung Cancer Congress.

“In recent years we’ve developed drugs that target biomarkers, so it’s important to identify those biomarkers to guide treatment and have an impact on the survival of our patients,” said lead author Virginia Calvo, MD, a medical oncologist with the Puerta de Hierro Majadahonda University Hospital, Madrid.

“If we don’t know our patients’ biomarkers, we can’t treat them with targeted therapies,” she added, noting that the overall survival of lung cancer patients has increased by 15% in the last 10 years, largely because of better therapies such as targeted drugs for advanced stage disease and immunotherapies.

To assess the status of biomarker testing in Spain, Dr. Calvo and colleagues analyzed data from the country’s Thoracic Tumor Registry on 9,239 patients diagnosed with metastatic NSCLC from 2016 to the present, 7,467 (81%) with nonsquamous tumors and 1,772 (19%) with squamous tumors.

They found that 85% of patients with nonsquamous NSCLC and about 53% of those with squamous cancers had undergone biomarker testing. They discovered that 4,115 (44%) of patients tested positive for EGFR, ALK, KRAS, BRAF, ROS1, or PD-L1.

Dr. Calvo attributes the widespread use of biomarker testing and its significant increase in the last 5 years to the growing knowledge and understanding of the disease.

“We are learning more about NSCLC, and I think in the next few years the number of biomarkers are going to grow,” she said.

The study’s findings also highlight the importance of establishing and maintaining cancer registries, Dr. Calvo said, noting that 182 hospitals across Spain and more than 550 experts participate in the Thoracic Tumors Registry, which includes data on patients from every Spanish territory.

“It’s important to collect information on real-life cancer care so that we know what our real situation is and take steps to improve it,” she said.

She anticipates that treatment for NSCLC patients will become increasingly complex in the future with the growing number of different biomarkers and the proportion of patients who test positive for them. “We may need to establish national strategies to implement next generation sequencing so that we can identify different biomarkers and improve the survival of our patients.”

In a press release, Rolf Stahel, MD, president of the European Thoracic Oncology Platform, said that it would be helpful to look at how frequently molecular testing led to patients receiving appropriate targeted treatment.

In the United States, the National Comprehensive Cancer Network recommends biomarker testing for eligible patients with newly diagnosed stage 4 NSCLC, and it can be considered for patients with squamous histology because 5%-10% of these tumors have targetable mutations. “This is because numerous lines of evidence show that patients with stage 4 NSCLC and a targetable mutation, typically have improved overall survival when treated with a targeted therapy,” wrote the authors of the NCCN recommendations.

“For newly diagnosed stage 4 NSCLC, there is always a tension between the need to start therapy versus waiting for molecular results. This is because if a recommended targeted option is identified, it is the optimal first-line therapy. Targeted therapy cannot be given to everyone. Different biomarkers predict response to different agents. This has been well illustrated and it makes testing critically important for patients with NSCLC,” Dara Aisner, MD, PhD, associate professor of pathology with the University of Colorado at Denver, Aurora, wrote in the NCCN guideline.

The study presented at ELCC was funded by a grant from the European Union’s Horizon 2020 Research and Innovation Program. Dr. Calvo has received fees from Roche, Bristol-Myers Squibb, MSD and AstraZeneca.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Despite Spain’s lack of a national project or standard protocol for biomarker testing, more than half of patients diagnosed with stage 4 non–small cell lung cancer (NSCLC) are tested for biomarkers, according to a Spanish national registry study reported at the 2022 European Lung Cancer Congress.

“In recent years we’ve developed drugs that target biomarkers, so it’s important to identify those biomarkers to guide treatment and have an impact on the survival of our patients,” said lead author Virginia Calvo, MD, a medical oncologist with the Puerta de Hierro Majadahonda University Hospital, Madrid.

“If we don’t know our patients’ biomarkers, we can’t treat them with targeted therapies,” she added, noting that the overall survival of lung cancer patients has increased by 15% in the last 10 years, largely because of better therapies such as targeted drugs for advanced stage disease and immunotherapies.

To assess the status of biomarker testing in Spain, Dr. Calvo and colleagues analyzed data from the country’s Thoracic Tumor Registry on 9,239 patients diagnosed with metastatic NSCLC from 2016 to the present, 7,467 (81%) with nonsquamous tumors and 1,772 (19%) with squamous tumors.

They found that 85% of patients with nonsquamous NSCLC and about 53% of those with squamous cancers had undergone biomarker testing. They discovered that 4,115 (44%) of patients tested positive for EGFR, ALK, KRAS, BRAF, ROS1, or PD-L1.

Dr. Calvo attributes the widespread use of biomarker testing and its significant increase in the last 5 years to the growing knowledge and understanding of the disease.

“We are learning more about NSCLC, and I think in the next few years the number of biomarkers are going to grow,” she said.

The study’s findings also highlight the importance of establishing and maintaining cancer registries, Dr. Calvo said, noting that 182 hospitals across Spain and more than 550 experts participate in the Thoracic Tumors Registry, which includes data on patients from every Spanish territory.

“It’s important to collect information on real-life cancer care so that we know what our real situation is and take steps to improve it,” she said.

She anticipates that treatment for NSCLC patients will become increasingly complex in the future with the growing number of different biomarkers and the proportion of patients who test positive for them. “We may need to establish national strategies to implement next generation sequencing so that we can identify different biomarkers and improve the survival of our patients.”

In a press release, Rolf Stahel, MD, president of the European Thoracic Oncology Platform, said that it would be helpful to look at how frequently molecular testing led to patients receiving appropriate targeted treatment.

In the United States, the National Comprehensive Cancer Network recommends biomarker testing for eligible patients with newly diagnosed stage 4 NSCLC, and it can be considered for patients with squamous histology because 5%-10% of these tumors have targetable mutations. “This is because numerous lines of evidence show that patients with stage 4 NSCLC and a targetable mutation, typically have improved overall survival when treated with a targeted therapy,” wrote the authors of the NCCN recommendations.

“For newly diagnosed stage 4 NSCLC, there is always a tension between the need to start therapy versus waiting for molecular results. This is because if a recommended targeted option is identified, it is the optimal first-line therapy. Targeted therapy cannot be given to everyone. Different biomarkers predict response to different agents. This has been well illustrated and it makes testing critically important for patients with NSCLC,” Dara Aisner, MD, PhD, associate professor of pathology with the University of Colorado at Denver, Aurora, wrote in the NCCN guideline.

The study presented at ELCC was funded by a grant from the European Union’s Horizon 2020 Research and Innovation Program. Dr. Calvo has received fees from Roche, Bristol-Myers Squibb, MSD and AstraZeneca.

Despite Spain’s lack of a national project or standard protocol for biomarker testing, more than half of patients diagnosed with stage 4 non–small cell lung cancer (NSCLC) are tested for biomarkers, according to a Spanish national registry study reported at the 2022 European Lung Cancer Congress.

“In recent years we’ve developed drugs that target biomarkers, so it’s important to identify those biomarkers to guide treatment and have an impact on the survival of our patients,” said lead author Virginia Calvo, MD, a medical oncologist with the Puerta de Hierro Majadahonda University Hospital, Madrid.

“If we don’t know our patients’ biomarkers, we can’t treat them with targeted therapies,” she added, noting that the overall survival of lung cancer patients has increased by 15% in the last 10 years, largely because of better therapies such as targeted drugs for advanced stage disease and immunotherapies.

To assess the status of biomarker testing in Spain, Dr. Calvo and colleagues analyzed data from the country’s Thoracic Tumor Registry on 9,239 patients diagnosed with metastatic NSCLC from 2016 to the present, 7,467 (81%) with nonsquamous tumors and 1,772 (19%) with squamous tumors.

They found that 85% of patients with nonsquamous NSCLC and about 53% of those with squamous cancers had undergone biomarker testing. They discovered that 4,115 (44%) of patients tested positive for EGFR, ALK, KRAS, BRAF, ROS1, or PD-L1.

Dr. Calvo attributes the widespread use of biomarker testing and its significant increase in the last 5 years to the growing knowledge and understanding of the disease.

“We are learning more about NSCLC, and I think in the next few years the number of biomarkers are going to grow,” she said.

The study’s findings also highlight the importance of establishing and maintaining cancer registries, Dr. Calvo said, noting that 182 hospitals across Spain and more than 550 experts participate in the Thoracic Tumors Registry, which includes data on patients from every Spanish territory.

“It’s important to collect information on real-life cancer care so that we know what our real situation is and take steps to improve it,” she said.

She anticipates that treatment for NSCLC patients will become increasingly complex in the future with the growing number of different biomarkers and the proportion of patients who test positive for them. “We may need to establish national strategies to implement next generation sequencing so that we can identify different biomarkers and improve the survival of our patients.”

In a press release, Rolf Stahel, MD, president of the European Thoracic Oncology Platform, said that it would be helpful to look at how frequently molecular testing led to patients receiving appropriate targeted treatment.

In the United States, the National Comprehensive Cancer Network recommends biomarker testing for eligible patients with newly diagnosed stage 4 NSCLC, and it can be considered for patients with squamous histology because 5%-10% of these tumors have targetable mutations. “This is because numerous lines of evidence show that patients with stage 4 NSCLC and a targetable mutation, typically have improved overall survival when treated with a targeted therapy,” wrote the authors of the NCCN recommendations.

“For newly diagnosed stage 4 NSCLC, there is always a tension between the need to start therapy versus waiting for molecular results. This is because if a recommended targeted option is identified, it is the optimal first-line therapy. Targeted therapy cannot be given to everyone. Different biomarkers predict response to different agents. This has been well illustrated and it makes testing critically important for patients with NSCLC,” Dara Aisner, MD, PhD, associate professor of pathology with the University of Colorado at Denver, Aurora, wrote in the NCCN guideline.

The study presented at ELCC was funded by a grant from the European Union’s Horizon 2020 Research and Innovation Program. Dr. Calvo has received fees from Roche, Bristol-Myers Squibb, MSD and AstraZeneca.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ELCC 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Childhood abuse may increase risk of MS in women

Article Type
Changed
Thu, 12/15/2022 - 15:38

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Issue
Neurology Reviews - 30(6)
Publications
Topics
Sections

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Issue
Neurology Reviews - 30(6)
Issue
Neurology Reviews - 30(6)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF NEUROLOGY, NEUROSURGERY, & PSYCHIATRY

Citation Override
Publish date: April 20, 2022
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Zanubrutinib shows worth against standard CLL drugs

Article Type
Changed
Thu, 01/12/2023 - 10:44

– A new treatment option may soon be available for patients with chronic lymphocytic leukemia (CLL)/small lymphocytic lymphoma (SLL).

Zanubrutinib (Brukinsa), an irreversible, next-generation Bruton tyrosine kinase (BTK) inhibitor, is designed to minimize the off-target cardiovascular toxicities, such as atrial fibrillation and hypertension, seen with the first-generation ibrutinib (Imbruvica).

Zanubrutinib is already approved for use in mantle cell and marginal zone lymphomas and Waldenström’s macroglobulinemia.

Now it has also shown efficacy in CLL. In two phase 3 clinical trials, zanubrutinib has shown improved outcomes and reduced toxicity when compared with more established treatments in patients with relapsed/refractory and untreated CLL and SLL.

However, experts question whether the drug will find its place in an increasingly crowded space for the management of CLL.
 

Data from two phase 3 trials

The new data from two phase 3 clinical trials were presented recently at the British Society for Haematology 62nd annual scientific meeting, held in Manchester, England.

The ALPINE trial compared zanubrutinib with ibrutinib in 415 patients with CLL/SLL and showed that the novel drug was associated with a significant improvement in overall response rate, at 78% versus 63%.

This first interim analysis also showed that there was an increase in progression-free survival (PFS) with zanubrutinib, and crucially, it was associated with a lower atrial fibrillation/flutter rate than ibrutinib.

“These data support that more selective BTK inhibition, with more complete and sustained BTK occupancy, results in improved efficacy and safety outcomes,” said lead author Peter Hillmen, MBChB, FRCP, PhD, St. James’s University Hospital, Leeds, England.

The SEQUOIA study looked at zanubrutinib versus bendamustine plus rituximab in patients with untreated CLL/SLL with a 17p deletion and showed that PFS was improved with zanubrutinib by 58%.

Zanubrutinib was also associated with improved overall response rates and was well tolerated.

The results therefore “support the potential utility of zanubrutinib in the frontline management of patients with previously untreated CLL/SLL,” said lead author Talha Munir, MBBS, also of St. James’s University Hospital.
 

Improvement over ibrutinib

Ibrutinib, the first BTK inhibitor, “truly revolutionized the way we treat CLL,” commented Renata Walewska, MRCP, PhD, consultant hematologist at the Royal Bournemouth (England) Hospital and chair of the UKCLL Forum.

“But it has got quite a lot of, especially cardiac, problems, with atrial fibrillation and hypertension,” she said in an interview. The problem is that it acts not only as an inhibitor of Bruton kinase, but also affects other kinases, she explained.

Zanubrutinib is “much cleaner,” continued Dr. Walewska, who was lead author of the recently published British Society of Haematology guideline for the treatment of CLL.

However, the drug “is not that groundbreaking,” she commented, as acalabrutinib (Calquence), another next-generation BTK inhibitor, is already available for use in the clinic.

“We’re really lucky in CLL,” Dr. Walewska said, “we’ve got so many new drugs available, and it’s getting quite crowded. Trying to find a place for zanubrutinib is going be tricky.”

Lee Greenberger, PhD, chief scientific officer at the Leukemia & Lymphoma Society, commented that he “gives a lot of credit” to BeiGene, the company behind zanubrutinib, for “taking on these big studies.”

He said that, with the improvements in PFS and reduced atrial fibrillation with the drug, “there will be many clinicians paying attention to this and zanubrutinib could be preferred over conventional options.”

However, he agreed that it will have to compete with acalabrutinib, adding that, beyond BTK inhibitors, there are “a lot of options” for patients with CLL.

“That makes it very difficult for physicians to figure out what is the best type of therapy” to use in these patients, he added.

Dr. Greenberger told this news organization that further studies will need to demonstrate that zanubrutinib is associated with extended survival, which is “just not possible to show” at the moment with the current follow-up period.

He also noted that, in 10 years, ibrutinib will be off-patent, but zanubrutinib will not, at which point the “substantial” cost of the medication, which is a source of “hardship to patients,” will be increasingly relevant.
 

 

 

Study details

The phase 3 ALPINE study involved 415 adults with CLL/SLL refractory to one or more prior systemic therapies and measurable lymphadenopathy on imaging.  

They were randomized 1:1 to zanubrutinib or ibrutinib until disease progression or withdrawal from the study.

Most patients had Binet stage A/B or Ann Arbor stage I/II disease, and 7.3% of patients treated with zanubrutinib and 10.1% of those assigned to ibrutinib had received more than three prior lines of therapy.

Over 60% of patients were aged 65 years or older and around 70% were men, with no significant differences between treatment groups.

Patients were randomized 1:1 to zanubrutinib or ibrutinib until disease progression or study withdrawal.

After a median follow-up of 15 months, the overall response rate was significantly higher with zanubrutinib than ibrutinib, at 78.3% versus 62.5% (P = .0006).

Subgroup analysis confirmed that the effect was seen regardless of age, sex, disease stage, number of prior lines of therapy, mutation status, or bulky disease.

Over a median follow-up of 14 months, the investigator-assessed 12-month PFS was 94.9% for zanubrutinib and 84.0% for ibrutinib (P = .0007). Overall survival at 12 months was 97% versus 92.7%, but the difference was not significant (P = .1081).

Patients treated with zanubrutinib experienced more grade 3 or higher adverse events than those given ibrutinib, at 55.9% versus 51.2%, although they had fewer adverse events leading to treatment discontinuation, at 7.8% versus 13.0%.

More importantly, there were fewer cardiac disorders of any grade with zanubrutinib versus ibrutinib, and any-grade atrial fibrillation was significantly less common, at 2.5% versus 10.1% (P = .0014).

Rates of hypertension and hemorrhage were similar between the two treatments, while rates of neutropenia were higher with zanubrutinib versus ibrutinib, at 28.4% versus 21.7%.

The phase 3 SEQUOIA study looked at an earlier stage of disease and included patients with previously untreated CLL/SLL (without 17p depletion) who were unsuitable for treatment with fludarabine, cyclophosphamide, and rituximab.

This trial involved 479 patients randomized to zanubrutinib or bendamustine (days 1 and 2) plus rituximab for six cycles of 28 days each (B+R).

The median age of patients was 70 years, and approximately 80% were at least 65 years old. Just over 60% were men and most (over 70%) were from Europe.

After a median of 26.2 months, independent review committee–assessed PFS was significantly longer with zanubrutinib versus B+R (hazard ratio, 0.42; P < .0001), with an estimated 24-month PFS of 85.5% versus 69.5%.

These results held whether patients were stratified by age, Binet stage, bulky disease, or 11q deletion status, and for patients with an unmutated, but not mutated, immunoglobulin heavy chain gene.

The overall response rate with zanubrutinib was 94.6% versus 85.3% with B+R, and estimated 24-month overall survival was 94.3% versus 94.6%.

Rates of adverse events of any grade were similar between the two treatment groups, although B+R was associated with a higher (grade ≥ 3) adverse event rate, at 79.7%, versus 52.5% for zanubrutinib, and a higher rate of treatment discontinuation because of adverse events, at 13.7% versus 8.3%.

Interestingly, any-grade hypertension was more common with zanubrutinib versus B+R, at 14.2% versus 10.6%, but much lower rates of neutropenia were more common with zanubrutinib, at 15.8% versus 56.8%.

The studies were sponsored by BeiGene. Dr. Hillmen has reported relationships with Janssen, AbbVie, Pharmacyclics, Roche, Gilead, AstraZeneca, SOBI, and BeiGene. Dr. Munir has reported relationships with AbbVie, AstraZeneca, Roche, Alexion, Janssen, MorphoSys, and SOBI. Other authors have also declared numerous relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

– A new treatment option may soon be available for patients with chronic lymphocytic leukemia (CLL)/small lymphocytic lymphoma (SLL).

Zanubrutinib (Brukinsa), an irreversible, next-generation Bruton tyrosine kinase (BTK) inhibitor, is designed to minimize the off-target cardiovascular toxicities, such as atrial fibrillation and hypertension, seen with the first-generation ibrutinib (Imbruvica).

Zanubrutinib is already approved for use in mantle cell and marginal zone lymphomas and Waldenström’s macroglobulinemia.

Now it has also shown efficacy in CLL. In two phase 3 clinical trials, zanubrutinib has shown improved outcomes and reduced toxicity when compared with more established treatments in patients with relapsed/refractory and untreated CLL and SLL.

However, experts question whether the drug will find its place in an increasingly crowded space for the management of CLL.
 

Data from two phase 3 trials

The new data from two phase 3 clinical trials were presented recently at the British Society for Haematology 62nd annual scientific meeting, held in Manchester, England.

The ALPINE trial compared zanubrutinib with ibrutinib in 415 patients with CLL/SLL and showed that the novel drug was associated with a significant improvement in overall response rate, at 78% versus 63%.

This first interim analysis also showed that there was an increase in progression-free survival (PFS) with zanubrutinib, and crucially, it was associated with a lower atrial fibrillation/flutter rate than ibrutinib.

“These data support that more selective BTK inhibition, with more complete and sustained BTK occupancy, results in improved efficacy and safety outcomes,” said lead author Peter Hillmen, MBChB, FRCP, PhD, St. James’s University Hospital, Leeds, England.

The SEQUOIA study looked at zanubrutinib versus bendamustine plus rituximab in patients with untreated CLL/SLL with a 17p deletion and showed that PFS was improved with zanubrutinib by 58%.

Zanubrutinib was also associated with improved overall response rates and was well tolerated.

The results therefore “support the potential utility of zanubrutinib in the frontline management of patients with previously untreated CLL/SLL,” said lead author Talha Munir, MBBS, also of St. James’s University Hospital.
 

Improvement over ibrutinib

Ibrutinib, the first BTK inhibitor, “truly revolutionized the way we treat CLL,” commented Renata Walewska, MRCP, PhD, consultant hematologist at the Royal Bournemouth (England) Hospital and chair of the UKCLL Forum.

“But it has got quite a lot of, especially cardiac, problems, with atrial fibrillation and hypertension,” she said in an interview. The problem is that it acts not only as an inhibitor of Bruton kinase, but also affects other kinases, she explained.

Zanubrutinib is “much cleaner,” continued Dr. Walewska, who was lead author of the recently published British Society of Haematology guideline for the treatment of CLL.

However, the drug “is not that groundbreaking,” she commented, as acalabrutinib (Calquence), another next-generation BTK inhibitor, is already available for use in the clinic.

“We’re really lucky in CLL,” Dr. Walewska said, “we’ve got so many new drugs available, and it’s getting quite crowded. Trying to find a place for zanubrutinib is going be tricky.”

Lee Greenberger, PhD, chief scientific officer at the Leukemia & Lymphoma Society, commented that he “gives a lot of credit” to BeiGene, the company behind zanubrutinib, for “taking on these big studies.”

He said that, with the improvements in PFS and reduced atrial fibrillation with the drug, “there will be many clinicians paying attention to this and zanubrutinib could be preferred over conventional options.”

However, he agreed that it will have to compete with acalabrutinib, adding that, beyond BTK inhibitors, there are “a lot of options” for patients with CLL.

“That makes it very difficult for physicians to figure out what is the best type of therapy” to use in these patients, he added.

Dr. Greenberger told this news organization that further studies will need to demonstrate that zanubrutinib is associated with extended survival, which is “just not possible to show” at the moment with the current follow-up period.

He also noted that, in 10 years, ibrutinib will be off-patent, but zanubrutinib will not, at which point the “substantial” cost of the medication, which is a source of “hardship to patients,” will be increasingly relevant.
 

 

 

Study details

The phase 3 ALPINE study involved 415 adults with CLL/SLL refractory to one or more prior systemic therapies and measurable lymphadenopathy on imaging.  

They were randomized 1:1 to zanubrutinib or ibrutinib until disease progression or withdrawal from the study.

Most patients had Binet stage A/B or Ann Arbor stage I/II disease, and 7.3% of patients treated with zanubrutinib and 10.1% of those assigned to ibrutinib had received more than three prior lines of therapy.

Over 60% of patients were aged 65 years or older and around 70% were men, with no significant differences between treatment groups.

Patients were randomized 1:1 to zanubrutinib or ibrutinib until disease progression or study withdrawal.

After a median follow-up of 15 months, the overall response rate was significantly higher with zanubrutinib than ibrutinib, at 78.3% versus 62.5% (P = .0006).

Subgroup analysis confirmed that the effect was seen regardless of age, sex, disease stage, number of prior lines of therapy, mutation status, or bulky disease.

Over a median follow-up of 14 months, the investigator-assessed 12-month PFS was 94.9% for zanubrutinib and 84.0% for ibrutinib (P = .0007). Overall survival at 12 months was 97% versus 92.7%, but the difference was not significant (P = .1081).

Patients treated with zanubrutinib experienced more grade 3 or higher adverse events than those given ibrutinib, at 55.9% versus 51.2%, although they had fewer adverse events leading to treatment discontinuation, at 7.8% versus 13.0%.

More importantly, there were fewer cardiac disorders of any grade with zanubrutinib versus ibrutinib, and any-grade atrial fibrillation was significantly less common, at 2.5% versus 10.1% (P = .0014).

Rates of hypertension and hemorrhage were similar between the two treatments, while rates of neutropenia were higher with zanubrutinib versus ibrutinib, at 28.4% versus 21.7%.

The phase 3 SEQUOIA study looked at an earlier stage of disease and included patients with previously untreated CLL/SLL (without 17p depletion) who were unsuitable for treatment with fludarabine, cyclophosphamide, and rituximab.

This trial involved 479 patients randomized to zanubrutinib or bendamustine (days 1 and 2) plus rituximab for six cycles of 28 days each (B+R).

The median age of patients was 70 years, and approximately 80% were at least 65 years old. Just over 60% were men and most (over 70%) were from Europe.

After a median of 26.2 months, independent review committee–assessed PFS was significantly longer with zanubrutinib versus B+R (hazard ratio, 0.42; P < .0001), with an estimated 24-month PFS of 85.5% versus 69.5%.

These results held whether patients were stratified by age, Binet stage, bulky disease, or 11q deletion status, and for patients with an unmutated, but not mutated, immunoglobulin heavy chain gene.

The overall response rate with zanubrutinib was 94.6% versus 85.3% with B+R, and estimated 24-month overall survival was 94.3% versus 94.6%.

Rates of adverse events of any grade were similar between the two treatment groups, although B+R was associated with a higher (grade ≥ 3) adverse event rate, at 79.7%, versus 52.5% for zanubrutinib, and a higher rate of treatment discontinuation because of adverse events, at 13.7% versus 8.3%.

Interestingly, any-grade hypertension was more common with zanubrutinib versus B+R, at 14.2% versus 10.6%, but much lower rates of neutropenia were more common with zanubrutinib, at 15.8% versus 56.8%.

The studies were sponsored by BeiGene. Dr. Hillmen has reported relationships with Janssen, AbbVie, Pharmacyclics, Roche, Gilead, AstraZeneca, SOBI, and BeiGene. Dr. Munir has reported relationships with AbbVie, AstraZeneca, Roche, Alexion, Janssen, MorphoSys, and SOBI. Other authors have also declared numerous relationships.

A version of this article first appeared on Medscape.com.

– A new treatment option may soon be available for patients with chronic lymphocytic leukemia (CLL)/small lymphocytic lymphoma (SLL).

Zanubrutinib (Brukinsa), an irreversible, next-generation Bruton tyrosine kinase (BTK) inhibitor, is designed to minimize the off-target cardiovascular toxicities, such as atrial fibrillation and hypertension, seen with the first-generation ibrutinib (Imbruvica).

Zanubrutinib is already approved for use in mantle cell and marginal zone lymphomas and Waldenström’s macroglobulinemia.

Now it has also shown efficacy in CLL. In two phase 3 clinical trials, zanubrutinib has shown improved outcomes and reduced toxicity when compared with more established treatments in patients with relapsed/refractory and untreated CLL and SLL.

However, experts question whether the drug will find its place in an increasingly crowded space for the management of CLL.
 

Data from two phase 3 trials

The new data from two phase 3 clinical trials were presented recently at the British Society for Haematology 62nd annual scientific meeting, held in Manchester, England.

The ALPINE trial compared zanubrutinib with ibrutinib in 415 patients with CLL/SLL and showed that the novel drug was associated with a significant improvement in overall response rate, at 78% versus 63%.

This first interim analysis also showed that there was an increase in progression-free survival (PFS) with zanubrutinib, and crucially, it was associated with a lower atrial fibrillation/flutter rate than ibrutinib.

“These data support that more selective BTK inhibition, with more complete and sustained BTK occupancy, results in improved efficacy and safety outcomes,” said lead author Peter Hillmen, MBChB, FRCP, PhD, St. James’s University Hospital, Leeds, England.

The SEQUOIA study looked at zanubrutinib versus bendamustine plus rituximab in patients with untreated CLL/SLL with a 17p deletion and showed that PFS was improved with zanubrutinib by 58%.

Zanubrutinib was also associated with improved overall response rates and was well tolerated.

The results therefore “support the potential utility of zanubrutinib in the frontline management of patients with previously untreated CLL/SLL,” said lead author Talha Munir, MBBS, also of St. James’s University Hospital.
 

Improvement over ibrutinib

Ibrutinib, the first BTK inhibitor, “truly revolutionized the way we treat CLL,” commented Renata Walewska, MRCP, PhD, consultant hematologist at the Royal Bournemouth (England) Hospital and chair of the UKCLL Forum.

“But it has got quite a lot of, especially cardiac, problems, with atrial fibrillation and hypertension,” she said in an interview. The problem is that it acts not only as an inhibitor of Bruton kinase, but also affects other kinases, she explained.

Zanubrutinib is “much cleaner,” continued Dr. Walewska, who was lead author of the recently published British Society of Haematology guideline for the treatment of CLL.

However, the drug “is not that groundbreaking,” she commented, as acalabrutinib (Calquence), another next-generation BTK inhibitor, is already available for use in the clinic.

“We’re really lucky in CLL,” Dr. Walewska said, “we’ve got so many new drugs available, and it’s getting quite crowded. Trying to find a place for zanubrutinib is going be tricky.”

Lee Greenberger, PhD, chief scientific officer at the Leukemia & Lymphoma Society, commented that he “gives a lot of credit” to BeiGene, the company behind zanubrutinib, for “taking on these big studies.”

He said that, with the improvements in PFS and reduced atrial fibrillation with the drug, “there will be many clinicians paying attention to this and zanubrutinib could be preferred over conventional options.”

However, he agreed that it will have to compete with acalabrutinib, adding that, beyond BTK inhibitors, there are “a lot of options” for patients with CLL.

“That makes it very difficult for physicians to figure out what is the best type of therapy” to use in these patients, he added.

Dr. Greenberger told this news organization that further studies will need to demonstrate that zanubrutinib is associated with extended survival, which is “just not possible to show” at the moment with the current follow-up period.

He also noted that, in 10 years, ibrutinib will be off-patent, but zanubrutinib will not, at which point the “substantial” cost of the medication, which is a source of “hardship to patients,” will be increasingly relevant.
 

 

 

Study details

The phase 3 ALPINE study involved 415 adults with CLL/SLL refractory to one or more prior systemic therapies and measurable lymphadenopathy on imaging.  

They were randomized 1:1 to zanubrutinib or ibrutinib until disease progression or withdrawal from the study.

Most patients had Binet stage A/B or Ann Arbor stage I/II disease, and 7.3% of patients treated with zanubrutinib and 10.1% of those assigned to ibrutinib had received more than three prior lines of therapy.

Over 60% of patients were aged 65 years or older and around 70% were men, with no significant differences between treatment groups.

Patients were randomized 1:1 to zanubrutinib or ibrutinib until disease progression or study withdrawal.

After a median follow-up of 15 months, the overall response rate was significantly higher with zanubrutinib than ibrutinib, at 78.3% versus 62.5% (P = .0006).

Subgroup analysis confirmed that the effect was seen regardless of age, sex, disease stage, number of prior lines of therapy, mutation status, or bulky disease.

Over a median follow-up of 14 months, the investigator-assessed 12-month PFS was 94.9% for zanubrutinib and 84.0% for ibrutinib (P = .0007). Overall survival at 12 months was 97% versus 92.7%, but the difference was not significant (P = .1081).

Patients treated with zanubrutinib experienced more grade 3 or higher adverse events than those given ibrutinib, at 55.9% versus 51.2%, although they had fewer adverse events leading to treatment discontinuation, at 7.8% versus 13.0%.

More importantly, there were fewer cardiac disorders of any grade with zanubrutinib versus ibrutinib, and any-grade atrial fibrillation was significantly less common, at 2.5% versus 10.1% (P = .0014).

Rates of hypertension and hemorrhage were similar between the two treatments, while rates of neutropenia were higher with zanubrutinib versus ibrutinib, at 28.4% versus 21.7%.

The phase 3 SEQUOIA study looked at an earlier stage of disease and included patients with previously untreated CLL/SLL (without 17p depletion) who were unsuitable for treatment with fludarabine, cyclophosphamide, and rituximab.

This trial involved 479 patients randomized to zanubrutinib or bendamustine (days 1 and 2) plus rituximab for six cycles of 28 days each (B+R).

The median age of patients was 70 years, and approximately 80% were at least 65 years old. Just over 60% were men and most (over 70%) were from Europe.

After a median of 26.2 months, independent review committee–assessed PFS was significantly longer with zanubrutinib versus B+R (hazard ratio, 0.42; P < .0001), with an estimated 24-month PFS of 85.5% versus 69.5%.

These results held whether patients were stratified by age, Binet stage, bulky disease, or 11q deletion status, and for patients with an unmutated, but not mutated, immunoglobulin heavy chain gene.

The overall response rate with zanubrutinib was 94.6% versus 85.3% with B+R, and estimated 24-month overall survival was 94.3% versus 94.6%.

Rates of adverse events of any grade were similar between the two treatment groups, although B+R was associated with a higher (grade ≥ 3) adverse event rate, at 79.7%, versus 52.5% for zanubrutinib, and a higher rate of treatment discontinuation because of adverse events, at 13.7% versus 8.3%.

Interestingly, any-grade hypertension was more common with zanubrutinib versus B+R, at 14.2% versus 10.6%, but much lower rates of neutropenia were more common with zanubrutinib, at 15.8% versus 56.8%.

The studies were sponsored by BeiGene. Dr. Hillmen has reported relationships with Janssen, AbbVie, Pharmacyclics, Roche, Gilead, AstraZeneca, SOBI, and BeiGene. Dr. Munir has reported relationships with AbbVie, AstraZeneca, Roche, Alexion, Janssen, MorphoSys, and SOBI. Other authors have also declared numerous relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cancer diet studies: Veggies get another rave, while red meat’s busted again

Article Type
Changed
Thu, 12/15/2022 - 14:33

A pair of new studies offers more evidence for the value of vegetables and the risk of red meat on the cancer prevention front. Researchers report that high consumption of vegetables – especially lettuce, legumes, and cruciferous varieties – appears to lower the risk of liver cancer/liver disease. A separate team suggests that high consumption of red meat, organ meats, and processed meats boosts the risk of gastric cancer.

The findings of the latter study “reinforce the idea that avoidance of red meat and processed meat is probably good beyond [the prevention of] colorectal cancer,” said corresponding author and epidemiologist Paolo Boffetta, MD, MPH, of Stony Brook University Cancer Center, New York, in an interview. “The possible carcinogenic effect may extend beyond the colon.”

Both studies were released at the annual meeting of the American Association for Cancer Research.

For the red meat study, researchers examined statistics from the Golestan cohort study, which is prospectively tracking 50,045 people aged 40-75 from northeastern Iran. The study focuses on esophageal cancer due to the region’s high rate of the disease.

Red meat consumption is fairly rare in the region, where residents typically prefer chicken, said study lead author Giulia Collatuzzo, MD, a resident physician in occupational medicine at the University of Bologna, Italy, in an interview. On average, participants reported eating 18.4 grams daily of red meat and 72.1 grams daily of white meat.

The researchers tracked study participants for a median 12-year follow-up, during which 369 developed esophageal cancer and 368 developed gastric cancer. Red meat was only linked to more esophageal cancer in women (hazard ratio, 1.13, 95% confidence interval, 1.00-1.18, for each quintile increase in consumption).

Overall red meat consumption (including red meat, organ meat, and processed meat) was linked to higher rates of gastric cancer (HR, 1.08, 95% CI, 1.00-1.17) for each quartile increase in consumption, as was consumption of the red meat subtype alone (HR, 1.09, 95% CI, 1.00-1.18).

According to Dr. Collatuzzo, the findings suggest that those in the highest quartile of overall red meat consumption may have around a 25% increase in risk, compared with the lowest quartile.

Overall, she said, the study findings aren’t surprising. The lack of a connection between red meat consumption and esophageal cancer may be due to the fact that meat only temporarily transits through the esophagus, she said.

For the liver cancer/liver disease study, researchers examined the medical records of 470,653 subjects in the NIH-AARP Diet and Health Study. They were recruited in 1995-1996 when they were 50-71 years old. Over a median follow-up of 15.5 years, 899 developed liver cancer, and 934 died of chronic liver disease.

The median intakes of vegetables in quintile 5 (highest) and quintile 1 (lowest) were 3.7 cups daily and 1.0 cups daily, respectively, said study lead author Long-Gang Zhao, MS, a graduate student at Harvard University.

After adjusting for possible cofounders, those in the highest quintile of vegetable consumption were a third less likely to develop liver cancer, compared with the lowest quintile (HR, 0.66, 95% CI, 0.53-0.82, P < 0.01). Several types of vegetables appeared to be the strongest cancer fighters: cruciferous (broccoli, cauliflower), lettuce, legumes, and carrots. These kinds of vegetables were also linked to lower rates of chronic liver disease mortality (all P < 0.01), as was total vegetable intake for the top quintile versus the lowest quintile (HR, 0.60, 95% CI, 0.49-0.74, P = < 0.01).

“A one-cup increase (8 oz or 225 g) in vegetable intake was associated with about 20% decreased risk of liver cancer incidence and chronic liver mortality,” Zhao said.

There was no statistically significant link between fruit consumption and liver cancer or chronic liver disease mortality.

The findings provide more insight into diet and liver disease, Zhao said. “Chronic liver disease, which predisposes to liver cancer, is the tenth cause of death worldwide, causing two million deaths each year. It shares some etiological processes with liver cancer. Therefore, examining both chronic liver disease mortality and liver cancer incidence in our study may provide a more general picture for the prevention of liver diseases.”

As for limitations, both studies are based on self-reports about food consumption, which can be unreliable, and the subjects in the fruit/vegetable analysis were mainly of European origin.

The authors of both studies report no relevant disclosures. No funding is reported for either study.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

A pair of new studies offers more evidence for the value of vegetables and the risk of red meat on the cancer prevention front. Researchers report that high consumption of vegetables – especially lettuce, legumes, and cruciferous varieties – appears to lower the risk of liver cancer/liver disease. A separate team suggests that high consumption of red meat, organ meats, and processed meats boosts the risk of gastric cancer.

The findings of the latter study “reinforce the idea that avoidance of red meat and processed meat is probably good beyond [the prevention of] colorectal cancer,” said corresponding author and epidemiologist Paolo Boffetta, MD, MPH, of Stony Brook University Cancer Center, New York, in an interview. “The possible carcinogenic effect may extend beyond the colon.”

Both studies were released at the annual meeting of the American Association for Cancer Research.

For the red meat study, researchers examined statistics from the Golestan cohort study, which is prospectively tracking 50,045 people aged 40-75 from northeastern Iran. The study focuses on esophageal cancer due to the region’s high rate of the disease.

Red meat consumption is fairly rare in the region, where residents typically prefer chicken, said study lead author Giulia Collatuzzo, MD, a resident physician in occupational medicine at the University of Bologna, Italy, in an interview. On average, participants reported eating 18.4 grams daily of red meat and 72.1 grams daily of white meat.

The researchers tracked study participants for a median 12-year follow-up, during which 369 developed esophageal cancer and 368 developed gastric cancer. Red meat was only linked to more esophageal cancer in women (hazard ratio, 1.13, 95% confidence interval, 1.00-1.18, for each quintile increase in consumption).

Overall red meat consumption (including red meat, organ meat, and processed meat) was linked to higher rates of gastric cancer (HR, 1.08, 95% CI, 1.00-1.17) for each quartile increase in consumption, as was consumption of the red meat subtype alone (HR, 1.09, 95% CI, 1.00-1.18).

According to Dr. Collatuzzo, the findings suggest that those in the highest quartile of overall red meat consumption may have around a 25% increase in risk, compared with the lowest quartile.

Overall, she said, the study findings aren’t surprising. The lack of a connection between red meat consumption and esophageal cancer may be due to the fact that meat only temporarily transits through the esophagus, she said.

For the liver cancer/liver disease study, researchers examined the medical records of 470,653 subjects in the NIH-AARP Diet and Health Study. They were recruited in 1995-1996 when they were 50-71 years old. Over a median follow-up of 15.5 years, 899 developed liver cancer, and 934 died of chronic liver disease.

The median intakes of vegetables in quintile 5 (highest) and quintile 1 (lowest) were 3.7 cups daily and 1.0 cups daily, respectively, said study lead author Long-Gang Zhao, MS, a graduate student at Harvard University.

After adjusting for possible cofounders, those in the highest quintile of vegetable consumption were a third less likely to develop liver cancer, compared with the lowest quintile (HR, 0.66, 95% CI, 0.53-0.82, P < 0.01). Several types of vegetables appeared to be the strongest cancer fighters: cruciferous (broccoli, cauliflower), lettuce, legumes, and carrots. These kinds of vegetables were also linked to lower rates of chronic liver disease mortality (all P < 0.01), as was total vegetable intake for the top quintile versus the lowest quintile (HR, 0.60, 95% CI, 0.49-0.74, P = < 0.01).

“A one-cup increase (8 oz or 225 g) in vegetable intake was associated with about 20% decreased risk of liver cancer incidence and chronic liver mortality,” Zhao said.

There was no statistically significant link between fruit consumption and liver cancer or chronic liver disease mortality.

The findings provide more insight into diet and liver disease, Zhao said. “Chronic liver disease, which predisposes to liver cancer, is the tenth cause of death worldwide, causing two million deaths each year. It shares some etiological processes with liver cancer. Therefore, examining both chronic liver disease mortality and liver cancer incidence in our study may provide a more general picture for the prevention of liver diseases.”

As for limitations, both studies are based on self-reports about food consumption, which can be unreliable, and the subjects in the fruit/vegetable analysis were mainly of European origin.

The authors of both studies report no relevant disclosures. No funding is reported for either study.

A pair of new studies offers more evidence for the value of vegetables and the risk of red meat on the cancer prevention front. Researchers report that high consumption of vegetables – especially lettuce, legumes, and cruciferous varieties – appears to lower the risk of liver cancer/liver disease. A separate team suggests that high consumption of red meat, organ meats, and processed meats boosts the risk of gastric cancer.

The findings of the latter study “reinforce the idea that avoidance of red meat and processed meat is probably good beyond [the prevention of] colorectal cancer,” said corresponding author and epidemiologist Paolo Boffetta, MD, MPH, of Stony Brook University Cancer Center, New York, in an interview. “The possible carcinogenic effect may extend beyond the colon.”

Both studies were released at the annual meeting of the American Association for Cancer Research.

For the red meat study, researchers examined statistics from the Golestan cohort study, which is prospectively tracking 50,045 people aged 40-75 from northeastern Iran. The study focuses on esophageal cancer due to the region’s high rate of the disease.

Red meat consumption is fairly rare in the region, where residents typically prefer chicken, said study lead author Giulia Collatuzzo, MD, a resident physician in occupational medicine at the University of Bologna, Italy, in an interview. On average, participants reported eating 18.4 grams daily of red meat and 72.1 grams daily of white meat.

The researchers tracked study participants for a median 12-year follow-up, during which 369 developed esophageal cancer and 368 developed gastric cancer. Red meat was only linked to more esophageal cancer in women (hazard ratio, 1.13, 95% confidence interval, 1.00-1.18, for each quintile increase in consumption).

Overall red meat consumption (including red meat, organ meat, and processed meat) was linked to higher rates of gastric cancer (HR, 1.08, 95% CI, 1.00-1.17) for each quartile increase in consumption, as was consumption of the red meat subtype alone (HR, 1.09, 95% CI, 1.00-1.18).

According to Dr. Collatuzzo, the findings suggest that those in the highest quartile of overall red meat consumption may have around a 25% increase in risk, compared with the lowest quartile.

Overall, she said, the study findings aren’t surprising. The lack of a connection between red meat consumption and esophageal cancer may be due to the fact that meat only temporarily transits through the esophagus, she said.

For the liver cancer/liver disease study, researchers examined the medical records of 470,653 subjects in the NIH-AARP Diet and Health Study. They were recruited in 1995-1996 when they were 50-71 years old. Over a median follow-up of 15.5 years, 899 developed liver cancer, and 934 died of chronic liver disease.

The median intakes of vegetables in quintile 5 (highest) and quintile 1 (lowest) were 3.7 cups daily and 1.0 cups daily, respectively, said study lead author Long-Gang Zhao, MS, a graduate student at Harvard University.

After adjusting for possible cofounders, those in the highest quintile of vegetable consumption were a third less likely to develop liver cancer, compared with the lowest quintile (HR, 0.66, 95% CI, 0.53-0.82, P < 0.01). Several types of vegetables appeared to be the strongest cancer fighters: cruciferous (broccoli, cauliflower), lettuce, legumes, and carrots. These kinds of vegetables were also linked to lower rates of chronic liver disease mortality (all P < 0.01), as was total vegetable intake for the top quintile versus the lowest quintile (HR, 0.60, 95% CI, 0.49-0.74, P = < 0.01).

“A one-cup increase (8 oz or 225 g) in vegetable intake was associated with about 20% decreased risk of liver cancer incidence and chronic liver mortality,” Zhao said.

There was no statistically significant link between fruit consumption and liver cancer or chronic liver disease mortality.

The findings provide more insight into diet and liver disease, Zhao said. “Chronic liver disease, which predisposes to liver cancer, is the tenth cause of death worldwide, causing two million deaths each year. It shares some etiological processes with liver cancer. Therefore, examining both chronic liver disease mortality and liver cancer incidence in our study may provide a more general picture for the prevention of liver diseases.”

As for limitations, both studies are based on self-reports about food consumption, which can be unreliable, and the subjects in the fruit/vegetable analysis were mainly of European origin.

The authors of both studies report no relevant disclosures. No funding is reported for either study.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AACR 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New York NPs join half of states with full practice authority

Article Type
Changed
Thu, 12/15/2022 - 14:33

With New York nurse practitioners recently gaining full practice authority (FPA), half of the country’s NPs now have the ability to provide patients with easier access to care, according to leading national nurse organizations.

New York joins 24 other states, the District of Columbia, and two U.S. territories that have adopted FPA legislation, as reported by the American Association of Nurse Practitioners (AANP). Like other states, New York has been under an emergency order during the pandemic that allowed NPs to practice to their full authority because of staffing shortages. That order was extended multiple times and was expected to expire this month, AANP reports.

“This has been in the making for nurse practitioners in New York since 2014, trying to get full practice authority,” Michelle Jones, RN, MSN, ANP-C, director at large for the New York State Nurses Association, said in an interview.

NPs who were allowed to practice independently during the pandemic campaigned for that provision to become permanent once the emergency order expired, she said. Ms. Jones explained that the FPA law expands the scope of practice and “removes unnecessary barriers,” namely an agreement with doctors to oversee NPs’ actions.

FPA gives NPs the authority to evaluate patients; diagnose, order, and interpret diagnostic tests; and initiate and manage treatments – including prescribing medications – without oversight by a doctor or state medical board, according to AANP.

Before the pandemic, New York NPs had “reduced” practice authority with those who had more than 3,600 hours of experience required to maintain a collaborative practice agreement with doctors and those with less experience maintaining a written agreement. The change gives full practice authority to those with more than 3,600 hours of experience, Stephen A. Ferrara, DNP, FNP-BC, AANP regional director, said in an interview.

Ferrara, who practices in New York, said the state is the largest to change to FPA. He said the state and others that have moved to FPA have determined that there “has been no lapse in quality care” during the emergency order period and that the regulatory barriers kept NPs from providing access to care.

Jones said that the law also will allow NPs to open private practices and serve underserved patients in areas that lack access to health care. “This is a step to improve access to health care and health equity of the New York population.”

It’s been a while since another state passed FPA legislation, Massachusetts in January 2021 and Delaware in August 2021, according to AANP.

Earlier this month, AANP released new data showing a 9% increase in NPs licensed to practice in the United States, rising from 325,000 in May 2021 to 355,000.

The New York legislation “will help New York attract and retain nurse practitioners and provide New Yorkers better access to quality care,” AANP President April Kapu, DNP, APRN, said in a statement.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

With New York nurse practitioners recently gaining full practice authority (FPA), half of the country’s NPs now have the ability to provide patients with easier access to care, according to leading national nurse organizations.

New York joins 24 other states, the District of Columbia, and two U.S. territories that have adopted FPA legislation, as reported by the American Association of Nurse Practitioners (AANP). Like other states, New York has been under an emergency order during the pandemic that allowed NPs to practice to their full authority because of staffing shortages. That order was extended multiple times and was expected to expire this month, AANP reports.

“This has been in the making for nurse practitioners in New York since 2014, trying to get full practice authority,” Michelle Jones, RN, MSN, ANP-C, director at large for the New York State Nurses Association, said in an interview.

NPs who were allowed to practice independently during the pandemic campaigned for that provision to become permanent once the emergency order expired, she said. Ms. Jones explained that the FPA law expands the scope of practice and “removes unnecessary barriers,” namely an agreement with doctors to oversee NPs’ actions.

FPA gives NPs the authority to evaluate patients; diagnose, order, and interpret diagnostic tests; and initiate and manage treatments – including prescribing medications – without oversight by a doctor or state medical board, according to AANP.

Before the pandemic, New York NPs had “reduced” practice authority with those who had more than 3,600 hours of experience required to maintain a collaborative practice agreement with doctors and those with less experience maintaining a written agreement. The change gives full practice authority to those with more than 3,600 hours of experience, Stephen A. Ferrara, DNP, FNP-BC, AANP regional director, said in an interview.

Ferrara, who practices in New York, said the state is the largest to change to FPA. He said the state and others that have moved to FPA have determined that there “has been no lapse in quality care” during the emergency order period and that the regulatory barriers kept NPs from providing access to care.

Jones said that the law also will allow NPs to open private practices and serve underserved patients in areas that lack access to health care. “This is a step to improve access to health care and health equity of the New York population.”

It’s been a while since another state passed FPA legislation, Massachusetts in January 2021 and Delaware in August 2021, according to AANP.

Earlier this month, AANP released new data showing a 9% increase in NPs licensed to practice in the United States, rising from 325,000 in May 2021 to 355,000.

The New York legislation “will help New York attract and retain nurse practitioners and provide New Yorkers better access to quality care,” AANP President April Kapu, DNP, APRN, said in a statement.

A version of this article first appeared on Medscape.com.

With New York nurse practitioners recently gaining full practice authority (FPA), half of the country’s NPs now have the ability to provide patients with easier access to care, according to leading national nurse organizations.

New York joins 24 other states, the District of Columbia, and two U.S. territories that have adopted FPA legislation, as reported by the American Association of Nurse Practitioners (AANP). Like other states, New York has been under an emergency order during the pandemic that allowed NPs to practice to their full authority because of staffing shortages. That order was extended multiple times and was expected to expire this month, AANP reports.

“This has been in the making for nurse practitioners in New York since 2014, trying to get full practice authority,” Michelle Jones, RN, MSN, ANP-C, director at large for the New York State Nurses Association, said in an interview.

NPs who were allowed to practice independently during the pandemic campaigned for that provision to become permanent once the emergency order expired, she said. Ms. Jones explained that the FPA law expands the scope of practice and “removes unnecessary barriers,” namely an agreement with doctors to oversee NPs’ actions.

FPA gives NPs the authority to evaluate patients; diagnose, order, and interpret diagnostic tests; and initiate and manage treatments – including prescribing medications – without oversight by a doctor or state medical board, according to AANP.

Before the pandemic, New York NPs had “reduced” practice authority with those who had more than 3,600 hours of experience required to maintain a collaborative practice agreement with doctors and those with less experience maintaining a written agreement. The change gives full practice authority to those with more than 3,600 hours of experience, Stephen A. Ferrara, DNP, FNP-BC, AANP regional director, said in an interview.

Ferrara, who practices in New York, said the state is the largest to change to FPA. He said the state and others that have moved to FPA have determined that there “has been no lapse in quality care” during the emergency order period and that the regulatory barriers kept NPs from providing access to care.

Jones said that the law also will allow NPs to open private practices and serve underserved patients in areas that lack access to health care. “This is a step to improve access to health care and health equity of the New York population.”

It’s been a while since another state passed FPA legislation, Massachusetts in January 2021 and Delaware in August 2021, according to AANP.

Earlier this month, AANP released new data showing a 9% increase in NPs licensed to practice in the United States, rising from 325,000 in May 2021 to 355,000.

The New York legislation “will help New York attract and retain nurse practitioners and provide New Yorkers better access to quality care,” AANP President April Kapu, DNP, APRN, said in a statement.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article