NAFLD progresses to cirrhosis in young and old at similar rate

Article Type
Changed
Wed, 11/02/2022 - 14:17

– Metabolic and genetic risk factors for nonalcoholic fatty liver disease (NAFLD) vary across the age spectrum, but once steatosis has started, the risk of progression to cirrhosis is similar for both young and old, investigators found.

At a large Midwest medical center, younger adults were more likely than older patients to have a high-risk gene variant predisposing carriers to NAFLD. And they were less likely than their senior counterparts to have metabolic risk factors, reported Matthew J. Miller, MD, a 3rd-year resident in the department of internal medicine at the University of Michigan Hospital in Ann Arbor.

“Progression to cirrhosis was similar in patients younger than 40, compared to older patients, suggesting NAFLD in the young should not be considered more benign than in older patients,” he said in an oral abstract presented at the annual meeting of the American College of Gastroenterology.

MDedge News/Neil Osterweil
Dr. Matthew J. Miller

The prevalence of NAFLD among younger adults is increasing, but it’s still unknown whether the course of NAFLD is more benign in these patients than in older adults.

In addition, the rate of progression to cirrhosis in patients with NAFLD can vary, making it difficult to predict those patients most at risk for advanced liver disease, Dr. Miller said.

He and his colleagues sought to characterize genetic and metabolic risk factors for NAFLD and their effects on disease progression in patients from 18 to 40 years, 40 to 59 years, and 60 and older.

The investigators collected data on patients with documented objective evidence of NAFLD seen at the Michigan Medicine health care system from 2010 through 2021.

They identified NAFLD by hepatic steatosis on imaging, biopsy, or transient elastography in the absence of other chronic liver diseases, with the earliest date of a hepatic steatosis diagnosis determined to be the index date.

The investigators determined the presence of cirrhosis using validated International Classification of Diseases version 9 or 10 codes, with incident cirrhosis defined as any new cirrhosis diagnosis at least 1 year after the index date.

They also looked at the frequency of known NAFLD risk alleles in a subset of patients with available genetic data.

They divided 31,505 patients into three age groups for comparison: 8,252 patients age 18-39 at the time of steatosis identification, 15,035 age 40-59, and 8,218 age 60 or older.

Of the full cohort, 804 had cirrhosis at the index date, and 388 others developed incident cirrhosis during 128,090 person-years of follow-up.

The prevalence of hypertension, hyperlipidemia, and diabetes were significantly lower in the youngest group, compared with the two older groups, but the youngest patients had a higher prevalence of obesity than the other two groups, with a significantly higher prevalence of class 3 (morbid) obesity.

Of the 4,359 patients with genetic data available, the NAFLD-promoting PNPLA3-rs738409-G allele was more common in the young, compared with the other two age groups (P = .016).

When the investigators looked at the ability of three laboratory tests – the AST to Platelet Ratio Index (APRI), Fibrosis-4 (FIB4), and NAFLD fibrosis score for identifying prevalent cirrhosis – they found that the scores performed similarly for patients in the 40-59 group, but the NFS did less well among patients in the 18-39 group. There were no significant differences among the three age groups in the risk for incident cirrhosis over 10 years.

The study helps to answer some of the questions surrounding differences in risk factors across the age spectrum, commented Patricia Jones, MD, MSCR, from the University of Miami.

“We wonder how these people with fatty liver are different. Do younger people have a more malignant course? Are they going to progress more rapidly than others, or not? Because if you think of a disease like fatty liver or for that matter any metabolic syndrome–based disease, it’s a spectrum and a continuum, and by the time you’re diagnosed you’ve already had that condition, so it’s really more interesting to me when people are diagnosed, because diagnosing at a younger age allows you to intervene earlier,” she said in an interview.

Dr. Jones said that she was also interested in exploring how the genetic data might be used to improve care for patients, perhaps by testing for the high-risk allele in routine clinical practice.

“It will be interesting to see how people with this allele progress, independently of whether they’re diagnosed at 40, 50, or 60,” she said.

Dr. Jones was a moderator of the session where Dr. Williams presented his data.

Comoderator Mitchell A. Mah’moud, MD, FACG from Duke University in Durham, N.C., commented in an interview that, “with the medications we have available, maybe we can target these patients and prevent progression to cirrhosis and some of the decompensation that we see.”

The study authors did not disclose a funding source. Dr. Miller, Dr. Jones, and Dr. Mah’moud all reported having no relevant financial disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Metabolic and genetic risk factors for nonalcoholic fatty liver disease (NAFLD) vary across the age spectrum, but once steatosis has started, the risk of progression to cirrhosis is similar for both young and old, investigators found.

At a large Midwest medical center, younger adults were more likely than older patients to have a high-risk gene variant predisposing carriers to NAFLD. And they were less likely than their senior counterparts to have metabolic risk factors, reported Matthew J. Miller, MD, a 3rd-year resident in the department of internal medicine at the University of Michigan Hospital in Ann Arbor.

“Progression to cirrhosis was similar in patients younger than 40, compared to older patients, suggesting NAFLD in the young should not be considered more benign than in older patients,” he said in an oral abstract presented at the annual meeting of the American College of Gastroenterology.

MDedge News/Neil Osterweil
Dr. Matthew J. Miller

The prevalence of NAFLD among younger adults is increasing, but it’s still unknown whether the course of NAFLD is more benign in these patients than in older adults.

In addition, the rate of progression to cirrhosis in patients with NAFLD can vary, making it difficult to predict those patients most at risk for advanced liver disease, Dr. Miller said.

He and his colleagues sought to characterize genetic and metabolic risk factors for NAFLD and their effects on disease progression in patients from 18 to 40 years, 40 to 59 years, and 60 and older.

The investigators collected data on patients with documented objective evidence of NAFLD seen at the Michigan Medicine health care system from 2010 through 2021.

They identified NAFLD by hepatic steatosis on imaging, biopsy, or transient elastography in the absence of other chronic liver diseases, with the earliest date of a hepatic steatosis diagnosis determined to be the index date.

The investigators determined the presence of cirrhosis using validated International Classification of Diseases version 9 or 10 codes, with incident cirrhosis defined as any new cirrhosis diagnosis at least 1 year after the index date.

They also looked at the frequency of known NAFLD risk alleles in a subset of patients with available genetic data.

They divided 31,505 patients into three age groups for comparison: 8,252 patients age 18-39 at the time of steatosis identification, 15,035 age 40-59, and 8,218 age 60 or older.

Of the full cohort, 804 had cirrhosis at the index date, and 388 others developed incident cirrhosis during 128,090 person-years of follow-up.

The prevalence of hypertension, hyperlipidemia, and diabetes were significantly lower in the youngest group, compared with the two older groups, but the youngest patients had a higher prevalence of obesity than the other two groups, with a significantly higher prevalence of class 3 (morbid) obesity.

Of the 4,359 patients with genetic data available, the NAFLD-promoting PNPLA3-rs738409-G allele was more common in the young, compared with the other two age groups (P = .016).

When the investigators looked at the ability of three laboratory tests – the AST to Platelet Ratio Index (APRI), Fibrosis-4 (FIB4), and NAFLD fibrosis score for identifying prevalent cirrhosis – they found that the scores performed similarly for patients in the 40-59 group, but the NFS did less well among patients in the 18-39 group. There were no significant differences among the three age groups in the risk for incident cirrhosis over 10 years.

The study helps to answer some of the questions surrounding differences in risk factors across the age spectrum, commented Patricia Jones, MD, MSCR, from the University of Miami.

“We wonder how these people with fatty liver are different. Do younger people have a more malignant course? Are they going to progress more rapidly than others, or not? Because if you think of a disease like fatty liver or for that matter any metabolic syndrome–based disease, it’s a spectrum and a continuum, and by the time you’re diagnosed you’ve already had that condition, so it’s really more interesting to me when people are diagnosed, because diagnosing at a younger age allows you to intervene earlier,” she said in an interview.

Dr. Jones said that she was also interested in exploring how the genetic data might be used to improve care for patients, perhaps by testing for the high-risk allele in routine clinical practice.

“It will be interesting to see how people with this allele progress, independently of whether they’re diagnosed at 40, 50, or 60,” she said.

Dr. Jones was a moderator of the session where Dr. Williams presented his data.

Comoderator Mitchell A. Mah’moud, MD, FACG from Duke University in Durham, N.C., commented in an interview that, “with the medications we have available, maybe we can target these patients and prevent progression to cirrhosis and some of the decompensation that we see.”

The study authors did not disclose a funding source. Dr. Miller, Dr. Jones, and Dr. Mah’moud all reported having no relevant financial disclosures.

– Metabolic and genetic risk factors for nonalcoholic fatty liver disease (NAFLD) vary across the age spectrum, but once steatosis has started, the risk of progression to cirrhosis is similar for both young and old, investigators found.

At a large Midwest medical center, younger adults were more likely than older patients to have a high-risk gene variant predisposing carriers to NAFLD. And they were less likely than their senior counterparts to have metabolic risk factors, reported Matthew J. Miller, MD, a 3rd-year resident in the department of internal medicine at the University of Michigan Hospital in Ann Arbor.

“Progression to cirrhosis was similar in patients younger than 40, compared to older patients, suggesting NAFLD in the young should not be considered more benign than in older patients,” he said in an oral abstract presented at the annual meeting of the American College of Gastroenterology.

MDedge News/Neil Osterweil
Dr. Matthew J. Miller

The prevalence of NAFLD among younger adults is increasing, but it’s still unknown whether the course of NAFLD is more benign in these patients than in older adults.

In addition, the rate of progression to cirrhosis in patients with NAFLD can vary, making it difficult to predict those patients most at risk for advanced liver disease, Dr. Miller said.

He and his colleagues sought to characterize genetic and metabolic risk factors for NAFLD and their effects on disease progression in patients from 18 to 40 years, 40 to 59 years, and 60 and older.

The investigators collected data on patients with documented objective evidence of NAFLD seen at the Michigan Medicine health care system from 2010 through 2021.

They identified NAFLD by hepatic steatosis on imaging, biopsy, or transient elastography in the absence of other chronic liver diseases, with the earliest date of a hepatic steatosis diagnosis determined to be the index date.

The investigators determined the presence of cirrhosis using validated International Classification of Diseases version 9 or 10 codes, with incident cirrhosis defined as any new cirrhosis diagnosis at least 1 year after the index date.

They also looked at the frequency of known NAFLD risk alleles in a subset of patients with available genetic data.

They divided 31,505 patients into three age groups for comparison: 8,252 patients age 18-39 at the time of steatosis identification, 15,035 age 40-59, and 8,218 age 60 or older.

Of the full cohort, 804 had cirrhosis at the index date, and 388 others developed incident cirrhosis during 128,090 person-years of follow-up.

The prevalence of hypertension, hyperlipidemia, and diabetes were significantly lower in the youngest group, compared with the two older groups, but the youngest patients had a higher prevalence of obesity than the other two groups, with a significantly higher prevalence of class 3 (morbid) obesity.

Of the 4,359 patients with genetic data available, the NAFLD-promoting PNPLA3-rs738409-G allele was more common in the young, compared with the other two age groups (P = .016).

When the investigators looked at the ability of three laboratory tests – the AST to Platelet Ratio Index (APRI), Fibrosis-4 (FIB4), and NAFLD fibrosis score for identifying prevalent cirrhosis – they found that the scores performed similarly for patients in the 40-59 group, but the NFS did less well among patients in the 18-39 group. There were no significant differences among the three age groups in the risk for incident cirrhosis over 10 years.

The study helps to answer some of the questions surrounding differences in risk factors across the age spectrum, commented Patricia Jones, MD, MSCR, from the University of Miami.

“We wonder how these people with fatty liver are different. Do younger people have a more malignant course? Are they going to progress more rapidly than others, or not? Because if you think of a disease like fatty liver or for that matter any metabolic syndrome–based disease, it’s a spectrum and a continuum, and by the time you’re diagnosed you’ve already had that condition, so it’s really more interesting to me when people are diagnosed, because diagnosing at a younger age allows you to intervene earlier,” she said in an interview.

Dr. Jones said that she was also interested in exploring how the genetic data might be used to improve care for patients, perhaps by testing for the high-risk allele in routine clinical practice.

“It will be interesting to see how people with this allele progress, independently of whether they’re diagnosed at 40, 50, or 60,” she said.

Dr. Jones was a moderator of the session where Dr. Williams presented his data.

Comoderator Mitchell A. Mah’moud, MD, FACG from Duke University in Durham, N.C., commented in an interview that, “with the medications we have available, maybe we can target these patients and prevent progression to cirrhosis and some of the decompensation that we see.”

The study authors did not disclose a funding source. Dr. Miller, Dr. Jones, and Dr. Mah’moud all reported having no relevant financial disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT ACG 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Research ties gout in women to comorbidities more than genetics

Article Type
Changed
Mon, 10/31/2022 - 13:01

Comorbidities may play a greater role than genetics women with gout, although this appears not to be true for men, Nicholas Sumpter, MSc, of the University of Alabama at Birmingham said at the annual research symposium of the Gout, Hyperuricemia, and Crystal Associated Disease Network (G-CAN).

Mr. Sumpter was among the authors of a recent paper in Arthritis & Rheumatology that suggested that earlier gout onset involves the accumulation of certain allelic variants in men. This genetic risk was shared across multiple ancestral groups in the study, conducted with men of European and Polynesian ancestry, Mr. Sumpter and colleagues reported.

“There might be more than one factor in gout in men, but in women we’ve been getting at this idea that comorbidities are the big thing,” he said.

During his presentation, Mr. Sumpter offered a hypothesis that in men there might be a kind of “two-pronged attack,” with increases in serum urate linked to genetic risk, but comorbidities also playing a role. “But that may not be the case for women.”

In his presentation, Mr. Sumpter noted a paper published in March 2022 from his University of Alabama at Birmingham colleagues, Aakash V. Patel, MD, and Angelo L. Gaffo, MD. In the article, Dr. Patel and Dr. Gaffo delved into the challenges of treating women with gout given “the paucity of appropriately well-powered, randomized-controlled trials investigating the efficacy” of commonly used treatments.



“This poses major challenges for the management of female gout patients since they carry a greater burden of cardiovascular and renal morbidity, which is known to modulate the pathophysiology of gout; as such, conclusions regarding the efficacy of treatments for females cannot be extrapolated from investigative studies that are predominantly male,” they wrote, calling for increased efforts to enroll women in studies of treatments for this condition.

There’s increased interest in how gout affects women, including findings in a paper published in September in Arthritis & Rheumatology that found people with gout, especially women, appear to be at higher risk for poor COVID-19 outcomes, including hospitalization and death, regardless of COVID-19 vaccination status.

Gout has become more common in women, although this remains a condition that is far more likely to strike men.

The age-standardized prevalence of gout among women rose from 233.52 per 100,000 in 1990 to 253.49 in 2017, a gain of about 9%, according to a systematic analysis of the Global Burden of Disease Study.

That topped the roughly 5% gain seen for men in the same time frame, with the rate going from 747.48 per 100,000 to 790.90. With the aging of the global population, gout’s burden in terms of prevalence and disability is expected to increase.

Impact of obesity and healthy eating patterns

Obesity, or excess adiposity, appears to be of particular concern for women in terms of gout risk.

While obesity and genetic predisposition both are strongly associated with a higher risk of gout, the excess risk of both combined was higher than the sum of each, particularly among women, Natalie McCormick, PhD, of Massachusetts General Hospital, Boston, and coauthors reported in Annals of the Rheumatic Diseases.

These findings suggested that “addressing excess adiposity could prevent a large proportion of female gout cases in particular, as well as its cardiometabolic comorbidities, and the benefit could be greater in genetically predisposed women,” they wrote.

In general, there’s a need to re-examine the advice given by many clinicians in the past that people with gout, or those at risk for it, should follow a low-protein diet to avoid purines, Dr. McCormick said in an interview.



“Now we’re finding that a healthier diet that balances protein as well as fat intake can actually be better both for cardiovascular health and for gout prevention,” she said.

Dr. McCormick’s research on this topic includes a 2022 JAMA Internal Medicine article, and a 2021 article in Current Rheumatology Reports. In the latter article, Dr. McCormick and colleagues examined the benefits of changing habits for patients, such as following one of several well-established healthy eating patterns, including the Mediterranean and DASH diets.

With excess weight and associated cardiovascular and endocrine risks already elevated among people with gout, especially women, the “conventional low-purine (i.e., low-protein) approach to gout dietary guidance is neither helpful nor sustainable and may lead to detrimental effects related to worsening insulin resistance as a result of substitution of healthy proteins with unhealthy carbohydrates or fats,” they wrote. “Rather, by focusing our dietary recommendations on healthy eating patterns which have been proven to reduce cardiometabolic risk factors, as opposed to singular ‘good’ or ‘bad’ food items or groups, the beneficial effects of such diets on relevant gout endpoints should naturally follow for the majority of typical gout cases, mediated through changes in insulin resistance.”

Mr. Sumpter and Dr. McCormick had no competing interests to declare.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Comorbidities may play a greater role than genetics women with gout, although this appears not to be true for men, Nicholas Sumpter, MSc, of the University of Alabama at Birmingham said at the annual research symposium of the Gout, Hyperuricemia, and Crystal Associated Disease Network (G-CAN).

Mr. Sumpter was among the authors of a recent paper in Arthritis & Rheumatology that suggested that earlier gout onset involves the accumulation of certain allelic variants in men. This genetic risk was shared across multiple ancestral groups in the study, conducted with men of European and Polynesian ancestry, Mr. Sumpter and colleagues reported.

“There might be more than one factor in gout in men, but in women we’ve been getting at this idea that comorbidities are the big thing,” he said.

During his presentation, Mr. Sumpter offered a hypothesis that in men there might be a kind of “two-pronged attack,” with increases in serum urate linked to genetic risk, but comorbidities also playing a role. “But that may not be the case for women.”

In his presentation, Mr. Sumpter noted a paper published in March 2022 from his University of Alabama at Birmingham colleagues, Aakash V. Patel, MD, and Angelo L. Gaffo, MD. In the article, Dr. Patel and Dr. Gaffo delved into the challenges of treating women with gout given “the paucity of appropriately well-powered, randomized-controlled trials investigating the efficacy” of commonly used treatments.



“This poses major challenges for the management of female gout patients since they carry a greater burden of cardiovascular and renal morbidity, which is known to modulate the pathophysiology of gout; as such, conclusions regarding the efficacy of treatments for females cannot be extrapolated from investigative studies that are predominantly male,” they wrote, calling for increased efforts to enroll women in studies of treatments for this condition.

There’s increased interest in how gout affects women, including findings in a paper published in September in Arthritis & Rheumatology that found people with gout, especially women, appear to be at higher risk for poor COVID-19 outcomes, including hospitalization and death, regardless of COVID-19 vaccination status.

Gout has become more common in women, although this remains a condition that is far more likely to strike men.

The age-standardized prevalence of gout among women rose from 233.52 per 100,000 in 1990 to 253.49 in 2017, a gain of about 9%, according to a systematic analysis of the Global Burden of Disease Study.

That topped the roughly 5% gain seen for men in the same time frame, with the rate going from 747.48 per 100,000 to 790.90. With the aging of the global population, gout’s burden in terms of prevalence and disability is expected to increase.

Impact of obesity and healthy eating patterns

Obesity, or excess adiposity, appears to be of particular concern for women in terms of gout risk.

While obesity and genetic predisposition both are strongly associated with a higher risk of gout, the excess risk of both combined was higher than the sum of each, particularly among women, Natalie McCormick, PhD, of Massachusetts General Hospital, Boston, and coauthors reported in Annals of the Rheumatic Diseases.

These findings suggested that “addressing excess adiposity could prevent a large proportion of female gout cases in particular, as well as its cardiometabolic comorbidities, and the benefit could be greater in genetically predisposed women,” they wrote.

In general, there’s a need to re-examine the advice given by many clinicians in the past that people with gout, or those at risk for it, should follow a low-protein diet to avoid purines, Dr. McCormick said in an interview.



“Now we’re finding that a healthier diet that balances protein as well as fat intake can actually be better both for cardiovascular health and for gout prevention,” she said.

Dr. McCormick’s research on this topic includes a 2022 JAMA Internal Medicine article, and a 2021 article in Current Rheumatology Reports. In the latter article, Dr. McCormick and colleagues examined the benefits of changing habits for patients, such as following one of several well-established healthy eating patterns, including the Mediterranean and DASH diets.

With excess weight and associated cardiovascular and endocrine risks already elevated among people with gout, especially women, the “conventional low-purine (i.e., low-protein) approach to gout dietary guidance is neither helpful nor sustainable and may lead to detrimental effects related to worsening insulin resistance as a result of substitution of healthy proteins with unhealthy carbohydrates or fats,” they wrote. “Rather, by focusing our dietary recommendations on healthy eating patterns which have been proven to reduce cardiometabolic risk factors, as opposed to singular ‘good’ or ‘bad’ food items or groups, the beneficial effects of such diets on relevant gout endpoints should naturally follow for the majority of typical gout cases, mediated through changes in insulin resistance.”

Mr. Sumpter and Dr. McCormick had no competing interests to declare.

Comorbidities may play a greater role than genetics women with gout, although this appears not to be true for men, Nicholas Sumpter, MSc, of the University of Alabama at Birmingham said at the annual research symposium of the Gout, Hyperuricemia, and Crystal Associated Disease Network (G-CAN).

Mr. Sumpter was among the authors of a recent paper in Arthritis & Rheumatology that suggested that earlier gout onset involves the accumulation of certain allelic variants in men. This genetic risk was shared across multiple ancestral groups in the study, conducted with men of European and Polynesian ancestry, Mr. Sumpter and colleagues reported.

“There might be more than one factor in gout in men, but in women we’ve been getting at this idea that comorbidities are the big thing,” he said.

During his presentation, Mr. Sumpter offered a hypothesis that in men there might be a kind of “two-pronged attack,” with increases in serum urate linked to genetic risk, but comorbidities also playing a role. “But that may not be the case for women.”

In his presentation, Mr. Sumpter noted a paper published in March 2022 from his University of Alabama at Birmingham colleagues, Aakash V. Patel, MD, and Angelo L. Gaffo, MD. In the article, Dr. Patel and Dr. Gaffo delved into the challenges of treating women with gout given “the paucity of appropriately well-powered, randomized-controlled trials investigating the efficacy” of commonly used treatments.



“This poses major challenges for the management of female gout patients since they carry a greater burden of cardiovascular and renal morbidity, which is known to modulate the pathophysiology of gout; as such, conclusions regarding the efficacy of treatments for females cannot be extrapolated from investigative studies that are predominantly male,” they wrote, calling for increased efforts to enroll women in studies of treatments for this condition.

There’s increased interest in how gout affects women, including findings in a paper published in September in Arthritis & Rheumatology that found people with gout, especially women, appear to be at higher risk for poor COVID-19 outcomes, including hospitalization and death, regardless of COVID-19 vaccination status.

Gout has become more common in women, although this remains a condition that is far more likely to strike men.

The age-standardized prevalence of gout among women rose from 233.52 per 100,000 in 1990 to 253.49 in 2017, a gain of about 9%, according to a systematic analysis of the Global Burden of Disease Study.

That topped the roughly 5% gain seen for men in the same time frame, with the rate going from 747.48 per 100,000 to 790.90. With the aging of the global population, gout’s burden in terms of prevalence and disability is expected to increase.

Impact of obesity and healthy eating patterns

Obesity, or excess adiposity, appears to be of particular concern for women in terms of gout risk.

While obesity and genetic predisposition both are strongly associated with a higher risk of gout, the excess risk of both combined was higher than the sum of each, particularly among women, Natalie McCormick, PhD, of Massachusetts General Hospital, Boston, and coauthors reported in Annals of the Rheumatic Diseases.

These findings suggested that “addressing excess adiposity could prevent a large proportion of female gout cases in particular, as well as its cardiometabolic comorbidities, and the benefit could be greater in genetically predisposed women,” they wrote.

In general, there’s a need to re-examine the advice given by many clinicians in the past that people with gout, or those at risk for it, should follow a low-protein diet to avoid purines, Dr. McCormick said in an interview.



“Now we’re finding that a healthier diet that balances protein as well as fat intake can actually be better both for cardiovascular health and for gout prevention,” she said.

Dr. McCormick’s research on this topic includes a 2022 JAMA Internal Medicine article, and a 2021 article in Current Rheumatology Reports. In the latter article, Dr. McCormick and colleagues examined the benefits of changing habits for patients, such as following one of several well-established healthy eating patterns, including the Mediterranean and DASH diets.

With excess weight and associated cardiovascular and endocrine risks already elevated among people with gout, especially women, the “conventional low-purine (i.e., low-protein) approach to gout dietary guidance is neither helpful nor sustainable and may lead to detrimental effects related to worsening insulin resistance as a result of substitution of healthy proteins with unhealthy carbohydrates or fats,” they wrote. “Rather, by focusing our dietary recommendations on healthy eating patterns which have been proven to reduce cardiometabolic risk factors, as opposed to singular ‘good’ or ‘bad’ food items or groups, the beneficial effects of such diets on relevant gout endpoints should naturally follow for the majority of typical gout cases, mediated through changes in insulin resistance.”

Mr. Sumpter and Dr. McCormick had no competing interests to declare.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM G-CAN 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Syphilis screening: Who and when

Article Type
Changed
Wed, 11/16/2022 - 17:03
Display Headline
Syphilis screening: Who and when

The US Preventive Services Task Force (USPSTF) published updated recommendations on screening for syphilis on September 27.1 The Task Force continues to recommend screening for all adolescents and adults who are at increased risk for infection. (As part of previous recommendations, the USPSTF also advocates screening all pregnant women for syphilis early in their pregnancy to prevent congenital syphilis.2)

Who is at increased risk? Men who have sex with men (MSM), those with HIV or other sexually transmitted infections (STIs), those who use illicit drugs, and those with a history of incarceration, sex work, or military service are considered to be at increased risk for syphilis. Additionally, since state and local health departments collect and publish STI incidence data, it’s important to stay up to date on how common syphilis is in one’s community and tailor screening practices accordingly.

Men account for more than 80% of all primary and secondary syphilis infections, and MSM account for 53% of cases in men.3 The highest rates of syphilis are in men ages 25-29 years and 30-34 years (58.1 and 55.7 cases per 100,000, respectively).3

Why screening is important. Primary and secondary syphilis rates have increased steadily from an all-time low of 2.1 per 100,000 in 2000 to 12.7 per 100,000 in 2020.4 There were 171,074 cases reported in 2021.5

If not detected and treated, syphilis will progress from the primary and secondary stages to a latent form. About one-third of those with latent syphilis will develop tertiary syphilis, which can affect every organ system and cause multiple neurologic disorders.

How to screen. Syphilis screening typically involves a 2-step process. The first test that should be performed is a Venereal Disease Research Laboratory (VDRL) or rapid plasma reagin (RPR) test. This is followed by a treponemal antibody test if the initial test is positive. While the VDRL and RPR tests have high sensitivity, many other conditions can cause a false-positive result, necessitating confirmation with the more specific antibody test.

As far as frequency, the Task Force suggests screening annually for those at continued risk and more frequently (every 3 or 6 months) for those at highest risk.

Treatment for primary, secondary, and early latent syphilis (< 1 year’s duration) is a single intramuscular (IM) injection of benzathine penicillin, 2.4 million units. For late latent syphilis or syphilis of unknown duration, treatment is benzathine penicillin, 2.4 million units, administered in 3 weekly IM doses.

Treatment for those with penicillin allergies depends on the stage of syphilis and whether or not the patient is pregnant. Refer to the STD treatment guidelines for guidance.6

The CDC recommends presumptive treatment for anyone who has had sexual contact in the past 90 days with a person who’s been given a diagnosis of primary, secondary, or early latent syphilis.6

And finally, remember that all STIs are reportable to your local health department, which can assist with contract tracing and treatment follow-up.

References

1. USPSTF. Syphilis infection in nonpregnant adolescents and adults: Screening. Final recommendation statement. September 27, 2022. Accessed October 25, 2022. https://uspreventiveservicestaskforce.org/uspstf/recommendation/syphilis-infection-nonpregnant-adults-adolescents-screening

2. USPSTF. Syphilis infection in pregnant women: screening. Final recommendation statement. September 4, 2018. Accessed October 25, 2022. https://www.uspreventiveservicestaskforce.org/uspstf/recommendation/syphilis-infection-in-pregnancy-screening

3. CDC. Sexually transmitted disease surveillance 2020: syphilis. Updated August 22, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2020/figures/2020-STD-Surveillance-Syphilis.pptx

4. CDC. Sexually transmitted disease surveillance 2020. Table 1: Sexually transmitted diseases—reported cases and rates of reported cases, United States, 1941-2020. Updated April 12, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2020/tables/1.htm

5. CDC. Preliminary 2021 STD surveillance data. Updated September 1, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2021/default.htm

6. Workowski KA, Bachmann LH, Chan PA, et al. Sexually transmitted infections treatment guidelines, 2021. MMWR Recommend Rep. 2021;70:1-187.

Author and Disclosure Information

Doug Campos-Outcalt, MD, MPA, is a clinical professor at the University of Arizona College of Medicine and a senior lecturer with the University of Arizona College of Public Health. He’s also an assistant editor at The Journal of Family Practice.

The author is a paid consultant to the CDC’s Advisory Committee on Immunization Practices.

Issue
The Journal of Family Practice - 71(9)
Publications
Topics
Sections
Author and Disclosure Information

Doug Campos-Outcalt, MD, MPA, is a clinical professor at the University of Arizona College of Medicine and a senior lecturer with the University of Arizona College of Public Health. He’s also an assistant editor at The Journal of Family Practice.

The author is a paid consultant to the CDC’s Advisory Committee on Immunization Practices.

Author and Disclosure Information

Doug Campos-Outcalt, MD, MPA, is a clinical professor at the University of Arizona College of Medicine and a senior lecturer with the University of Arizona College of Public Health. He’s also an assistant editor at The Journal of Family Practice.

The author is a paid consultant to the CDC’s Advisory Committee on Immunization Practices.

The US Preventive Services Task Force (USPSTF) published updated recommendations on screening for syphilis on September 27.1 The Task Force continues to recommend screening for all adolescents and adults who are at increased risk for infection. (As part of previous recommendations, the USPSTF also advocates screening all pregnant women for syphilis early in their pregnancy to prevent congenital syphilis.2)

Who is at increased risk? Men who have sex with men (MSM), those with HIV or other sexually transmitted infections (STIs), those who use illicit drugs, and those with a history of incarceration, sex work, or military service are considered to be at increased risk for syphilis. Additionally, since state and local health departments collect and publish STI incidence data, it’s important to stay up to date on how common syphilis is in one’s community and tailor screening practices accordingly.

Men account for more than 80% of all primary and secondary syphilis infections, and MSM account for 53% of cases in men.3 The highest rates of syphilis are in men ages 25-29 years and 30-34 years (58.1 and 55.7 cases per 100,000, respectively).3

Why screening is important. Primary and secondary syphilis rates have increased steadily from an all-time low of 2.1 per 100,000 in 2000 to 12.7 per 100,000 in 2020.4 There were 171,074 cases reported in 2021.5

If not detected and treated, syphilis will progress from the primary and secondary stages to a latent form. About one-third of those with latent syphilis will develop tertiary syphilis, which can affect every organ system and cause multiple neurologic disorders.

How to screen. Syphilis screening typically involves a 2-step process. The first test that should be performed is a Venereal Disease Research Laboratory (VDRL) or rapid plasma reagin (RPR) test. This is followed by a treponemal antibody test if the initial test is positive. While the VDRL and RPR tests have high sensitivity, many other conditions can cause a false-positive result, necessitating confirmation with the more specific antibody test.

As far as frequency, the Task Force suggests screening annually for those at continued risk and more frequently (every 3 or 6 months) for those at highest risk.

Treatment for primary, secondary, and early latent syphilis (< 1 year’s duration) is a single intramuscular (IM) injection of benzathine penicillin, 2.4 million units. For late latent syphilis or syphilis of unknown duration, treatment is benzathine penicillin, 2.4 million units, administered in 3 weekly IM doses.

Treatment for those with penicillin allergies depends on the stage of syphilis and whether or not the patient is pregnant. Refer to the STD treatment guidelines for guidance.6

The CDC recommends presumptive treatment for anyone who has had sexual contact in the past 90 days with a person who’s been given a diagnosis of primary, secondary, or early latent syphilis.6

And finally, remember that all STIs are reportable to your local health department, which can assist with contract tracing and treatment follow-up.

The US Preventive Services Task Force (USPSTF) published updated recommendations on screening for syphilis on September 27.1 The Task Force continues to recommend screening for all adolescents and adults who are at increased risk for infection. (As part of previous recommendations, the USPSTF also advocates screening all pregnant women for syphilis early in their pregnancy to prevent congenital syphilis.2)

Who is at increased risk? Men who have sex with men (MSM), those with HIV or other sexually transmitted infections (STIs), those who use illicit drugs, and those with a history of incarceration, sex work, or military service are considered to be at increased risk for syphilis. Additionally, since state and local health departments collect and publish STI incidence data, it’s important to stay up to date on how common syphilis is in one’s community and tailor screening practices accordingly.

Men account for more than 80% of all primary and secondary syphilis infections, and MSM account for 53% of cases in men.3 The highest rates of syphilis are in men ages 25-29 years and 30-34 years (58.1 and 55.7 cases per 100,000, respectively).3

Why screening is important. Primary and secondary syphilis rates have increased steadily from an all-time low of 2.1 per 100,000 in 2000 to 12.7 per 100,000 in 2020.4 There were 171,074 cases reported in 2021.5

If not detected and treated, syphilis will progress from the primary and secondary stages to a latent form. About one-third of those with latent syphilis will develop tertiary syphilis, which can affect every organ system and cause multiple neurologic disorders.

How to screen. Syphilis screening typically involves a 2-step process. The first test that should be performed is a Venereal Disease Research Laboratory (VDRL) or rapid plasma reagin (RPR) test. This is followed by a treponemal antibody test if the initial test is positive. While the VDRL and RPR tests have high sensitivity, many other conditions can cause a false-positive result, necessitating confirmation with the more specific antibody test.

As far as frequency, the Task Force suggests screening annually for those at continued risk and more frequently (every 3 or 6 months) for those at highest risk.

Treatment for primary, secondary, and early latent syphilis (< 1 year’s duration) is a single intramuscular (IM) injection of benzathine penicillin, 2.4 million units. For late latent syphilis or syphilis of unknown duration, treatment is benzathine penicillin, 2.4 million units, administered in 3 weekly IM doses.

Treatment for those with penicillin allergies depends on the stage of syphilis and whether or not the patient is pregnant. Refer to the STD treatment guidelines for guidance.6

The CDC recommends presumptive treatment for anyone who has had sexual contact in the past 90 days with a person who’s been given a diagnosis of primary, secondary, or early latent syphilis.6

And finally, remember that all STIs are reportable to your local health department, which can assist with contract tracing and treatment follow-up.

References

1. USPSTF. Syphilis infection in nonpregnant adolescents and adults: Screening. Final recommendation statement. September 27, 2022. Accessed October 25, 2022. https://uspreventiveservicestaskforce.org/uspstf/recommendation/syphilis-infection-nonpregnant-adults-adolescents-screening

2. USPSTF. Syphilis infection in pregnant women: screening. Final recommendation statement. September 4, 2018. Accessed October 25, 2022. https://www.uspreventiveservicestaskforce.org/uspstf/recommendation/syphilis-infection-in-pregnancy-screening

3. CDC. Sexually transmitted disease surveillance 2020: syphilis. Updated August 22, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2020/figures/2020-STD-Surveillance-Syphilis.pptx

4. CDC. Sexually transmitted disease surveillance 2020. Table 1: Sexually transmitted diseases—reported cases and rates of reported cases, United States, 1941-2020. Updated April 12, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2020/tables/1.htm

5. CDC. Preliminary 2021 STD surveillance data. Updated September 1, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2021/default.htm

6. Workowski KA, Bachmann LH, Chan PA, et al. Sexually transmitted infections treatment guidelines, 2021. MMWR Recommend Rep. 2021;70:1-187.

References

1. USPSTF. Syphilis infection in nonpregnant adolescents and adults: Screening. Final recommendation statement. September 27, 2022. Accessed October 25, 2022. https://uspreventiveservicestaskforce.org/uspstf/recommendation/syphilis-infection-nonpregnant-adults-adolescents-screening

2. USPSTF. Syphilis infection in pregnant women: screening. Final recommendation statement. September 4, 2018. Accessed October 25, 2022. https://www.uspreventiveservicestaskforce.org/uspstf/recommendation/syphilis-infection-in-pregnancy-screening

3. CDC. Sexually transmitted disease surveillance 2020: syphilis. Updated August 22, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2020/figures/2020-STD-Surveillance-Syphilis.pptx

4. CDC. Sexually transmitted disease surveillance 2020. Table 1: Sexually transmitted diseases—reported cases and rates of reported cases, United States, 1941-2020. Updated April 12, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2020/tables/1.htm

5. CDC. Preliminary 2021 STD surveillance data. Updated September 1, 2022. Accessed October 25, 2022. www.cdc.gov/std/statistics/2021/default.htm

6. Workowski KA, Bachmann LH, Chan PA, et al. Sexually transmitted infections treatment guidelines, 2021. MMWR Recommend Rep. 2021;70:1-187.

Issue
The Journal of Family Practice - 71(9)
Issue
The Journal of Family Practice - 71(9)
Publications
Publications
Topics
Article Type
Display Headline
Syphilis screening: Who and when
Display Headline
Syphilis screening: Who and when
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 10/31/2022 - 12:15
Un-Gate On Date
Mon, 10/31/2022 - 12:15
Use ProPublica
CFC Schedule Remove Status
Mon, 10/31/2022 - 12:15
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Is MRI a viable alternative to lumbar puncture for MS diagnosis?

Article Type
Changed
Mon, 10/31/2022 - 11:52

To diagnose multiple sclerosis (MS), the central vein sign on brain MRI appears to work as well as oligoclonal bands in cerebrospinal fluid, and combining the two biomarkers yields the highest predictive value for MS, a new study indicates.

The presence of oligoclonal bands is “very specific for MS and is obtained by lumbar puncture, which is invasive and can be unpleasant, so it is not an ideal test,” said study investigator Daniel Ontaneda, MD, PhD, with the Mellen Center for MS Treatment and Research at the Cleveland Clinic.

In a pilot study, the central vein sign was “highly correlated with the presence of oligoclonal bands and in many cases could serve to prove that a person has MS without the need for a spinal tap,” Dr. Ontaneda said.

The study was presented at the annual meeting of the European Committee for Treatment and Research in Multiple Sclerosis (ECTRIMS).
 

Reducing the need for lumbar puncture

Oligoclonal bands in cerebrospinal fluid are commonly used as a diagnostic biomarker for MS and can serve to meet the requirement for dissemination in time in the 2017 McDonald criteria. Central vein sign is an emerging neuroimaging biomarker for MS that may improve diagnostic accuracy and reduce the need for lumbar puncture.

For the study, the investigators compared the sensitivity, specificity, and positive predictive value of central vein sign on MRI with that of oligoclonal bands in cerebrospinal fluid for MS diagnosis.

Among the 53 participants, 24 (45%) met 2017 McDonald criteria for dissemination in space and time at baseline, and 27 (51%) met the criteria at 12-month follow-up.

At initial presentation, sensitivity for MS diagnosis was 75% for oligoclonal bands, 83% for central vein sign “Select-3” (3 central vein sign–positive lesions per scan), and 71% for central vein sign “Select-6” (6 central vein sign–positive lesions per scan).

The point estimate of sensitivity of central vein sign was higher than of oligoclonal bands, but there was no significant difference in sensitivities across methods.

Specificity at initial presentation was 76% for oligoclonal bands, 48% for Select-3, and 86% for Select-6.

The presence of oligoclonal bands was more specific than Select-3 for diagnosis of MS at initial presentation (P = .03), as was Select-6 (P = .001). There was no significant difference when comparing cerebrospinal fluid oligoclonal bands with central vein sign Select-6.

At 12-month follow-up, the positive predictive value was 84% for oligoclonal bands and 95% for Select-6; combining oligoclonal bands and Select-6 gave a positive predictive value of 100%.

Dr. Ontaneda said that on the basis of these promising pilot data, the researchers have secured funding from the National Institutes of Health for a prospective study to further investigate the central vein sign as a potential biomarker for MS.

He also said there is “active discussion as to whether central vein sign should be added to the diagnostic criteria for MS.

“We think that it’s probably about time that we have diagnostic biomarkers that are sensitive and specific and can help us do away with complicated criteria to make the diagnosis, in favor of an imaging biomarker,” Dr. Ontaneda said.
 

 

 

A green light for further research

Commenting on the study, Shaheen Lakhan, MD, a neurologist and researcher from Boston, said that, “if an imaging finding on an otherwise routinely done MRI for patients with MS is just as good as analyses from the fluid from a spinal tap, of course, neurologists, and for sure patients, would go for the former.

“However, this study doesn’t fully support that argument just yet. It is retrospective with a tiny sample size, and the full way they standardized assessments and reporting hasn’t been fully reported,” said Dr. Lakhan, who was not involved in the study.

The study does, however, offer a “solid signal to green-light further exploration of a noninvasive assessment that may replace the dreaded spinal tap.

“In general, these principles need to be applied to all our invasive diagnostic criteria from biopsies to risky procedures, and also the incorporation of artificial intelligence/machine learning to aid in standardizing and scaling these assessments – and, frankly, reduce human error in readings,” said Dr. Lakhan.

Funding for the study was provided by the Race to Erase MS Foundation and the NIH. Dr. Ontaneda has received research support from the NIH, the National MS Society, the Patient Centered Outcomes Research Institute, the Race to Erase MS Foundation, Genentech, Sanofi, and Novartis and has consulted for Biogen, Genentech, Sanofi, Janssen, Novartis, and Merck. Dr. Lakhan has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

To diagnose multiple sclerosis (MS), the central vein sign on brain MRI appears to work as well as oligoclonal bands in cerebrospinal fluid, and combining the two biomarkers yields the highest predictive value for MS, a new study indicates.

The presence of oligoclonal bands is “very specific for MS and is obtained by lumbar puncture, which is invasive and can be unpleasant, so it is not an ideal test,” said study investigator Daniel Ontaneda, MD, PhD, with the Mellen Center for MS Treatment and Research at the Cleveland Clinic.

In a pilot study, the central vein sign was “highly correlated with the presence of oligoclonal bands and in many cases could serve to prove that a person has MS without the need for a spinal tap,” Dr. Ontaneda said.

The study was presented at the annual meeting of the European Committee for Treatment and Research in Multiple Sclerosis (ECTRIMS).
 

Reducing the need for lumbar puncture

Oligoclonal bands in cerebrospinal fluid are commonly used as a diagnostic biomarker for MS and can serve to meet the requirement for dissemination in time in the 2017 McDonald criteria. Central vein sign is an emerging neuroimaging biomarker for MS that may improve diagnostic accuracy and reduce the need for lumbar puncture.

For the study, the investigators compared the sensitivity, specificity, and positive predictive value of central vein sign on MRI with that of oligoclonal bands in cerebrospinal fluid for MS diagnosis.

Among the 53 participants, 24 (45%) met 2017 McDonald criteria for dissemination in space and time at baseline, and 27 (51%) met the criteria at 12-month follow-up.

At initial presentation, sensitivity for MS diagnosis was 75% for oligoclonal bands, 83% for central vein sign “Select-3” (3 central vein sign–positive lesions per scan), and 71% for central vein sign “Select-6” (6 central vein sign–positive lesions per scan).

The point estimate of sensitivity of central vein sign was higher than of oligoclonal bands, but there was no significant difference in sensitivities across methods.

Specificity at initial presentation was 76% for oligoclonal bands, 48% for Select-3, and 86% for Select-6.

The presence of oligoclonal bands was more specific than Select-3 for diagnosis of MS at initial presentation (P = .03), as was Select-6 (P = .001). There was no significant difference when comparing cerebrospinal fluid oligoclonal bands with central vein sign Select-6.

At 12-month follow-up, the positive predictive value was 84% for oligoclonal bands and 95% for Select-6; combining oligoclonal bands and Select-6 gave a positive predictive value of 100%.

Dr. Ontaneda said that on the basis of these promising pilot data, the researchers have secured funding from the National Institutes of Health for a prospective study to further investigate the central vein sign as a potential biomarker for MS.

He also said there is “active discussion as to whether central vein sign should be added to the diagnostic criteria for MS.

“We think that it’s probably about time that we have diagnostic biomarkers that are sensitive and specific and can help us do away with complicated criteria to make the diagnosis, in favor of an imaging biomarker,” Dr. Ontaneda said.
 

 

 

A green light for further research

Commenting on the study, Shaheen Lakhan, MD, a neurologist and researcher from Boston, said that, “if an imaging finding on an otherwise routinely done MRI for patients with MS is just as good as analyses from the fluid from a spinal tap, of course, neurologists, and for sure patients, would go for the former.

“However, this study doesn’t fully support that argument just yet. It is retrospective with a tiny sample size, and the full way they standardized assessments and reporting hasn’t been fully reported,” said Dr. Lakhan, who was not involved in the study.

The study does, however, offer a “solid signal to green-light further exploration of a noninvasive assessment that may replace the dreaded spinal tap.

“In general, these principles need to be applied to all our invasive diagnostic criteria from biopsies to risky procedures, and also the incorporation of artificial intelligence/machine learning to aid in standardizing and scaling these assessments – and, frankly, reduce human error in readings,” said Dr. Lakhan.

Funding for the study was provided by the Race to Erase MS Foundation and the NIH. Dr. Ontaneda has received research support from the NIH, the National MS Society, the Patient Centered Outcomes Research Institute, the Race to Erase MS Foundation, Genentech, Sanofi, and Novartis and has consulted for Biogen, Genentech, Sanofi, Janssen, Novartis, and Merck. Dr. Lakhan has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

To diagnose multiple sclerosis (MS), the central vein sign on brain MRI appears to work as well as oligoclonal bands in cerebrospinal fluid, and combining the two biomarkers yields the highest predictive value for MS, a new study indicates.

The presence of oligoclonal bands is “very specific for MS and is obtained by lumbar puncture, which is invasive and can be unpleasant, so it is not an ideal test,” said study investigator Daniel Ontaneda, MD, PhD, with the Mellen Center for MS Treatment and Research at the Cleveland Clinic.

In a pilot study, the central vein sign was “highly correlated with the presence of oligoclonal bands and in many cases could serve to prove that a person has MS without the need for a spinal tap,” Dr. Ontaneda said.

The study was presented at the annual meeting of the European Committee for Treatment and Research in Multiple Sclerosis (ECTRIMS).
 

Reducing the need for lumbar puncture

Oligoclonal bands in cerebrospinal fluid are commonly used as a diagnostic biomarker for MS and can serve to meet the requirement for dissemination in time in the 2017 McDonald criteria. Central vein sign is an emerging neuroimaging biomarker for MS that may improve diagnostic accuracy and reduce the need for lumbar puncture.

For the study, the investigators compared the sensitivity, specificity, and positive predictive value of central vein sign on MRI with that of oligoclonal bands in cerebrospinal fluid for MS diagnosis.

Among the 53 participants, 24 (45%) met 2017 McDonald criteria for dissemination in space and time at baseline, and 27 (51%) met the criteria at 12-month follow-up.

At initial presentation, sensitivity for MS diagnosis was 75% for oligoclonal bands, 83% for central vein sign “Select-3” (3 central vein sign–positive lesions per scan), and 71% for central vein sign “Select-6” (6 central vein sign–positive lesions per scan).

The point estimate of sensitivity of central vein sign was higher than of oligoclonal bands, but there was no significant difference in sensitivities across methods.

Specificity at initial presentation was 76% for oligoclonal bands, 48% for Select-3, and 86% for Select-6.

The presence of oligoclonal bands was more specific than Select-3 for diagnosis of MS at initial presentation (P = .03), as was Select-6 (P = .001). There was no significant difference when comparing cerebrospinal fluid oligoclonal bands with central vein sign Select-6.

At 12-month follow-up, the positive predictive value was 84% for oligoclonal bands and 95% for Select-6; combining oligoclonal bands and Select-6 gave a positive predictive value of 100%.

Dr. Ontaneda said that on the basis of these promising pilot data, the researchers have secured funding from the National Institutes of Health for a prospective study to further investigate the central vein sign as a potential biomarker for MS.

He also said there is “active discussion as to whether central vein sign should be added to the diagnostic criteria for MS.

“We think that it’s probably about time that we have diagnostic biomarkers that are sensitive and specific and can help us do away with complicated criteria to make the diagnosis, in favor of an imaging biomarker,” Dr. Ontaneda said.
 

 

 

A green light for further research

Commenting on the study, Shaheen Lakhan, MD, a neurologist and researcher from Boston, said that, “if an imaging finding on an otherwise routinely done MRI for patients with MS is just as good as analyses from the fluid from a spinal tap, of course, neurologists, and for sure patients, would go for the former.

“However, this study doesn’t fully support that argument just yet. It is retrospective with a tiny sample size, and the full way they standardized assessments and reporting hasn’t been fully reported,” said Dr. Lakhan, who was not involved in the study.

The study does, however, offer a “solid signal to green-light further exploration of a noninvasive assessment that may replace the dreaded spinal tap.

“In general, these principles need to be applied to all our invasive diagnostic criteria from biopsies to risky procedures, and also the incorporation of artificial intelligence/machine learning to aid in standardizing and scaling these assessments – and, frankly, reduce human error in readings,” said Dr. Lakhan.

Funding for the study was provided by the Race to Erase MS Foundation and the NIH. Dr. Ontaneda has received research support from the NIH, the National MS Society, the Patient Centered Outcomes Research Institute, the Race to Erase MS Foundation, Genentech, Sanofi, and Novartis and has consulted for Biogen, Genentech, Sanofi, Janssen, Novartis, and Merck. Dr. Lakhan has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECTRIMS 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Patients with schizophrenia may be twice as likely to develop dementia

Article Type
Changed
Mon, 10/31/2022 - 13:09

Patients with psychotic disorders such as schizophrenia are more than twice as likely as those without a psychotic disorder to eventually develop dementia, new research suggests.

Results from a review and meta-analysis of almost 13 million total participants from nine countries showed that, across multiple different psychotic disorders, there was a 2.5-fold higher risk of developing dementia later in life compared with individuals who did not have a disorder. This was regardless of the age at which the patients first developed the mental illness.

Moreover, participants with a psychotic disorder tended to be younger than average when diagnosed with dementia. Two studies showed that those with psychotic disorders were more likely to be diagnosed with dementia as early as in their 60s.

“The findings add to a growing body of evidence linking psychiatric disorders with later cognitive decline and dementia,” senior investigator Jean Stafford, PhD, a research fellow at MRC Unit for Lifelong Health and Ageing, University College London, told this news organization.

Dr. Stafford noted that the results highlight the importance of being aware of and watchful for symptoms of cognitive decline in patients with psychotic disorders in mid- and late life.

“In addition, given that people with psychotic disorders are at higher risk of experiencing multiple health conditions, including dementia, managing overall physical and mental health in this group is crucial,” she said.

The findings were published online in Psychological Medicine.
 

Bringing the evidence together

There is increasing evidence that multiple psychiatric symptoms and diagnoses are associated with cognitive decline and dementia, with particularly strong evidence for late-life depression, Dr. Stafford said.

“However, the relationship between psychotic disorders and dementia is less well-established,” she added.

Last year, her team published a study showing a strong association between very late onset psychotic disorders, defined as first diagnosed after age 60 years, and increased risk for dementia in Swedish population register data.

“We also became aware of several other large studies on the topic published in the last few years and realized that an up-to-date systematic review and meta-analysis was needed to bring together the evidence, specifically focusing on longitudinal studies,” Dr. Stafford said.

The researchers searched four databases of prospective and retrospective longitudinal studies published through March 2022. Studies were required to focus on adults aged 18 years or older with a clinical diagnosis of a nonaffective psychotic disorder and a comparison group consisting of adults without a nonaffective psychotic disorder.

Of 9,496 papers, the investigators selected 11 published from 2003 to 2022 that met criteria for inclusion in their meta-analysis (12,997,101 participants), with follow-up periods ranging from 1.57 to 33 years.

The studies hailed from Denmark, Finland, Sweden, the United Kingdom, the United States, Australia, Taiwan, New Zealand, and Israel.

Random-effects meta-analyses were used to pool estimates across studies. The researchers assessed the risk of bias for each study. They also included two additional studies in the review, but not the meta-analysis, that focused specifically on late-onset acute and transient psychosis and late-onset delusional disorder.

The other studies focused on late-onset schizophrenia and/or very late onset schizophrenia-like psychoses, schizophrenia, psychotic disorders, and schizophrenia in older people.

Most studies investigated the incidence of all-cause dementia, although one study focused on the incidence of Alzheimer’s disease.
 

 

 

Potential mechanisms

The narrative review showed that most studies (n = 10) were of high methodological quality, although two were rated as fair and one as poor.

Almost all studies accounted for basic sociodemographic confounders. Several also adjusted for comorbidities, alcohol/substance use disorders, medications, smoking status, and income/education level.

Pooled estimates from the meta-analyzed studies showed that only one showed no significant association between psychotic disorders and dementia, whereas 10 reported increased risk (pooled risk ratio, 2.52; 95% confidence interval, 1.67-3.80; I2, 99.7%).

Subgroup analyses showed higher risk in participants with typical and late-onset psychotic disorders (pooled RR, 2.10; 95% CI, 2.33-4.14; I2, 77.5%; P = .004) vs. those with very late onset schizophrenia-like psychoses (pooled RR, 2.77; 95% CI, 1.74-4.40 I2, 98.9%; P < .001).

The effect was larger in studies with a follow-up of less than 10 years vs. those with a follow-up of 10 years or more, and it was also greater in studies conducted in non-European vs. European countries (all P < .001).

Studies with more female participants (≥ 60%) showed higher risk compared with those that had a lower percentage of female participants. Studies published during or after 2020 showed a stronger association than those published before 2020 (all P < .001).

There was also a higher risk for dementia in studies investigating broader nonaffective psychotic disorders compared with studies investigating only schizophrenia, in prospective vs. retrospective studies, and in studies with a minimum age of less than 60 years at baseline vs. a minimum age of 60 or older (all P < .001).

“Several possible mechanisms could underlie these findings, although we were not able to directly test these in our review,” Dr. Stafford said. She noted that psychotic disorders and other psychiatric diagnoses may cause dementia.

“People with psychotic disorders such as schizophrenia are also at higher risk of health conditions including cardiovascular disease and diabetes, which are known risk factors for dementia and could underpin these associations,” said Dr. Stafford.

It is also possible “that psychotic symptoms could be early markers of dementia for some people, rather than causes,” she added.
 

Neuroimaging evidence lacking

Commenting on the study, Dilip V. Jeste, MD, former senior associate dean for healthy aging and senior care and distinguished professor of psychiatry and neurosciences at the University of California, San Diego, complimented the investigators for “an excellent article on an important but difficult topic.”

Dr. Dilip V. Jeste

Limitations “pertain not to the meta-analysis but to the original studies,” said Dr. Jeste, who was not involved with the review. Diagnosing dementia in individuals with psychotic disorders is “challenging because cognitive deficits and behavioral symptoms in psychotic disorders may be misdiagnosed as dementia in some individuals – and vice versa,” he added.

Moreover, the studies did not specify the type of dementia, such as Alzheimer’s disease, vascular, Lewy body, frontotemporal, or mixed. Together, “they account for 90% of the dementias, and most patients with these dementias have brain abnormalities that can clearly be seen on MRI,” Dr. Jeste said.

However, patients with schizophrenia who are diagnosed with dementia “rarely show severe brain atrophy, even in specific regions commonly observed in nonpsychotic people with these dementias,” Dr. Jeste noted.

Thus, objective neuroimaging-based evidence for dementia and its subtype “is lacking in most of the published studies of persons with psychotic disorders diagnosed as having dementia,” he said.

There is a “clear need for comprehensive studies of dementia in people with psychotic disorders to understand the significance of the results,” Dr. Jeste concluded.

The review did not receive any funding. Dr. Stafford was supported by an NIHR-UCLH BRC Postdoctoral Bridging Fellowship and the National Institute for Health Research Biomedical Research Centre at University College London Hospitals NHS Foundation Trust. Dr. Stafford was also the principal investigator in one of the studies meeting the inclusion criteria of the review. The other investigators and Dr. Jeste reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Patients with psychotic disorders such as schizophrenia are more than twice as likely as those without a psychotic disorder to eventually develop dementia, new research suggests.

Results from a review and meta-analysis of almost 13 million total participants from nine countries showed that, across multiple different psychotic disorders, there was a 2.5-fold higher risk of developing dementia later in life compared with individuals who did not have a disorder. This was regardless of the age at which the patients first developed the mental illness.

Moreover, participants with a psychotic disorder tended to be younger than average when diagnosed with dementia. Two studies showed that those with psychotic disorders were more likely to be diagnosed with dementia as early as in their 60s.

“The findings add to a growing body of evidence linking psychiatric disorders with later cognitive decline and dementia,” senior investigator Jean Stafford, PhD, a research fellow at MRC Unit for Lifelong Health and Ageing, University College London, told this news organization.

Dr. Stafford noted that the results highlight the importance of being aware of and watchful for symptoms of cognitive decline in patients with psychotic disorders in mid- and late life.

“In addition, given that people with psychotic disorders are at higher risk of experiencing multiple health conditions, including dementia, managing overall physical and mental health in this group is crucial,” she said.

The findings were published online in Psychological Medicine.
 

Bringing the evidence together

There is increasing evidence that multiple psychiatric symptoms and diagnoses are associated with cognitive decline and dementia, with particularly strong evidence for late-life depression, Dr. Stafford said.

“However, the relationship between psychotic disorders and dementia is less well-established,” she added.

Last year, her team published a study showing a strong association between very late onset psychotic disorders, defined as first diagnosed after age 60 years, and increased risk for dementia in Swedish population register data.

“We also became aware of several other large studies on the topic published in the last few years and realized that an up-to-date systematic review and meta-analysis was needed to bring together the evidence, specifically focusing on longitudinal studies,” Dr. Stafford said.

The researchers searched four databases of prospective and retrospective longitudinal studies published through March 2022. Studies were required to focus on adults aged 18 years or older with a clinical diagnosis of a nonaffective psychotic disorder and a comparison group consisting of adults without a nonaffective psychotic disorder.

Of 9,496 papers, the investigators selected 11 published from 2003 to 2022 that met criteria for inclusion in their meta-analysis (12,997,101 participants), with follow-up periods ranging from 1.57 to 33 years.

The studies hailed from Denmark, Finland, Sweden, the United Kingdom, the United States, Australia, Taiwan, New Zealand, and Israel.

Random-effects meta-analyses were used to pool estimates across studies. The researchers assessed the risk of bias for each study. They also included two additional studies in the review, but not the meta-analysis, that focused specifically on late-onset acute and transient psychosis and late-onset delusional disorder.

The other studies focused on late-onset schizophrenia and/or very late onset schizophrenia-like psychoses, schizophrenia, psychotic disorders, and schizophrenia in older people.

Most studies investigated the incidence of all-cause dementia, although one study focused on the incidence of Alzheimer’s disease.
 

 

 

Potential mechanisms

The narrative review showed that most studies (n = 10) were of high methodological quality, although two were rated as fair and one as poor.

Almost all studies accounted for basic sociodemographic confounders. Several also adjusted for comorbidities, alcohol/substance use disorders, medications, smoking status, and income/education level.

Pooled estimates from the meta-analyzed studies showed that only one showed no significant association between psychotic disorders and dementia, whereas 10 reported increased risk (pooled risk ratio, 2.52; 95% confidence interval, 1.67-3.80; I2, 99.7%).

Subgroup analyses showed higher risk in participants with typical and late-onset psychotic disorders (pooled RR, 2.10; 95% CI, 2.33-4.14; I2, 77.5%; P = .004) vs. those with very late onset schizophrenia-like psychoses (pooled RR, 2.77; 95% CI, 1.74-4.40 I2, 98.9%; P < .001).

The effect was larger in studies with a follow-up of less than 10 years vs. those with a follow-up of 10 years or more, and it was also greater in studies conducted in non-European vs. European countries (all P < .001).

Studies with more female participants (≥ 60%) showed higher risk compared with those that had a lower percentage of female participants. Studies published during or after 2020 showed a stronger association than those published before 2020 (all P < .001).

There was also a higher risk for dementia in studies investigating broader nonaffective psychotic disorders compared with studies investigating only schizophrenia, in prospective vs. retrospective studies, and in studies with a minimum age of less than 60 years at baseline vs. a minimum age of 60 or older (all P < .001).

“Several possible mechanisms could underlie these findings, although we were not able to directly test these in our review,” Dr. Stafford said. She noted that psychotic disorders and other psychiatric diagnoses may cause dementia.

“People with psychotic disorders such as schizophrenia are also at higher risk of health conditions including cardiovascular disease and diabetes, which are known risk factors for dementia and could underpin these associations,” said Dr. Stafford.

It is also possible “that psychotic symptoms could be early markers of dementia for some people, rather than causes,” she added.
 

Neuroimaging evidence lacking

Commenting on the study, Dilip V. Jeste, MD, former senior associate dean for healthy aging and senior care and distinguished professor of psychiatry and neurosciences at the University of California, San Diego, complimented the investigators for “an excellent article on an important but difficult topic.”

Dr. Dilip V. Jeste

Limitations “pertain not to the meta-analysis but to the original studies,” said Dr. Jeste, who was not involved with the review. Diagnosing dementia in individuals with psychotic disorders is “challenging because cognitive deficits and behavioral symptoms in psychotic disorders may be misdiagnosed as dementia in some individuals – and vice versa,” he added.

Moreover, the studies did not specify the type of dementia, such as Alzheimer’s disease, vascular, Lewy body, frontotemporal, or mixed. Together, “they account for 90% of the dementias, and most patients with these dementias have brain abnormalities that can clearly be seen on MRI,” Dr. Jeste said.

However, patients with schizophrenia who are diagnosed with dementia “rarely show severe brain atrophy, even in specific regions commonly observed in nonpsychotic people with these dementias,” Dr. Jeste noted.

Thus, objective neuroimaging-based evidence for dementia and its subtype “is lacking in most of the published studies of persons with psychotic disorders diagnosed as having dementia,” he said.

There is a “clear need for comprehensive studies of dementia in people with psychotic disorders to understand the significance of the results,” Dr. Jeste concluded.

The review did not receive any funding. Dr. Stafford was supported by an NIHR-UCLH BRC Postdoctoral Bridging Fellowship and the National Institute for Health Research Biomedical Research Centre at University College London Hospitals NHS Foundation Trust. Dr. Stafford was also the principal investigator in one of the studies meeting the inclusion criteria of the review. The other investigators and Dr. Jeste reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Patients with psychotic disorders such as schizophrenia are more than twice as likely as those without a psychotic disorder to eventually develop dementia, new research suggests.

Results from a review and meta-analysis of almost 13 million total participants from nine countries showed that, across multiple different psychotic disorders, there was a 2.5-fold higher risk of developing dementia later in life compared with individuals who did not have a disorder. This was regardless of the age at which the patients first developed the mental illness.

Moreover, participants with a psychotic disorder tended to be younger than average when diagnosed with dementia. Two studies showed that those with psychotic disorders were more likely to be diagnosed with dementia as early as in their 60s.

“The findings add to a growing body of evidence linking psychiatric disorders with later cognitive decline and dementia,” senior investigator Jean Stafford, PhD, a research fellow at MRC Unit for Lifelong Health and Ageing, University College London, told this news organization.

Dr. Stafford noted that the results highlight the importance of being aware of and watchful for symptoms of cognitive decline in patients with psychotic disorders in mid- and late life.

“In addition, given that people with psychotic disorders are at higher risk of experiencing multiple health conditions, including dementia, managing overall physical and mental health in this group is crucial,” she said.

The findings were published online in Psychological Medicine.
 

Bringing the evidence together

There is increasing evidence that multiple psychiatric symptoms and diagnoses are associated with cognitive decline and dementia, with particularly strong evidence for late-life depression, Dr. Stafford said.

“However, the relationship between psychotic disorders and dementia is less well-established,” she added.

Last year, her team published a study showing a strong association between very late onset psychotic disorders, defined as first diagnosed after age 60 years, and increased risk for dementia in Swedish population register data.

“We also became aware of several other large studies on the topic published in the last few years and realized that an up-to-date systematic review and meta-analysis was needed to bring together the evidence, specifically focusing on longitudinal studies,” Dr. Stafford said.

The researchers searched four databases of prospective and retrospective longitudinal studies published through March 2022. Studies were required to focus on adults aged 18 years or older with a clinical diagnosis of a nonaffective psychotic disorder and a comparison group consisting of adults without a nonaffective psychotic disorder.

Of 9,496 papers, the investigators selected 11 published from 2003 to 2022 that met criteria for inclusion in their meta-analysis (12,997,101 participants), with follow-up periods ranging from 1.57 to 33 years.

The studies hailed from Denmark, Finland, Sweden, the United Kingdom, the United States, Australia, Taiwan, New Zealand, and Israel.

Random-effects meta-analyses were used to pool estimates across studies. The researchers assessed the risk of bias for each study. They also included two additional studies in the review, but not the meta-analysis, that focused specifically on late-onset acute and transient psychosis and late-onset delusional disorder.

The other studies focused on late-onset schizophrenia and/or very late onset schizophrenia-like psychoses, schizophrenia, psychotic disorders, and schizophrenia in older people.

Most studies investigated the incidence of all-cause dementia, although one study focused on the incidence of Alzheimer’s disease.
 

 

 

Potential mechanisms

The narrative review showed that most studies (n = 10) were of high methodological quality, although two were rated as fair and one as poor.

Almost all studies accounted for basic sociodemographic confounders. Several also adjusted for comorbidities, alcohol/substance use disorders, medications, smoking status, and income/education level.

Pooled estimates from the meta-analyzed studies showed that only one showed no significant association between psychotic disorders and dementia, whereas 10 reported increased risk (pooled risk ratio, 2.52; 95% confidence interval, 1.67-3.80; I2, 99.7%).

Subgroup analyses showed higher risk in participants with typical and late-onset psychotic disorders (pooled RR, 2.10; 95% CI, 2.33-4.14; I2, 77.5%; P = .004) vs. those with very late onset schizophrenia-like psychoses (pooled RR, 2.77; 95% CI, 1.74-4.40 I2, 98.9%; P < .001).

The effect was larger in studies with a follow-up of less than 10 years vs. those with a follow-up of 10 years or more, and it was also greater in studies conducted in non-European vs. European countries (all P < .001).

Studies with more female participants (≥ 60%) showed higher risk compared with those that had a lower percentage of female participants. Studies published during or after 2020 showed a stronger association than those published before 2020 (all P < .001).

There was also a higher risk for dementia in studies investigating broader nonaffective psychotic disorders compared with studies investigating only schizophrenia, in prospective vs. retrospective studies, and in studies with a minimum age of less than 60 years at baseline vs. a minimum age of 60 or older (all P < .001).

“Several possible mechanisms could underlie these findings, although we were not able to directly test these in our review,” Dr. Stafford said. She noted that psychotic disorders and other psychiatric diagnoses may cause dementia.

“People with psychotic disorders such as schizophrenia are also at higher risk of health conditions including cardiovascular disease and diabetes, which are known risk factors for dementia and could underpin these associations,” said Dr. Stafford.

It is also possible “that psychotic symptoms could be early markers of dementia for some people, rather than causes,” she added.
 

Neuroimaging evidence lacking

Commenting on the study, Dilip V. Jeste, MD, former senior associate dean for healthy aging and senior care and distinguished professor of psychiatry and neurosciences at the University of California, San Diego, complimented the investigators for “an excellent article on an important but difficult topic.”

Dr. Dilip V. Jeste

Limitations “pertain not to the meta-analysis but to the original studies,” said Dr. Jeste, who was not involved with the review. Diagnosing dementia in individuals with psychotic disorders is “challenging because cognitive deficits and behavioral symptoms in psychotic disorders may be misdiagnosed as dementia in some individuals – and vice versa,” he added.

Moreover, the studies did not specify the type of dementia, such as Alzheimer’s disease, vascular, Lewy body, frontotemporal, or mixed. Together, “they account for 90% of the dementias, and most patients with these dementias have brain abnormalities that can clearly be seen on MRI,” Dr. Jeste said.

However, patients with schizophrenia who are diagnosed with dementia “rarely show severe brain atrophy, even in specific regions commonly observed in nonpsychotic people with these dementias,” Dr. Jeste noted.

Thus, objective neuroimaging-based evidence for dementia and its subtype “is lacking in most of the published studies of persons with psychotic disorders diagnosed as having dementia,” he said.

There is a “clear need for comprehensive studies of dementia in people with psychotic disorders to understand the significance of the results,” Dr. Jeste concluded.

The review did not receive any funding. Dr. Stafford was supported by an NIHR-UCLH BRC Postdoctoral Bridging Fellowship and the National Institute for Health Research Biomedical Research Centre at University College London Hospitals NHS Foundation Trust. Dr. Stafford was also the principal investigator in one of the studies meeting the inclusion criteria of the review. The other investigators and Dr. Jeste reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PSYCHOLOGICAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

An integrative approach to atopic dermatitis features a long list of options

Article Type
Changed
Mon, 10/31/2022 - 13:09

Peter A. Lio, MD, is a big user of the “big guns” for his patients with atopic dermatitis – biologics, conventional immunosuppressants, and JAK inhibitors. But he also has a big menu of treatments – from oral hempseed oil and black tea compresses to probiotics and acupressure – that he encourages patients to try as they use the big guns, or as they attempt to wean off of them or avoid their use altogether.

During a presentation at the annual Integrative Dermatology Symposium, Dr. Lio said that he uses “5 pillars” to guide his integrative treatment plans: The skin barrier, the psyche, the microbiome, inflammation, and itch. “I try to flag approaches that predominantly address the categories that I think need the most help,” he said. “And I tell patients [which pillar or pillars] each treatment is addressing.”

PicturePartners/Getty Images

Most commonly, the greatest challenge with AD – and the “single biggest weakness of conventional Western medicine” – lies not with getting patients clear in the first place, but in keeping them clear safely, he said. “I don’t think that using immunosuppressive [medications] is okay for the long-term unless there is no other choice,” said Dr. Lio, who cofounded the Chicago Integrative Eczema Center about 6 years ago and is clinical assistant professor of dermatology and pediatrics at Northwestern University, Chicago. Oftentimes, he said, complementary approaches, including dietary changes, can also serve as supportive adjunctive therapy to biologics and JAK inhibitors.

He has three main criteria, or “filters,” for evaluating these treatments before recommending them to patients: At least some clinical evidence for efficacy (preferably randomized trials but not necessarily), safety, and practicality. The “only way we’re going to move things forward [for AD and other conditions] is to try out less tested treatments ... to open up to them,” Dr. Lio said in an interview after the meeting. And in doing so, he said, dermatologists “can connect with a lot of patients whom naysayers can’t connect with.”
 

An integrative menu

Dr. Lio individualizes plans, suggesting treatments after “listening to patients’ stories” and considering their age, history, symptoms and skin presentation, and other factors. He said he “goes little by little,” telling a patient, for instance, “I’d love for us to try adding a little hemp oil to your diet.”

Dr. Peter A. Lio

If patients aren’t pleased with or are tired of treatments, he said in the interview, “we move on and try something else.”

At the meeting, he described some of the treatments on his menu and the supporting evidence for those treatments:

Oral hempseed oil. A randomized crossover study of 20 adult patients with AD found that daily consumption of 2 tablespoons of hempseed oil decreased skin dryness, itchiness, and use of topical medications compared with consumption of olive oil. “It was statistically significant and seemed clinically meaningful,” likely resulting from the high concentration of polyunsaturated fatty acids in the oil, Dr. Lio said.

Topical vitamin B12. In a phase 3 randomized controlled trial of topical B12 applied twice a day for 8 weeks, patients experienced significant improvements in the extent and severity of AD compared with placebo. Another study in children with AD aged 6 months to 18 years found significant improvement in as early as 2 weeks of use. “It really does help, and is very gentle in babies,” Dr. Lio said.

Black tea compresses. “It’s absolutely my favorite kind of compress,” he said. “It was studied on the face and eyelids but I use it all over the body for adults and kids.” A German study of 22 patients with AD or contact facial dermatitis showed significant improvements in facial dermatitis within the first 3 days of treatment with application of black tea dressings plus an emollient cream, with significant reductions in four disease activity scores (the Facial Eczema Area and Severity Index, visual analog scale for pruritus, Investigator’s Global Assessment score, and Patient’s Self-Assessment Score) that continued through day 6.

Oolong tea. In a 2001 study, after 1 month of drinking oolong tea after each meal, 64% of patients with recalcitrant AD who continued with their regular treatment showed marked to moderate improvements in AD, with a beneficial effect first noticed after 1-2 weeks. At 6 months, 54% still had a good response to treatment. “It’s super cheap and accessible,” Dr. Lio said.

Coconut oil. One of the greatest benefits of coconut oil is on the microbiome and the dysbiosis that can result from a disrupted, or “leaky,” skin barrier – especially overgrowth of Staphylococcus aureus, which “drives AD,” Dr. Lio said. In a study of adults with AD from the Philippines, topically applied coconut oil decreased S. aureus colonization by 95% when applied twice daily for 4 weeks, compared with a 50% decrease in an olive oil control group. Other research has shown coconut oil to be superior to mineral oil as a moisturizer, he said at the meeting.

Acupressure. After a pilot study conducted by Dr. Lio and colleagues showed greater decreases in itch (per the visual analogue scale) in adults with AD who applied an acupressure bead at the LI11 point (near the elbow) for 3 minutes three times a week for 4 weeks, than among those who did not use the acupressure tool, Dr. Lio began trying it with some of his patients. “Now I use it broadly,” he added in the interview. “Kids over 10 can figure out how to use it and teenagers love it [to relief itch]. Some don’t use the beads anymore, they just use their fingertips.”
 

 

 

Advice on diet, vitamin D, and probiotics

AD severity is “powerfully” correlated with IgE food allergy, but Dr. Lio said at the meeting that he currently takes a cautious approach toward strict elimination diets.

There is a growing school of thought among allergists, he said, that positive IgE tests without evidence of acute reactions may not indicate true allergy, but rather sensitivity – and may not warrant food eliminations. And as has been shown with peanuts, there can be a serious downside to elimination, as food avoidance can lead to serious allergy later on, he said.

“More and more people are thinking that if you can tolerate [a food], continue it,” he added in the interview. In the absence of clear reactions, the only way to really know if a food is making eczema worse is to do a double-blind, placebo-controlled food challenge test, he noted.

Patients often come to see him believing that food is the “root cause” of their eczema and feeling frustrated, even anxious, about strict dietary restrictions they’ve implemented. But for many of these patients, the right question “would be to ask, why is my eczema causing my food allergy?” he said at the meeting, referring to the epithelial barrier hypothesis, which posits that skin barrier dysfunction can lead to asthma, allergic rhinitis, and food allergy.



Dr. Lio often recommends the Autoimmune Protocol (AIP) diet, a “close cousin” of the paleo diet for patients with AD, as general guidance to be followed “holistically” and often without the strict eliminations it prescribes. Minimizing processed foods and dairy and grains, which “can be inflammatory in some people,” and focusing on whole, nutrient-rich foods – all in keeping with the AIP principles – should have positive effects on the microbiome, overall health, and likely AD as well, he said.

Across the board, Dr. Lio recommends vitamin D (at nationally recommended dosages) and probiotics. Vitamin D has been shown to significantly help a small percentage of patients with eczema, he said, so he advises patients that it’s worth a trial. “I tell patients that I don’t know how to pick that small group out, so let’s try for a few months and see,” he said. “Inevitably, a percentage of patients come back and say it makes a huge difference.”

Dr. Lio’s understanding and use of probiotics has been “dynamic” over the years. “The “best, most reliable evidence” that probiotics can improve AD symptoms comes with the use of multiple probiotic strains together, he said. Based on limited but growing literature, he ensures that recommended formulations for babies include Lactobacillus rhamnosus, and that formulations for adults include Lactobacillus salivarius.

Dr. Lio works closely with dietitians, hypnotherapists, and psychologists – and will occasionally refer interested patients with AD to a Chinese medicine practitioner who personalizes the use of herbal formulations.

He reported no relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Peter A. Lio, MD, is a big user of the “big guns” for his patients with atopic dermatitis – biologics, conventional immunosuppressants, and JAK inhibitors. But he also has a big menu of treatments – from oral hempseed oil and black tea compresses to probiotics and acupressure – that he encourages patients to try as they use the big guns, or as they attempt to wean off of them or avoid their use altogether.

During a presentation at the annual Integrative Dermatology Symposium, Dr. Lio said that he uses “5 pillars” to guide his integrative treatment plans: The skin barrier, the psyche, the microbiome, inflammation, and itch. “I try to flag approaches that predominantly address the categories that I think need the most help,” he said. “And I tell patients [which pillar or pillars] each treatment is addressing.”

PicturePartners/Getty Images

Most commonly, the greatest challenge with AD – and the “single biggest weakness of conventional Western medicine” – lies not with getting patients clear in the first place, but in keeping them clear safely, he said. “I don’t think that using immunosuppressive [medications] is okay for the long-term unless there is no other choice,” said Dr. Lio, who cofounded the Chicago Integrative Eczema Center about 6 years ago and is clinical assistant professor of dermatology and pediatrics at Northwestern University, Chicago. Oftentimes, he said, complementary approaches, including dietary changes, can also serve as supportive adjunctive therapy to biologics and JAK inhibitors.

He has three main criteria, or “filters,” for evaluating these treatments before recommending them to patients: At least some clinical evidence for efficacy (preferably randomized trials but not necessarily), safety, and practicality. The “only way we’re going to move things forward [for AD and other conditions] is to try out less tested treatments ... to open up to them,” Dr. Lio said in an interview after the meeting. And in doing so, he said, dermatologists “can connect with a lot of patients whom naysayers can’t connect with.”
 

An integrative menu

Dr. Lio individualizes plans, suggesting treatments after “listening to patients’ stories” and considering their age, history, symptoms and skin presentation, and other factors. He said he “goes little by little,” telling a patient, for instance, “I’d love for us to try adding a little hemp oil to your diet.”

Dr. Peter A. Lio

If patients aren’t pleased with or are tired of treatments, he said in the interview, “we move on and try something else.”

At the meeting, he described some of the treatments on his menu and the supporting evidence for those treatments:

Oral hempseed oil. A randomized crossover study of 20 adult patients with AD found that daily consumption of 2 tablespoons of hempseed oil decreased skin dryness, itchiness, and use of topical medications compared with consumption of olive oil. “It was statistically significant and seemed clinically meaningful,” likely resulting from the high concentration of polyunsaturated fatty acids in the oil, Dr. Lio said.

Topical vitamin B12. In a phase 3 randomized controlled trial of topical B12 applied twice a day for 8 weeks, patients experienced significant improvements in the extent and severity of AD compared with placebo. Another study in children with AD aged 6 months to 18 years found significant improvement in as early as 2 weeks of use. “It really does help, and is very gentle in babies,” Dr. Lio said.

Black tea compresses. “It’s absolutely my favorite kind of compress,” he said. “It was studied on the face and eyelids but I use it all over the body for adults and kids.” A German study of 22 patients with AD or contact facial dermatitis showed significant improvements in facial dermatitis within the first 3 days of treatment with application of black tea dressings plus an emollient cream, with significant reductions in four disease activity scores (the Facial Eczema Area and Severity Index, visual analog scale for pruritus, Investigator’s Global Assessment score, and Patient’s Self-Assessment Score) that continued through day 6.

Oolong tea. In a 2001 study, after 1 month of drinking oolong tea after each meal, 64% of patients with recalcitrant AD who continued with their regular treatment showed marked to moderate improvements in AD, with a beneficial effect first noticed after 1-2 weeks. At 6 months, 54% still had a good response to treatment. “It’s super cheap and accessible,” Dr. Lio said.

Coconut oil. One of the greatest benefits of coconut oil is on the microbiome and the dysbiosis that can result from a disrupted, or “leaky,” skin barrier – especially overgrowth of Staphylococcus aureus, which “drives AD,” Dr. Lio said. In a study of adults with AD from the Philippines, topically applied coconut oil decreased S. aureus colonization by 95% when applied twice daily for 4 weeks, compared with a 50% decrease in an olive oil control group. Other research has shown coconut oil to be superior to mineral oil as a moisturizer, he said at the meeting.

Acupressure. After a pilot study conducted by Dr. Lio and colleagues showed greater decreases in itch (per the visual analogue scale) in adults with AD who applied an acupressure bead at the LI11 point (near the elbow) for 3 minutes three times a week for 4 weeks, than among those who did not use the acupressure tool, Dr. Lio began trying it with some of his patients. “Now I use it broadly,” he added in the interview. “Kids over 10 can figure out how to use it and teenagers love it [to relief itch]. Some don’t use the beads anymore, they just use their fingertips.”
 

 

 

Advice on diet, vitamin D, and probiotics

AD severity is “powerfully” correlated with IgE food allergy, but Dr. Lio said at the meeting that he currently takes a cautious approach toward strict elimination diets.

There is a growing school of thought among allergists, he said, that positive IgE tests without evidence of acute reactions may not indicate true allergy, but rather sensitivity – and may not warrant food eliminations. And as has been shown with peanuts, there can be a serious downside to elimination, as food avoidance can lead to serious allergy later on, he said.

“More and more people are thinking that if you can tolerate [a food], continue it,” he added in the interview. In the absence of clear reactions, the only way to really know if a food is making eczema worse is to do a double-blind, placebo-controlled food challenge test, he noted.

Patients often come to see him believing that food is the “root cause” of their eczema and feeling frustrated, even anxious, about strict dietary restrictions they’ve implemented. But for many of these patients, the right question “would be to ask, why is my eczema causing my food allergy?” he said at the meeting, referring to the epithelial barrier hypothesis, which posits that skin barrier dysfunction can lead to asthma, allergic rhinitis, and food allergy.



Dr. Lio often recommends the Autoimmune Protocol (AIP) diet, a “close cousin” of the paleo diet for patients with AD, as general guidance to be followed “holistically” and often without the strict eliminations it prescribes. Minimizing processed foods and dairy and grains, which “can be inflammatory in some people,” and focusing on whole, nutrient-rich foods – all in keeping with the AIP principles – should have positive effects on the microbiome, overall health, and likely AD as well, he said.

Across the board, Dr. Lio recommends vitamin D (at nationally recommended dosages) and probiotics. Vitamin D has been shown to significantly help a small percentage of patients with eczema, he said, so he advises patients that it’s worth a trial. “I tell patients that I don’t know how to pick that small group out, so let’s try for a few months and see,” he said. “Inevitably, a percentage of patients come back and say it makes a huge difference.”

Dr. Lio’s understanding and use of probiotics has been “dynamic” over the years. “The “best, most reliable evidence” that probiotics can improve AD symptoms comes with the use of multiple probiotic strains together, he said. Based on limited but growing literature, he ensures that recommended formulations for babies include Lactobacillus rhamnosus, and that formulations for adults include Lactobacillus salivarius.

Dr. Lio works closely with dietitians, hypnotherapists, and psychologists – and will occasionally refer interested patients with AD to a Chinese medicine practitioner who personalizes the use of herbal formulations.

He reported no relevant disclosures.

Peter A. Lio, MD, is a big user of the “big guns” for his patients with atopic dermatitis – biologics, conventional immunosuppressants, and JAK inhibitors. But he also has a big menu of treatments – from oral hempseed oil and black tea compresses to probiotics and acupressure – that he encourages patients to try as they use the big guns, or as they attempt to wean off of them or avoid their use altogether.

During a presentation at the annual Integrative Dermatology Symposium, Dr. Lio said that he uses “5 pillars” to guide his integrative treatment plans: The skin barrier, the psyche, the microbiome, inflammation, and itch. “I try to flag approaches that predominantly address the categories that I think need the most help,” he said. “And I tell patients [which pillar or pillars] each treatment is addressing.”

PicturePartners/Getty Images

Most commonly, the greatest challenge with AD – and the “single biggest weakness of conventional Western medicine” – lies not with getting patients clear in the first place, but in keeping them clear safely, he said. “I don’t think that using immunosuppressive [medications] is okay for the long-term unless there is no other choice,” said Dr. Lio, who cofounded the Chicago Integrative Eczema Center about 6 years ago and is clinical assistant professor of dermatology and pediatrics at Northwestern University, Chicago. Oftentimes, he said, complementary approaches, including dietary changes, can also serve as supportive adjunctive therapy to biologics and JAK inhibitors.

He has three main criteria, or “filters,” for evaluating these treatments before recommending them to patients: At least some clinical evidence for efficacy (preferably randomized trials but not necessarily), safety, and practicality. The “only way we’re going to move things forward [for AD and other conditions] is to try out less tested treatments ... to open up to them,” Dr. Lio said in an interview after the meeting. And in doing so, he said, dermatologists “can connect with a lot of patients whom naysayers can’t connect with.”
 

An integrative menu

Dr. Lio individualizes plans, suggesting treatments after “listening to patients’ stories” and considering their age, history, symptoms and skin presentation, and other factors. He said he “goes little by little,” telling a patient, for instance, “I’d love for us to try adding a little hemp oil to your diet.”

Dr. Peter A. Lio

If patients aren’t pleased with or are tired of treatments, he said in the interview, “we move on and try something else.”

At the meeting, he described some of the treatments on his menu and the supporting evidence for those treatments:

Oral hempseed oil. A randomized crossover study of 20 adult patients with AD found that daily consumption of 2 tablespoons of hempseed oil decreased skin dryness, itchiness, and use of topical medications compared with consumption of olive oil. “It was statistically significant and seemed clinically meaningful,” likely resulting from the high concentration of polyunsaturated fatty acids in the oil, Dr. Lio said.

Topical vitamin B12. In a phase 3 randomized controlled trial of topical B12 applied twice a day for 8 weeks, patients experienced significant improvements in the extent and severity of AD compared with placebo. Another study in children with AD aged 6 months to 18 years found significant improvement in as early as 2 weeks of use. “It really does help, and is very gentle in babies,” Dr. Lio said.

Black tea compresses. “It’s absolutely my favorite kind of compress,” he said. “It was studied on the face and eyelids but I use it all over the body for adults and kids.” A German study of 22 patients with AD or contact facial dermatitis showed significant improvements in facial dermatitis within the first 3 days of treatment with application of black tea dressings plus an emollient cream, with significant reductions in four disease activity scores (the Facial Eczema Area and Severity Index, visual analog scale for pruritus, Investigator’s Global Assessment score, and Patient’s Self-Assessment Score) that continued through day 6.

Oolong tea. In a 2001 study, after 1 month of drinking oolong tea after each meal, 64% of patients with recalcitrant AD who continued with their regular treatment showed marked to moderate improvements in AD, with a beneficial effect first noticed after 1-2 weeks. At 6 months, 54% still had a good response to treatment. “It’s super cheap and accessible,” Dr. Lio said.

Coconut oil. One of the greatest benefits of coconut oil is on the microbiome and the dysbiosis that can result from a disrupted, or “leaky,” skin barrier – especially overgrowth of Staphylococcus aureus, which “drives AD,” Dr. Lio said. In a study of adults with AD from the Philippines, topically applied coconut oil decreased S. aureus colonization by 95% when applied twice daily for 4 weeks, compared with a 50% decrease in an olive oil control group. Other research has shown coconut oil to be superior to mineral oil as a moisturizer, he said at the meeting.

Acupressure. After a pilot study conducted by Dr. Lio and colleagues showed greater decreases in itch (per the visual analogue scale) in adults with AD who applied an acupressure bead at the LI11 point (near the elbow) for 3 minutes three times a week for 4 weeks, than among those who did not use the acupressure tool, Dr. Lio began trying it with some of his patients. “Now I use it broadly,” he added in the interview. “Kids over 10 can figure out how to use it and teenagers love it [to relief itch]. Some don’t use the beads anymore, they just use their fingertips.”
 

 

 

Advice on diet, vitamin D, and probiotics

AD severity is “powerfully” correlated with IgE food allergy, but Dr. Lio said at the meeting that he currently takes a cautious approach toward strict elimination diets.

There is a growing school of thought among allergists, he said, that positive IgE tests without evidence of acute reactions may not indicate true allergy, but rather sensitivity – and may not warrant food eliminations. And as has been shown with peanuts, there can be a serious downside to elimination, as food avoidance can lead to serious allergy later on, he said.

“More and more people are thinking that if you can tolerate [a food], continue it,” he added in the interview. In the absence of clear reactions, the only way to really know if a food is making eczema worse is to do a double-blind, placebo-controlled food challenge test, he noted.

Patients often come to see him believing that food is the “root cause” of their eczema and feeling frustrated, even anxious, about strict dietary restrictions they’ve implemented. But for many of these patients, the right question “would be to ask, why is my eczema causing my food allergy?” he said at the meeting, referring to the epithelial barrier hypothesis, which posits that skin barrier dysfunction can lead to asthma, allergic rhinitis, and food allergy.



Dr. Lio often recommends the Autoimmune Protocol (AIP) diet, a “close cousin” of the paleo diet for patients with AD, as general guidance to be followed “holistically” and often without the strict eliminations it prescribes. Minimizing processed foods and dairy and grains, which “can be inflammatory in some people,” and focusing on whole, nutrient-rich foods – all in keeping with the AIP principles – should have positive effects on the microbiome, overall health, and likely AD as well, he said.

Across the board, Dr. Lio recommends vitamin D (at nationally recommended dosages) and probiotics. Vitamin D has been shown to significantly help a small percentage of patients with eczema, he said, so he advises patients that it’s worth a trial. “I tell patients that I don’t know how to pick that small group out, so let’s try for a few months and see,” he said. “Inevitably, a percentage of patients come back and say it makes a huge difference.”

Dr. Lio’s understanding and use of probiotics has been “dynamic” over the years. “The “best, most reliable evidence” that probiotics can improve AD symptoms comes with the use of multiple probiotic strains together, he said. Based on limited but growing literature, he ensures that recommended formulations for babies include Lactobacillus rhamnosus, and that formulations for adults include Lactobacillus salivarius.

Dr. Lio works closely with dietitians, hypnotherapists, and psychologists – and will occasionally refer interested patients with AD to a Chinese medicine practitioner who personalizes the use of herbal formulations.

He reported no relevant disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM IDS 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Findings may be practice changing for early breast cancer patients

Article Type
Changed
Wed, 01/04/2023 - 16:57

Among high-risk early breast cancer patients, delivery of a radiation boost to the tumor bed during whole breast irradiation was just as safe and effective as delivering the boost sequentially after whole breast irradiation ended. The findings from the phase 3 clinical trial are a boon to patient convenience.

These findings are indeed practice changing. This was a well-designed trial that looked at shortening treatment from six to three weeks. They showed equivalent local control and importantly, a good cosmetic outcome over time,” said Kathleen Horst, MD, who served as a discussant during a presentation given by Frank Vicini, MD, FASTRO, GenesisCare, during the annual meeting of the American Society for Radiation Oncology.

“This is substantially more convenient. It’s cost effective both for the health care system and individual patients. Importantly, our patients come in for treatment every day and they’re taking time from work which means they have to arrange for childcare and transportation. So, this makes a big difference for these patients,” said Dr. Horst, who is a professor of radiation oncology at Stanford (Calif.) Medicine and director of well-being in the radiation department at Stanford Medicine.

“One of the things that was surprising is that I think all of us were thinking this might be a more toxic regimen, but as Dr. Vicini showed, it was equally effective over time with minimal toxicity and cosmesis was stable over time, which is important. Importantly, it included patient-reported outcomes, not just the physician-reported outcomes. Broadly, I think these findings are applicable for many patients, including all patients who are receiving whole breast radiotherapy with an added boost. I think over time this is going to improve the quality of life of our patients. It represents an innovative change that everyone is going to be excited to embrace,” Dr. Horst said.



Previous randomized controlled trials showed that an additional radiation dose to the tumor bed following lumpectomy and whole breast irradiation reduces the relative risk of local recurrence by about 35%. However, this increases treatment time for patients who have already endured an extensive regimen. For whole breast irradiation, hypofractionated radiation in 15-16 fractions over 3 weeks has comparable recurrence rates as a 5-week regimen, but the relevant trials did not examine the effect hypofractionation may have on a radiation boost to the tumor bed of high-risk patients. Because of this lack of evidence, current practice calls for the boost to remain sequential in five to eight fractions after completion of whole breast irradiation, which adds 1 week to a 1.5 week–long treatment.

The study included 2,262 patients who were randomized to receive a sequential boost or a concomitant boost. After a median follow-up of 7.4 years, there were 54 ipsilateral breast recurrence (IBR) events. The estimated 7-year risk of IBR was 2.2% in the sequential boost and 2.6% in the concurrent risk group (hazard ratio, 1.32; noninferiority test P = .039). Approximately 60% of patients received adjuvant chemotherapy.

Grade 3 or higher adverse events were similar with a frequency of 3.3% in the sequential group and 3.5% in the concurrent group (P = .79). The researchers used the Global Cosmetic Score (GCS) to assess outcomes from the perspective of both physicians and patients. 86% of physicians rated the outcome as excellent/good in the sequential group versus 82% in the concurrent group (P = .33).

“For high-risk early-stage breast cancer patients undergoing breast conservation, a concurrent boost with hypofractionated whole breast irradiation – compared to a sequential boost – results in noninferior local recurrence rates with no significant difference in toxicity, noninferior patient rated cosmesis, and no significant difference in physician rated cosmesis. The entire treatment was delivered in three weeks, even for high-risk patients. Just as critical, the use of target volume based radiation planning for [three-dimensional conformal or IMRT whole breast irradiation assessed by dose volume analysis is feasible, and resulted in very low toxicity in the treatment arms, regardless of the fractionation schedule, or the boost delivery,” Dr. Vincini said.

No conflicts of interest were disclosed for Dr. Horst or Dr. Vicini.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Among high-risk early breast cancer patients, delivery of a radiation boost to the tumor bed during whole breast irradiation was just as safe and effective as delivering the boost sequentially after whole breast irradiation ended. The findings from the phase 3 clinical trial are a boon to patient convenience.

These findings are indeed practice changing. This was a well-designed trial that looked at shortening treatment from six to three weeks. They showed equivalent local control and importantly, a good cosmetic outcome over time,” said Kathleen Horst, MD, who served as a discussant during a presentation given by Frank Vicini, MD, FASTRO, GenesisCare, during the annual meeting of the American Society for Radiation Oncology.

“This is substantially more convenient. It’s cost effective both for the health care system and individual patients. Importantly, our patients come in for treatment every day and they’re taking time from work which means they have to arrange for childcare and transportation. So, this makes a big difference for these patients,” said Dr. Horst, who is a professor of radiation oncology at Stanford (Calif.) Medicine and director of well-being in the radiation department at Stanford Medicine.

“One of the things that was surprising is that I think all of us were thinking this might be a more toxic regimen, but as Dr. Vicini showed, it was equally effective over time with minimal toxicity and cosmesis was stable over time, which is important. Importantly, it included patient-reported outcomes, not just the physician-reported outcomes. Broadly, I think these findings are applicable for many patients, including all patients who are receiving whole breast radiotherapy with an added boost. I think over time this is going to improve the quality of life of our patients. It represents an innovative change that everyone is going to be excited to embrace,” Dr. Horst said.



Previous randomized controlled trials showed that an additional radiation dose to the tumor bed following lumpectomy and whole breast irradiation reduces the relative risk of local recurrence by about 35%. However, this increases treatment time for patients who have already endured an extensive regimen. For whole breast irradiation, hypofractionated radiation in 15-16 fractions over 3 weeks has comparable recurrence rates as a 5-week regimen, but the relevant trials did not examine the effect hypofractionation may have on a radiation boost to the tumor bed of high-risk patients. Because of this lack of evidence, current practice calls for the boost to remain sequential in five to eight fractions after completion of whole breast irradiation, which adds 1 week to a 1.5 week–long treatment.

The study included 2,262 patients who were randomized to receive a sequential boost or a concomitant boost. After a median follow-up of 7.4 years, there were 54 ipsilateral breast recurrence (IBR) events. The estimated 7-year risk of IBR was 2.2% in the sequential boost and 2.6% in the concurrent risk group (hazard ratio, 1.32; noninferiority test P = .039). Approximately 60% of patients received adjuvant chemotherapy.

Grade 3 or higher adverse events were similar with a frequency of 3.3% in the sequential group and 3.5% in the concurrent group (P = .79). The researchers used the Global Cosmetic Score (GCS) to assess outcomes from the perspective of both physicians and patients. 86% of physicians rated the outcome as excellent/good in the sequential group versus 82% in the concurrent group (P = .33).

“For high-risk early-stage breast cancer patients undergoing breast conservation, a concurrent boost with hypofractionated whole breast irradiation – compared to a sequential boost – results in noninferior local recurrence rates with no significant difference in toxicity, noninferior patient rated cosmesis, and no significant difference in physician rated cosmesis. The entire treatment was delivered in three weeks, even for high-risk patients. Just as critical, the use of target volume based radiation planning for [three-dimensional conformal or IMRT whole breast irradiation assessed by dose volume analysis is feasible, and resulted in very low toxicity in the treatment arms, regardless of the fractionation schedule, or the boost delivery,” Dr. Vincini said.

No conflicts of interest were disclosed for Dr. Horst or Dr. Vicini.

Among high-risk early breast cancer patients, delivery of a radiation boost to the tumor bed during whole breast irradiation was just as safe and effective as delivering the boost sequentially after whole breast irradiation ended. The findings from the phase 3 clinical trial are a boon to patient convenience.

These findings are indeed practice changing. This was a well-designed trial that looked at shortening treatment from six to three weeks. They showed equivalent local control and importantly, a good cosmetic outcome over time,” said Kathleen Horst, MD, who served as a discussant during a presentation given by Frank Vicini, MD, FASTRO, GenesisCare, during the annual meeting of the American Society for Radiation Oncology.

“This is substantially more convenient. It’s cost effective both for the health care system and individual patients. Importantly, our patients come in for treatment every day and they’re taking time from work which means they have to arrange for childcare and transportation. So, this makes a big difference for these patients,” said Dr. Horst, who is a professor of radiation oncology at Stanford (Calif.) Medicine and director of well-being in the radiation department at Stanford Medicine.

“One of the things that was surprising is that I think all of us were thinking this might be a more toxic regimen, but as Dr. Vicini showed, it was equally effective over time with minimal toxicity and cosmesis was stable over time, which is important. Importantly, it included patient-reported outcomes, not just the physician-reported outcomes. Broadly, I think these findings are applicable for many patients, including all patients who are receiving whole breast radiotherapy with an added boost. I think over time this is going to improve the quality of life of our patients. It represents an innovative change that everyone is going to be excited to embrace,” Dr. Horst said.



Previous randomized controlled trials showed that an additional radiation dose to the tumor bed following lumpectomy and whole breast irradiation reduces the relative risk of local recurrence by about 35%. However, this increases treatment time for patients who have already endured an extensive regimen. For whole breast irradiation, hypofractionated radiation in 15-16 fractions over 3 weeks has comparable recurrence rates as a 5-week regimen, but the relevant trials did not examine the effect hypofractionation may have on a radiation boost to the tumor bed of high-risk patients. Because of this lack of evidence, current practice calls for the boost to remain sequential in five to eight fractions after completion of whole breast irradiation, which adds 1 week to a 1.5 week–long treatment.

The study included 2,262 patients who were randomized to receive a sequential boost or a concomitant boost. After a median follow-up of 7.4 years, there were 54 ipsilateral breast recurrence (IBR) events. The estimated 7-year risk of IBR was 2.2% in the sequential boost and 2.6% in the concurrent risk group (hazard ratio, 1.32; noninferiority test P = .039). Approximately 60% of patients received adjuvant chemotherapy.

Grade 3 or higher adverse events were similar with a frequency of 3.3% in the sequential group and 3.5% in the concurrent group (P = .79). The researchers used the Global Cosmetic Score (GCS) to assess outcomes from the perspective of both physicians and patients. 86% of physicians rated the outcome as excellent/good in the sequential group versus 82% in the concurrent group (P = .33).

“For high-risk early-stage breast cancer patients undergoing breast conservation, a concurrent boost with hypofractionated whole breast irradiation – compared to a sequential boost – results in noninferior local recurrence rates with no significant difference in toxicity, noninferior patient rated cosmesis, and no significant difference in physician rated cosmesis. The entire treatment was delivered in three weeks, even for high-risk patients. Just as critical, the use of target volume based radiation planning for [three-dimensional conformal or IMRT whole breast irradiation assessed by dose volume analysis is feasible, and resulted in very low toxicity in the treatment arms, regardless of the fractionation schedule, or the boost delivery,” Dr. Vincini said.

No conflicts of interest were disclosed for Dr. Horst or Dr. Vicini.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASTRO 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

NfL levels might presage MS disability

Article Type
Changed
Mon, 10/31/2022 - 10:41

Neurofilament light chain (NfL) is a well-known and useful biomarker for multiple sclerosis (MS) disease activity, but its association with disease progression is not well understood. A new analysis of MS patients in California’s EPIC cohort suggests that NfL spikes occur about 1 year before clinical sign of MS disease worsening.

“We see evidence for accelerated neuroaxonal damage in the year preceding the first diagnosis of the progression events, [but] only if they were associated with evidence of focal inflammatory activity – that can be either clinical or imaging evidence,” said Ahmed Abdelhak, MD, during a presentation of the study at the annual meeting of the European Committee for Treatment and Research in Multiple Sclerosis (ECTRIMS).

“By the time we diagnose the EDSS progression, it’s already too late. Every damage or any accelerated neuroaxonal damage that has happened in association with this event already took place around a year ago. I think [this has] huge implications for the designing of clinical trials,” said Dr. Abdelhak, who is a postdoctoral researcher at the University of California, San Francisco.

In the study, researchers analyzed data from 609 MS cases, with a total of 3,906 office visits. The median age was 42 years, and 69.6% were female. Median disease duration was 6 years.

They examined the association between NfL scores and confirmed disease worsening, as recorded by an increase in EDSS score. There was an increase in NfL age-adjusted z score about 12 months in advance among patients with a progression association with a relapse in the past year, compared with individuals who did not experience disease progression. There was also a more modest increase among individuals who had disease progression without a recent relapse, but this was not statistically significant.

“Our findings suggest that the association between NfL levels and EDSS worsening is most prominent in the setting of relapse-associated events,” said Dr. Abdelhak.
 

Clinical implications and audience skepticism

During the Q&A following the talk, session moderator Charlotte Teunissen, PhD, professor of neurochemistry at Amsterdam University Medical Center, asked about the clinical implication of the finding. “It seems that you concluded that axonal damage has been done before the progression starts. Is that your conclusion? So it means that there is no option to interfere anymore, consequently.”

Dr. Abdelhak responded: “I think that’s a very important interpretation of the data, which I’m sure is a relatively new way of thinking about it. That means, indeed, that when we see these patients, measuring NfL wouldn’t deliver any additional value because they don’t differ between the groups at the time of EDSS worsening. And there is probably nothing more we can do about this event. But it’s still very important to know that any therapeutic intervention has also the need to prevent future disability progression, future neuroaxonal damage, but regarding what has happened already, I’m a little bit skeptical if we will be able to change anything.”

Dr. Teunissen expressed skepticism that there was no further neurodegeneration following the spike in NfL, and pointed out an important caveat, which was the study’s reliance on NfL. “You base your conclusions on what you observe for NfL, and it’s a far-fetched conclusion that there is no further axonal damage ongoing. Maybe NfL is just one marker, and it’s not the best biomarker to measure progression,” she said.

Dr. Abdelhak conceded that it will be necessary to confirm the findings with other biomarkers of neurological injury. Even different subunits of the NfL protein have been shown to have different dynamics in other neurological conditions. “So the data we have give definitely an incomplete picture because we [know] nothing about the other biomarkers of neuroaxonal injury, including the other subunits of NfL,” he said.

Later in the Q&A, Alasdair Coles, MD, professor of neurology at University of Cambridge (England), spoke from the audience. He suggested that the findings could be seen as dispiriting for clinicians. “Would the panel agree that actually for a clinician this is all rather disappointing, because none of these markers are telling us anything that we don’t otherwise know by examining the patient and doing scans?”

“I can attempt to tackle that provocative question,” replied Elias Sotirchos, MD, who also presented on an association between NfL and brain atrophy research during the session. He pointed out that all clinical tests are imperfect, and suggested that NfL isn’t something to be used in isolation. It could be useful when patients are experiencing new symptoms, or worsening symptoms, and in combination with MRI results. “My interpretation of NfL is that it does have incremental value, telling us which patients have lesions that are more destructive, potentially, given all of these consistent associations with brain atrophy and disability progression over time,” said Dr. Sotirchos, who is an assistant professor of neurology at Johns Hopkins Medicine, Baltimore.

Dr. Abdelhak and Dr. Teunissen have no relevant financial disclosures. Dr. Sotirchos has financial relationships with Alexion, Viela Bio, Horizon Therapeutics, Genentech, and Ad Scientiam.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Neurofilament light chain (NfL) is a well-known and useful biomarker for multiple sclerosis (MS) disease activity, but its association with disease progression is not well understood. A new analysis of MS patients in California’s EPIC cohort suggests that NfL spikes occur about 1 year before clinical sign of MS disease worsening.

“We see evidence for accelerated neuroaxonal damage in the year preceding the first diagnosis of the progression events, [but] only if they were associated with evidence of focal inflammatory activity – that can be either clinical or imaging evidence,” said Ahmed Abdelhak, MD, during a presentation of the study at the annual meeting of the European Committee for Treatment and Research in Multiple Sclerosis (ECTRIMS).

“By the time we diagnose the EDSS progression, it’s already too late. Every damage or any accelerated neuroaxonal damage that has happened in association with this event already took place around a year ago. I think [this has] huge implications for the designing of clinical trials,” said Dr. Abdelhak, who is a postdoctoral researcher at the University of California, San Francisco.

In the study, researchers analyzed data from 609 MS cases, with a total of 3,906 office visits. The median age was 42 years, and 69.6% were female. Median disease duration was 6 years.

They examined the association between NfL scores and confirmed disease worsening, as recorded by an increase in EDSS score. There was an increase in NfL age-adjusted z score about 12 months in advance among patients with a progression association with a relapse in the past year, compared with individuals who did not experience disease progression. There was also a more modest increase among individuals who had disease progression without a recent relapse, but this was not statistically significant.

“Our findings suggest that the association between NfL levels and EDSS worsening is most prominent in the setting of relapse-associated events,” said Dr. Abdelhak.
 

Clinical implications and audience skepticism

During the Q&A following the talk, session moderator Charlotte Teunissen, PhD, professor of neurochemistry at Amsterdam University Medical Center, asked about the clinical implication of the finding. “It seems that you concluded that axonal damage has been done before the progression starts. Is that your conclusion? So it means that there is no option to interfere anymore, consequently.”

Dr. Abdelhak responded: “I think that’s a very important interpretation of the data, which I’m sure is a relatively new way of thinking about it. That means, indeed, that when we see these patients, measuring NfL wouldn’t deliver any additional value because they don’t differ between the groups at the time of EDSS worsening. And there is probably nothing more we can do about this event. But it’s still very important to know that any therapeutic intervention has also the need to prevent future disability progression, future neuroaxonal damage, but regarding what has happened already, I’m a little bit skeptical if we will be able to change anything.”

Dr. Teunissen expressed skepticism that there was no further neurodegeneration following the spike in NfL, and pointed out an important caveat, which was the study’s reliance on NfL. “You base your conclusions on what you observe for NfL, and it’s a far-fetched conclusion that there is no further axonal damage ongoing. Maybe NfL is just one marker, and it’s not the best biomarker to measure progression,” she said.

Dr. Abdelhak conceded that it will be necessary to confirm the findings with other biomarkers of neurological injury. Even different subunits of the NfL protein have been shown to have different dynamics in other neurological conditions. “So the data we have give definitely an incomplete picture because we [know] nothing about the other biomarkers of neuroaxonal injury, including the other subunits of NfL,” he said.

Later in the Q&A, Alasdair Coles, MD, professor of neurology at University of Cambridge (England), spoke from the audience. He suggested that the findings could be seen as dispiriting for clinicians. “Would the panel agree that actually for a clinician this is all rather disappointing, because none of these markers are telling us anything that we don’t otherwise know by examining the patient and doing scans?”

“I can attempt to tackle that provocative question,” replied Elias Sotirchos, MD, who also presented on an association between NfL and brain atrophy research during the session. He pointed out that all clinical tests are imperfect, and suggested that NfL isn’t something to be used in isolation. It could be useful when patients are experiencing new symptoms, or worsening symptoms, and in combination with MRI results. “My interpretation of NfL is that it does have incremental value, telling us which patients have lesions that are more destructive, potentially, given all of these consistent associations with brain atrophy and disability progression over time,” said Dr. Sotirchos, who is an assistant professor of neurology at Johns Hopkins Medicine, Baltimore.

Dr. Abdelhak and Dr. Teunissen have no relevant financial disclosures. Dr. Sotirchos has financial relationships with Alexion, Viela Bio, Horizon Therapeutics, Genentech, and Ad Scientiam.

Neurofilament light chain (NfL) is a well-known and useful biomarker for multiple sclerosis (MS) disease activity, but its association with disease progression is not well understood. A new analysis of MS patients in California’s EPIC cohort suggests that NfL spikes occur about 1 year before clinical sign of MS disease worsening.

“We see evidence for accelerated neuroaxonal damage in the year preceding the first diagnosis of the progression events, [but] only if they were associated with evidence of focal inflammatory activity – that can be either clinical or imaging evidence,” said Ahmed Abdelhak, MD, during a presentation of the study at the annual meeting of the European Committee for Treatment and Research in Multiple Sclerosis (ECTRIMS).

“By the time we diagnose the EDSS progression, it’s already too late. Every damage or any accelerated neuroaxonal damage that has happened in association with this event already took place around a year ago. I think [this has] huge implications for the designing of clinical trials,” said Dr. Abdelhak, who is a postdoctoral researcher at the University of California, San Francisco.

In the study, researchers analyzed data from 609 MS cases, with a total of 3,906 office visits. The median age was 42 years, and 69.6% were female. Median disease duration was 6 years.

They examined the association between NfL scores and confirmed disease worsening, as recorded by an increase in EDSS score. There was an increase in NfL age-adjusted z score about 12 months in advance among patients with a progression association with a relapse in the past year, compared with individuals who did not experience disease progression. There was also a more modest increase among individuals who had disease progression without a recent relapse, but this was not statistically significant.

“Our findings suggest that the association between NfL levels and EDSS worsening is most prominent in the setting of relapse-associated events,” said Dr. Abdelhak.
 

Clinical implications and audience skepticism

During the Q&A following the talk, session moderator Charlotte Teunissen, PhD, professor of neurochemistry at Amsterdam University Medical Center, asked about the clinical implication of the finding. “It seems that you concluded that axonal damage has been done before the progression starts. Is that your conclusion? So it means that there is no option to interfere anymore, consequently.”

Dr. Abdelhak responded: “I think that’s a very important interpretation of the data, which I’m sure is a relatively new way of thinking about it. That means, indeed, that when we see these patients, measuring NfL wouldn’t deliver any additional value because they don’t differ between the groups at the time of EDSS worsening. And there is probably nothing more we can do about this event. But it’s still very important to know that any therapeutic intervention has also the need to prevent future disability progression, future neuroaxonal damage, but regarding what has happened already, I’m a little bit skeptical if we will be able to change anything.”

Dr. Teunissen expressed skepticism that there was no further neurodegeneration following the spike in NfL, and pointed out an important caveat, which was the study’s reliance on NfL. “You base your conclusions on what you observe for NfL, and it’s a far-fetched conclusion that there is no further axonal damage ongoing. Maybe NfL is just one marker, and it’s not the best biomarker to measure progression,” she said.

Dr. Abdelhak conceded that it will be necessary to confirm the findings with other biomarkers of neurological injury. Even different subunits of the NfL protein have been shown to have different dynamics in other neurological conditions. “So the data we have give definitely an incomplete picture because we [know] nothing about the other biomarkers of neuroaxonal injury, including the other subunits of NfL,” he said.

Later in the Q&A, Alasdair Coles, MD, professor of neurology at University of Cambridge (England), spoke from the audience. He suggested that the findings could be seen as dispiriting for clinicians. “Would the panel agree that actually for a clinician this is all rather disappointing, because none of these markers are telling us anything that we don’t otherwise know by examining the patient and doing scans?”

“I can attempt to tackle that provocative question,” replied Elias Sotirchos, MD, who also presented on an association between NfL and brain atrophy research during the session. He pointed out that all clinical tests are imperfect, and suggested that NfL isn’t something to be used in isolation. It could be useful when patients are experiencing new symptoms, or worsening symptoms, and in combination with MRI results. “My interpretation of NfL is that it does have incremental value, telling us which patients have lesions that are more destructive, potentially, given all of these consistent associations with brain atrophy and disability progression over time,” said Dr. Sotirchos, who is an assistant professor of neurology at Johns Hopkins Medicine, Baltimore.

Dr. Abdelhak and Dr. Teunissen have no relevant financial disclosures. Dr. Sotirchos has financial relationships with Alexion, Viela Bio, Horizon Therapeutics, Genentech, and Ad Scientiam.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECTRIMS 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

ctDNA hints at esophageal cancer outcomes

Article Type
Changed
Mon, 10/31/2022 - 09:23

Circulating tumor DNA (ctDNA) has garnered attention in recent years as a potential noninvasive biomarker that could help determine prognosis and treatment responses in solid tumors. They could also provide a more complete picture of tumor genetics than the limited samples often available from a biopsy.

ctDNA studies have been conducted in a range of solid tumors, but esophageal cancer has received less attention than other cancers. It is currently diagnosed by endoscopy, but this method is not suitable for population-wide surveillance because of its cost and invasiveness.

Esophageal squamous cell carcinoma (ESCC) is the predominant histologic type of esophageal cancer in China, and it is difficult to diagnose using normal radiological techniques because of the hollow nature of the esophagus.

In a virtual poster session at the annual meeting of the American Society for Radiation Oncology, Xin Wang, MD, discussed the results of a small study looking at ctDNA and ESCC. “We aimed to investigate if ctDNA could detect disease progression before radiological imaging and try identifying patients with inferior prognosis based on ctDNA positivity and dynamics,” said Dr. Wang, who is a researcher at the Chinese Academy of Medical Sciences, Beijing.

85% of enrolled patients were male, and the median age at diagnosis was 64 years. The gross tumor volume was larger in patients with ctDNA-positive tumors at baseline 40.1 cm3 versus 28.7 cm3 (P = .001) and 14% underwent esophagectomy following radiotherapy, compared with 58% of the ctDNA-negative group (P = .008). Other baseline factors were similar between the two groups.

The researchers used a 474-gene panel to analyze plasma samples. 106 of the genes are known to be associated with radiosensitivity. Prior to radiotherapy (T0), 28 of 40 patients (70%) had a positive ctDNA sample. At week 4 of radiotherapy (T1), 42% of 36 patients were ctDNA positive. One to 3 months after radiotherapy/chemoradiotherapy (T2), among 27 patients, 30% were ctDNA positive. 27 patients ultimately underwent esophagectomy, while 9 did not have surgery. Three to 6 months after radiotherapy/chemoradiotherapy (T3), among 23 patients, 22% were ctDNA positive. Of 14 patients alive after 1 year, 43% were ctDNA positive.

Over a median follow-up of 20.6 months, 17 patients were diagnosed with progression through radiological imaging. Of these, 13 patients (77%) were ctDNA positive before or after progression (Cohen’s kappa, 0.512; P < .01). The mean lead time was 5.5 months (95% confidence interval, 1.5-9.4 months).

The researchers also observed links between ctDNA and survival. “We observed a strong association between inferior progression-free survival [PFS] and ctDNA positivity at T1, T2, and T3 time points. Similar associations were detected in OS [overall survival] as well,” Dr. Wang said.

In a multivariate analysis, ctDNA positivity at T1 was associated worse PFS (hazard ratio, 3.35; 95% CI, 1.10-10.22), and there was a trend toward worse overall survival (HR, 2.48; 95% CI, 0.83-7.37). There were no statistically significant associations between ctDNA positivity and PFS or OS at T2.

Twenty-one patients experienced a decrease in ctDNA concentration between T0 and T1. Of these, eight patients achieved a clearance of ctDNA by T1, and they had a trend toward better PFS than patients who did not achieve clearance (HR, 0.31; P = .06).

“The relatively poor locoregional recurrence-free survival remains related to ctDNA positivity at T1. Interestingly, for ctDNA-negative patients who received surgery, none of them were diagnosed with radiological progression. To summarize, ctDNA is a promising biomarker for detecting disease progression. Positive ctDNA status indicates for PFS and OS, but patients achieving ctDNA clearance after radiation are likely to have a better PFS. There is also a potential association between ctDNA positivity at the fourth week during radiation therapy and higher risk of local recurrence, but further studies with a larger sample size are required,” Dr. Wang said.

Ann Raldow, MD, who served as a discussant following the poster presentation, pointed out that ctDNA has been found to be a useful prognostic and predictive tool in colon cancer. The new work suggests “that detectable ctDNA may help guide recommendations for postchemoradiation treatment. Of course, the ctDNA and esophageal cancer space is still in its infancy, and I would really encourage future studies to incorporate ctDNA as part of what they’re studying so that we can get more information about both the prognostic and predictive value of ctDNA in esophageal cancer,” said Dr. Raldow, who is an assistant professor of radiation oncology, University of California, Los Angeles.

Dr. Wang has no relevant financial disclosures. Dr. Raldow had received research funding from Intelligent Automation, Clarity, and Viewray.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Circulating tumor DNA (ctDNA) has garnered attention in recent years as a potential noninvasive biomarker that could help determine prognosis and treatment responses in solid tumors. They could also provide a more complete picture of tumor genetics than the limited samples often available from a biopsy.

ctDNA studies have been conducted in a range of solid tumors, but esophageal cancer has received less attention than other cancers. It is currently diagnosed by endoscopy, but this method is not suitable for population-wide surveillance because of its cost and invasiveness.

Esophageal squamous cell carcinoma (ESCC) is the predominant histologic type of esophageal cancer in China, and it is difficult to diagnose using normal radiological techniques because of the hollow nature of the esophagus.

In a virtual poster session at the annual meeting of the American Society for Radiation Oncology, Xin Wang, MD, discussed the results of a small study looking at ctDNA and ESCC. “We aimed to investigate if ctDNA could detect disease progression before radiological imaging and try identifying patients with inferior prognosis based on ctDNA positivity and dynamics,” said Dr. Wang, who is a researcher at the Chinese Academy of Medical Sciences, Beijing.

85% of enrolled patients were male, and the median age at diagnosis was 64 years. The gross tumor volume was larger in patients with ctDNA-positive tumors at baseline 40.1 cm3 versus 28.7 cm3 (P = .001) and 14% underwent esophagectomy following radiotherapy, compared with 58% of the ctDNA-negative group (P = .008). Other baseline factors were similar between the two groups.

The researchers used a 474-gene panel to analyze plasma samples. 106 of the genes are known to be associated with radiosensitivity. Prior to radiotherapy (T0), 28 of 40 patients (70%) had a positive ctDNA sample. At week 4 of radiotherapy (T1), 42% of 36 patients were ctDNA positive. One to 3 months after radiotherapy/chemoradiotherapy (T2), among 27 patients, 30% were ctDNA positive. 27 patients ultimately underwent esophagectomy, while 9 did not have surgery. Three to 6 months after radiotherapy/chemoradiotherapy (T3), among 23 patients, 22% were ctDNA positive. Of 14 patients alive after 1 year, 43% were ctDNA positive.

Over a median follow-up of 20.6 months, 17 patients were diagnosed with progression through radiological imaging. Of these, 13 patients (77%) were ctDNA positive before or after progression (Cohen’s kappa, 0.512; P < .01). The mean lead time was 5.5 months (95% confidence interval, 1.5-9.4 months).

The researchers also observed links between ctDNA and survival. “We observed a strong association between inferior progression-free survival [PFS] and ctDNA positivity at T1, T2, and T3 time points. Similar associations were detected in OS [overall survival] as well,” Dr. Wang said.

In a multivariate analysis, ctDNA positivity at T1 was associated worse PFS (hazard ratio, 3.35; 95% CI, 1.10-10.22), and there was a trend toward worse overall survival (HR, 2.48; 95% CI, 0.83-7.37). There were no statistically significant associations between ctDNA positivity and PFS or OS at T2.

Twenty-one patients experienced a decrease in ctDNA concentration between T0 and T1. Of these, eight patients achieved a clearance of ctDNA by T1, and they had a trend toward better PFS than patients who did not achieve clearance (HR, 0.31; P = .06).

“The relatively poor locoregional recurrence-free survival remains related to ctDNA positivity at T1. Interestingly, for ctDNA-negative patients who received surgery, none of them were diagnosed with radiological progression. To summarize, ctDNA is a promising biomarker for detecting disease progression. Positive ctDNA status indicates for PFS and OS, but patients achieving ctDNA clearance after radiation are likely to have a better PFS. There is also a potential association between ctDNA positivity at the fourth week during radiation therapy and higher risk of local recurrence, but further studies with a larger sample size are required,” Dr. Wang said.

Ann Raldow, MD, who served as a discussant following the poster presentation, pointed out that ctDNA has been found to be a useful prognostic and predictive tool in colon cancer. The new work suggests “that detectable ctDNA may help guide recommendations for postchemoradiation treatment. Of course, the ctDNA and esophageal cancer space is still in its infancy, and I would really encourage future studies to incorporate ctDNA as part of what they’re studying so that we can get more information about both the prognostic and predictive value of ctDNA in esophageal cancer,” said Dr. Raldow, who is an assistant professor of radiation oncology, University of California, Los Angeles.

Dr. Wang has no relevant financial disclosures. Dr. Raldow had received research funding from Intelligent Automation, Clarity, and Viewray.

Circulating tumor DNA (ctDNA) has garnered attention in recent years as a potential noninvasive biomarker that could help determine prognosis and treatment responses in solid tumors. They could also provide a more complete picture of tumor genetics than the limited samples often available from a biopsy.

ctDNA studies have been conducted in a range of solid tumors, but esophageal cancer has received less attention than other cancers. It is currently diagnosed by endoscopy, but this method is not suitable for population-wide surveillance because of its cost and invasiveness.

Esophageal squamous cell carcinoma (ESCC) is the predominant histologic type of esophageal cancer in China, and it is difficult to diagnose using normal radiological techniques because of the hollow nature of the esophagus.

In a virtual poster session at the annual meeting of the American Society for Radiation Oncology, Xin Wang, MD, discussed the results of a small study looking at ctDNA and ESCC. “We aimed to investigate if ctDNA could detect disease progression before radiological imaging and try identifying patients with inferior prognosis based on ctDNA positivity and dynamics,” said Dr. Wang, who is a researcher at the Chinese Academy of Medical Sciences, Beijing.

85% of enrolled patients were male, and the median age at diagnosis was 64 years. The gross tumor volume was larger in patients with ctDNA-positive tumors at baseline 40.1 cm3 versus 28.7 cm3 (P = .001) and 14% underwent esophagectomy following radiotherapy, compared with 58% of the ctDNA-negative group (P = .008). Other baseline factors were similar between the two groups.

The researchers used a 474-gene panel to analyze plasma samples. 106 of the genes are known to be associated with radiosensitivity. Prior to radiotherapy (T0), 28 of 40 patients (70%) had a positive ctDNA sample. At week 4 of radiotherapy (T1), 42% of 36 patients were ctDNA positive. One to 3 months after radiotherapy/chemoradiotherapy (T2), among 27 patients, 30% were ctDNA positive. 27 patients ultimately underwent esophagectomy, while 9 did not have surgery. Three to 6 months after radiotherapy/chemoradiotherapy (T3), among 23 patients, 22% were ctDNA positive. Of 14 patients alive after 1 year, 43% were ctDNA positive.

Over a median follow-up of 20.6 months, 17 patients were diagnosed with progression through radiological imaging. Of these, 13 patients (77%) were ctDNA positive before or after progression (Cohen’s kappa, 0.512; P < .01). The mean lead time was 5.5 months (95% confidence interval, 1.5-9.4 months).

The researchers also observed links between ctDNA and survival. “We observed a strong association between inferior progression-free survival [PFS] and ctDNA positivity at T1, T2, and T3 time points. Similar associations were detected in OS [overall survival] as well,” Dr. Wang said.

In a multivariate analysis, ctDNA positivity at T1 was associated worse PFS (hazard ratio, 3.35; 95% CI, 1.10-10.22), and there was a trend toward worse overall survival (HR, 2.48; 95% CI, 0.83-7.37). There were no statistically significant associations between ctDNA positivity and PFS or OS at T2.

Twenty-one patients experienced a decrease in ctDNA concentration between T0 and T1. Of these, eight patients achieved a clearance of ctDNA by T1, and they had a trend toward better PFS than patients who did not achieve clearance (HR, 0.31; P = .06).

“The relatively poor locoregional recurrence-free survival remains related to ctDNA positivity at T1. Interestingly, for ctDNA-negative patients who received surgery, none of them were diagnosed with radiological progression. To summarize, ctDNA is a promising biomarker for detecting disease progression. Positive ctDNA status indicates for PFS and OS, but patients achieving ctDNA clearance after radiation are likely to have a better PFS. There is also a potential association between ctDNA positivity at the fourth week during radiation therapy and higher risk of local recurrence, but further studies with a larger sample size are required,” Dr. Wang said.

Ann Raldow, MD, who served as a discussant following the poster presentation, pointed out that ctDNA has been found to be a useful prognostic and predictive tool in colon cancer. The new work suggests “that detectable ctDNA may help guide recommendations for postchemoradiation treatment. Of course, the ctDNA and esophageal cancer space is still in its infancy, and I would really encourage future studies to incorporate ctDNA as part of what they’re studying so that we can get more information about both the prognostic and predictive value of ctDNA in esophageal cancer,” said Dr. Raldow, who is an assistant professor of radiation oncology, University of California, Los Angeles.

Dr. Wang has no relevant financial disclosures. Dr. Raldow had received research funding from Intelligent Automation, Clarity, and Viewray.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASTRO 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Radiotherapy shows benefit in difficult liver cancer cases

Article Type
Changed
Mon, 10/31/2022 - 13:10

Among patients with advanced hepatocellular carcinoma (HCC), and especially those with macrovascular invasion, stereotactic body radiation therapy (SBRT) appears to grant a survival benefit when added to systemic therapy. That was the finding from a phase 3 clinical trial presented at the annual meeting of the American Society for Radiation Oncology.

For unresectable HCC or cases that cannot be treated with thermal ablation or regional therapy, the current standard of care is systemic therapy. When the study was conducted, the recommended therapy was the sorafenib, a tyrosine kinase inhibitor. But with the publication of the IMbrave 150 study in 2021, atezolizumab plus bevacizumab is now increasingly preferred by some oncologists.

In 2008, the SHARP study found that sorafenib improved median survival, but it provided less benefit for patients with macrovascular invasion. Various studies have addressed the question of whether radiation could improve survival among this patient population, but results have not been encouraging. Direct comparisons between sorafenib and radiotherapy in the SARAH and SIRveNIB studies showed no significant differences in outcomes.

To determine the efficacy of combined SBRT and sorafenib, researchers randomized 177 patients with locally advanced HCC to receive 400 mg sorafenib every 12 hours or SBRT of 27.5-50 Gy in five fractions, followed by 200 mg sorafenib every 12 hours for 4 weeks, then 400 mg sorafenib every 12 hours thereafter. The median age was 66 years, 85% of patients were male, 74% had macrovascular invasion. The study included patients with locally advanced tumors up to a 20-cm sum of diameters or up to a 20-cm conglomerate tumor, as well as those with metastases of 3 cm size or smaller.

After a median follow-up of 13.2 months, median overall survival was 15.8 months in the combination group, versus 12.3 months in sorafenib group (hazard ratio, 0.77; 1-sided P = .055). After a multivariable analysis, the combined treatment was associated with better overall survival (HR, 0.72; P = .042).

“This overall survival is greater than expected and impressive even in the era now of immunotherapy trials,” said Laura Dawson, MD, who presented the results of the study during a press conference at the meeting. Dr. Dawson is a professor of radiation oncology at University of Toronto and a radiation oncologist at Princess Margaret Hospital in Toronto.

Median progression-free survival was 9.2 months in the combined group versus 5.5 months in the sorafenib-only group (HR, 0.55; P = .0001). At 24 months, 17% of the combination group had 7% of the sorafenib group remained had not progressed. The median time to progression was 18.5 months in the combination group and 9.5 months in the sorafenib group (HR, 0.69; P = .034). The frequency of adverse events was similar in both groups. The study admitted patients with any level of vascular invasion, which contrasted with many earlier trials that excluded those with involvement of the main portal vein.

“I think this is really one of the most important studies that’s come out in many years in terms of practice changing outcomes. We’ve seen that with patients who have very high-risk HCC, especially patients who have portal vein or macrovascular vascular invasion, there’s been a significant improvement in overall survival for these patients, and this is a very difficult patient population. Adding SBRT in this group improved both the progression free survival and overall survival, so I think we’re really at a point where we can call this a standard of care for patients,” Karyn A. Goodman, MD, professor and vice chair of clinical research and radiation oncology at the Icahn School of Medicine at Mount Sinai, New York, said at the press conference.

A limitation of the study is that it closed early to accrual because of a change in the standard of care.

Dr. Goodman has served on advisory boards for Novartis, Philips Healthcare, and Genentech, and has consulted for RenovoRx and Syntactx. Dr. Dawson has received research grants from Merck and received patent/license fees or copyright compensation from RaySearch.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Among patients with advanced hepatocellular carcinoma (HCC), and especially those with macrovascular invasion, stereotactic body radiation therapy (SBRT) appears to grant a survival benefit when added to systemic therapy. That was the finding from a phase 3 clinical trial presented at the annual meeting of the American Society for Radiation Oncology.

For unresectable HCC or cases that cannot be treated with thermal ablation or regional therapy, the current standard of care is systemic therapy. When the study was conducted, the recommended therapy was the sorafenib, a tyrosine kinase inhibitor. But with the publication of the IMbrave 150 study in 2021, atezolizumab plus bevacizumab is now increasingly preferred by some oncologists.

In 2008, the SHARP study found that sorafenib improved median survival, but it provided less benefit for patients with macrovascular invasion. Various studies have addressed the question of whether radiation could improve survival among this patient population, but results have not been encouraging. Direct comparisons between sorafenib and radiotherapy in the SARAH and SIRveNIB studies showed no significant differences in outcomes.

To determine the efficacy of combined SBRT and sorafenib, researchers randomized 177 patients with locally advanced HCC to receive 400 mg sorafenib every 12 hours or SBRT of 27.5-50 Gy in five fractions, followed by 200 mg sorafenib every 12 hours for 4 weeks, then 400 mg sorafenib every 12 hours thereafter. The median age was 66 years, 85% of patients were male, 74% had macrovascular invasion. The study included patients with locally advanced tumors up to a 20-cm sum of diameters or up to a 20-cm conglomerate tumor, as well as those with metastases of 3 cm size or smaller.

After a median follow-up of 13.2 months, median overall survival was 15.8 months in the combination group, versus 12.3 months in sorafenib group (hazard ratio, 0.77; 1-sided P = .055). After a multivariable analysis, the combined treatment was associated with better overall survival (HR, 0.72; P = .042).

“This overall survival is greater than expected and impressive even in the era now of immunotherapy trials,” said Laura Dawson, MD, who presented the results of the study during a press conference at the meeting. Dr. Dawson is a professor of radiation oncology at University of Toronto and a radiation oncologist at Princess Margaret Hospital in Toronto.

Median progression-free survival was 9.2 months in the combined group versus 5.5 months in the sorafenib-only group (HR, 0.55; P = .0001). At 24 months, 17% of the combination group had 7% of the sorafenib group remained had not progressed. The median time to progression was 18.5 months in the combination group and 9.5 months in the sorafenib group (HR, 0.69; P = .034). The frequency of adverse events was similar in both groups. The study admitted patients with any level of vascular invasion, which contrasted with many earlier trials that excluded those with involvement of the main portal vein.

“I think this is really one of the most important studies that’s come out in many years in terms of practice changing outcomes. We’ve seen that with patients who have very high-risk HCC, especially patients who have portal vein or macrovascular vascular invasion, there’s been a significant improvement in overall survival for these patients, and this is a very difficult patient population. Adding SBRT in this group improved both the progression free survival and overall survival, so I think we’re really at a point where we can call this a standard of care for patients,” Karyn A. Goodman, MD, professor and vice chair of clinical research and radiation oncology at the Icahn School of Medicine at Mount Sinai, New York, said at the press conference.

A limitation of the study is that it closed early to accrual because of a change in the standard of care.

Dr. Goodman has served on advisory boards for Novartis, Philips Healthcare, and Genentech, and has consulted for RenovoRx and Syntactx. Dr. Dawson has received research grants from Merck and received patent/license fees or copyright compensation from RaySearch.

Among patients with advanced hepatocellular carcinoma (HCC), and especially those with macrovascular invasion, stereotactic body radiation therapy (SBRT) appears to grant a survival benefit when added to systemic therapy. That was the finding from a phase 3 clinical trial presented at the annual meeting of the American Society for Radiation Oncology.

For unresectable HCC or cases that cannot be treated with thermal ablation or regional therapy, the current standard of care is systemic therapy. When the study was conducted, the recommended therapy was the sorafenib, a tyrosine kinase inhibitor. But with the publication of the IMbrave 150 study in 2021, atezolizumab plus bevacizumab is now increasingly preferred by some oncologists.

In 2008, the SHARP study found that sorafenib improved median survival, but it provided less benefit for patients with macrovascular invasion. Various studies have addressed the question of whether radiation could improve survival among this patient population, but results have not been encouraging. Direct comparisons between sorafenib and radiotherapy in the SARAH and SIRveNIB studies showed no significant differences in outcomes.

To determine the efficacy of combined SBRT and sorafenib, researchers randomized 177 patients with locally advanced HCC to receive 400 mg sorafenib every 12 hours or SBRT of 27.5-50 Gy in five fractions, followed by 200 mg sorafenib every 12 hours for 4 weeks, then 400 mg sorafenib every 12 hours thereafter. The median age was 66 years, 85% of patients were male, 74% had macrovascular invasion. The study included patients with locally advanced tumors up to a 20-cm sum of diameters or up to a 20-cm conglomerate tumor, as well as those with metastases of 3 cm size or smaller.

After a median follow-up of 13.2 months, median overall survival was 15.8 months in the combination group, versus 12.3 months in sorafenib group (hazard ratio, 0.77; 1-sided P = .055). After a multivariable analysis, the combined treatment was associated with better overall survival (HR, 0.72; P = .042).

“This overall survival is greater than expected and impressive even in the era now of immunotherapy trials,” said Laura Dawson, MD, who presented the results of the study during a press conference at the meeting. Dr. Dawson is a professor of radiation oncology at University of Toronto and a radiation oncologist at Princess Margaret Hospital in Toronto.

Median progression-free survival was 9.2 months in the combined group versus 5.5 months in the sorafenib-only group (HR, 0.55; P = .0001). At 24 months, 17% of the combination group had 7% of the sorafenib group remained had not progressed. The median time to progression was 18.5 months in the combination group and 9.5 months in the sorafenib group (HR, 0.69; P = .034). The frequency of adverse events was similar in both groups. The study admitted patients with any level of vascular invasion, which contrasted with many earlier trials that excluded those with involvement of the main portal vein.

“I think this is really one of the most important studies that’s come out in many years in terms of practice changing outcomes. We’ve seen that with patients who have very high-risk HCC, especially patients who have portal vein or macrovascular vascular invasion, there’s been a significant improvement in overall survival for these patients, and this is a very difficult patient population. Adding SBRT in this group improved both the progression free survival and overall survival, so I think we’re really at a point where we can call this a standard of care for patients,” Karyn A. Goodman, MD, professor and vice chair of clinical research and radiation oncology at the Icahn School of Medicine at Mount Sinai, New York, said at the press conference.

A limitation of the study is that it closed early to accrual because of a change in the standard of care.

Dr. Goodman has served on advisory boards for Novartis, Philips Healthcare, and Genentech, and has consulted for RenovoRx and Syntactx. Dr. Dawson has received research grants from Merck and received patent/license fees or copyright compensation from RaySearch.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASTRO 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article