AVAHO

avaho
Main menu
AVAHO Main Menu
Unpublish
Negative Keywords Excluded Elements
header[@id='header']
div[contains(@class, 'header__large-screen')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
footer[@id='footer']
div[contains(@class, 'main-prefix')]
section[contains(@class, 'nav-hidden')]
div[contains(@class, 'ce-card-content')]
nav[contains(@class, 'nav-ce-stack')]
Altmetric
DSM Affiliated
Display in offset block
Enable Disqus
Display Author and Disclosure Link
Publication Type
Clinical
Slot System
Top 25
Disable Sticky Ads
Disable Ad Block Mitigation
Featured Buckets Admin
Show Ads on this Publication's Homepage
Consolidated Pub
Show Article Page Numbers on TOC
Expire Announcement Bar
Use larger logo size
Off
publication_blueconic_enabled
Off
Show More Destinations Menu
Disable Adhesion on Publication
Off
Mobile Logo Image
Restore Menu Label on Mobile Navigation
Disable Facebook Pixel from Publication
Exclude this publication from publication selection on articles and quiz
Challenge Center
Disable Inline Native ads
survey writer start date
Mobile Logo Media

Study finds discrepancies in biopsy decisions, diagnoses based on skin type

Article Type
Changed

Among dermatology residents and attending dermatologists, rates of diagnostic accuracy and appropriate biopsy recommendations were significantly lower for patients with skin of color, compared with White patients, new research shows.

“Our findings suggest diagnostic biases based on skin color exist in dermatology practice,” lead author Loren Krueger, MD, assistant professor in the department of dermatology, Emory University School of Medicine, Atlanta, said at the Annual Skin of Color Society Scientific Symposium. “A lower likelihood of biopsy of malignancy in darker skin types could contribute to disparities in cutaneous malignancies,” she added.

Dr. Loren Krueger
Loren Krueger, MD, assistant professor in the Department of Dermatology, Emory University. Atlanta


Disparities in dermatologic care among Black patients, compared with White patients, have been well documented. Recent evidence includes a 2020 study that showed significant shortcomings among medical students in correctly diagnosing squamous cell carcinoma, urticaria, and atopic dermatitis for patients with skin of color.

“It’s no secret that our images do not accurately or in the right quantity include skin of color,” Dr. Krueger said. “Yet few papers talk about how these biases actually impact our care. Importantly, this study demonstrates that diagnostic bias develops as early as the medical student level.”

To further investigate the role of skin color in the assessment of neoplastic and inflammatory skin conditions and decisions to perform biopsy, Dr. Krueger and her colleagues surveyed 144 dermatology residents and attending dermatologists to evaluate their clinical decisionmaking skills in assessing skin conditions for patients with lighter skin and those with darker skin. Almost 80% (113) provided complete responses and were included in the study.

For the survey, participants were shown photos of 10 neoplastic and 10 inflammatory skin conditions. Each image was matched in lighter (skin types I-II) and darker (skin types IV-VI) skinned patients in random order. Participants were asked to identify the suspected underlying etiology (neoplastic–benign, neoplastic–malignant, papulosquamous, lichenoid, infectious, bullous, or no suspected etiology) and whether they would choose to perform biopsy for the pictured condition.

Overall, their responses showed a slightly higher probability of recommending a biopsy for patients with skin types IV-V (odds ratio, 1.18; P = .054).

However, respondents were more than twice as likely to recommend a biopsy for benign neoplasms for patients with skin of color, compared with those with lighter skin types (OR, 2.57; P < .0001). They were significantly less likely to recommend a biopsy for a malignant neoplasm for patients with skin of color (OR, 0.42; P < .0001).

In addition, the correct etiology was much more commonly missed in diagnosing patients with skin of color, even after adjusting for years in dermatology practice (OR, 0.569; P < .0001).

Conversely, respondents were significantly less likely to recommend a biopsy for benign neoplasms and were more likely to recommend a biopsy for malignant neoplasms among White patients. Etiology was more commonly correct.



The findings underscore that “for skin of color patients, you’re more likely to have a benign neoplasm biopsied, you’re less likely to have a malignant neoplasm biopsied, and more often, your etiology may be missed,” Dr. Krueger said at the meeting.

Of note, while 45% of respondents were dermatology residents or fellows, 20.4% had 1-5 years of experience, and about 28% had 10 to more than 25 years of experience.

And while 75% of the dermatology residents, fellows, and attendings were White, there was no difference in the probability of correctly identifying the underlying etiology in dark or light skin types based on the provider’s self-identified race.

Importantly, the patterns in the study of diagnostic discrepancies are reflected in broader dermatologic outcomes. The 5-year melanoma survival rate is 74.1% among Black patients and 92.9% among White patients. Dr. Krueger referred to data showing that only 52.6% of Black patients have stage I melanoma at diagnosis, whereas among White patients, the rate is much higher, at 75.9%.

“We know skin malignancy can be more aggressive and late-stage in skin of color populations, leading to increased morbidity and later stage at initial diagnosis,” Dr. Krueger told this news organization. “We routinely attribute this to limited access to care and lack of awareness on skin malignancy. However, we have no evidence on how we, as dermatologists, may be playing a role.”

Furthermore, the decision to perform biopsy or not can affect the size and stage at diagnosis of a cutaneous malignancy, she noted.

Key changes needed to prevent the disparities – and their implications – should start at the training level, she emphasized. “I would love to see increased photo representation in training materials – this is a great place to start,” Dr. Krueger said.

In addition, “encouraging medical students, residents, and dermatologists to learn from skin of color experts is vital,” she said. “We should also provide hands-on experience and training with diverse patient populations.”

The first step to addressing biases “is to acknowledge they exist,” Dr. Krueger added. “I am hopeful this inspires others to continue to investigate these biases, as well as how we can eliminate them.”

The study was funded by the Rudin Resident Research Award. The authors have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Among dermatology residents and attending dermatologists, rates of diagnostic accuracy and appropriate biopsy recommendations were significantly lower for patients with skin of color, compared with White patients, new research shows.

“Our findings suggest diagnostic biases based on skin color exist in dermatology practice,” lead author Loren Krueger, MD, assistant professor in the department of dermatology, Emory University School of Medicine, Atlanta, said at the Annual Skin of Color Society Scientific Symposium. “A lower likelihood of biopsy of malignancy in darker skin types could contribute to disparities in cutaneous malignancies,” she added.

Dr. Loren Krueger
Loren Krueger, MD, assistant professor in the Department of Dermatology, Emory University. Atlanta


Disparities in dermatologic care among Black patients, compared with White patients, have been well documented. Recent evidence includes a 2020 study that showed significant shortcomings among medical students in correctly diagnosing squamous cell carcinoma, urticaria, and atopic dermatitis for patients with skin of color.

“It’s no secret that our images do not accurately or in the right quantity include skin of color,” Dr. Krueger said. “Yet few papers talk about how these biases actually impact our care. Importantly, this study demonstrates that diagnostic bias develops as early as the medical student level.”

To further investigate the role of skin color in the assessment of neoplastic and inflammatory skin conditions and decisions to perform biopsy, Dr. Krueger and her colleagues surveyed 144 dermatology residents and attending dermatologists to evaluate their clinical decisionmaking skills in assessing skin conditions for patients with lighter skin and those with darker skin. Almost 80% (113) provided complete responses and were included in the study.

For the survey, participants were shown photos of 10 neoplastic and 10 inflammatory skin conditions. Each image was matched in lighter (skin types I-II) and darker (skin types IV-VI) skinned patients in random order. Participants were asked to identify the suspected underlying etiology (neoplastic–benign, neoplastic–malignant, papulosquamous, lichenoid, infectious, bullous, or no suspected etiology) and whether they would choose to perform biopsy for the pictured condition.

Overall, their responses showed a slightly higher probability of recommending a biopsy for patients with skin types IV-V (odds ratio, 1.18; P = .054).

However, respondents were more than twice as likely to recommend a biopsy for benign neoplasms for patients with skin of color, compared with those with lighter skin types (OR, 2.57; P < .0001). They were significantly less likely to recommend a biopsy for a malignant neoplasm for patients with skin of color (OR, 0.42; P < .0001).

In addition, the correct etiology was much more commonly missed in diagnosing patients with skin of color, even after adjusting for years in dermatology practice (OR, 0.569; P < .0001).

Conversely, respondents were significantly less likely to recommend a biopsy for benign neoplasms and were more likely to recommend a biopsy for malignant neoplasms among White patients. Etiology was more commonly correct.



The findings underscore that “for skin of color patients, you’re more likely to have a benign neoplasm biopsied, you’re less likely to have a malignant neoplasm biopsied, and more often, your etiology may be missed,” Dr. Krueger said at the meeting.

Of note, while 45% of respondents were dermatology residents or fellows, 20.4% had 1-5 years of experience, and about 28% had 10 to more than 25 years of experience.

And while 75% of the dermatology residents, fellows, and attendings were White, there was no difference in the probability of correctly identifying the underlying etiology in dark or light skin types based on the provider’s self-identified race.

Importantly, the patterns in the study of diagnostic discrepancies are reflected in broader dermatologic outcomes. The 5-year melanoma survival rate is 74.1% among Black patients and 92.9% among White patients. Dr. Krueger referred to data showing that only 52.6% of Black patients have stage I melanoma at diagnosis, whereas among White patients, the rate is much higher, at 75.9%.

“We know skin malignancy can be more aggressive and late-stage in skin of color populations, leading to increased morbidity and later stage at initial diagnosis,” Dr. Krueger told this news organization. “We routinely attribute this to limited access to care and lack of awareness on skin malignancy. However, we have no evidence on how we, as dermatologists, may be playing a role.”

Furthermore, the decision to perform biopsy or not can affect the size and stage at diagnosis of a cutaneous malignancy, she noted.

Key changes needed to prevent the disparities – and their implications – should start at the training level, she emphasized. “I would love to see increased photo representation in training materials – this is a great place to start,” Dr. Krueger said.

In addition, “encouraging medical students, residents, and dermatologists to learn from skin of color experts is vital,” she said. “We should also provide hands-on experience and training with diverse patient populations.”

The first step to addressing biases “is to acknowledge they exist,” Dr. Krueger added. “I am hopeful this inspires others to continue to investigate these biases, as well as how we can eliminate them.”

The study was funded by the Rudin Resident Research Award. The authors have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Among dermatology residents and attending dermatologists, rates of diagnostic accuracy and appropriate biopsy recommendations were significantly lower for patients with skin of color, compared with White patients, new research shows.

“Our findings suggest diagnostic biases based on skin color exist in dermatology practice,” lead author Loren Krueger, MD, assistant professor in the department of dermatology, Emory University School of Medicine, Atlanta, said at the Annual Skin of Color Society Scientific Symposium. “A lower likelihood of biopsy of malignancy in darker skin types could contribute to disparities in cutaneous malignancies,” she added.

Dr. Loren Krueger
Loren Krueger, MD, assistant professor in the Department of Dermatology, Emory University. Atlanta


Disparities in dermatologic care among Black patients, compared with White patients, have been well documented. Recent evidence includes a 2020 study that showed significant shortcomings among medical students in correctly diagnosing squamous cell carcinoma, urticaria, and atopic dermatitis for patients with skin of color.

“It’s no secret that our images do not accurately or in the right quantity include skin of color,” Dr. Krueger said. “Yet few papers talk about how these biases actually impact our care. Importantly, this study demonstrates that diagnostic bias develops as early as the medical student level.”

To further investigate the role of skin color in the assessment of neoplastic and inflammatory skin conditions and decisions to perform biopsy, Dr. Krueger and her colleagues surveyed 144 dermatology residents and attending dermatologists to evaluate their clinical decisionmaking skills in assessing skin conditions for patients with lighter skin and those with darker skin. Almost 80% (113) provided complete responses and were included in the study.

For the survey, participants were shown photos of 10 neoplastic and 10 inflammatory skin conditions. Each image was matched in lighter (skin types I-II) and darker (skin types IV-VI) skinned patients in random order. Participants were asked to identify the suspected underlying etiology (neoplastic–benign, neoplastic–malignant, papulosquamous, lichenoid, infectious, bullous, or no suspected etiology) and whether they would choose to perform biopsy for the pictured condition.

Overall, their responses showed a slightly higher probability of recommending a biopsy for patients with skin types IV-V (odds ratio, 1.18; P = .054).

However, respondents were more than twice as likely to recommend a biopsy for benign neoplasms for patients with skin of color, compared with those with lighter skin types (OR, 2.57; P < .0001). They were significantly less likely to recommend a biopsy for a malignant neoplasm for patients with skin of color (OR, 0.42; P < .0001).

In addition, the correct etiology was much more commonly missed in diagnosing patients with skin of color, even after adjusting for years in dermatology practice (OR, 0.569; P < .0001).

Conversely, respondents were significantly less likely to recommend a biopsy for benign neoplasms and were more likely to recommend a biopsy for malignant neoplasms among White patients. Etiology was more commonly correct.



The findings underscore that “for skin of color patients, you’re more likely to have a benign neoplasm biopsied, you’re less likely to have a malignant neoplasm biopsied, and more often, your etiology may be missed,” Dr. Krueger said at the meeting.

Of note, while 45% of respondents were dermatology residents or fellows, 20.4% had 1-5 years of experience, and about 28% had 10 to more than 25 years of experience.

And while 75% of the dermatology residents, fellows, and attendings were White, there was no difference in the probability of correctly identifying the underlying etiology in dark or light skin types based on the provider’s self-identified race.

Importantly, the patterns in the study of diagnostic discrepancies are reflected in broader dermatologic outcomes. The 5-year melanoma survival rate is 74.1% among Black patients and 92.9% among White patients. Dr. Krueger referred to data showing that only 52.6% of Black patients have stage I melanoma at diagnosis, whereas among White patients, the rate is much higher, at 75.9%.

“We know skin malignancy can be more aggressive and late-stage in skin of color populations, leading to increased morbidity and later stage at initial diagnosis,” Dr. Krueger told this news organization. “We routinely attribute this to limited access to care and lack of awareness on skin malignancy. However, we have no evidence on how we, as dermatologists, may be playing a role.”

Furthermore, the decision to perform biopsy or not can affect the size and stage at diagnosis of a cutaneous malignancy, she noted.

Key changes needed to prevent the disparities – and their implications – should start at the training level, she emphasized. “I would love to see increased photo representation in training materials – this is a great place to start,” Dr. Krueger said.

In addition, “encouraging medical students, residents, and dermatologists to learn from skin of color experts is vital,” she said. “We should also provide hands-on experience and training with diverse patient populations.”

The first step to addressing biases “is to acknowledge they exist,” Dr. Krueger added. “I am hopeful this inspires others to continue to investigate these biases, as well as how we can eliminate them.”

The study was funded by the Rudin Resident Research Award. The authors have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Lung cancer in 2030: Expand genotyping

Article Type
Changed

In recent years, patients with advanced lung cancer have benefited from the advent of immune therapies and genotype-directed therapies –both of which have led to improved survival rates. But what will lung cancer look like in 2030?

Pasi A. Janne, MD, PhD, of the Dana-Farber Cancer Institute, Boston, hopes to see improved access to tumor and blood-based genotyping.

Dr. Janne, who serves as director of the Lowe Center for Thoracic Oncology at Dana-Farber, gave a keynote presentation at the 2022 European Lung Cancer Congress, where he highlighted the need to broaden the scope of targeted therapies, make “great drugs work even better,” improve the ability to treat patients based on risk level, and expand the use of targeted therapies in the adjuvant and neoadjuvant setting to make significant progress in the treatment lung cancer treatment in coming years.

Genotyping is underutilized, he said. A 2019 multicenter study reported at the annual meeting of the American Society of Clinical Oncology showed that only 54% of 1,203 patients underwent testing for EGFR mutations, 22% were tested for EGFR, ALK, ROS1, and BRAF mutations, and only 7% were tested for all biomarkers recommended by National Comprehensive Cancer Network guidelines at the time.

That study also showed that only 45% of patients received biomarker-driven treatment, even when driver mutations were detected.

“Immunotherapy was often prescribed instead of targeted therapy, even when molecular results were available,” Dr. Janne said.

Another study, reported at the 2021 ASCO annual meeting, showed some improvement in testing rates, but still, only 37% of patients were tested for all biomarkers as recommended.

Racial disparities in testing have also been observed. Bruno and colleagues found that any next-generation sequencing was performed in 50.1% of White patients, compared with 39.8% of black patients, and NGS prior to first-line therapy was performed in 35.5% and 25.8%, respectively.

The study, also reported at ASCO in 2021, showed that trial participation was observed among 3.9% of White patients and 1.9% of Black patients.

“The studies really highlight the need for increased testing rates and appropriate utilization of testing results to deliver optimal care to our patients with advanced lung cancer. We have a long way to go. To live the promise and appreciate the promise of precision therapy ... we need to be able to offer this testing to all of our patients with lung cancer,” he said.

Dr. Janne reported relationships with numerous pharmaceutical companies, including consulting, research support and stock ownership. He also receives postmarketing royalties from Dana-Farber Cancer Institute–owned intellectual property on EGFR mutations.

Publications
Topics
Sections

In recent years, patients with advanced lung cancer have benefited from the advent of immune therapies and genotype-directed therapies –both of which have led to improved survival rates. But what will lung cancer look like in 2030?

Pasi A. Janne, MD, PhD, of the Dana-Farber Cancer Institute, Boston, hopes to see improved access to tumor and blood-based genotyping.

Dr. Janne, who serves as director of the Lowe Center for Thoracic Oncology at Dana-Farber, gave a keynote presentation at the 2022 European Lung Cancer Congress, where he highlighted the need to broaden the scope of targeted therapies, make “great drugs work even better,” improve the ability to treat patients based on risk level, and expand the use of targeted therapies in the adjuvant and neoadjuvant setting to make significant progress in the treatment lung cancer treatment in coming years.

Genotyping is underutilized, he said. A 2019 multicenter study reported at the annual meeting of the American Society of Clinical Oncology showed that only 54% of 1,203 patients underwent testing for EGFR mutations, 22% were tested for EGFR, ALK, ROS1, and BRAF mutations, and only 7% were tested for all biomarkers recommended by National Comprehensive Cancer Network guidelines at the time.

That study also showed that only 45% of patients received biomarker-driven treatment, even when driver mutations were detected.

“Immunotherapy was often prescribed instead of targeted therapy, even when molecular results were available,” Dr. Janne said.

Another study, reported at the 2021 ASCO annual meeting, showed some improvement in testing rates, but still, only 37% of patients were tested for all biomarkers as recommended.

Racial disparities in testing have also been observed. Bruno and colleagues found that any next-generation sequencing was performed in 50.1% of White patients, compared with 39.8% of black patients, and NGS prior to first-line therapy was performed in 35.5% and 25.8%, respectively.

The study, also reported at ASCO in 2021, showed that trial participation was observed among 3.9% of White patients and 1.9% of Black patients.

“The studies really highlight the need for increased testing rates and appropriate utilization of testing results to deliver optimal care to our patients with advanced lung cancer. We have a long way to go. To live the promise and appreciate the promise of precision therapy ... we need to be able to offer this testing to all of our patients with lung cancer,” he said.

Dr. Janne reported relationships with numerous pharmaceutical companies, including consulting, research support and stock ownership. He also receives postmarketing royalties from Dana-Farber Cancer Institute–owned intellectual property on EGFR mutations.

In recent years, patients with advanced lung cancer have benefited from the advent of immune therapies and genotype-directed therapies –both of which have led to improved survival rates. But what will lung cancer look like in 2030?

Pasi A. Janne, MD, PhD, of the Dana-Farber Cancer Institute, Boston, hopes to see improved access to tumor and blood-based genotyping.

Dr. Janne, who serves as director of the Lowe Center for Thoracic Oncology at Dana-Farber, gave a keynote presentation at the 2022 European Lung Cancer Congress, where he highlighted the need to broaden the scope of targeted therapies, make “great drugs work even better,” improve the ability to treat patients based on risk level, and expand the use of targeted therapies in the adjuvant and neoadjuvant setting to make significant progress in the treatment lung cancer treatment in coming years.

Genotyping is underutilized, he said. A 2019 multicenter study reported at the annual meeting of the American Society of Clinical Oncology showed that only 54% of 1,203 patients underwent testing for EGFR mutations, 22% were tested for EGFR, ALK, ROS1, and BRAF mutations, and only 7% were tested for all biomarkers recommended by National Comprehensive Cancer Network guidelines at the time.

That study also showed that only 45% of patients received biomarker-driven treatment, even when driver mutations were detected.

“Immunotherapy was often prescribed instead of targeted therapy, even when molecular results were available,” Dr. Janne said.

Another study, reported at the 2021 ASCO annual meeting, showed some improvement in testing rates, but still, only 37% of patients were tested for all biomarkers as recommended.

Racial disparities in testing have also been observed. Bruno and colleagues found that any next-generation sequencing was performed in 50.1% of White patients, compared with 39.8% of black patients, and NGS prior to first-line therapy was performed in 35.5% and 25.8%, respectively.

The study, also reported at ASCO in 2021, showed that trial participation was observed among 3.9% of White patients and 1.9% of Black patients.

“The studies really highlight the need for increased testing rates and appropriate utilization of testing results to deliver optimal care to our patients with advanced lung cancer. We have a long way to go. To live the promise and appreciate the promise of precision therapy ... we need to be able to offer this testing to all of our patients with lung cancer,” he said.

Dr. Janne reported relationships with numerous pharmaceutical companies, including consulting, research support and stock ownership. He also receives postmarketing royalties from Dana-Farber Cancer Institute–owned intellectual property on EGFR mutations.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ELCC 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Weighing the complexity of pathological response in lung cancer

Article Type
Changed

Pathological response has emerged as a valuable endpoint and surrogate marker for overall survival in lung cancer studies, but much work remains to be done, said William D. Travis, MD, director of thoracic pathology at Memorial Sloan Kettering Cancer Center, New York.

In a keynote address at the 2022 European Lung Cancer Conference, Dr. Travis highlighted advances in the use of pathological response in this setting and outlined areas that need refinement. “Pathologic response after preoperative therapy is important because the extent of pathologic response strongly correlates with improved overall survival, and it is reflective of neoadjuvant therapy. The degree of response is associated with the degree of benefit in survival, and it’s being used as a surrogate for survival in phase 2 and 3 neoadjuvant clinical trials.”

In fact, multiple studies have demonstrated that non–small cell lung cancer patients with 10% or less viable residual tumor after treatment have improved overall survival and disease-free survival, compared with patients who have more residual tumor, he explained.

Recent studies have demonstrated the value of pathological response as an endpoint in the neoadjuvant therapy and molecular targeted therapy setting, he said, citing a study published in the Journal of Clinical Oncology that showed major pathological response rates of 14%-45% and pathological complete response rates up to 29% in patients treated with single-agent checkpoint inhibition.

In the CheckMate 816 trial, both major pathologic response and pathological complete response were significantly higher in patients treated with combination nivolumab and chemotherapy, compared with those treated with chemotherapy alone (37% vs. 8.9% and 24% vs. 2%, respectively).

“This high rate of responses with combined immunotherapy and chemotherapy is quite exciting,” he said.

Dr. Travis also stressed the importance of consulting the current International Association for the Study of Lung Cancer Recommendations for Pathologic Assessment of Lung Cancer Resection Specimens After Neoadjuvant Therapy.

He highlighted several key points regarding pathological response in lung cancer:

  • Major pathological response (MPR) is calculated as the estimated size of viable tumor divided by the size of the tumor bed.
  • Optimal cutoffs for determining MPR is currently 10%, but recent data suggest that in the conventional chemotherapy setting this may vary by tumor histology, with much higher cutoffs of about 65% for adenocarcinoma.
  • Estimating the amount of viable tumor is “quite complicated and requires quite a number of steps,” and one the most important steps is “for the surgeon to the pathologist know that given specimen is from a patient who received neoadjuvant therapy.”
  • Determining the border of the tumor bed can be challenging, therefore “resection specimens after neoadjuvant therapy should be sampled to optimize comprehensive gross and histologic assessment of the lung tumor bed for pathologic response ... as outlined in the guidelines.”
  • The IASLC panel determined that having a single approach for estimating treatment effect would be best, despite the different therapy types and combinations used, but “it is recognized that there may be certain types of features that need to be addressed,” such as immune cell infiltrates in pats who received immunotherapy.
  • The recommendations provide specific guidance for measuring tumor size for staging, including for special circumstances.

As for future direction, Dr. Travis said, “one question is how to assess treatment effect in lymph node samples.

“This is done for lymph nodes in breast cancer but not in lung cancer. We need system[s] for lung cancer.”

Good “infrastructure for pathology departments” is needed to support clinical trials, he said, noting that the team at Memorial Sloan Kettering Cancer Center includes physician assistants, tissue procurement staff, frozen section techs, research fellows, and research assistants.

Future work should also aim to standardize pathology assessment for clinical trials, improve the current recommendations, make use of new technology like artificial intelligence, optimize banking protocols and special techniques, and identify radiologic-pathological correlations, he said.

He added that “IASLC is promoting the design and implementation of an international database to collect uniformly clinical and pathologic information with the ultimate goal of fostering collaboration and to facilitate the identification of surrogate endpoints of long-term survival.”

Dr. Travis is a nonpaid pathology consultant for the LCMC3 and LCMC4 trials.

Publications
Topics
Sections

Pathological response has emerged as a valuable endpoint and surrogate marker for overall survival in lung cancer studies, but much work remains to be done, said William D. Travis, MD, director of thoracic pathology at Memorial Sloan Kettering Cancer Center, New York.

In a keynote address at the 2022 European Lung Cancer Conference, Dr. Travis highlighted advances in the use of pathological response in this setting and outlined areas that need refinement. “Pathologic response after preoperative therapy is important because the extent of pathologic response strongly correlates with improved overall survival, and it is reflective of neoadjuvant therapy. The degree of response is associated with the degree of benefit in survival, and it’s being used as a surrogate for survival in phase 2 and 3 neoadjuvant clinical trials.”

In fact, multiple studies have demonstrated that non–small cell lung cancer patients with 10% or less viable residual tumor after treatment have improved overall survival and disease-free survival, compared with patients who have more residual tumor, he explained.

Recent studies have demonstrated the value of pathological response as an endpoint in the neoadjuvant therapy and molecular targeted therapy setting, he said, citing a study published in the Journal of Clinical Oncology that showed major pathological response rates of 14%-45% and pathological complete response rates up to 29% in patients treated with single-agent checkpoint inhibition.

In the CheckMate 816 trial, both major pathologic response and pathological complete response were significantly higher in patients treated with combination nivolumab and chemotherapy, compared with those treated with chemotherapy alone (37% vs. 8.9% and 24% vs. 2%, respectively).

“This high rate of responses with combined immunotherapy and chemotherapy is quite exciting,” he said.

Dr. Travis also stressed the importance of consulting the current International Association for the Study of Lung Cancer Recommendations for Pathologic Assessment of Lung Cancer Resection Specimens After Neoadjuvant Therapy.

He highlighted several key points regarding pathological response in lung cancer:

  • Major pathological response (MPR) is calculated as the estimated size of viable tumor divided by the size of the tumor bed.
  • Optimal cutoffs for determining MPR is currently 10%, but recent data suggest that in the conventional chemotherapy setting this may vary by tumor histology, with much higher cutoffs of about 65% for adenocarcinoma.
  • Estimating the amount of viable tumor is “quite complicated and requires quite a number of steps,” and one the most important steps is “for the surgeon to the pathologist know that given specimen is from a patient who received neoadjuvant therapy.”
  • Determining the border of the tumor bed can be challenging, therefore “resection specimens after neoadjuvant therapy should be sampled to optimize comprehensive gross and histologic assessment of the lung tumor bed for pathologic response ... as outlined in the guidelines.”
  • The IASLC panel determined that having a single approach for estimating treatment effect would be best, despite the different therapy types and combinations used, but “it is recognized that there may be certain types of features that need to be addressed,” such as immune cell infiltrates in pats who received immunotherapy.
  • The recommendations provide specific guidance for measuring tumor size for staging, including for special circumstances.

As for future direction, Dr. Travis said, “one question is how to assess treatment effect in lymph node samples.

“This is done for lymph nodes in breast cancer but not in lung cancer. We need system[s] for lung cancer.”

Good “infrastructure for pathology departments” is needed to support clinical trials, he said, noting that the team at Memorial Sloan Kettering Cancer Center includes physician assistants, tissue procurement staff, frozen section techs, research fellows, and research assistants.

Future work should also aim to standardize pathology assessment for clinical trials, improve the current recommendations, make use of new technology like artificial intelligence, optimize banking protocols and special techniques, and identify radiologic-pathological correlations, he said.

He added that “IASLC is promoting the design and implementation of an international database to collect uniformly clinical and pathologic information with the ultimate goal of fostering collaboration and to facilitate the identification of surrogate endpoints of long-term survival.”

Dr. Travis is a nonpaid pathology consultant for the LCMC3 and LCMC4 trials.

Pathological response has emerged as a valuable endpoint and surrogate marker for overall survival in lung cancer studies, but much work remains to be done, said William D. Travis, MD, director of thoracic pathology at Memorial Sloan Kettering Cancer Center, New York.

In a keynote address at the 2022 European Lung Cancer Conference, Dr. Travis highlighted advances in the use of pathological response in this setting and outlined areas that need refinement. “Pathologic response after preoperative therapy is important because the extent of pathologic response strongly correlates with improved overall survival, and it is reflective of neoadjuvant therapy. The degree of response is associated with the degree of benefit in survival, and it’s being used as a surrogate for survival in phase 2 and 3 neoadjuvant clinical trials.”

In fact, multiple studies have demonstrated that non–small cell lung cancer patients with 10% or less viable residual tumor after treatment have improved overall survival and disease-free survival, compared with patients who have more residual tumor, he explained.

Recent studies have demonstrated the value of pathological response as an endpoint in the neoadjuvant therapy and molecular targeted therapy setting, he said, citing a study published in the Journal of Clinical Oncology that showed major pathological response rates of 14%-45% and pathological complete response rates up to 29% in patients treated with single-agent checkpoint inhibition.

In the CheckMate 816 trial, both major pathologic response and pathological complete response were significantly higher in patients treated with combination nivolumab and chemotherapy, compared with those treated with chemotherapy alone (37% vs. 8.9% and 24% vs. 2%, respectively).

“This high rate of responses with combined immunotherapy and chemotherapy is quite exciting,” he said.

Dr. Travis also stressed the importance of consulting the current International Association for the Study of Lung Cancer Recommendations for Pathologic Assessment of Lung Cancer Resection Specimens After Neoadjuvant Therapy.

He highlighted several key points regarding pathological response in lung cancer:

  • Major pathological response (MPR) is calculated as the estimated size of viable tumor divided by the size of the tumor bed.
  • Optimal cutoffs for determining MPR is currently 10%, but recent data suggest that in the conventional chemotherapy setting this may vary by tumor histology, with much higher cutoffs of about 65% for adenocarcinoma.
  • Estimating the amount of viable tumor is “quite complicated and requires quite a number of steps,” and one the most important steps is “for the surgeon to the pathologist know that given specimen is from a patient who received neoadjuvant therapy.”
  • Determining the border of the tumor bed can be challenging, therefore “resection specimens after neoadjuvant therapy should be sampled to optimize comprehensive gross and histologic assessment of the lung tumor bed for pathologic response ... as outlined in the guidelines.”
  • The IASLC panel determined that having a single approach for estimating treatment effect would be best, despite the different therapy types and combinations used, but “it is recognized that there may be certain types of features that need to be addressed,” such as immune cell infiltrates in pats who received immunotherapy.
  • The recommendations provide specific guidance for measuring tumor size for staging, including for special circumstances.

As for future direction, Dr. Travis said, “one question is how to assess treatment effect in lymph node samples.

“This is done for lymph nodes in breast cancer but not in lung cancer. We need system[s] for lung cancer.”

Good “infrastructure for pathology departments” is needed to support clinical trials, he said, noting that the team at Memorial Sloan Kettering Cancer Center includes physician assistants, tissue procurement staff, frozen section techs, research fellows, and research assistants.

Future work should also aim to standardize pathology assessment for clinical trials, improve the current recommendations, make use of new technology like artificial intelligence, optimize banking protocols and special techniques, and identify radiologic-pathological correlations, he said.

He added that “IASLC is promoting the design and implementation of an international database to collect uniformly clinical and pathologic information with the ultimate goal of fostering collaboration and to facilitate the identification of surrogate endpoints of long-term survival.”

Dr. Travis is a nonpaid pathology consultant for the LCMC3 and LCMC4 trials.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ELCC 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Mutation testing recommended for advanced and refractory thyroid cancer

Article Type
Changed

A new consensus statement from the American Head and Neck Society Endocrine Surgery Section and International Thyroid Oncology Group focuses on a definition of advanced thyroid cancer and outlines strategies for mutation testing and targeted treatment.

Mutation testing should not be pursued if cancer burden and disease threat is low, since most thyroid cancers have a very good prognosis and are highly treatable. But 15% of differentiated thyroid cancer cases are locally advanced, and radioiodine refractory differentiated thyroid cancer has a 10-year survival below 50%.

More generally, advanced thyroid cancer has not been well defined clinically. Physicians with experience diagnosing advanced disease may recognize it, but there is no widely accepted definition. “This may be the first time that an expert group of physicians has attempted to define what advanced thyroid cancer is,” said David Shonka, MD, who is a coauthor of the consensus statement, which was published online in Head & Neck. He is an associate professor of otolaryngology/head and neck surgery at the University of Virginia, Charlottesville.

“All patients with advanced thyroid disease and most patients with incurable radioiodine refractory differentiated thyroid cancer should undergo somatic mutational testing,” the authors wrote. “Next-generation sequencing can reveal targetable mutations and potentially give patients affected by advanced thyroid carcinoma systemic treatment options that can prolong survival. These new innovative approaches are changing the landscape of clinical care for patients with advanced thyroid cancer.”

For differentiated thyroid cancer and medullary thyroid carcinoma, the authors created a definition that combines structural factors on imaging, along with surgical findings, and biochemical, histologic, and molecular factors. Anaplastic thyroid cancer should always be considered advanced, even after a complete resection and incidental pathological identification.

The statement also summarizes recent advances in thyroid cancer that have revealed molecular markers which contribute to oncogenesis. Initially, those approaches were applied to indeterminate fine needle biopsies to improve diagnosis. More recent studies used them to match patients to targeted therapies. There are Food and Drug Administration–approved therapies targeting the BRAF and RET mutations, but advanced thyroid cancer is also included in some “basket” trials that test targeted agents against driver mutations across multiple tumor types.

Radioiodine refractory differentiated thyroid cancer had few treatments as recently as 10 years ago. But recent research has shown that multikinase inhibitors improve outcomes, and a range of mutations have been found in this type of thyroid cancer, including BRAF V600E, RET, PIK3CA, and PTEN, and fusions involving RET, NTRK, and ALK. Other mutations have been linked to more aggressive disease. Efforts to personalize treatment also include microsatellite stability status, tumor mutational burden, and programmed death–ligand 1 status as indicators for immunotherapy. “With discovery of many other molecular targets, and emerging literature showcasing promise of matched targeted therapies, we recommend that all patients with advanced thyroid cancer have comprehensive genomic profiling on tumor tissue through (next generation sequencing),” the authors wrote.

These newer and novel therapies have presented physicians with options outside of surgery, chemotherapy, or radiotherapy, which have low efficacy against advanced thyroid cancer. “It is an area in which there has been substantial change. Even 5-7 years ago, patients with advanced thyroid cancer that was not responsive to radioactive iodine or surgery really didn’t have a lot of options. This is a really an exciting and growing field,” Dr. Shonka said.

He specifically cited anaplastic thyroid cancer, which like radioiodine refractory differentiated thyroid cancer has had few treatment options until recently. “Now, if you see a patient with anaplastic thyroid cancer, your knee-jerk reaction should be ‘let’s do molecular testing on this, this is definitely advanced disease.’ If they have a BRAF mutation, that’s targetable, and we can treat this patient with combination therapy that actually improves their survival. So, there’s some exciting stuff happening and probably more coming down the road as we develop new drugs that can target these mutations that we’re identifying.”

Dr. Shonka has no relevant financial disclosures.

Publications
Topics
Sections

A new consensus statement from the American Head and Neck Society Endocrine Surgery Section and International Thyroid Oncology Group focuses on a definition of advanced thyroid cancer and outlines strategies for mutation testing and targeted treatment.

Mutation testing should not be pursued if cancer burden and disease threat is low, since most thyroid cancers have a very good prognosis and are highly treatable. But 15% of differentiated thyroid cancer cases are locally advanced, and radioiodine refractory differentiated thyroid cancer has a 10-year survival below 50%.

More generally, advanced thyroid cancer has not been well defined clinically. Physicians with experience diagnosing advanced disease may recognize it, but there is no widely accepted definition. “This may be the first time that an expert group of physicians has attempted to define what advanced thyroid cancer is,” said David Shonka, MD, who is a coauthor of the consensus statement, which was published online in Head & Neck. He is an associate professor of otolaryngology/head and neck surgery at the University of Virginia, Charlottesville.

“All patients with advanced thyroid disease and most patients with incurable radioiodine refractory differentiated thyroid cancer should undergo somatic mutational testing,” the authors wrote. “Next-generation sequencing can reveal targetable mutations and potentially give patients affected by advanced thyroid carcinoma systemic treatment options that can prolong survival. These new innovative approaches are changing the landscape of clinical care for patients with advanced thyroid cancer.”

For differentiated thyroid cancer and medullary thyroid carcinoma, the authors created a definition that combines structural factors on imaging, along with surgical findings, and biochemical, histologic, and molecular factors. Anaplastic thyroid cancer should always be considered advanced, even after a complete resection and incidental pathological identification.

The statement also summarizes recent advances in thyroid cancer that have revealed molecular markers which contribute to oncogenesis. Initially, those approaches were applied to indeterminate fine needle biopsies to improve diagnosis. More recent studies used them to match patients to targeted therapies. There are Food and Drug Administration–approved therapies targeting the BRAF and RET mutations, but advanced thyroid cancer is also included in some “basket” trials that test targeted agents against driver mutations across multiple tumor types.

Radioiodine refractory differentiated thyroid cancer had few treatments as recently as 10 years ago. But recent research has shown that multikinase inhibitors improve outcomes, and a range of mutations have been found in this type of thyroid cancer, including BRAF V600E, RET, PIK3CA, and PTEN, and fusions involving RET, NTRK, and ALK. Other mutations have been linked to more aggressive disease. Efforts to personalize treatment also include microsatellite stability status, tumor mutational burden, and programmed death–ligand 1 status as indicators for immunotherapy. “With discovery of many other molecular targets, and emerging literature showcasing promise of matched targeted therapies, we recommend that all patients with advanced thyroid cancer have comprehensive genomic profiling on tumor tissue through (next generation sequencing),” the authors wrote.

These newer and novel therapies have presented physicians with options outside of surgery, chemotherapy, or radiotherapy, which have low efficacy against advanced thyroid cancer. “It is an area in which there has been substantial change. Even 5-7 years ago, patients with advanced thyroid cancer that was not responsive to radioactive iodine or surgery really didn’t have a lot of options. This is a really an exciting and growing field,” Dr. Shonka said.

He specifically cited anaplastic thyroid cancer, which like radioiodine refractory differentiated thyroid cancer has had few treatment options until recently. “Now, if you see a patient with anaplastic thyroid cancer, your knee-jerk reaction should be ‘let’s do molecular testing on this, this is definitely advanced disease.’ If they have a BRAF mutation, that’s targetable, and we can treat this patient with combination therapy that actually improves their survival. So, there’s some exciting stuff happening and probably more coming down the road as we develop new drugs that can target these mutations that we’re identifying.”

Dr. Shonka has no relevant financial disclosures.

A new consensus statement from the American Head and Neck Society Endocrine Surgery Section and International Thyroid Oncology Group focuses on a definition of advanced thyroid cancer and outlines strategies for mutation testing and targeted treatment.

Mutation testing should not be pursued if cancer burden and disease threat is low, since most thyroid cancers have a very good prognosis and are highly treatable. But 15% of differentiated thyroid cancer cases are locally advanced, and radioiodine refractory differentiated thyroid cancer has a 10-year survival below 50%.

More generally, advanced thyroid cancer has not been well defined clinically. Physicians with experience diagnosing advanced disease may recognize it, but there is no widely accepted definition. “This may be the first time that an expert group of physicians has attempted to define what advanced thyroid cancer is,” said David Shonka, MD, who is a coauthor of the consensus statement, which was published online in Head & Neck. He is an associate professor of otolaryngology/head and neck surgery at the University of Virginia, Charlottesville.

“All patients with advanced thyroid disease and most patients with incurable radioiodine refractory differentiated thyroid cancer should undergo somatic mutational testing,” the authors wrote. “Next-generation sequencing can reveal targetable mutations and potentially give patients affected by advanced thyroid carcinoma systemic treatment options that can prolong survival. These new innovative approaches are changing the landscape of clinical care for patients with advanced thyroid cancer.”

For differentiated thyroid cancer and medullary thyroid carcinoma, the authors created a definition that combines structural factors on imaging, along with surgical findings, and biochemical, histologic, and molecular factors. Anaplastic thyroid cancer should always be considered advanced, even after a complete resection and incidental pathological identification.

The statement also summarizes recent advances in thyroid cancer that have revealed molecular markers which contribute to oncogenesis. Initially, those approaches were applied to indeterminate fine needle biopsies to improve diagnosis. More recent studies used them to match patients to targeted therapies. There are Food and Drug Administration–approved therapies targeting the BRAF and RET mutations, but advanced thyroid cancer is also included in some “basket” trials that test targeted agents against driver mutations across multiple tumor types.

Radioiodine refractory differentiated thyroid cancer had few treatments as recently as 10 years ago. But recent research has shown that multikinase inhibitors improve outcomes, and a range of mutations have been found in this type of thyroid cancer, including BRAF V600E, RET, PIK3CA, and PTEN, and fusions involving RET, NTRK, and ALK. Other mutations have been linked to more aggressive disease. Efforts to personalize treatment also include microsatellite stability status, tumor mutational burden, and programmed death–ligand 1 status as indicators for immunotherapy. “With discovery of many other molecular targets, and emerging literature showcasing promise of matched targeted therapies, we recommend that all patients with advanced thyroid cancer have comprehensive genomic profiling on tumor tissue through (next generation sequencing),” the authors wrote.

These newer and novel therapies have presented physicians with options outside of surgery, chemotherapy, or radiotherapy, which have low efficacy against advanced thyroid cancer. “It is an area in which there has been substantial change. Even 5-7 years ago, patients with advanced thyroid cancer that was not responsive to radioactive iodine or surgery really didn’t have a lot of options. This is a really an exciting and growing field,” Dr. Shonka said.

He specifically cited anaplastic thyroid cancer, which like radioiodine refractory differentiated thyroid cancer has had few treatment options until recently. “Now, if you see a patient with anaplastic thyroid cancer, your knee-jerk reaction should be ‘let’s do molecular testing on this, this is definitely advanced disease.’ If they have a BRAF mutation, that’s targetable, and we can treat this patient with combination therapy that actually improves their survival. So, there’s some exciting stuff happening and probably more coming down the road as we develop new drugs that can target these mutations that we’re identifying.”

Dr. Shonka has no relevant financial disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM HEAD & NECK

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study suggests keto diet increases tumor growth in ovarian cancer

Article Type
Changed

A ketogenic diet fed to mice with epithelial ovarian cancer led to significantly increased tumor growth and gut microbiome alterations, according to study recently presented at the annual meeting of the Society of Gynecologic Oncology.

“The keto diet is very popular, especially among patients who believe it may treat cancer by starving tumors of the fuel they need to grow, altering the immune system, and other anticancer effects,” said study leader Mariam AlHilli, MD, of the Cleveland Clinic.

The findings are surprising because in other studies the high-fat, zero-carb ketogenic diet has demonstrated tumor-suppressing effects. It has been under study as a possible adjuvant therapy for other cancers, such as glioblastoma, colon cancer, prostate cancer, and pancreatic cancer.

“While we don’t know yet whether these findings extend to patients, the results in animals indicate that instead of being protective, the keto diet appears to promote ovarian cancer growth and progression,” Dr. AlHilli said. In the present study, tumor bearing mice were fed a keto diet consisting of 10% protein, 0% carbohydrates, and 90% fat, while the high-fat diet was 10% protein, 15% carbohydrates, and 75% fat. The control diet consisted of 10% protein, 77% carbohydrates, and 13% fat. Epithelial ovarian cancer tumor growth was monitored weekly.

Over the 6- to 10-week course of study, a 9.1-fold increase from baseline in tumor growth was observed in the keto diet-fed mice (n = 20). Among mice fed a high-fat diet (n = 20) that included some carbohydrates, tumor growth increased 2.0-fold from baseline, and among control group mice (n = 20) fed a low-fat, high carbohydrate diet, tumor growth increased 3.1-fold.

The investigators observed several hallmarks of tumor progression: tumor associated macrophages were enriched significantly, activated lymphoid cells (natural killer cells) were significantly reduced (P < .001), and M2:M1 polarization trended higher. Also, in keto diet–fed mice, gene set enrichment analysis revealed that epithelial ovarian cancer tumors had increased angiogenesis and inflammatory responses, enhanced epithelial-to-mesenchymal transition phenotype, and altered lipid metabolism. Compared with high-fat diet–fed mice, the keto-fed mice had increases in lipid catalytic activity and catabolism, as well as decreases in lipid synthesis.

“The tumor increase could be mediated by the gut microbiome or by gene alterations or by metabolite levels that influence tumor growth. It’s possible that each cancer type is different. The composition of the diet may be a factor, as well as how tumors metabolize fat and ketones,” Dr. AlHilli said.

The results need to be confirmed in preclinical animal studies and in additional models, she added.

The study was funded by a K12 Grant and internal funding from Cleveland Clinic. Dr. AlHilli declared no relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

A ketogenic diet fed to mice with epithelial ovarian cancer led to significantly increased tumor growth and gut microbiome alterations, according to study recently presented at the annual meeting of the Society of Gynecologic Oncology.

“The keto diet is very popular, especially among patients who believe it may treat cancer by starving tumors of the fuel they need to grow, altering the immune system, and other anticancer effects,” said study leader Mariam AlHilli, MD, of the Cleveland Clinic.

The findings are surprising because in other studies the high-fat, zero-carb ketogenic diet has demonstrated tumor-suppressing effects. It has been under study as a possible adjuvant therapy for other cancers, such as glioblastoma, colon cancer, prostate cancer, and pancreatic cancer.

“While we don’t know yet whether these findings extend to patients, the results in animals indicate that instead of being protective, the keto diet appears to promote ovarian cancer growth and progression,” Dr. AlHilli said. In the present study, tumor bearing mice were fed a keto diet consisting of 10% protein, 0% carbohydrates, and 90% fat, while the high-fat diet was 10% protein, 15% carbohydrates, and 75% fat. The control diet consisted of 10% protein, 77% carbohydrates, and 13% fat. Epithelial ovarian cancer tumor growth was monitored weekly.

Over the 6- to 10-week course of study, a 9.1-fold increase from baseline in tumor growth was observed in the keto diet-fed mice (n = 20). Among mice fed a high-fat diet (n = 20) that included some carbohydrates, tumor growth increased 2.0-fold from baseline, and among control group mice (n = 20) fed a low-fat, high carbohydrate diet, tumor growth increased 3.1-fold.

The investigators observed several hallmarks of tumor progression: tumor associated macrophages were enriched significantly, activated lymphoid cells (natural killer cells) were significantly reduced (P < .001), and M2:M1 polarization trended higher. Also, in keto diet–fed mice, gene set enrichment analysis revealed that epithelial ovarian cancer tumors had increased angiogenesis and inflammatory responses, enhanced epithelial-to-mesenchymal transition phenotype, and altered lipid metabolism. Compared with high-fat diet–fed mice, the keto-fed mice had increases in lipid catalytic activity and catabolism, as well as decreases in lipid synthesis.

“The tumor increase could be mediated by the gut microbiome or by gene alterations or by metabolite levels that influence tumor growth. It’s possible that each cancer type is different. The composition of the diet may be a factor, as well as how tumors metabolize fat and ketones,” Dr. AlHilli said.

The results need to be confirmed in preclinical animal studies and in additional models, she added.

The study was funded by a K12 Grant and internal funding from Cleveland Clinic. Dr. AlHilli declared no relevant disclosures.

A ketogenic diet fed to mice with epithelial ovarian cancer led to significantly increased tumor growth and gut microbiome alterations, according to study recently presented at the annual meeting of the Society of Gynecologic Oncology.

“The keto diet is very popular, especially among patients who believe it may treat cancer by starving tumors of the fuel they need to grow, altering the immune system, and other anticancer effects,” said study leader Mariam AlHilli, MD, of the Cleveland Clinic.

The findings are surprising because in other studies the high-fat, zero-carb ketogenic diet has demonstrated tumor-suppressing effects. It has been under study as a possible adjuvant therapy for other cancers, such as glioblastoma, colon cancer, prostate cancer, and pancreatic cancer.

“While we don’t know yet whether these findings extend to patients, the results in animals indicate that instead of being protective, the keto diet appears to promote ovarian cancer growth and progression,” Dr. AlHilli said. In the present study, tumor bearing mice were fed a keto diet consisting of 10% protein, 0% carbohydrates, and 90% fat, while the high-fat diet was 10% protein, 15% carbohydrates, and 75% fat. The control diet consisted of 10% protein, 77% carbohydrates, and 13% fat. Epithelial ovarian cancer tumor growth was monitored weekly.

Over the 6- to 10-week course of study, a 9.1-fold increase from baseline in tumor growth was observed in the keto diet-fed mice (n = 20). Among mice fed a high-fat diet (n = 20) that included some carbohydrates, tumor growth increased 2.0-fold from baseline, and among control group mice (n = 20) fed a low-fat, high carbohydrate diet, tumor growth increased 3.1-fold.

The investigators observed several hallmarks of tumor progression: tumor associated macrophages were enriched significantly, activated lymphoid cells (natural killer cells) were significantly reduced (P < .001), and M2:M1 polarization trended higher. Also, in keto diet–fed mice, gene set enrichment analysis revealed that epithelial ovarian cancer tumors had increased angiogenesis and inflammatory responses, enhanced epithelial-to-mesenchymal transition phenotype, and altered lipid metabolism. Compared with high-fat diet–fed mice, the keto-fed mice had increases in lipid catalytic activity and catabolism, as well as decreases in lipid synthesis.

“The tumor increase could be mediated by the gut microbiome or by gene alterations or by metabolite levels that influence tumor growth. It’s possible that each cancer type is different. The composition of the diet may be a factor, as well as how tumors metabolize fat and ketones,” Dr. AlHilli said.

The results need to be confirmed in preclinical animal studies and in additional models, she added.

The study was funded by a K12 Grant and internal funding from Cleveland Clinic. Dr. AlHilli declared no relevant disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SGO 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

AI model predicts ovarian cancer responses

Article Type
Changed

An artificial intelligence (AI) model successfully predicted which high-grade serous ovarian cancer patients would have excellent responses to laparoscopic surgery. The model, using still-frame images from pretreatment laparoscopic surgical videos, had an overall accuracy rate of 93%, according to the pilot study’s first author, Deanna Glassman, MD, an oncologic fellow at the University of Texas MD Anderson Cancer Center, Houston.

Dr. Glassman described her research in a presentation given at the annual meeting of the Society of Gynecologic Oncology.

While the AI model successfully identified all excellent-response patients, it did classify about a third of patients with poor responses as excellent responses. The smaller number of images in the poor-response category, Dr. Glassman speculated, may explain the misclassification.

Researchers took 435 representative still-frame images from pretreatment laparoscopic surgical videos of 113 patients with pathologically proven high-grade serous ovarian cancer. Using 70% of the images to train the model, they used 10% for validation and 20% for the actual testing. They developed the AI model with images from four anatomical locations (diaphragm, omentum, peritoneum, and pelvis), training it using deep learning and neural networks to extract morphological disease patterns for correlation with either of two outcomes: excellent response or poor response. An excellent response was defined as progression-free survival of 12 months or more, and poor response as PFS of 6 months or less. In the retrospective study of images, after excluding 32 gray-zone patients, 75 patients (66%) had durable responses to therapy and 6 (5%) had poor responses.

The PFS was 19 months in the excellent-response group and 3 months in the poor-response group.

Clinicians have often observed differences in gross morphology within the single histologic diagnosis of high-grade serous ovarian cancer. The research intent was to determine if AI could detect these distinct morphological patterns in the still frame images taken at the time of laparoscopy, and correlate them with the eventual clinical outcomes. Dr. Glassman and colleagues are currently validating the model with a much larger cohort, and will look into clinical testing.

“The big-picture goal,” Dr. Glassman said in an interview, “would be to utilize the model to predict which patients would do well with traditional standard of care treatments and those who wouldn’t do well so that we can personalize the treatment plan for those patients with alternative agents and therapies.”

Once validated, the model could also be employed to identify patterns of disease in other gynecologic cancers or distinguish between viable and necrosed malignant tissue.

The study’s predominant limitation was the small sample size which is being addressed in a larger ongoing study.

Funding was provided by a T32 grant, MD Anderson Cancer Center Support Grant, MD Anderson Ovarian Cancer Moon Shot, SPORE in Ovarian Cancer, the American Cancer Society, and the Ovarian Cancer Research Alliance. Dr. Glassman declared no relevant financial relationships.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

An artificial intelligence (AI) model successfully predicted which high-grade serous ovarian cancer patients would have excellent responses to laparoscopic surgery. The model, using still-frame images from pretreatment laparoscopic surgical videos, had an overall accuracy rate of 93%, according to the pilot study’s first author, Deanna Glassman, MD, an oncologic fellow at the University of Texas MD Anderson Cancer Center, Houston.

Dr. Glassman described her research in a presentation given at the annual meeting of the Society of Gynecologic Oncology.

While the AI model successfully identified all excellent-response patients, it did classify about a third of patients with poor responses as excellent responses. The smaller number of images in the poor-response category, Dr. Glassman speculated, may explain the misclassification.

Researchers took 435 representative still-frame images from pretreatment laparoscopic surgical videos of 113 patients with pathologically proven high-grade serous ovarian cancer. Using 70% of the images to train the model, they used 10% for validation and 20% for the actual testing. They developed the AI model with images from four anatomical locations (diaphragm, omentum, peritoneum, and pelvis), training it using deep learning and neural networks to extract morphological disease patterns for correlation with either of two outcomes: excellent response or poor response. An excellent response was defined as progression-free survival of 12 months or more, and poor response as PFS of 6 months or less. In the retrospective study of images, after excluding 32 gray-zone patients, 75 patients (66%) had durable responses to therapy and 6 (5%) had poor responses.

The PFS was 19 months in the excellent-response group and 3 months in the poor-response group.

Clinicians have often observed differences in gross morphology within the single histologic diagnosis of high-grade serous ovarian cancer. The research intent was to determine if AI could detect these distinct morphological patterns in the still frame images taken at the time of laparoscopy, and correlate them with the eventual clinical outcomes. Dr. Glassman and colleagues are currently validating the model with a much larger cohort, and will look into clinical testing.

“The big-picture goal,” Dr. Glassman said in an interview, “would be to utilize the model to predict which patients would do well with traditional standard of care treatments and those who wouldn’t do well so that we can personalize the treatment plan for those patients with alternative agents and therapies.”

Once validated, the model could also be employed to identify patterns of disease in other gynecologic cancers or distinguish between viable and necrosed malignant tissue.

The study’s predominant limitation was the small sample size which is being addressed in a larger ongoing study.

Funding was provided by a T32 grant, MD Anderson Cancer Center Support Grant, MD Anderson Ovarian Cancer Moon Shot, SPORE in Ovarian Cancer, the American Cancer Society, and the Ovarian Cancer Research Alliance. Dr. Glassman declared no relevant financial relationships.

An artificial intelligence (AI) model successfully predicted which high-grade serous ovarian cancer patients would have excellent responses to laparoscopic surgery. The model, using still-frame images from pretreatment laparoscopic surgical videos, had an overall accuracy rate of 93%, according to the pilot study’s first author, Deanna Glassman, MD, an oncologic fellow at the University of Texas MD Anderson Cancer Center, Houston.

Dr. Glassman described her research in a presentation given at the annual meeting of the Society of Gynecologic Oncology.

While the AI model successfully identified all excellent-response patients, it did classify about a third of patients with poor responses as excellent responses. The smaller number of images in the poor-response category, Dr. Glassman speculated, may explain the misclassification.

Researchers took 435 representative still-frame images from pretreatment laparoscopic surgical videos of 113 patients with pathologically proven high-grade serous ovarian cancer. Using 70% of the images to train the model, they used 10% for validation and 20% for the actual testing. They developed the AI model with images from four anatomical locations (diaphragm, omentum, peritoneum, and pelvis), training it using deep learning and neural networks to extract morphological disease patterns for correlation with either of two outcomes: excellent response or poor response. An excellent response was defined as progression-free survival of 12 months or more, and poor response as PFS of 6 months or less. In the retrospective study of images, after excluding 32 gray-zone patients, 75 patients (66%) had durable responses to therapy and 6 (5%) had poor responses.

The PFS was 19 months in the excellent-response group and 3 months in the poor-response group.

Clinicians have often observed differences in gross morphology within the single histologic diagnosis of high-grade serous ovarian cancer. The research intent was to determine if AI could detect these distinct morphological patterns in the still frame images taken at the time of laparoscopy, and correlate them with the eventual clinical outcomes. Dr. Glassman and colleagues are currently validating the model with a much larger cohort, and will look into clinical testing.

“The big-picture goal,” Dr. Glassman said in an interview, “would be to utilize the model to predict which patients would do well with traditional standard of care treatments and those who wouldn’t do well so that we can personalize the treatment plan for those patients with alternative agents and therapies.”

Once validated, the model could also be employed to identify patterns of disease in other gynecologic cancers or distinguish between viable and necrosed malignant tissue.

The study’s predominant limitation was the small sample size which is being addressed in a larger ongoing study.

Funding was provided by a T32 grant, MD Anderson Cancer Center Support Grant, MD Anderson Ovarian Cancer Moon Shot, SPORE in Ovarian Cancer, the American Cancer Society, and the Ovarian Cancer Research Alliance. Dr. Glassman declared no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SGO 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Poverty-related stress linked to aggressive head and neck cancer

Article Type
Changed

A humanized mouse model suggests that head and neck cancer growth may stem from chronic stress. The study found that animals had immunophenotypic changes and a greater propensity towards tumor growth and metastasis.

It is not uncommon for low-income patients with head and neck cancer to present with more aggressive disease at diagnosis. Other studies have shown this may be caused by the lack of access to health care services or poor quality care. but the difference remains even after adjusting for these factors, according to researchers writing in Head and Neck.

Led by Heather A. Himburg, PhD, associate professor of radiation oncology with the Medical College of Wisconsin, Milwaukee, researchers conducted a study of head and neck cancer models in which tumor cells were implanted into a mouse with a humanized immune system.

Their theory was that psychosocial stress may contribute to the growth of head and neck tumors. The stress of poverty, social deprivation and social isolation can lead to the up-regulation of proinflammatory markers in circulating blood leukocytes, and this has been tied to worse outcomes in hematologic malignancies and breast cancer. Many such studies examined social adversity and found an association with greater tumor growth rates and treatment resistance.

Other researchers have used mouse models to study the phenomenon, but the results have been inconclusive. For example, some research linked the beta-adrenergic pathway to head and neck cancer, but clinical trials of beta-blockers showed no benefit, and even potential harm, for patients with head and neck cancers. Those results imply that this pathway does not drive tumor growth and metastasis in the presence of chronic stress.

Previous research used immunocompromised or nonhumanized mice. However, neither type of model reproduces the human tumor microenvironment, which may contribute to ensuing clinical failures. In the new study, researchers describe results from a preclinical model created using a human head and neck cancer xenograft in a mouse with a humanized immune system.
 

How the study was conducted

The animals were randomly assigned to normal housing of two or three animals from the same litter to a cage, or social isolation from littermates. There were five male and five female animals in each arm, and the animals were housed in their separate conditions for 4 weeks before tumor implantation.

The isolated animals experienced increased growth and metastasis of the xenografts, compared with controls. The results are consistent with findings in immunodeficient or syngeneic mice, but the humanized nature of the new model could lead to better translation of findings into clinical studies. “The humanized model system in this study demonstrated the presence of both human myeloid and lymphoid lineages as well as expression of at least 40 human cytokines. These data indicate that our model is likely to well-represent the human condition and better predict human clinical responses as compared to both immunodeficient and syngeneic models,” the authors wrote.

The researchers also found that chronic stress may act through an immunoregulatory effect, since there was greater human immune infiltrate into the tumors of stressed animals. Increased presence of regulatory components like myeloid-derived suppressor cells or regulatory T cells, or eroded function of tumor-infiltrating lymphocytes, might explain this finding. The researchers also identified a proinflammatory change in peripheral blood monocular cells in the stressed group. When they analyzed samples from patients who were low income earners of less than $45,000 in annual household income, they found a similar pattern. “This suggests that chronic socioeconomic stress may induce a similar proinflammatory immune state as our chronic stress model system,” the authors wrote.

Tumors were also different between the two groups of mice. Tumors in stressed animals had a higher percentage of cancer stem cells, which is associated with more aggressive tumors and worse disease-free survival. The researchers suggested that up-regulated levels of the chemokine SDF-1 seen in the stressed animals may be driving the higher proportion of stem cells through its effects on the CXCR4 receptor, which is expressed by stem cells in various organs and may cause migration, proliferation, and cell survival.

The study was funded by an endowment from Advancing a Healthier Wisconsin and a grant from the National Center for Advancing Translational Sciences. The authors reported no conflicts of interest.

Publications
Topics
Sections

A humanized mouse model suggests that head and neck cancer growth may stem from chronic stress. The study found that animals had immunophenotypic changes and a greater propensity towards tumor growth and metastasis.

It is not uncommon for low-income patients with head and neck cancer to present with more aggressive disease at diagnosis. Other studies have shown this may be caused by the lack of access to health care services or poor quality care. but the difference remains even after adjusting for these factors, according to researchers writing in Head and Neck.

Led by Heather A. Himburg, PhD, associate professor of radiation oncology with the Medical College of Wisconsin, Milwaukee, researchers conducted a study of head and neck cancer models in which tumor cells were implanted into a mouse with a humanized immune system.

Their theory was that psychosocial stress may contribute to the growth of head and neck tumors. The stress of poverty, social deprivation and social isolation can lead to the up-regulation of proinflammatory markers in circulating blood leukocytes, and this has been tied to worse outcomes in hematologic malignancies and breast cancer. Many such studies examined social adversity and found an association with greater tumor growth rates and treatment resistance.

Other researchers have used mouse models to study the phenomenon, but the results have been inconclusive. For example, some research linked the beta-adrenergic pathway to head and neck cancer, but clinical trials of beta-blockers showed no benefit, and even potential harm, for patients with head and neck cancers. Those results imply that this pathway does not drive tumor growth and metastasis in the presence of chronic stress.

Previous research used immunocompromised or nonhumanized mice. However, neither type of model reproduces the human tumor microenvironment, which may contribute to ensuing clinical failures. In the new study, researchers describe results from a preclinical model created using a human head and neck cancer xenograft in a mouse with a humanized immune system.
 

How the study was conducted

The animals were randomly assigned to normal housing of two or three animals from the same litter to a cage, or social isolation from littermates. There were five male and five female animals in each arm, and the animals were housed in their separate conditions for 4 weeks before tumor implantation.

The isolated animals experienced increased growth and metastasis of the xenografts, compared with controls. The results are consistent with findings in immunodeficient or syngeneic mice, but the humanized nature of the new model could lead to better translation of findings into clinical studies. “The humanized model system in this study demonstrated the presence of both human myeloid and lymphoid lineages as well as expression of at least 40 human cytokines. These data indicate that our model is likely to well-represent the human condition and better predict human clinical responses as compared to both immunodeficient and syngeneic models,” the authors wrote.

The researchers also found that chronic stress may act through an immunoregulatory effect, since there was greater human immune infiltrate into the tumors of stressed animals. Increased presence of regulatory components like myeloid-derived suppressor cells or regulatory T cells, or eroded function of tumor-infiltrating lymphocytes, might explain this finding. The researchers also identified a proinflammatory change in peripheral blood monocular cells in the stressed group. When they analyzed samples from patients who were low income earners of less than $45,000 in annual household income, they found a similar pattern. “This suggests that chronic socioeconomic stress may induce a similar proinflammatory immune state as our chronic stress model system,” the authors wrote.

Tumors were also different between the two groups of mice. Tumors in stressed animals had a higher percentage of cancer stem cells, which is associated with more aggressive tumors and worse disease-free survival. The researchers suggested that up-regulated levels of the chemokine SDF-1 seen in the stressed animals may be driving the higher proportion of stem cells through its effects on the CXCR4 receptor, which is expressed by stem cells in various organs and may cause migration, proliferation, and cell survival.

The study was funded by an endowment from Advancing a Healthier Wisconsin and a grant from the National Center for Advancing Translational Sciences. The authors reported no conflicts of interest.

A humanized mouse model suggests that head and neck cancer growth may stem from chronic stress. The study found that animals had immunophenotypic changes and a greater propensity towards tumor growth and metastasis.

It is not uncommon for low-income patients with head and neck cancer to present with more aggressive disease at diagnosis. Other studies have shown this may be caused by the lack of access to health care services or poor quality care. but the difference remains even after adjusting for these factors, according to researchers writing in Head and Neck.

Led by Heather A. Himburg, PhD, associate professor of radiation oncology with the Medical College of Wisconsin, Milwaukee, researchers conducted a study of head and neck cancer models in which tumor cells were implanted into a mouse with a humanized immune system.

Their theory was that psychosocial stress may contribute to the growth of head and neck tumors. The stress of poverty, social deprivation and social isolation can lead to the up-regulation of proinflammatory markers in circulating blood leukocytes, and this has been tied to worse outcomes in hematologic malignancies and breast cancer. Many such studies examined social adversity and found an association with greater tumor growth rates and treatment resistance.

Other researchers have used mouse models to study the phenomenon, but the results have been inconclusive. For example, some research linked the beta-adrenergic pathway to head and neck cancer, but clinical trials of beta-blockers showed no benefit, and even potential harm, for patients with head and neck cancers. Those results imply that this pathway does not drive tumor growth and metastasis in the presence of chronic stress.

Previous research used immunocompromised or nonhumanized mice. However, neither type of model reproduces the human tumor microenvironment, which may contribute to ensuing clinical failures. In the new study, researchers describe results from a preclinical model created using a human head and neck cancer xenograft in a mouse with a humanized immune system.
 

How the study was conducted

The animals were randomly assigned to normal housing of two or three animals from the same litter to a cage, or social isolation from littermates. There were five male and five female animals in each arm, and the animals were housed in their separate conditions for 4 weeks before tumor implantation.

The isolated animals experienced increased growth and metastasis of the xenografts, compared with controls. The results are consistent with findings in immunodeficient or syngeneic mice, but the humanized nature of the new model could lead to better translation of findings into clinical studies. “The humanized model system in this study demonstrated the presence of both human myeloid and lymphoid lineages as well as expression of at least 40 human cytokines. These data indicate that our model is likely to well-represent the human condition and better predict human clinical responses as compared to both immunodeficient and syngeneic models,” the authors wrote.

The researchers also found that chronic stress may act through an immunoregulatory effect, since there was greater human immune infiltrate into the tumors of stressed animals. Increased presence of regulatory components like myeloid-derived suppressor cells or regulatory T cells, or eroded function of tumor-infiltrating lymphocytes, might explain this finding. The researchers also identified a proinflammatory change in peripheral blood monocular cells in the stressed group. When they analyzed samples from patients who were low income earners of less than $45,000 in annual household income, they found a similar pattern. “This suggests that chronic socioeconomic stress may induce a similar proinflammatory immune state as our chronic stress model system,” the authors wrote.

Tumors were also different between the two groups of mice. Tumors in stressed animals had a higher percentage of cancer stem cells, which is associated with more aggressive tumors and worse disease-free survival. The researchers suggested that up-regulated levels of the chemokine SDF-1 seen in the stressed animals may be driving the higher proportion of stem cells through its effects on the CXCR4 receptor, which is expressed by stem cells in various organs and may cause migration, proliferation, and cell survival.

The study was funded by an endowment from Advancing a Healthier Wisconsin and a grant from the National Center for Advancing Translational Sciences. The authors reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM HEAD & NECK

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Live-donor liver transplants for patients with CRC liver mets

Article Type
Changed

Encouraging improvements in survival have been reported by surgeons who used liver transplants from live donors as a treatment for patients with colorectal cancer (CRC) and unresectable liver metastases. These patients usually have a poor prognosis, and for many, palliative chemotherapy is the standard of care.

“For the first time, we have been able to demonstrate [outside of Norway] that liver transplantation for patients with unresectable liver metastases is feasible with good outcomes,” lead author Gonzalo Sapisochin, MD, PhD, an assistant professor of surgery at the University of Toronto, said in an interview.

“Furthermore, this is the first time we are able to prove that living donation may be a good strategy in this setting,” Dr. Sapisochin said of the series of 10 cases that they published in JAMA Surgery.

The series showed “excellent perioperative outcomes for both donors and recipients,” noted the authors of an accompanying commentary. They said the team “should be commended for adding liver-donor live transplantation to the armamentarium of surgical options for patients with CRC liver metastases.”

However, they express concern about the relatively short follow-up of 1.5 years and the “very high” recurrence rate of 30%.

Commenting in an interview, lead editorialist Shimul Shah, MD, an associate professor of surgery and the chief of solid organ transplantation at the University of Cincinnati, said: “I agree that overall survival is an important measure to look at, but it’s hard to look at overall survival with [1.5] years of follow-up.”

Other key areas of concern are the need for more standardized practices and for more data on how liver transplantation compares with patients who just continue to receive chemotherapy.

“I certainly think that there’s a role for liver transplantation in these patients, and I am a big fan of this,” Dr. Shah emphasized, noting that four patients at his own center have recently received liver transplants, including three from deceased donors.

“However, I just think that as a community, we need to be cautious and not get too excited too early,” he said. “We need to keep studying it and take it one step at a time.”

Moving from deceased to living donors

Nearly 70% of patients with CRC develop liver metastases, and when these are unresectable, the prognosis is poor, with 5-year survival rates of less than 10%.

The option of liver transplantation was first reported in 2015 by a group in Norway. Their study included 21 patients with CRC and unresectable liver tumors. They reported a striking improvement in overall survival at 5 years (56% vs. 9% among patients who started first-line chemotherapy).

But with shortages of donor livers, this approach has not caught on. Deceased-donor liver allografts are in short supply in most countries, and recent allocation changes have further shifted available organs away from patients with liver tumors.

An alternative is to use living donors. In a recent study, Dr. Sapisochin and colleagues showed viability and a survival advantage, compared with deceased-donor liver transplantation.

Building on that work, they established treatment protocols at three centers – the University of Rochester (N.Y.) Medical Center, the Cleveland Clinic, , and the University Health Network in Toronto.

Of 91 evaluated patients who were prospectively enrolled with liver-confined, unresectable CRC liver metastases, 10 met all inclusion criteria and received living-donor liver transplants between December 2017 and May 2021. The median age of the patients was 45 years; six were men, and four were women.

These patients all had primary tumors greater than stage T2 (six T3 and four T4b). Lymphovascular invasion was present in two patients, and perineural invasion was present in one patient.

The median time from diagnosis of the liver metastases to liver transplant was 1.7 years (range, 1.1-7.8 years).

At a median follow-up of 1.5 years (range, 0.4-2.9 years), recurrences occurred in three patients, with a rate of recurrence-free survival, using Kaplan-Meier estimates, of 62% and a rate of overall survival of 100%.

Rates of morbidity associated with transplantation were no higher than those observed in established standards for the donors or recipients, the authors noted.

Among transplant recipients, three patients had no Clavien-Dindo complications; three had grade II, and four had grade III complications. Among donors, five had no complications, four had grade I, and one had grade III complications.

All 10 donors were discharged from the hospital 4-7 days after surgery and recovered fully.

All three patients who experienced recurrences were treated with palliative chemotherapy. One died of disease after 3 months of treatment. As of the time of publication of the study, the other two had survived for 2 or more years following their live donor liver transplant.
 

Patient selection key

The authors are now investigating tumor subtypes, responses in CRC liver metastases, and other factors, with the aim of developing a novel screening method to identify appropriate candidates more quickly.

In the meantime, they emphasized that indicators of disease biology, such as the Oslo Score, the Clinical Risk Score, and sustained clinical response to systemic therapy, “remain the key filters through which to select patients who have sufficient opportunity for long-term cancer control, which is necessary to justify the risk to a living donor.”

Dr. Sapisochin reported receiving grants from Roche and Bayer and personal fees from Integra, Roche, AstraZeneca, and Novartis outside the submitted work. Dr. Shah disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Encouraging improvements in survival have been reported by surgeons who used liver transplants from live donors as a treatment for patients with colorectal cancer (CRC) and unresectable liver metastases. These patients usually have a poor prognosis, and for many, palliative chemotherapy is the standard of care.

“For the first time, we have been able to demonstrate [outside of Norway] that liver transplantation for patients with unresectable liver metastases is feasible with good outcomes,” lead author Gonzalo Sapisochin, MD, PhD, an assistant professor of surgery at the University of Toronto, said in an interview.

“Furthermore, this is the first time we are able to prove that living donation may be a good strategy in this setting,” Dr. Sapisochin said of the series of 10 cases that they published in JAMA Surgery.

The series showed “excellent perioperative outcomes for both donors and recipients,” noted the authors of an accompanying commentary. They said the team “should be commended for adding liver-donor live transplantation to the armamentarium of surgical options for patients with CRC liver metastases.”

However, they express concern about the relatively short follow-up of 1.5 years and the “very high” recurrence rate of 30%.

Commenting in an interview, lead editorialist Shimul Shah, MD, an associate professor of surgery and the chief of solid organ transplantation at the University of Cincinnati, said: “I agree that overall survival is an important measure to look at, but it’s hard to look at overall survival with [1.5] years of follow-up.”

Other key areas of concern are the need for more standardized practices and for more data on how liver transplantation compares with patients who just continue to receive chemotherapy.

“I certainly think that there’s a role for liver transplantation in these patients, and I am a big fan of this,” Dr. Shah emphasized, noting that four patients at his own center have recently received liver transplants, including three from deceased donors.

“However, I just think that as a community, we need to be cautious and not get too excited too early,” he said. “We need to keep studying it and take it one step at a time.”

Moving from deceased to living donors

Nearly 70% of patients with CRC develop liver metastases, and when these are unresectable, the prognosis is poor, with 5-year survival rates of less than 10%.

The option of liver transplantation was first reported in 2015 by a group in Norway. Their study included 21 patients with CRC and unresectable liver tumors. They reported a striking improvement in overall survival at 5 years (56% vs. 9% among patients who started first-line chemotherapy).

But with shortages of donor livers, this approach has not caught on. Deceased-donor liver allografts are in short supply in most countries, and recent allocation changes have further shifted available organs away from patients with liver tumors.

An alternative is to use living donors. In a recent study, Dr. Sapisochin and colleagues showed viability and a survival advantage, compared with deceased-donor liver transplantation.

Building on that work, they established treatment protocols at three centers – the University of Rochester (N.Y.) Medical Center, the Cleveland Clinic, , and the University Health Network in Toronto.

Of 91 evaluated patients who were prospectively enrolled with liver-confined, unresectable CRC liver metastases, 10 met all inclusion criteria and received living-donor liver transplants between December 2017 and May 2021. The median age of the patients was 45 years; six were men, and four were women.

These patients all had primary tumors greater than stage T2 (six T3 and four T4b). Lymphovascular invasion was present in two patients, and perineural invasion was present in one patient.

The median time from diagnosis of the liver metastases to liver transplant was 1.7 years (range, 1.1-7.8 years).

At a median follow-up of 1.5 years (range, 0.4-2.9 years), recurrences occurred in three patients, with a rate of recurrence-free survival, using Kaplan-Meier estimates, of 62% and a rate of overall survival of 100%.

Rates of morbidity associated with transplantation were no higher than those observed in established standards for the donors or recipients, the authors noted.

Among transplant recipients, three patients had no Clavien-Dindo complications; three had grade II, and four had grade III complications. Among donors, five had no complications, four had grade I, and one had grade III complications.

All 10 donors were discharged from the hospital 4-7 days after surgery and recovered fully.

All three patients who experienced recurrences were treated with palliative chemotherapy. One died of disease after 3 months of treatment. As of the time of publication of the study, the other two had survived for 2 or more years following their live donor liver transplant.
 

Patient selection key

The authors are now investigating tumor subtypes, responses in CRC liver metastases, and other factors, with the aim of developing a novel screening method to identify appropriate candidates more quickly.

In the meantime, they emphasized that indicators of disease biology, such as the Oslo Score, the Clinical Risk Score, and sustained clinical response to systemic therapy, “remain the key filters through which to select patients who have sufficient opportunity for long-term cancer control, which is necessary to justify the risk to a living donor.”

Dr. Sapisochin reported receiving grants from Roche and Bayer and personal fees from Integra, Roche, AstraZeneca, and Novartis outside the submitted work. Dr. Shah disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Encouraging improvements in survival have been reported by surgeons who used liver transplants from live donors as a treatment for patients with colorectal cancer (CRC) and unresectable liver metastases. These patients usually have a poor prognosis, and for many, palliative chemotherapy is the standard of care.

“For the first time, we have been able to demonstrate [outside of Norway] that liver transplantation for patients with unresectable liver metastases is feasible with good outcomes,” lead author Gonzalo Sapisochin, MD, PhD, an assistant professor of surgery at the University of Toronto, said in an interview.

“Furthermore, this is the first time we are able to prove that living donation may be a good strategy in this setting,” Dr. Sapisochin said of the series of 10 cases that they published in JAMA Surgery.

The series showed “excellent perioperative outcomes for both donors and recipients,” noted the authors of an accompanying commentary. They said the team “should be commended for adding liver-donor live transplantation to the armamentarium of surgical options for patients with CRC liver metastases.”

However, they express concern about the relatively short follow-up of 1.5 years and the “very high” recurrence rate of 30%.

Commenting in an interview, lead editorialist Shimul Shah, MD, an associate professor of surgery and the chief of solid organ transplantation at the University of Cincinnati, said: “I agree that overall survival is an important measure to look at, but it’s hard to look at overall survival with [1.5] years of follow-up.”

Other key areas of concern are the need for more standardized practices and for more data on how liver transplantation compares with patients who just continue to receive chemotherapy.

“I certainly think that there’s a role for liver transplantation in these patients, and I am a big fan of this,” Dr. Shah emphasized, noting that four patients at his own center have recently received liver transplants, including three from deceased donors.

“However, I just think that as a community, we need to be cautious and not get too excited too early,” he said. “We need to keep studying it and take it one step at a time.”

Moving from deceased to living donors

Nearly 70% of patients with CRC develop liver metastases, and when these are unresectable, the prognosis is poor, with 5-year survival rates of less than 10%.

The option of liver transplantation was first reported in 2015 by a group in Norway. Their study included 21 patients with CRC and unresectable liver tumors. They reported a striking improvement in overall survival at 5 years (56% vs. 9% among patients who started first-line chemotherapy).

But with shortages of donor livers, this approach has not caught on. Deceased-donor liver allografts are in short supply in most countries, and recent allocation changes have further shifted available organs away from patients with liver tumors.

An alternative is to use living donors. In a recent study, Dr. Sapisochin and colleagues showed viability and a survival advantage, compared with deceased-donor liver transplantation.

Building on that work, they established treatment protocols at three centers – the University of Rochester (N.Y.) Medical Center, the Cleveland Clinic, , and the University Health Network in Toronto.

Of 91 evaluated patients who were prospectively enrolled with liver-confined, unresectable CRC liver metastases, 10 met all inclusion criteria and received living-donor liver transplants between December 2017 and May 2021. The median age of the patients was 45 years; six were men, and four were women.

These patients all had primary tumors greater than stage T2 (six T3 and four T4b). Lymphovascular invasion was present in two patients, and perineural invasion was present in one patient.

The median time from diagnosis of the liver metastases to liver transplant was 1.7 years (range, 1.1-7.8 years).

At a median follow-up of 1.5 years (range, 0.4-2.9 years), recurrences occurred in three patients, with a rate of recurrence-free survival, using Kaplan-Meier estimates, of 62% and a rate of overall survival of 100%.

Rates of morbidity associated with transplantation were no higher than those observed in established standards for the donors or recipients, the authors noted.

Among transplant recipients, three patients had no Clavien-Dindo complications; three had grade II, and four had grade III complications. Among donors, five had no complications, four had grade I, and one had grade III complications.

All 10 donors were discharged from the hospital 4-7 days after surgery and recovered fully.

All three patients who experienced recurrences were treated with palliative chemotherapy. One died of disease after 3 months of treatment. As of the time of publication of the study, the other two had survived for 2 or more years following their live donor liver transplant.
 

Patient selection key

The authors are now investigating tumor subtypes, responses in CRC liver metastases, and other factors, with the aim of developing a novel screening method to identify appropriate candidates more quickly.

In the meantime, they emphasized that indicators of disease biology, such as the Oslo Score, the Clinical Risk Score, and sustained clinical response to systemic therapy, “remain the key filters through which to select patients who have sufficient opportunity for long-term cancer control, which is necessary to justify the risk to a living donor.”

Dr. Sapisochin reported receiving grants from Roche and Bayer and personal fees from Integra, Roche, AstraZeneca, and Novartis outside the submitted work. Dr. Shah disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Some leukemias detectable up to 16 years before diagnosis?

Article Type
Changed

 

The preclinical phase of chronic lymphocytic leukemia (CLL) may be exist longer than previously thought, even in adverse-prognostic cases, as suggested by a sequencing analysis of blood samples obtained up to 22 years prior to CLL diagnosis.

Previous analyses showed that monoclonal B-cell lymphocytosis (MBL), a CLL precursor state, has been detected up to 6 years before CLL diagnosis, the investigators explained, noting that “[a]nother prognostically relevant immunogenetic feature of CLL concerns the stereotype of the B-cell receptor immunoglobulins (BcR IG).”

“Indeed, distinct stereotyped subsets can be defined by the expression of shared sequence motifs and are associated with particular presentation and outcomes,” P. Martijn Kolijn, PhD, a researcher in the department of immunology at Erasmus Medical Center, Rotterdam, the Netherlands, and colleagues wrote in a brief report published online in Blood. In an effort to “gain insight into the composition of the BcR IG repertoire during the early stages of CLL,” the investigators utilized next-generation sequencing to analyze 124 blood samples taken from healthy individuals up to 22 years before they received a diagnosis of CLL or small lymphocytic leukemia (SLL). An additional 118 matched control samples were also analyzed.

Study subjects were participants in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort.

“First, unsurprisingly, we observed a significant difference in the frequency of the dominant clonotype in CLL patients versus controls with a median frequency of 54.9%, compared to only 0.38% in controls,” they wrote.

Among 28 patients whose lymphocyte counts were measured at baseline, 10 showed evidence of lymphocytosis up to 8 years before CLL diagnosis.

This suggests undiagnosed instances of high-count MBL (cases with a cell count above 0.5x 109 cells/L, which can progress to CLL) or asymptomatic CLL, they explained.

“In contrast, next-generation sequencing results showed detectable skewing of the IGH gene repertoire in 21/28 patients up to 15 years before CLL diagnosis, often in the absence of elevated lymphocyte counts,” they wrote. “Remarkably, some patients with CLL requiring treatment and clinical transformation to an aggressive B-cell lymphoma displayed considerable skewing in the IGH gene repertoire even 16 years before CLL diagnosis.”

Patients with a prediagnostic IGHV-unmutated dominant clonotype had significantly shorter overall survival after CLL diagnosis than did those with an IGHV-mutated clonotype, they noted.

“Furthermore, at early timepoints (>10 years before diagnosis), patients with a high dominant clonotype frequency were more likely to be IGHV mutated, whereas closer to diagnosis this tendency was lost, indicating that the prediagnostic phase may be even longer than 16 years for [mutated] CLL patients,” they added.

The investigators also found that:

  • Twenty-five patients carried stereotyped BcR IG up to 17 years prior to CLL diagnosis, and of these, 10 clonotypes were assigned to minor subsets and 15 to major CLL subsets. Among the latter, 14 of the 15 belonged to high-risk subsets, and most of those showed a trend for faster disease evolution.
  • High frequency of the dominant clonotype was evident in samples obtained less than 6 years before diagnosis, whereas high-risk stereotyped clonotypes found longer before diagnosis (as early as 16 years) tended to have a lower dominant clonotype frequency (<20% of IGH gene repertoire)
  • The stereotyped BcR IG matched the clonotype at diagnosis for both patients with diagnostic material.
  • No stereotyped subsets were identified among the dominant clonotypes of the healthy controls.
 

 

“To our knowledge, the dynamics of the emergence of biclonality in an MBL patient and subsequent progression to CLL have never been captured in such a convincing manner,” they noted.

The findings “extend current knowledge on the evolution of the IGH repertoire prior to CLL diagnosis, highlighting that even high-risk CLL subtypes may display a prolonged indolent preclinical stage,” they added, speculating that “somatic genetic aberrations, (auto)stimulation, epigenetic and/or microenvironmental influences are required for the transformation into overt CLL.”

The investigators also noted that since the observed skewing in the IGH gene repertoire often occurs prior to B-cell lymphocytosis, they consider the findings “a novel extension to the characterization of MBL.”

“Further studies may prove invaluable in the clinical distinction between ‘progressing’ MBL versus ‘stable’ MBL. Notwithstanding the above, we emphasize that early detection is only warranted if it provides clear benefits to patient care,” they concluded.

In a related commentary, Gerald Marti, MD, PhD, of the National Heart, Lung, and Blood Institute, emphasized that the findings “represent the earliest detection of a clonotypic precursor cell for CLL.” .

They also raise new questions and point to new directions for research, Dr. Marti noted.

“Where do we go from here? CLL has a long evolutionary history in which early branching may start as an oligoclonal process (antigen stimulation) and include driver mutations,” he wrote. “A long-term analysis of the B-cell repertoire in familial CLL might shed light on this process. Further clarification of the mechanisms of age-related immune senescence is also of interest.”

The study authors and Dr. Marti reported having no competing financial interests.

Publications
Topics
Sections

 

The preclinical phase of chronic lymphocytic leukemia (CLL) may be exist longer than previously thought, even in adverse-prognostic cases, as suggested by a sequencing analysis of blood samples obtained up to 22 years prior to CLL diagnosis.

Previous analyses showed that monoclonal B-cell lymphocytosis (MBL), a CLL precursor state, has been detected up to 6 years before CLL diagnosis, the investigators explained, noting that “[a]nother prognostically relevant immunogenetic feature of CLL concerns the stereotype of the B-cell receptor immunoglobulins (BcR IG).”

“Indeed, distinct stereotyped subsets can be defined by the expression of shared sequence motifs and are associated with particular presentation and outcomes,” P. Martijn Kolijn, PhD, a researcher in the department of immunology at Erasmus Medical Center, Rotterdam, the Netherlands, and colleagues wrote in a brief report published online in Blood. In an effort to “gain insight into the composition of the BcR IG repertoire during the early stages of CLL,” the investigators utilized next-generation sequencing to analyze 124 blood samples taken from healthy individuals up to 22 years before they received a diagnosis of CLL or small lymphocytic leukemia (SLL). An additional 118 matched control samples were also analyzed.

Study subjects were participants in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort.

“First, unsurprisingly, we observed a significant difference in the frequency of the dominant clonotype in CLL patients versus controls with a median frequency of 54.9%, compared to only 0.38% in controls,” they wrote.

Among 28 patients whose lymphocyte counts were measured at baseline, 10 showed evidence of lymphocytosis up to 8 years before CLL diagnosis.

This suggests undiagnosed instances of high-count MBL (cases with a cell count above 0.5x 109 cells/L, which can progress to CLL) or asymptomatic CLL, they explained.

“In contrast, next-generation sequencing results showed detectable skewing of the IGH gene repertoire in 21/28 patients up to 15 years before CLL diagnosis, often in the absence of elevated lymphocyte counts,” they wrote. “Remarkably, some patients with CLL requiring treatment and clinical transformation to an aggressive B-cell lymphoma displayed considerable skewing in the IGH gene repertoire even 16 years before CLL diagnosis.”

Patients with a prediagnostic IGHV-unmutated dominant clonotype had significantly shorter overall survival after CLL diagnosis than did those with an IGHV-mutated clonotype, they noted.

“Furthermore, at early timepoints (>10 years before diagnosis), patients with a high dominant clonotype frequency were more likely to be IGHV mutated, whereas closer to diagnosis this tendency was lost, indicating that the prediagnostic phase may be even longer than 16 years for [mutated] CLL patients,” they added.

The investigators also found that:

  • Twenty-five patients carried stereotyped BcR IG up to 17 years prior to CLL diagnosis, and of these, 10 clonotypes were assigned to minor subsets and 15 to major CLL subsets. Among the latter, 14 of the 15 belonged to high-risk subsets, and most of those showed a trend for faster disease evolution.
  • High frequency of the dominant clonotype was evident in samples obtained less than 6 years before diagnosis, whereas high-risk stereotyped clonotypes found longer before diagnosis (as early as 16 years) tended to have a lower dominant clonotype frequency (<20% of IGH gene repertoire)
  • The stereotyped BcR IG matched the clonotype at diagnosis for both patients with diagnostic material.
  • No stereotyped subsets were identified among the dominant clonotypes of the healthy controls.
 

 

“To our knowledge, the dynamics of the emergence of biclonality in an MBL patient and subsequent progression to CLL have never been captured in such a convincing manner,” they noted.

The findings “extend current knowledge on the evolution of the IGH repertoire prior to CLL diagnosis, highlighting that even high-risk CLL subtypes may display a prolonged indolent preclinical stage,” they added, speculating that “somatic genetic aberrations, (auto)stimulation, epigenetic and/or microenvironmental influences are required for the transformation into overt CLL.”

The investigators also noted that since the observed skewing in the IGH gene repertoire often occurs prior to B-cell lymphocytosis, they consider the findings “a novel extension to the characterization of MBL.”

“Further studies may prove invaluable in the clinical distinction between ‘progressing’ MBL versus ‘stable’ MBL. Notwithstanding the above, we emphasize that early detection is only warranted if it provides clear benefits to patient care,” they concluded.

In a related commentary, Gerald Marti, MD, PhD, of the National Heart, Lung, and Blood Institute, emphasized that the findings “represent the earliest detection of a clonotypic precursor cell for CLL.” .

They also raise new questions and point to new directions for research, Dr. Marti noted.

“Where do we go from here? CLL has a long evolutionary history in which early branching may start as an oligoclonal process (antigen stimulation) and include driver mutations,” he wrote. “A long-term analysis of the B-cell repertoire in familial CLL might shed light on this process. Further clarification of the mechanisms of age-related immune senescence is also of interest.”

The study authors and Dr. Marti reported having no competing financial interests.

 

The preclinical phase of chronic lymphocytic leukemia (CLL) may be exist longer than previously thought, even in adverse-prognostic cases, as suggested by a sequencing analysis of blood samples obtained up to 22 years prior to CLL diagnosis.

Previous analyses showed that monoclonal B-cell lymphocytosis (MBL), a CLL precursor state, has been detected up to 6 years before CLL diagnosis, the investigators explained, noting that “[a]nother prognostically relevant immunogenetic feature of CLL concerns the stereotype of the B-cell receptor immunoglobulins (BcR IG).”

“Indeed, distinct stereotyped subsets can be defined by the expression of shared sequence motifs and are associated with particular presentation and outcomes,” P. Martijn Kolijn, PhD, a researcher in the department of immunology at Erasmus Medical Center, Rotterdam, the Netherlands, and colleagues wrote in a brief report published online in Blood. In an effort to “gain insight into the composition of the BcR IG repertoire during the early stages of CLL,” the investigators utilized next-generation sequencing to analyze 124 blood samples taken from healthy individuals up to 22 years before they received a diagnosis of CLL or small lymphocytic leukemia (SLL). An additional 118 matched control samples were also analyzed.

Study subjects were participants in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort.

“First, unsurprisingly, we observed a significant difference in the frequency of the dominant clonotype in CLL patients versus controls with a median frequency of 54.9%, compared to only 0.38% in controls,” they wrote.

Among 28 patients whose lymphocyte counts were measured at baseline, 10 showed evidence of lymphocytosis up to 8 years before CLL diagnosis.

This suggests undiagnosed instances of high-count MBL (cases with a cell count above 0.5x 109 cells/L, which can progress to CLL) or asymptomatic CLL, they explained.

“In contrast, next-generation sequencing results showed detectable skewing of the IGH gene repertoire in 21/28 patients up to 15 years before CLL diagnosis, often in the absence of elevated lymphocyte counts,” they wrote. “Remarkably, some patients with CLL requiring treatment and clinical transformation to an aggressive B-cell lymphoma displayed considerable skewing in the IGH gene repertoire even 16 years before CLL diagnosis.”

Patients with a prediagnostic IGHV-unmutated dominant clonotype had significantly shorter overall survival after CLL diagnosis than did those with an IGHV-mutated clonotype, they noted.

“Furthermore, at early timepoints (>10 years before diagnosis), patients with a high dominant clonotype frequency were more likely to be IGHV mutated, whereas closer to diagnosis this tendency was lost, indicating that the prediagnostic phase may be even longer than 16 years for [mutated] CLL patients,” they added.

The investigators also found that:

  • Twenty-five patients carried stereotyped BcR IG up to 17 years prior to CLL diagnosis, and of these, 10 clonotypes were assigned to minor subsets and 15 to major CLL subsets. Among the latter, 14 of the 15 belonged to high-risk subsets, and most of those showed a trend for faster disease evolution.
  • High frequency of the dominant clonotype was evident in samples obtained less than 6 years before diagnosis, whereas high-risk stereotyped clonotypes found longer before diagnosis (as early as 16 years) tended to have a lower dominant clonotype frequency (<20% of IGH gene repertoire)
  • The stereotyped BcR IG matched the clonotype at diagnosis for both patients with diagnostic material.
  • No stereotyped subsets were identified among the dominant clonotypes of the healthy controls.
 

 

“To our knowledge, the dynamics of the emergence of biclonality in an MBL patient and subsequent progression to CLL have never been captured in such a convincing manner,” they noted.

The findings “extend current knowledge on the evolution of the IGH repertoire prior to CLL diagnosis, highlighting that even high-risk CLL subtypes may display a prolonged indolent preclinical stage,” they added, speculating that “somatic genetic aberrations, (auto)stimulation, epigenetic and/or microenvironmental influences are required for the transformation into overt CLL.”

The investigators also noted that since the observed skewing in the IGH gene repertoire often occurs prior to B-cell lymphocytosis, they consider the findings “a novel extension to the characterization of MBL.”

“Further studies may prove invaluable in the clinical distinction between ‘progressing’ MBL versus ‘stable’ MBL. Notwithstanding the above, we emphasize that early detection is only warranted if it provides clear benefits to patient care,” they concluded.

In a related commentary, Gerald Marti, MD, PhD, of the National Heart, Lung, and Blood Institute, emphasized that the findings “represent the earliest detection of a clonotypic precursor cell for CLL.” .

They also raise new questions and point to new directions for research, Dr. Marti noted.

“Where do we go from here? CLL has a long evolutionary history in which early branching may start as an oligoclonal process (antigen stimulation) and include driver mutations,” he wrote. “A long-term analysis of the B-cell repertoire in familial CLL might shed light on this process. Further clarification of the mechanisms of age-related immune senescence is also of interest.”

The study authors and Dr. Marti reported having no competing financial interests.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM BLOOD

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pembro provides DFS benefit in early NSCLC

Article Type
Changed

Adjuvant pembrolizumab significantly improves disease-free survival (DFS) compared to placebo in patients with early-stage non–small cell lung cancer (NSCLC) who have undergone complete resection, according to findings from the phase 3 PEARLS/KEYNOTE-091 (PEARLS) study.

Patients in the pembrolizumab arm demonstrated median DFS nearly 12 months longer than those in the placebo arm (53.6 vs. 42.0 months). Investigators observed a DFS benefit for patients with any programmed death-ligand 1 (PD-L1) expression.

“We believe that pembrolizumab has the potential to become a new adjuvant treatment option for patient with [stage IB to IIIA] non–small cell lung cancer following complete resection and adjuvant chemotherapy when recommended,” concluded first author Luis Paz-Ares, MD, chair of the clinical research unit at Hospital Universitario 12 de Octubre, CNIO & Universidad Complutense, Madrid. “Pembrolizumab provided a benefit regardless of pathological stage and PD-L1 progression subgroup.”

The findings were presented by Dr. Paz-Ares at the European Society for Medical Oncology (ESMO) March virtual plenary session and published March 17 in Annals of Oncology.

Pembrolizumab is the standard treatment for patients with advanced NSCLC, but its efficacy in early-stage disease remains unclear. To determine whether patients with early-stage disease benefit from pembrolizumab, Dr. Paz-Ares and colleagues randomized 1,177 adults with stage IB, II, or IIIA NSCLC to 200 mg of pembrolizumab (n = 590) or placebo (n = 587) every 3 weeks.

All patients had Eastern Cooperative Oncology Group performance status of 0-1, and any level of PD-L1 expression. Of the study participants, 168 in the pembrolizumab arm and 165 in the placebo arm had PD-L1 expression and a tumor proportion score (TPS) of at least 50%.

Overall, patients receiving pembrolizumab had a DFS of 53.6 months compared to 42.0 months in the placebo arm (hazard ratio [HR], 0.76; P = .0014). The DFS benefit was generally consistent across patients with PD-L1 TPS <1%, 1%-49%, and ≥50%. In the subset of patients with PD-L1 TPS ≥50%, a slightly higher percentage of patients in the pembrolizumab group demonstrated DFS at 18 months (71.7% vs. 70.2%), but the difference did not reach statistical significance (HR, 0.82; P = .14).



Overall survival (OS) at 18 months was 91.7% in the treatment arm and 91.3% in the placebo arm (HR, 0.87; P = .17), but the data were immature.

“The disease-free survival benefit was observed across most prespecified subgroups,” Dr. Paz-Ares said.

No new safety concerns were raised. Grade 3 or greater adverse events occurred in 34.1% of patients in the treatment arm and 25.8% in the placebo arm. Adverse events led to discontinuation in 19.8% of patients receiving pembrolizumab and 5.9% of patients in the placebo group.

Invited discussant Martin Reck, MD, said these findings represent forward progress. “We do see many patients with distant relapse, which indicates that we have to improve our control of the systemic relapse,” said Dr. Reck, head of the department of thoracic oncology and the clinical trial department at the Lungen Clinic Grosshansdorf, Germany.

Prior data provide a rationale for using immune checkpoint inhibition in early-stage NSCLC, and both the PEARLS study and the IMpower010 trial evaluating atezolizumab in a similar setting have demonstrated relevant improvements in DFS.

“I think we are entering the times of perioperative immunotherapies. We are seeing the first signals of efficacy for adjuvant immunotherapy in two large, randomized trials,” Dr. Reck said.

Based on the PEARLS trial results, Dr. Reck said that PD-L1 appears to have predictive and prognostic value but noted that “several other clinical trials say PD-L1 expression is a poor prognostic marker” for sensitivity to immune checkpoint inhibitor. Given this potential inconsistency, Dr. Reck called for further follow-up in this patient population and for studies in larger groups of patients to further delineate the role of PD-L1 as well as EGFR mutations and adjuvant chemotherapy in patients with early NSCLC.

The PEARLS study was funded by Merck Sharp & Dohme Corp. Dr. Paz-Ares and Dr. Reck disclosed numerous relationships with pharmaceutical companies.

Publications
Topics
Sections

Adjuvant pembrolizumab significantly improves disease-free survival (DFS) compared to placebo in patients with early-stage non–small cell lung cancer (NSCLC) who have undergone complete resection, according to findings from the phase 3 PEARLS/KEYNOTE-091 (PEARLS) study.

Patients in the pembrolizumab arm demonstrated median DFS nearly 12 months longer than those in the placebo arm (53.6 vs. 42.0 months). Investigators observed a DFS benefit for patients with any programmed death-ligand 1 (PD-L1) expression.

“We believe that pembrolizumab has the potential to become a new adjuvant treatment option for patient with [stage IB to IIIA] non–small cell lung cancer following complete resection and adjuvant chemotherapy when recommended,” concluded first author Luis Paz-Ares, MD, chair of the clinical research unit at Hospital Universitario 12 de Octubre, CNIO & Universidad Complutense, Madrid. “Pembrolizumab provided a benefit regardless of pathological stage and PD-L1 progression subgroup.”

The findings were presented by Dr. Paz-Ares at the European Society for Medical Oncology (ESMO) March virtual plenary session and published March 17 in Annals of Oncology.

Pembrolizumab is the standard treatment for patients with advanced NSCLC, but its efficacy in early-stage disease remains unclear. To determine whether patients with early-stage disease benefit from pembrolizumab, Dr. Paz-Ares and colleagues randomized 1,177 adults with stage IB, II, or IIIA NSCLC to 200 mg of pembrolizumab (n = 590) or placebo (n = 587) every 3 weeks.

All patients had Eastern Cooperative Oncology Group performance status of 0-1, and any level of PD-L1 expression. Of the study participants, 168 in the pembrolizumab arm and 165 in the placebo arm had PD-L1 expression and a tumor proportion score (TPS) of at least 50%.

Overall, patients receiving pembrolizumab had a DFS of 53.6 months compared to 42.0 months in the placebo arm (hazard ratio [HR], 0.76; P = .0014). The DFS benefit was generally consistent across patients with PD-L1 TPS <1%, 1%-49%, and ≥50%. In the subset of patients with PD-L1 TPS ≥50%, a slightly higher percentage of patients in the pembrolizumab group demonstrated DFS at 18 months (71.7% vs. 70.2%), but the difference did not reach statistical significance (HR, 0.82; P = .14).



Overall survival (OS) at 18 months was 91.7% in the treatment arm and 91.3% in the placebo arm (HR, 0.87; P = .17), but the data were immature.

“The disease-free survival benefit was observed across most prespecified subgroups,” Dr. Paz-Ares said.

No new safety concerns were raised. Grade 3 or greater adverse events occurred in 34.1% of patients in the treatment arm and 25.8% in the placebo arm. Adverse events led to discontinuation in 19.8% of patients receiving pembrolizumab and 5.9% of patients in the placebo group.

Invited discussant Martin Reck, MD, said these findings represent forward progress. “We do see many patients with distant relapse, which indicates that we have to improve our control of the systemic relapse,” said Dr. Reck, head of the department of thoracic oncology and the clinical trial department at the Lungen Clinic Grosshansdorf, Germany.

Prior data provide a rationale for using immune checkpoint inhibition in early-stage NSCLC, and both the PEARLS study and the IMpower010 trial evaluating atezolizumab in a similar setting have demonstrated relevant improvements in DFS.

“I think we are entering the times of perioperative immunotherapies. We are seeing the first signals of efficacy for adjuvant immunotherapy in two large, randomized trials,” Dr. Reck said.

Based on the PEARLS trial results, Dr. Reck said that PD-L1 appears to have predictive and prognostic value but noted that “several other clinical trials say PD-L1 expression is a poor prognostic marker” for sensitivity to immune checkpoint inhibitor. Given this potential inconsistency, Dr. Reck called for further follow-up in this patient population and for studies in larger groups of patients to further delineate the role of PD-L1 as well as EGFR mutations and adjuvant chemotherapy in patients with early NSCLC.

The PEARLS study was funded by Merck Sharp & Dohme Corp. Dr. Paz-Ares and Dr. Reck disclosed numerous relationships with pharmaceutical companies.

Adjuvant pembrolizumab significantly improves disease-free survival (DFS) compared to placebo in patients with early-stage non–small cell lung cancer (NSCLC) who have undergone complete resection, according to findings from the phase 3 PEARLS/KEYNOTE-091 (PEARLS) study.

Patients in the pembrolizumab arm demonstrated median DFS nearly 12 months longer than those in the placebo arm (53.6 vs. 42.0 months). Investigators observed a DFS benefit for patients with any programmed death-ligand 1 (PD-L1) expression.

“We believe that pembrolizumab has the potential to become a new adjuvant treatment option for patient with [stage IB to IIIA] non–small cell lung cancer following complete resection and adjuvant chemotherapy when recommended,” concluded first author Luis Paz-Ares, MD, chair of the clinical research unit at Hospital Universitario 12 de Octubre, CNIO & Universidad Complutense, Madrid. “Pembrolizumab provided a benefit regardless of pathological stage and PD-L1 progression subgroup.”

The findings were presented by Dr. Paz-Ares at the European Society for Medical Oncology (ESMO) March virtual plenary session and published March 17 in Annals of Oncology.

Pembrolizumab is the standard treatment for patients with advanced NSCLC, but its efficacy in early-stage disease remains unclear. To determine whether patients with early-stage disease benefit from pembrolizumab, Dr. Paz-Ares and colleagues randomized 1,177 adults with stage IB, II, or IIIA NSCLC to 200 mg of pembrolizumab (n = 590) or placebo (n = 587) every 3 weeks.

All patients had Eastern Cooperative Oncology Group performance status of 0-1, and any level of PD-L1 expression. Of the study participants, 168 in the pembrolizumab arm and 165 in the placebo arm had PD-L1 expression and a tumor proportion score (TPS) of at least 50%.

Overall, patients receiving pembrolizumab had a DFS of 53.6 months compared to 42.0 months in the placebo arm (hazard ratio [HR], 0.76; P = .0014). The DFS benefit was generally consistent across patients with PD-L1 TPS <1%, 1%-49%, and ≥50%. In the subset of patients with PD-L1 TPS ≥50%, a slightly higher percentage of patients in the pembrolizumab group demonstrated DFS at 18 months (71.7% vs. 70.2%), but the difference did not reach statistical significance (HR, 0.82; P = .14).



Overall survival (OS) at 18 months was 91.7% in the treatment arm and 91.3% in the placebo arm (HR, 0.87; P = .17), but the data were immature.

“The disease-free survival benefit was observed across most prespecified subgroups,” Dr. Paz-Ares said.

No new safety concerns were raised. Grade 3 or greater adverse events occurred in 34.1% of patients in the treatment arm and 25.8% in the placebo arm. Adverse events led to discontinuation in 19.8% of patients receiving pembrolizumab and 5.9% of patients in the placebo group.

Invited discussant Martin Reck, MD, said these findings represent forward progress. “We do see many patients with distant relapse, which indicates that we have to improve our control of the systemic relapse,” said Dr. Reck, head of the department of thoracic oncology and the clinical trial department at the Lungen Clinic Grosshansdorf, Germany.

Prior data provide a rationale for using immune checkpoint inhibition in early-stage NSCLC, and both the PEARLS study and the IMpower010 trial evaluating atezolizumab in a similar setting have demonstrated relevant improvements in DFS.

“I think we are entering the times of perioperative immunotherapies. We are seeing the first signals of efficacy for adjuvant immunotherapy in two large, randomized trials,” Dr. Reck said.

Based on the PEARLS trial results, Dr. Reck said that PD-L1 appears to have predictive and prognostic value but noted that “several other clinical trials say PD-L1 expression is a poor prognostic marker” for sensitivity to immune checkpoint inhibitor. Given this potential inconsistency, Dr. Reck called for further follow-up in this patient population and for studies in larger groups of patients to further delineate the role of PD-L1 as well as EGFR mutations and adjuvant chemotherapy in patients with early NSCLC.

The PEARLS study was funded by Merck Sharp & Dohme Corp. Dr. Paz-Ares and Dr. Reck disclosed numerous relationships with pharmaceutical companies.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE ESMO MARCH PLENARY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article