User login
One in four NSCLC patients respond poorly to COVID-19 vaccine
according to a new study.
The study was published in the Journal of Clinical Oncology.
“Booster vaccination increased binding and neutralizing antibody titers to Omicron, but antibody titers declined after 3 months. These data highlight the concern for patients with cancer given the rapid spread of SARS-CoV-2 Omicron variant,” wrote the authors, who were led by Rafi Ahmed, PhD, Emory University, Atlanta.
Researchers found that 18% had no detectable antibody at all and active treatment type had no association with vaccine response.
Researchers examined antibody titers among 82 NSCLC patients and 53 healthy volunteers. They collected blood samples longitudinally for analysis. While most patients had binding and neutralizing antibody titers that were comparable with healthy volunteers, 25% had poor responses, which led to six- to sevenfold lower titers than healthy controls. There was no association between worse vaccine responses and history of programmed death–1 monotherapy, chemotherapy, or both in combination. Receipt of a booster vaccine improved binding and neutralizing antibody titers to both the wild type and the Omicron variant, but 2-4 months after the booster there was a five- to sevenfold decrease in neutralizing titers to both the wild type and Omicron variant.
“This study indicates both the need to monitor our patients with lung cancer for response to COVID-19 mRNA vaccines, identify the nonresponders for follow-up and further attempts at immunization, and continue collecting and analyzing clinicodemographic information and biospecimens from our patients,” wrote the authors of an accompanying editorial.
Although the findings reveal potential concerns, the good news is that most patients NSCLC patients do respond normally to COVID-19 vaccination, said John D. Minna, MD, University of Texas Southwestern Medical Center, Dallas, lead author of the editorial.
He offered some advice to physicians. “You can test your patients using currently available [Clinical Laboratory Improvement Amendments]–approved lab tests to determine what their antibody titers are. This should be done after boosting since titers will go down after time. We know that if a patient has lung cancer and they do get infected with SARS-CoV-2 that potentially they could develop serious COVID-19 disease. Besides giving antiviral treatment, it is important that they be closely monitored for symptoms of progression so if they need to be hospitalized it can be done in a prudent manner,” said Dr. Minna, who is director of the Hamon Center for Therapeutic Oncology Research at the University of Texas Southwestern Medical Center.
No clinical details have emerged that might predict which patients have an insufficient response to vaccination. “When we started these studies, a lot of us thought that anyone who did not develop a good antibody response would be weak or sicker. For example, [patients with] late-stage disease, or having had a lot of therapy, or perhaps immune checkpoint blockade. However, none of these things are correlated. The main advice we are giving our lung cancer patients are to get vaccinated, get boosted (double boosted), and just do the smart thing to protect yourself from exposure,” he said.
For example, when traveling on a plane, patients should wear a mask. They should also avoid large indoor events. He also recommended that, following vaccination and boosters, patients seek out CLIA-certified tests to get their titer checked.
“Upon any COVID infection, even if their titer is at or above 80%, patients should see their physician to consider treatment with Paxlovid (nirmatrelvir/ritonavir), which has emergency use authorization. For patients with a lower titer, it’s important to seek out a physician and consider Paxlovid and possibly antibody therapy. But these are individual decisions to be made with your doctor,” Dr. Minna said.
The next important research question is what happens to T-cell immune response following vaccination. “We know that a good cellular immune response is also important in preventing infection and severe infection, but we don’t yet know which persons (with or without cancer) have good T-cell responses. This information will also likely impact what we tell our patients and will add to the antibody data,” he said.
Studies are ongoing to determine specific T-cell responses to mRNA vaccines, and how well those T-cell responses respond to SARS-CoV-2 infection in the laboratory.
according to a new study.
The study was published in the Journal of Clinical Oncology.
“Booster vaccination increased binding and neutralizing antibody titers to Omicron, but antibody titers declined after 3 months. These data highlight the concern for patients with cancer given the rapid spread of SARS-CoV-2 Omicron variant,” wrote the authors, who were led by Rafi Ahmed, PhD, Emory University, Atlanta.
Researchers found that 18% had no detectable antibody at all and active treatment type had no association with vaccine response.
Researchers examined antibody titers among 82 NSCLC patients and 53 healthy volunteers. They collected blood samples longitudinally for analysis. While most patients had binding and neutralizing antibody titers that were comparable with healthy volunteers, 25% had poor responses, which led to six- to sevenfold lower titers than healthy controls. There was no association between worse vaccine responses and history of programmed death–1 monotherapy, chemotherapy, or both in combination. Receipt of a booster vaccine improved binding and neutralizing antibody titers to both the wild type and the Omicron variant, but 2-4 months after the booster there was a five- to sevenfold decrease in neutralizing titers to both the wild type and Omicron variant.
“This study indicates both the need to monitor our patients with lung cancer for response to COVID-19 mRNA vaccines, identify the nonresponders for follow-up and further attempts at immunization, and continue collecting and analyzing clinicodemographic information and biospecimens from our patients,” wrote the authors of an accompanying editorial.
Although the findings reveal potential concerns, the good news is that most patients NSCLC patients do respond normally to COVID-19 vaccination, said John D. Minna, MD, University of Texas Southwestern Medical Center, Dallas, lead author of the editorial.
He offered some advice to physicians. “You can test your patients using currently available [Clinical Laboratory Improvement Amendments]–approved lab tests to determine what their antibody titers are. This should be done after boosting since titers will go down after time. We know that if a patient has lung cancer and they do get infected with SARS-CoV-2 that potentially they could develop serious COVID-19 disease. Besides giving antiviral treatment, it is important that they be closely monitored for symptoms of progression so if they need to be hospitalized it can be done in a prudent manner,” said Dr. Minna, who is director of the Hamon Center for Therapeutic Oncology Research at the University of Texas Southwestern Medical Center.
No clinical details have emerged that might predict which patients have an insufficient response to vaccination. “When we started these studies, a lot of us thought that anyone who did not develop a good antibody response would be weak or sicker. For example, [patients with] late-stage disease, or having had a lot of therapy, or perhaps immune checkpoint blockade. However, none of these things are correlated. The main advice we are giving our lung cancer patients are to get vaccinated, get boosted (double boosted), and just do the smart thing to protect yourself from exposure,” he said.
For example, when traveling on a plane, patients should wear a mask. They should also avoid large indoor events. He also recommended that, following vaccination and boosters, patients seek out CLIA-certified tests to get their titer checked.
“Upon any COVID infection, even if their titer is at or above 80%, patients should see their physician to consider treatment with Paxlovid (nirmatrelvir/ritonavir), which has emergency use authorization. For patients with a lower titer, it’s important to seek out a physician and consider Paxlovid and possibly antibody therapy. But these are individual decisions to be made with your doctor,” Dr. Minna said.
The next important research question is what happens to T-cell immune response following vaccination. “We know that a good cellular immune response is also important in preventing infection and severe infection, but we don’t yet know which persons (with or without cancer) have good T-cell responses. This information will also likely impact what we tell our patients and will add to the antibody data,” he said.
Studies are ongoing to determine specific T-cell responses to mRNA vaccines, and how well those T-cell responses respond to SARS-CoV-2 infection in the laboratory.
according to a new study.
The study was published in the Journal of Clinical Oncology.
“Booster vaccination increased binding and neutralizing antibody titers to Omicron, but antibody titers declined after 3 months. These data highlight the concern for patients with cancer given the rapid spread of SARS-CoV-2 Omicron variant,” wrote the authors, who were led by Rafi Ahmed, PhD, Emory University, Atlanta.
Researchers found that 18% had no detectable antibody at all and active treatment type had no association with vaccine response.
Researchers examined antibody titers among 82 NSCLC patients and 53 healthy volunteers. They collected blood samples longitudinally for analysis. While most patients had binding and neutralizing antibody titers that were comparable with healthy volunteers, 25% had poor responses, which led to six- to sevenfold lower titers than healthy controls. There was no association between worse vaccine responses and history of programmed death–1 monotherapy, chemotherapy, or both in combination. Receipt of a booster vaccine improved binding and neutralizing antibody titers to both the wild type and the Omicron variant, but 2-4 months after the booster there was a five- to sevenfold decrease in neutralizing titers to both the wild type and Omicron variant.
“This study indicates both the need to monitor our patients with lung cancer for response to COVID-19 mRNA vaccines, identify the nonresponders for follow-up and further attempts at immunization, and continue collecting and analyzing clinicodemographic information and biospecimens from our patients,” wrote the authors of an accompanying editorial.
Although the findings reveal potential concerns, the good news is that most patients NSCLC patients do respond normally to COVID-19 vaccination, said John D. Minna, MD, University of Texas Southwestern Medical Center, Dallas, lead author of the editorial.
He offered some advice to physicians. “You can test your patients using currently available [Clinical Laboratory Improvement Amendments]–approved lab tests to determine what their antibody titers are. This should be done after boosting since titers will go down after time. We know that if a patient has lung cancer and they do get infected with SARS-CoV-2 that potentially they could develop serious COVID-19 disease. Besides giving antiviral treatment, it is important that they be closely monitored for symptoms of progression so if they need to be hospitalized it can be done in a prudent manner,” said Dr. Minna, who is director of the Hamon Center for Therapeutic Oncology Research at the University of Texas Southwestern Medical Center.
No clinical details have emerged that might predict which patients have an insufficient response to vaccination. “When we started these studies, a lot of us thought that anyone who did not develop a good antibody response would be weak or sicker. For example, [patients with] late-stage disease, or having had a lot of therapy, or perhaps immune checkpoint blockade. However, none of these things are correlated. The main advice we are giving our lung cancer patients are to get vaccinated, get boosted (double boosted), and just do the smart thing to protect yourself from exposure,” he said.
For example, when traveling on a plane, patients should wear a mask. They should also avoid large indoor events. He also recommended that, following vaccination and boosters, patients seek out CLIA-certified tests to get their titer checked.
“Upon any COVID infection, even if their titer is at or above 80%, patients should see their physician to consider treatment with Paxlovid (nirmatrelvir/ritonavir), which has emergency use authorization. For patients with a lower titer, it’s important to seek out a physician and consider Paxlovid and possibly antibody therapy. But these are individual decisions to be made with your doctor,” Dr. Minna said.
The next important research question is what happens to T-cell immune response following vaccination. “We know that a good cellular immune response is also important in preventing infection and severe infection, but we don’t yet know which persons (with or without cancer) have good T-cell responses. This information will also likely impact what we tell our patients and will add to the antibody data,” he said.
Studies are ongoing to determine specific T-cell responses to mRNA vaccines, and how well those T-cell responses respond to SARS-CoV-2 infection in the laboratory.
FROM THE JOURNAL OF CLINICAL ONCOLOGY
Treatment combo shows ‘clinical benefit’ in liver cancer trial
in patients with hepatocellular carcinoma, shows a new study.
While the combination has been shown to be beneficial in renal cell carcinoma and other solid tumor types, it has never before been tested in a phase 3 clinical trial for hepatocellular carcinoma until now.
The new study, published in The Lancet Oncology, included 837 patients from 178 hospital in 32 countries who were enrolled in the study (called COSMIC-312) between December 2018 and August 2020. 432 patients were randomly assigned to receive a combination of cabozantinib (Cabometyx, Exelixis), a tyrosine kinase inhibitor (TKI), and atezolizumab (Tecentriq, Genentech), a PD-L1 inhibitor. While 217 patients were treated with sorafenib (Nexavar, Bayer) alone and 188 patients were treated with cabozantinib.
Clinically meaningful improvements in progression-free survival, increased disease control and lower primary progression were seen in patients who received the cabozantinib and atezolizumab combination therapy over patients who were treated with sorafenib. However, there was no improvement in overall survival.
“The improvement in progression-free survival with cabozantinib plus atezolizumab in this study shows that the combination confers clinical benefit for patients with advanced hepatocellular carcinoma previously untreated with systemic anticancer therapy,” wrote the authors of the study, led by Robin Kate Kelley, MD, a gastrointestinal oncologist with the University of California, San Francisco, and Lorenza Rimassa, MD, a gastrointestinal oncologist with Humanitas University, Milan. “The absence of a benefit in overall survival, along with the availability of atezolizumab in combination with bevacizumab, indicates the need for additional studies to determine if cabozantinib plus atezolizumab would be an appropriate first-line treatment option in select patient populations.”
For symptomatic patients with high disease burden or main portal vein occlusion who are at risk for impending complications, controlling the disease as quickly as possible is vital, the authors wrote. “Underlying chronic liver disease is nearly universal in patients with hepatocellular carcinoma and the risk of gastrointestinal bleeding is high in this population, particularly if portal vein tumor thrombus is present.”
Hepatocellular carcinoma (HCC) is an angiogenic tumor, making it a logical target for TKIs that target vascular endothelial growth factor. The TKI sorafenib was the first to be approved as a first-line treatment for HCC, and since then immune checkpoint inhibitors have been shown to induce durable responses in the first-line setting, but have not improved overall survival in randomized trials.
Study methodology
In the study, after a median follow-up of 15.8 months, median progression-free survival was 6.8 months in the combination group and 4.2 months in the sorafenib group (hazard ratio, 0.63; P = .0012). The median overall survival was 15.4 months in the combination group and 15.5 months in the sorafenib group (not significant). Grade 3-4 adverse events included an increase in ALT, which occurred in 9% of the combination group, 3% of the sorafenib group, and 6% of the cabozantinib only group; hypertension (9%, 8%, and 12%, respectively); an increase in AST increase (9%, 4%, 10%); and palmar-plantar erythrodysesthesia (8%, 8%, 9%). Serious treatment-related adverse events occurred in 18% of patients in the combination arm, 8% in the sorafenib arm, and 13% in the cabozantinib arm.
There were no excess serious bleeding events in the treatment groups containing cabozantinib, compared with sorafenib which is noteworthy because HCC patients are at high risk for gastrointestinal bleeding.
Treatment-related grade 5 events were rare, occurring in 1% (six patients) of the combination group, and in just one patient in both the sorafenib and cabozantinib groups.
Although the results suggest promising clinical benefit, the lack of overall survival benefit limit the implications of these findings. Since atezolizumab combined with bevacizumab is also available for this patient population, more research is needed to determine if cabozantinib plus atezolizumab can become a first-line option.
The study had some limitations: Participants had to have a Child-Pugh class of A, though there was no requirement to assess for fibrosis or cirrhosis. Otherwise there were few barriers to study entry.
The study was sponsored by Exelixis (Alameda) and Ipsen (Boulogne-Billancourt, France).
in patients with hepatocellular carcinoma, shows a new study.
While the combination has been shown to be beneficial in renal cell carcinoma and other solid tumor types, it has never before been tested in a phase 3 clinical trial for hepatocellular carcinoma until now.
The new study, published in The Lancet Oncology, included 837 patients from 178 hospital in 32 countries who were enrolled in the study (called COSMIC-312) between December 2018 and August 2020. 432 patients were randomly assigned to receive a combination of cabozantinib (Cabometyx, Exelixis), a tyrosine kinase inhibitor (TKI), and atezolizumab (Tecentriq, Genentech), a PD-L1 inhibitor. While 217 patients were treated with sorafenib (Nexavar, Bayer) alone and 188 patients were treated with cabozantinib.
Clinically meaningful improvements in progression-free survival, increased disease control and lower primary progression were seen in patients who received the cabozantinib and atezolizumab combination therapy over patients who were treated with sorafenib. However, there was no improvement in overall survival.
“The improvement in progression-free survival with cabozantinib plus atezolizumab in this study shows that the combination confers clinical benefit for patients with advanced hepatocellular carcinoma previously untreated with systemic anticancer therapy,” wrote the authors of the study, led by Robin Kate Kelley, MD, a gastrointestinal oncologist with the University of California, San Francisco, and Lorenza Rimassa, MD, a gastrointestinal oncologist with Humanitas University, Milan. “The absence of a benefit in overall survival, along with the availability of atezolizumab in combination with bevacizumab, indicates the need for additional studies to determine if cabozantinib plus atezolizumab would be an appropriate first-line treatment option in select patient populations.”
For symptomatic patients with high disease burden or main portal vein occlusion who are at risk for impending complications, controlling the disease as quickly as possible is vital, the authors wrote. “Underlying chronic liver disease is nearly universal in patients with hepatocellular carcinoma and the risk of gastrointestinal bleeding is high in this population, particularly if portal vein tumor thrombus is present.”
Hepatocellular carcinoma (HCC) is an angiogenic tumor, making it a logical target for TKIs that target vascular endothelial growth factor. The TKI sorafenib was the first to be approved as a first-line treatment for HCC, and since then immune checkpoint inhibitors have been shown to induce durable responses in the first-line setting, but have not improved overall survival in randomized trials.
Study methodology
In the study, after a median follow-up of 15.8 months, median progression-free survival was 6.8 months in the combination group and 4.2 months in the sorafenib group (hazard ratio, 0.63; P = .0012). The median overall survival was 15.4 months in the combination group and 15.5 months in the sorafenib group (not significant). Grade 3-4 adverse events included an increase in ALT, which occurred in 9% of the combination group, 3% of the sorafenib group, and 6% of the cabozantinib only group; hypertension (9%, 8%, and 12%, respectively); an increase in AST increase (9%, 4%, 10%); and palmar-plantar erythrodysesthesia (8%, 8%, 9%). Serious treatment-related adverse events occurred in 18% of patients in the combination arm, 8% in the sorafenib arm, and 13% in the cabozantinib arm.
There were no excess serious bleeding events in the treatment groups containing cabozantinib, compared with sorafenib which is noteworthy because HCC patients are at high risk for gastrointestinal bleeding.
Treatment-related grade 5 events were rare, occurring in 1% (six patients) of the combination group, and in just one patient in both the sorafenib and cabozantinib groups.
Although the results suggest promising clinical benefit, the lack of overall survival benefit limit the implications of these findings. Since atezolizumab combined with bevacizumab is also available for this patient population, more research is needed to determine if cabozantinib plus atezolizumab can become a first-line option.
The study had some limitations: Participants had to have a Child-Pugh class of A, though there was no requirement to assess for fibrosis or cirrhosis. Otherwise there were few barriers to study entry.
The study was sponsored by Exelixis (Alameda) and Ipsen (Boulogne-Billancourt, France).
in patients with hepatocellular carcinoma, shows a new study.
While the combination has been shown to be beneficial in renal cell carcinoma and other solid tumor types, it has never before been tested in a phase 3 clinical trial for hepatocellular carcinoma until now.
The new study, published in The Lancet Oncology, included 837 patients from 178 hospital in 32 countries who were enrolled in the study (called COSMIC-312) between December 2018 and August 2020. 432 patients were randomly assigned to receive a combination of cabozantinib (Cabometyx, Exelixis), a tyrosine kinase inhibitor (TKI), and atezolizumab (Tecentriq, Genentech), a PD-L1 inhibitor. While 217 patients were treated with sorafenib (Nexavar, Bayer) alone and 188 patients were treated with cabozantinib.
Clinically meaningful improvements in progression-free survival, increased disease control and lower primary progression were seen in patients who received the cabozantinib and atezolizumab combination therapy over patients who were treated with sorafenib. However, there was no improvement in overall survival.
“The improvement in progression-free survival with cabozantinib plus atezolizumab in this study shows that the combination confers clinical benefit for patients with advanced hepatocellular carcinoma previously untreated with systemic anticancer therapy,” wrote the authors of the study, led by Robin Kate Kelley, MD, a gastrointestinal oncologist with the University of California, San Francisco, and Lorenza Rimassa, MD, a gastrointestinal oncologist with Humanitas University, Milan. “The absence of a benefit in overall survival, along with the availability of atezolizumab in combination with bevacizumab, indicates the need for additional studies to determine if cabozantinib plus atezolizumab would be an appropriate first-line treatment option in select patient populations.”
For symptomatic patients with high disease burden or main portal vein occlusion who are at risk for impending complications, controlling the disease as quickly as possible is vital, the authors wrote. “Underlying chronic liver disease is nearly universal in patients with hepatocellular carcinoma and the risk of gastrointestinal bleeding is high in this population, particularly if portal vein tumor thrombus is present.”
Hepatocellular carcinoma (HCC) is an angiogenic tumor, making it a logical target for TKIs that target vascular endothelial growth factor. The TKI sorafenib was the first to be approved as a first-line treatment for HCC, and since then immune checkpoint inhibitors have been shown to induce durable responses in the first-line setting, but have not improved overall survival in randomized trials.
Study methodology
In the study, after a median follow-up of 15.8 months, median progression-free survival was 6.8 months in the combination group and 4.2 months in the sorafenib group (hazard ratio, 0.63; P = .0012). The median overall survival was 15.4 months in the combination group and 15.5 months in the sorafenib group (not significant). Grade 3-4 adverse events included an increase in ALT, which occurred in 9% of the combination group, 3% of the sorafenib group, and 6% of the cabozantinib only group; hypertension (9%, 8%, and 12%, respectively); an increase in AST increase (9%, 4%, 10%); and palmar-plantar erythrodysesthesia (8%, 8%, 9%). Serious treatment-related adverse events occurred in 18% of patients in the combination arm, 8% in the sorafenib arm, and 13% in the cabozantinib arm.
There were no excess serious bleeding events in the treatment groups containing cabozantinib, compared with sorafenib which is noteworthy because HCC patients are at high risk for gastrointestinal bleeding.
Treatment-related grade 5 events were rare, occurring in 1% (six patients) of the combination group, and in just one patient in both the sorafenib and cabozantinib groups.
Although the results suggest promising clinical benefit, the lack of overall survival benefit limit the implications of these findings. Since atezolizumab combined with bevacizumab is also available for this patient population, more research is needed to determine if cabozantinib plus atezolizumab can become a first-line option.
The study had some limitations: Participants had to have a Child-Pugh class of A, though there was no requirement to assess for fibrosis or cirrhosis. Otherwise there were few barriers to study entry.
The study was sponsored by Exelixis (Alameda) and Ipsen (Boulogne-Billancourt, France).
FROM THE LANCET
Taste dysfunction in head and neck cancer due to radiation dose
JAMA Otolaryngology–Head & Neck Surgery.
finds a new study fromTaste dysfunction can affect up to 90% of patients undergoing radiotherapy for head and neck cancer. While the ability to taste usually returns after the treatment concludes, some patients can still feel the lingering effects of radiotherapy on taste function long after the treatment concludes. It can lead to weight loss and dry mouth which can, in turn, negatively affect quality of life.
“Taste dysfunction has profound effects on quality of life in patients with head and neck cancer, and the oral cavity dose could be significantly lower with modern radiotherapy techniques,” wrote the researchers, who were led by Miao-Fen Chen, MD, PhD, of Chang Gung University, Taoyuan City, Taiwan. “This study provides useful dose constraints of the oral cavity that may be associated with reduced taste dysfunction.”
Degradation of taste is an important quality of life factor for head and neck cancer patients. A 2021 systematic review published in the journal Radiotherapy and Oncology found that acute taste dysfunction affected 96% of patients as measured objectively, and 79% as measured subjectively. While most patients recover an estimated 23-53% of patients experience long-term dysfunction.
In 2019, a study published in the journal Chemical Senses found that 31% of head and neck cancer patients had long-term changes to taste at 27 months after intensity-modulated radiotherapy (IMRT), with dysfunction associated with glossectomy and oral cavity radiation doses greater than 50 Gy, but the study only used one quality of life subjective measure to evaluate taste function.
In the new JAMA study, researchers reported the results of a longitudinal using the whole-mouth solution method for basic tastes, including salt, sweet, sour, and bitter.
Study methodology
The study included 87 patients (mean age, 58 years; 90% men) who were enrolled between 2017 and 2020 from a single hospital. 45 patients received primary intensity-modulated radiotherapy and 42 received postoperative radiotherapy. 78 patients received volumetric arc therapy, and 9 received intensity-modulated radiotherapy. The radiotherapy was directed to minimize the effect on the parotid glands and oral cavity.
Researchers measured taste dysfunction according to detection thresholds based on solutions with different concentrations. After moving the solution around the mouth and spitting it out, patients were asked to identify taste components. Following a water rinse, they tested a solution with another concentration of taste components. A number was assigned based on the concentration level they were able to detect, with nigher numbers indicating greater sensitivity.
Two to four weeks after initiation of radiotherapy, there were drops in taste scores for salt (4.7 to 1.4), sweet (4.2 to 1.8), sour (4.5 to 2.3), and bitter (4.7 to 1.2). 1 week after radiotherapy, those mean scores increased to 2.6, 2.6, 2.9, and 2.3 respectively. Over the following 3 months, mean scores reflected general recovery to near preradiotherapy levels (4.2, 3.9, 4.1, and 4.0, respectively). At 6 months and 1 year, the scores were equivalent to preradiotherapy levels.
Objective taste tests were performed on 81 participants. 33.3% had taste dysfunction 6 months after radiotherapy. 6 months after, 8.9% had taste dysfunction. At 3 months following radiotherapy, taste dysfunction was associated with an oral cavity mean dose of 4,000 cGy or higher (relative risk, 2.87; 95% confidence interval, 1.21-6.81) or 5,000 cGy or higher (RR, 2.04; 95% CI, 1.12-3.72). At 6 months, taste dysfunction was predicted by glossectomy (RR, 5.63; 95% CI, 1.12-28.15) and oral cavity mean dose 5,000 cGy or greater (RR, 7.79; 95% CI, 0.93-64.92).
The researchers quantified the relationship between mean oral cavity dose and probability of developing taste dysfunction at 3 and 6 months. 3 months after radiotherapy, 25 Gy predicted a 15% chance, 38 Gy predicted a 25% chance, and 60 Gy predicted a 50% chance. At 6 months, the numbers were 57, 60, and 64 Gy.
The study was limited by being conducted at a single center and its small sample size, and it recruited patients varied significantly in treatment modality and disease subtype.
JAMA Otolaryngology–Head & Neck Surgery.
finds a new study fromTaste dysfunction can affect up to 90% of patients undergoing radiotherapy for head and neck cancer. While the ability to taste usually returns after the treatment concludes, some patients can still feel the lingering effects of radiotherapy on taste function long after the treatment concludes. It can lead to weight loss and dry mouth which can, in turn, negatively affect quality of life.
“Taste dysfunction has profound effects on quality of life in patients with head and neck cancer, and the oral cavity dose could be significantly lower with modern radiotherapy techniques,” wrote the researchers, who were led by Miao-Fen Chen, MD, PhD, of Chang Gung University, Taoyuan City, Taiwan. “This study provides useful dose constraints of the oral cavity that may be associated with reduced taste dysfunction.”
Degradation of taste is an important quality of life factor for head and neck cancer patients. A 2021 systematic review published in the journal Radiotherapy and Oncology found that acute taste dysfunction affected 96% of patients as measured objectively, and 79% as measured subjectively. While most patients recover an estimated 23-53% of patients experience long-term dysfunction.
In 2019, a study published in the journal Chemical Senses found that 31% of head and neck cancer patients had long-term changes to taste at 27 months after intensity-modulated radiotherapy (IMRT), with dysfunction associated with glossectomy and oral cavity radiation doses greater than 50 Gy, but the study only used one quality of life subjective measure to evaluate taste function.
In the new JAMA study, researchers reported the results of a longitudinal using the whole-mouth solution method for basic tastes, including salt, sweet, sour, and bitter.
Study methodology
The study included 87 patients (mean age, 58 years; 90% men) who were enrolled between 2017 and 2020 from a single hospital. 45 patients received primary intensity-modulated radiotherapy and 42 received postoperative radiotherapy. 78 patients received volumetric arc therapy, and 9 received intensity-modulated radiotherapy. The radiotherapy was directed to minimize the effect on the parotid glands and oral cavity.
Researchers measured taste dysfunction according to detection thresholds based on solutions with different concentrations. After moving the solution around the mouth and spitting it out, patients were asked to identify taste components. Following a water rinse, they tested a solution with another concentration of taste components. A number was assigned based on the concentration level they were able to detect, with nigher numbers indicating greater sensitivity.
Two to four weeks after initiation of radiotherapy, there were drops in taste scores for salt (4.7 to 1.4), sweet (4.2 to 1.8), sour (4.5 to 2.3), and bitter (4.7 to 1.2). 1 week after radiotherapy, those mean scores increased to 2.6, 2.6, 2.9, and 2.3 respectively. Over the following 3 months, mean scores reflected general recovery to near preradiotherapy levels (4.2, 3.9, 4.1, and 4.0, respectively). At 6 months and 1 year, the scores were equivalent to preradiotherapy levels.
Objective taste tests were performed on 81 participants. 33.3% had taste dysfunction 6 months after radiotherapy. 6 months after, 8.9% had taste dysfunction. At 3 months following radiotherapy, taste dysfunction was associated with an oral cavity mean dose of 4,000 cGy or higher (relative risk, 2.87; 95% confidence interval, 1.21-6.81) or 5,000 cGy or higher (RR, 2.04; 95% CI, 1.12-3.72). At 6 months, taste dysfunction was predicted by glossectomy (RR, 5.63; 95% CI, 1.12-28.15) and oral cavity mean dose 5,000 cGy or greater (RR, 7.79; 95% CI, 0.93-64.92).
The researchers quantified the relationship between mean oral cavity dose and probability of developing taste dysfunction at 3 and 6 months. 3 months after radiotherapy, 25 Gy predicted a 15% chance, 38 Gy predicted a 25% chance, and 60 Gy predicted a 50% chance. At 6 months, the numbers were 57, 60, and 64 Gy.
The study was limited by being conducted at a single center and its small sample size, and it recruited patients varied significantly in treatment modality and disease subtype.
JAMA Otolaryngology–Head & Neck Surgery.
finds a new study fromTaste dysfunction can affect up to 90% of patients undergoing radiotherapy for head and neck cancer. While the ability to taste usually returns after the treatment concludes, some patients can still feel the lingering effects of radiotherapy on taste function long after the treatment concludes. It can lead to weight loss and dry mouth which can, in turn, negatively affect quality of life.
“Taste dysfunction has profound effects on quality of life in patients with head and neck cancer, and the oral cavity dose could be significantly lower with modern radiotherapy techniques,” wrote the researchers, who were led by Miao-Fen Chen, MD, PhD, of Chang Gung University, Taoyuan City, Taiwan. “This study provides useful dose constraints of the oral cavity that may be associated with reduced taste dysfunction.”
Degradation of taste is an important quality of life factor for head and neck cancer patients. A 2021 systematic review published in the journal Radiotherapy and Oncology found that acute taste dysfunction affected 96% of patients as measured objectively, and 79% as measured subjectively. While most patients recover an estimated 23-53% of patients experience long-term dysfunction.
In 2019, a study published in the journal Chemical Senses found that 31% of head and neck cancer patients had long-term changes to taste at 27 months after intensity-modulated radiotherapy (IMRT), with dysfunction associated with glossectomy and oral cavity radiation doses greater than 50 Gy, but the study only used one quality of life subjective measure to evaluate taste function.
In the new JAMA study, researchers reported the results of a longitudinal using the whole-mouth solution method for basic tastes, including salt, sweet, sour, and bitter.
Study methodology
The study included 87 patients (mean age, 58 years; 90% men) who were enrolled between 2017 and 2020 from a single hospital. 45 patients received primary intensity-modulated radiotherapy and 42 received postoperative radiotherapy. 78 patients received volumetric arc therapy, and 9 received intensity-modulated radiotherapy. The radiotherapy was directed to minimize the effect on the parotid glands and oral cavity.
Researchers measured taste dysfunction according to detection thresholds based on solutions with different concentrations. After moving the solution around the mouth and spitting it out, patients were asked to identify taste components. Following a water rinse, they tested a solution with another concentration of taste components. A number was assigned based on the concentration level they were able to detect, with nigher numbers indicating greater sensitivity.
Two to four weeks after initiation of radiotherapy, there were drops in taste scores for salt (4.7 to 1.4), sweet (4.2 to 1.8), sour (4.5 to 2.3), and bitter (4.7 to 1.2). 1 week after radiotherapy, those mean scores increased to 2.6, 2.6, 2.9, and 2.3 respectively. Over the following 3 months, mean scores reflected general recovery to near preradiotherapy levels (4.2, 3.9, 4.1, and 4.0, respectively). At 6 months and 1 year, the scores were equivalent to preradiotherapy levels.
Objective taste tests were performed on 81 participants. 33.3% had taste dysfunction 6 months after radiotherapy. 6 months after, 8.9% had taste dysfunction. At 3 months following radiotherapy, taste dysfunction was associated with an oral cavity mean dose of 4,000 cGy or higher (relative risk, 2.87; 95% confidence interval, 1.21-6.81) or 5,000 cGy or higher (RR, 2.04; 95% CI, 1.12-3.72). At 6 months, taste dysfunction was predicted by glossectomy (RR, 5.63; 95% CI, 1.12-28.15) and oral cavity mean dose 5,000 cGy or greater (RR, 7.79; 95% CI, 0.93-64.92).
The researchers quantified the relationship between mean oral cavity dose and probability of developing taste dysfunction at 3 and 6 months. 3 months after radiotherapy, 25 Gy predicted a 15% chance, 38 Gy predicted a 25% chance, and 60 Gy predicted a 50% chance. At 6 months, the numbers were 57, 60, and 64 Gy.
The study was limited by being conducted at a single center and its small sample size, and it recruited patients varied significantly in treatment modality and disease subtype.
FROM JAMA OTOLARYNGOLOGY–HEAD AND NECK SURGERY
Novel guidance informs plasma biomarker use for Alzheimer’s disease
SAN DIEGO – The organization has previously published recommendations for use of amyloid positron emission tomography (PET) and cerebrospinal fluid (CSF) biomarkers for Alzheimer’s disease.
The recommendations were the subject of a presentation at the 2022 Alzheimer’s Association International Conference and were published online in Alzheimer’s & Dementia.
During his presentation, Oskar Hansson, MD, PhD, stressed that the document describes recommendations, not criteria, for use of blood-based biomarkers. He suggested that the recommendations will need to be updated within 9-12 months, and that criteria for blood-based biomarkers use could come within 2 years.
The new recommendations reflect the recent acceleration of progress in the field, according to Wiesje M. van der Flier, PhD, who moderated the session. “It’s just growing so quickly. I think within 5 years the whole field will have transformed. By starting to use them in specialized memory clinics first, but then also local memory clinics, and then finally, I think that they may also transform primary care,” said Dr. van der Flier, who is a professor of neurology at Amsterdam University Medical Center.
Guidance for clinical trials and memory clinics
The guidelines were created in part because blood-based biomarkers for Alzheimer’s disease have become increasingly available, and there has been a call from the community for guidance, according to Dr. Hansson. There is also a hazard that widespread adoption could interfere with the field itself, especially if physicians don’t understand how to interpret the results. That’s a particularly acute problem since Alzheimer’s disease pathology can precede symptoms. “It’s important to have some guidance about regulating their use so we don’t get the problem that they are misused and get a bad reputation,” said Dr. Hansson in an interview.
The current recommendations are for use in clinical trials to identify patients likely to have Alzheimer’s disease, as well as in memory clinics, though “we’re still a bit cautious. We still need to confirm it with other biomarkers. The reason for that is we still don’t know how these will perform in the clinical reality. So it’s a bit trying it out. You can start using these blood biomarkers to some degree,” said Dr. Hansson.
However, he offered the caveat that plasma-based biomarkers should only be used while confirming that the blood-based biomarkers agree with CSF tests, ideally more than 90% of the time. “If suddenly only 60% of the plasma biomarkers agree with CSF, you have a problem and you need to stop,” said Dr. Hansson.
The authors recommend that blood-based biomarkers be used in clinical trials to help select patients and identify healthy controls. Dr. Hansson said that there is not enough evidence that blood-based biomarkers have sufficient positive predictive value to be used as the sole criteria for clinical trial admission. However, they could also be used to inform decision-making in adaptive clinical trials.
Specifically, plasma Abeta42/Abeta40 and P-tau assays using established thresholds can be used in clinical studies first-screening step for clinical trials, though they should be confirmed by PET or CSF in those with abnormal blood biomarker levels. The biomarkers could also be used in non–Alzheimer’s disease clinical trials to exclude patients with probable Alzheimer’s disease copathology.
In memory clinics, the authors recommend that BBMs be used only in patients who are symptomatic and, when possible, should be confirmed by PET or CSF.
More work to be done
Dr. Hansson noted that 50%-70% of patients with Alzheimer’s disease are misdiagnosed in primary care, showing a clear need for biomarkers that could improve diagnosis. However, he stressed that blood-based biomarkers are not yet ready for use in that setting.
Still, they could eventually become a boon. “The majority of patients now do not get any biomarker support to diagnosis. They do not have access to amyloid PET or [CSF] biomarkers, but when the blood-based biomarkers are good enough, that means that biomarker support for an Alzheimer’s diagnosis [will be] available to many patients … across the globe,” said Dr. van der Flier.
There are numerous research efforts underway to validate blood-based biomarkers in more diverse groups of patients. That’s because the retrospective studies typically used to identify and validate biomarkers tend to recruit carefully selected patients, with clearly defined cases and good CSF characterization, according to Charlotte Teunissen, PhD, who is also a coauthor of the guidelines and professor of neuropsychiatry at Amsterdam University Medical Center. “Now we want to go one step further to go real-life practice, and there are several initiatives,” she said.
Dr. Hansson, Dr. Tenuissen, and Dr. van der Flier have no relevant financial disclosures.
SAN DIEGO – The organization has previously published recommendations for use of amyloid positron emission tomography (PET) and cerebrospinal fluid (CSF) biomarkers for Alzheimer’s disease.
The recommendations were the subject of a presentation at the 2022 Alzheimer’s Association International Conference and were published online in Alzheimer’s & Dementia.
During his presentation, Oskar Hansson, MD, PhD, stressed that the document describes recommendations, not criteria, for use of blood-based biomarkers. He suggested that the recommendations will need to be updated within 9-12 months, and that criteria for blood-based biomarkers use could come within 2 years.
The new recommendations reflect the recent acceleration of progress in the field, according to Wiesje M. van der Flier, PhD, who moderated the session. “It’s just growing so quickly. I think within 5 years the whole field will have transformed. By starting to use them in specialized memory clinics first, but then also local memory clinics, and then finally, I think that they may also transform primary care,” said Dr. van der Flier, who is a professor of neurology at Amsterdam University Medical Center.
Guidance for clinical trials and memory clinics
The guidelines were created in part because blood-based biomarkers for Alzheimer’s disease have become increasingly available, and there has been a call from the community for guidance, according to Dr. Hansson. There is also a hazard that widespread adoption could interfere with the field itself, especially if physicians don’t understand how to interpret the results. That’s a particularly acute problem since Alzheimer’s disease pathology can precede symptoms. “It’s important to have some guidance about regulating their use so we don’t get the problem that they are misused and get a bad reputation,” said Dr. Hansson in an interview.
The current recommendations are for use in clinical trials to identify patients likely to have Alzheimer’s disease, as well as in memory clinics, though “we’re still a bit cautious. We still need to confirm it with other biomarkers. The reason for that is we still don’t know how these will perform in the clinical reality. So it’s a bit trying it out. You can start using these blood biomarkers to some degree,” said Dr. Hansson.
However, he offered the caveat that plasma-based biomarkers should only be used while confirming that the blood-based biomarkers agree with CSF tests, ideally more than 90% of the time. “If suddenly only 60% of the plasma biomarkers agree with CSF, you have a problem and you need to stop,” said Dr. Hansson.
The authors recommend that blood-based biomarkers be used in clinical trials to help select patients and identify healthy controls. Dr. Hansson said that there is not enough evidence that blood-based biomarkers have sufficient positive predictive value to be used as the sole criteria for clinical trial admission. However, they could also be used to inform decision-making in adaptive clinical trials.
Specifically, plasma Abeta42/Abeta40 and P-tau assays using established thresholds can be used in clinical studies first-screening step for clinical trials, though they should be confirmed by PET or CSF in those with abnormal blood biomarker levels. The biomarkers could also be used in non–Alzheimer’s disease clinical trials to exclude patients with probable Alzheimer’s disease copathology.
In memory clinics, the authors recommend that BBMs be used only in patients who are symptomatic and, when possible, should be confirmed by PET or CSF.
More work to be done
Dr. Hansson noted that 50%-70% of patients with Alzheimer’s disease are misdiagnosed in primary care, showing a clear need for biomarkers that could improve diagnosis. However, he stressed that blood-based biomarkers are not yet ready for use in that setting.
Still, they could eventually become a boon. “The majority of patients now do not get any biomarker support to diagnosis. They do not have access to amyloid PET or [CSF] biomarkers, but when the blood-based biomarkers are good enough, that means that biomarker support for an Alzheimer’s diagnosis [will be] available to many patients … across the globe,” said Dr. van der Flier.
There are numerous research efforts underway to validate blood-based biomarkers in more diverse groups of patients. That’s because the retrospective studies typically used to identify and validate biomarkers tend to recruit carefully selected patients, with clearly defined cases and good CSF characterization, according to Charlotte Teunissen, PhD, who is also a coauthor of the guidelines and professor of neuropsychiatry at Amsterdam University Medical Center. “Now we want to go one step further to go real-life practice, and there are several initiatives,” she said.
Dr. Hansson, Dr. Tenuissen, and Dr. van der Flier have no relevant financial disclosures.
SAN DIEGO – The organization has previously published recommendations for use of amyloid positron emission tomography (PET) and cerebrospinal fluid (CSF) biomarkers for Alzheimer’s disease.
The recommendations were the subject of a presentation at the 2022 Alzheimer’s Association International Conference and were published online in Alzheimer’s & Dementia.
During his presentation, Oskar Hansson, MD, PhD, stressed that the document describes recommendations, not criteria, for use of blood-based biomarkers. He suggested that the recommendations will need to be updated within 9-12 months, and that criteria for blood-based biomarkers use could come within 2 years.
The new recommendations reflect the recent acceleration of progress in the field, according to Wiesje M. van der Flier, PhD, who moderated the session. “It’s just growing so quickly. I think within 5 years the whole field will have transformed. By starting to use them in specialized memory clinics first, but then also local memory clinics, and then finally, I think that they may also transform primary care,” said Dr. van der Flier, who is a professor of neurology at Amsterdam University Medical Center.
Guidance for clinical trials and memory clinics
The guidelines were created in part because blood-based biomarkers for Alzheimer’s disease have become increasingly available, and there has been a call from the community for guidance, according to Dr. Hansson. There is also a hazard that widespread adoption could interfere with the field itself, especially if physicians don’t understand how to interpret the results. That’s a particularly acute problem since Alzheimer’s disease pathology can precede symptoms. “It’s important to have some guidance about regulating their use so we don’t get the problem that they are misused and get a bad reputation,” said Dr. Hansson in an interview.
The current recommendations are for use in clinical trials to identify patients likely to have Alzheimer’s disease, as well as in memory clinics, though “we’re still a bit cautious. We still need to confirm it with other biomarkers. The reason for that is we still don’t know how these will perform in the clinical reality. So it’s a bit trying it out. You can start using these blood biomarkers to some degree,” said Dr. Hansson.
However, he offered the caveat that plasma-based biomarkers should only be used while confirming that the blood-based biomarkers agree with CSF tests, ideally more than 90% of the time. “If suddenly only 60% of the plasma biomarkers agree with CSF, you have a problem and you need to stop,” said Dr. Hansson.
The authors recommend that blood-based biomarkers be used in clinical trials to help select patients and identify healthy controls. Dr. Hansson said that there is not enough evidence that blood-based biomarkers have sufficient positive predictive value to be used as the sole criteria for clinical trial admission. However, they could also be used to inform decision-making in adaptive clinical trials.
Specifically, plasma Abeta42/Abeta40 and P-tau assays using established thresholds can be used in clinical studies first-screening step for clinical trials, though they should be confirmed by PET or CSF in those with abnormal blood biomarker levels. The biomarkers could also be used in non–Alzheimer’s disease clinical trials to exclude patients with probable Alzheimer’s disease copathology.
In memory clinics, the authors recommend that BBMs be used only in patients who are symptomatic and, when possible, should be confirmed by PET or CSF.
More work to be done
Dr. Hansson noted that 50%-70% of patients with Alzheimer’s disease are misdiagnosed in primary care, showing a clear need for biomarkers that could improve diagnosis. However, he stressed that blood-based biomarkers are not yet ready for use in that setting.
Still, they could eventually become a boon. “The majority of patients now do not get any biomarker support to diagnosis. They do not have access to amyloid PET or [CSF] biomarkers, but when the blood-based biomarkers are good enough, that means that biomarker support for an Alzheimer’s diagnosis [will be] available to many patients … across the globe,” said Dr. van der Flier.
There are numerous research efforts underway to validate blood-based biomarkers in more diverse groups of patients. That’s because the retrospective studies typically used to identify and validate biomarkers tend to recruit carefully selected patients, with clearly defined cases and good CSF characterization, according to Charlotte Teunissen, PhD, who is also a coauthor of the guidelines and professor of neuropsychiatry at Amsterdam University Medical Center. “Now we want to go one step further to go real-life practice, and there are several initiatives,” she said.
Dr. Hansson, Dr. Tenuissen, and Dr. van der Flier have no relevant financial disclosures.
FROM AAIC 2022
Higher ADR continues to show ‘strong, consistent’ link with lower interval CRC
Higher adenoma detection rates (ADR) during colonoscopies were associated with lower rates of interim colorectal cancer (CRC), and the relationship held true along a broad range of ADR values, according to a retrospective study.
The new study, published online in JAMA, examined ADRs and rates of interim colorectal cancer among patients in California and Washington State between 2011 and 2017. The authors found a 3% reduction in risk for each additional 1% value of ADR. The reduction in risk held true even at high ADRs.
“It basically reaffirms what we’ve believed for the longest time, and other research work has documented – that interim cancers are higher in association with lower adenoma detection rates. The higher you can get that adenoma detection rate, the more we’re going to be able to lower the [rate of] cancers that develop within 3 years of a colonoscopy,” said Lawrence Kosinski, MD, who was asked to comment on the study.
The study included 735,396 patients with a median age of 61.4 years. Among these patients, 852,624 negative colonoscopies were performed by 383 eligible physicians. Participating physicians had to perform at least 25 screening colonoscopies and 100 total colonoscopies per year. After 2.4 million person-years of follow-up, the researchers observed 619 postcolonoscopy colorectal cancers and 36 related deaths over a median follow-up of 3.25 years.
There was an association between each 1% increase in ADR and a reduced probability of postcolonoscopy CRC (hazard ratio [HR], 0.97; 95% confidence interval [CI], 0.96-0.98) and mortality from postcolonoscopy CRC (HR, 0.95; 95% CI, 0.92-0.99).
The median ADR was 28.3%. There was an association between ADR above the median versus below the median and a reduced risk of postcolonoscopy CRC with 1.79 cases versus 3.10 cases per 10,000 person-years, respectively (absolute difference in 7-year risk, –12.2 per 10,000 negative colonoscopies; HR, 0.61; 95% CI, 0.52-0.73). There was a similar reduction in risk of postcolonoscopy CRC-related mortality (0.05 versus 0.22 per 10,000 person-years; absolute difference in 7-year risk, –1.2 per 10,000 negative colonoscopies; HR, 0.26; 95% CI, 0.11-0.65).
These findings may be limited in generalizability to physicians with lower procedure volumes or to populations with different adenoma prevalence.
“Given the strong, consistent associations of higher adenoma detection rates with colonoscopy effectiveness for reducing colorectal cancer incidence and mortality, the current results support more research to identify reliable and readily adoptable methods for increasing adenoma detection rates among physicians with lower values across diverse settings,” the researchers wrote.
The improvement over a broad range of ADRs, along with other recent findings, suggests that there may need to be updates to the use of ADRs as a quality metric, according to an accompanying editorial by Douglas K. Rex, MD, of the division of gastroenterology/hepatology at Indiana University, Indianapolis. For example, it’s possible that ADRs could be measured by averaging values from screening, diagnostic, and surveillance colonoscopy. The editorialist suggested that, if improvements in interim cancer rates continue as ADRs approach 50%, the current view of ADRs, as a minimally acceptable standard, may require reconsideration. Instead, it may be appropriate to continue with a minimum threshold, but add a much higher, aspirational target. Dr. Rex also suggested that highly-variable detection of sessile serrated lesions could be excluded from ADRs in order to reduce variability.
Factors to consider
The study is useful, but it doesn’t address the disparity in adenoma detection that exists between individual doctors, according to Dr. Kosinski, founder and chief medical officer of SonarMD and previously director of a large gastroenterology clinic. “Even if you look at doctors who do a minimum of 250 screening colonoscopies in a year, there’s still variability. There was even a study published in 2014 showing ADRs anywhere from 7.4% to 52.5%. The bell curve is broad,” he said.
As patients age, they have a higher frequency of polyps appearing on the right side of the colon, and those polyps are flatter and more easily missed than polyps on the left side. “The variation in ADR is higher on the right side of the colon than it is on the left. Doctors have to really do a very good job of examining that right side of the colon so that they don’t miss the flat polyps,” said Dr. Kosinski.
To improve ADRs, Dr. Kosinski emphasized the need to take the required time out to complete a procedure, despite the tight schedules often faced by ambulatory centers. “It’s the time you take coming out of the colon that’s critical. You owe it to the patient,” he said.
And if a patient hasn’t prepped well enough, it’s better to send the patient home without the procedure than to conduct a poor-quality screening. “If you can’t see the mucosal surface, you can’t tell the patient that they have a negative colonoscopy. If you have to do more cleaning during the procedure, then do more cleaning during the procedure. If you have to cancel the procedure and bring the patient back, it’s better to do that than it is to do an incomplete colonoscopy,” said Dr. Kosinski.
He also stressed the need to make sure that the patient is properly sedated and comfortable “so that you can do the job you’re supposed to do,” he said.
Some authors disclosed relationships with Amgen and the National Cancer Institute. Dr. Rex disclosed relationships with Olympus, Boston Scientific, Aries, and others, all outside the submitted work.
Higher adenoma detection rates (ADR) during colonoscopies were associated with lower rates of interim colorectal cancer (CRC), and the relationship held true along a broad range of ADR values, according to a retrospective study.
The new study, published online in JAMA, examined ADRs and rates of interim colorectal cancer among patients in California and Washington State between 2011 and 2017. The authors found a 3% reduction in risk for each additional 1% value of ADR. The reduction in risk held true even at high ADRs.
“It basically reaffirms what we’ve believed for the longest time, and other research work has documented – that interim cancers are higher in association with lower adenoma detection rates. The higher you can get that adenoma detection rate, the more we’re going to be able to lower the [rate of] cancers that develop within 3 years of a colonoscopy,” said Lawrence Kosinski, MD, who was asked to comment on the study.
The study included 735,396 patients with a median age of 61.4 years. Among these patients, 852,624 negative colonoscopies were performed by 383 eligible physicians. Participating physicians had to perform at least 25 screening colonoscopies and 100 total colonoscopies per year. After 2.4 million person-years of follow-up, the researchers observed 619 postcolonoscopy colorectal cancers and 36 related deaths over a median follow-up of 3.25 years.
There was an association between each 1% increase in ADR and a reduced probability of postcolonoscopy CRC (hazard ratio [HR], 0.97; 95% confidence interval [CI], 0.96-0.98) and mortality from postcolonoscopy CRC (HR, 0.95; 95% CI, 0.92-0.99).
The median ADR was 28.3%. There was an association between ADR above the median versus below the median and a reduced risk of postcolonoscopy CRC with 1.79 cases versus 3.10 cases per 10,000 person-years, respectively (absolute difference in 7-year risk, –12.2 per 10,000 negative colonoscopies; HR, 0.61; 95% CI, 0.52-0.73). There was a similar reduction in risk of postcolonoscopy CRC-related mortality (0.05 versus 0.22 per 10,000 person-years; absolute difference in 7-year risk, –1.2 per 10,000 negative colonoscopies; HR, 0.26; 95% CI, 0.11-0.65).
These findings may be limited in generalizability to physicians with lower procedure volumes or to populations with different adenoma prevalence.
“Given the strong, consistent associations of higher adenoma detection rates with colonoscopy effectiveness for reducing colorectal cancer incidence and mortality, the current results support more research to identify reliable and readily adoptable methods for increasing adenoma detection rates among physicians with lower values across diverse settings,” the researchers wrote.
The improvement over a broad range of ADRs, along with other recent findings, suggests that there may need to be updates to the use of ADRs as a quality metric, according to an accompanying editorial by Douglas K. Rex, MD, of the division of gastroenterology/hepatology at Indiana University, Indianapolis. For example, it’s possible that ADRs could be measured by averaging values from screening, diagnostic, and surveillance colonoscopy. The editorialist suggested that, if improvements in interim cancer rates continue as ADRs approach 50%, the current view of ADRs, as a minimally acceptable standard, may require reconsideration. Instead, it may be appropriate to continue with a minimum threshold, but add a much higher, aspirational target. Dr. Rex also suggested that highly-variable detection of sessile serrated lesions could be excluded from ADRs in order to reduce variability.
Factors to consider
The study is useful, but it doesn’t address the disparity in adenoma detection that exists between individual doctors, according to Dr. Kosinski, founder and chief medical officer of SonarMD and previously director of a large gastroenterology clinic. “Even if you look at doctors who do a minimum of 250 screening colonoscopies in a year, there’s still variability. There was even a study published in 2014 showing ADRs anywhere from 7.4% to 52.5%. The bell curve is broad,” he said.
As patients age, they have a higher frequency of polyps appearing on the right side of the colon, and those polyps are flatter and more easily missed than polyps on the left side. “The variation in ADR is higher on the right side of the colon than it is on the left. Doctors have to really do a very good job of examining that right side of the colon so that they don’t miss the flat polyps,” said Dr. Kosinski.
To improve ADRs, Dr. Kosinski emphasized the need to take the required time out to complete a procedure, despite the tight schedules often faced by ambulatory centers. “It’s the time you take coming out of the colon that’s critical. You owe it to the patient,” he said.
And if a patient hasn’t prepped well enough, it’s better to send the patient home without the procedure than to conduct a poor-quality screening. “If you can’t see the mucosal surface, you can’t tell the patient that they have a negative colonoscopy. If you have to do more cleaning during the procedure, then do more cleaning during the procedure. If you have to cancel the procedure and bring the patient back, it’s better to do that than it is to do an incomplete colonoscopy,” said Dr. Kosinski.
He also stressed the need to make sure that the patient is properly sedated and comfortable “so that you can do the job you’re supposed to do,” he said.
Some authors disclosed relationships with Amgen and the National Cancer Institute. Dr. Rex disclosed relationships with Olympus, Boston Scientific, Aries, and others, all outside the submitted work.
Higher adenoma detection rates (ADR) during colonoscopies were associated with lower rates of interim colorectal cancer (CRC), and the relationship held true along a broad range of ADR values, according to a retrospective study.
The new study, published online in JAMA, examined ADRs and rates of interim colorectal cancer among patients in California and Washington State between 2011 and 2017. The authors found a 3% reduction in risk for each additional 1% value of ADR. The reduction in risk held true even at high ADRs.
“It basically reaffirms what we’ve believed for the longest time, and other research work has documented – that interim cancers are higher in association with lower adenoma detection rates. The higher you can get that adenoma detection rate, the more we’re going to be able to lower the [rate of] cancers that develop within 3 years of a colonoscopy,” said Lawrence Kosinski, MD, who was asked to comment on the study.
The study included 735,396 patients with a median age of 61.4 years. Among these patients, 852,624 negative colonoscopies were performed by 383 eligible physicians. Participating physicians had to perform at least 25 screening colonoscopies and 100 total colonoscopies per year. After 2.4 million person-years of follow-up, the researchers observed 619 postcolonoscopy colorectal cancers and 36 related deaths over a median follow-up of 3.25 years.
There was an association between each 1% increase in ADR and a reduced probability of postcolonoscopy CRC (hazard ratio [HR], 0.97; 95% confidence interval [CI], 0.96-0.98) and mortality from postcolonoscopy CRC (HR, 0.95; 95% CI, 0.92-0.99).
The median ADR was 28.3%. There was an association between ADR above the median versus below the median and a reduced risk of postcolonoscopy CRC with 1.79 cases versus 3.10 cases per 10,000 person-years, respectively (absolute difference in 7-year risk, –12.2 per 10,000 negative colonoscopies; HR, 0.61; 95% CI, 0.52-0.73). There was a similar reduction in risk of postcolonoscopy CRC-related mortality (0.05 versus 0.22 per 10,000 person-years; absolute difference in 7-year risk, –1.2 per 10,000 negative colonoscopies; HR, 0.26; 95% CI, 0.11-0.65).
These findings may be limited in generalizability to physicians with lower procedure volumes or to populations with different adenoma prevalence.
“Given the strong, consistent associations of higher adenoma detection rates with colonoscopy effectiveness for reducing colorectal cancer incidence and mortality, the current results support more research to identify reliable and readily adoptable methods for increasing adenoma detection rates among physicians with lower values across diverse settings,” the researchers wrote.
The improvement over a broad range of ADRs, along with other recent findings, suggests that there may need to be updates to the use of ADRs as a quality metric, according to an accompanying editorial by Douglas K. Rex, MD, of the division of gastroenterology/hepatology at Indiana University, Indianapolis. For example, it’s possible that ADRs could be measured by averaging values from screening, diagnostic, and surveillance colonoscopy. The editorialist suggested that, if improvements in interim cancer rates continue as ADRs approach 50%, the current view of ADRs, as a minimally acceptable standard, may require reconsideration. Instead, it may be appropriate to continue with a minimum threshold, but add a much higher, aspirational target. Dr. Rex also suggested that highly-variable detection of sessile serrated lesions could be excluded from ADRs in order to reduce variability.
Factors to consider
The study is useful, but it doesn’t address the disparity in adenoma detection that exists between individual doctors, according to Dr. Kosinski, founder and chief medical officer of SonarMD and previously director of a large gastroenterology clinic. “Even if you look at doctors who do a minimum of 250 screening colonoscopies in a year, there’s still variability. There was even a study published in 2014 showing ADRs anywhere from 7.4% to 52.5%. The bell curve is broad,” he said.
As patients age, they have a higher frequency of polyps appearing on the right side of the colon, and those polyps are flatter and more easily missed than polyps on the left side. “The variation in ADR is higher on the right side of the colon than it is on the left. Doctors have to really do a very good job of examining that right side of the colon so that they don’t miss the flat polyps,” said Dr. Kosinski.
To improve ADRs, Dr. Kosinski emphasized the need to take the required time out to complete a procedure, despite the tight schedules often faced by ambulatory centers. “It’s the time you take coming out of the colon that’s critical. You owe it to the patient,” he said.
And if a patient hasn’t prepped well enough, it’s better to send the patient home without the procedure than to conduct a poor-quality screening. “If you can’t see the mucosal surface, you can’t tell the patient that they have a negative colonoscopy. If you have to do more cleaning during the procedure, then do more cleaning during the procedure. If you have to cancel the procedure and bring the patient back, it’s better to do that than it is to do an incomplete colonoscopy,” said Dr. Kosinski.
He also stressed the need to make sure that the patient is properly sedated and comfortable “so that you can do the job you’re supposed to do,” he said.
Some authors disclosed relationships with Amgen and the National Cancer Institute. Dr. Rex disclosed relationships with Olympus, Boston Scientific, Aries, and others, all outside the submitted work.
FROM JAMA
Prior decompensation in alcohol-associated hepatitis not an ‘absolute contraindication’ for early liver transplant
Past decompensation in alcohol-associated hepatitis may be linked with worse survival following liver transplantation, but it’s not all bad news, according to a retrospective study.
Traditionally, patients with alcoholic liver disease were asked to be alcohol free for 6 months before consideration for a liver transplantation. In recent years, there’s been a loosening of that policy, with physicians considering “early” liver transplantation (early LT) instead of waiting 6 months. “It became obvious that a lot of patients do resume alcohol use after transplant, and most of them don’t appear to suffer too much in the way of adverse consequences,” said Paul Martin, MD, chief of hepatology at the University of Miami, who was not involved in the current research.
In 2011, a study confirmed that suspicion, finding that 6-month survival was 77% among carefully selected patients with alcohol-associated hepatitis for whom the 6-month sobriety requirement was waived; 6-month survival in those who did not receive a transplant was 22%. The selection criteria included the presence of supportive family members, the absence of severe coexisting conditions, and a commitment to abstaining from alcohol.
However, authors of the current study, published in the American Journal of Gastroenterology sought nuance: The appropriateness of prior decompensation as exclusion criteria in published studies is unknown, so the researchers compared outcomes of patients with prior versus first-time liver decompensation in alcohol-associated hepatitis.
Not all bad news
The study included 241 patients from six sites who consecutively received early LT between 2007 and 2020. Among these, 210 were identified as having a first-time liver decompensation event and 31 as having had a prior history of liver decompensation, defined as being diagnosed with ascites, hepatic encephalopathy, variceal bleeding, or jaundice.
There was no significant difference in median age, Model for End-Stage Liver Disease (MELD) scores, or post–liver transplant follow-up time between those with first-time liver decompensation or a prior history. The unadjusted 1-year survival rate was 93% in the first decompensation group (95% confidence interval, 89%-96%) and 86% in the prior decompensation group (95% CI, 66%-94%). The unadjusted 3-year survival rates were 85% (95% CI, 79%-90%) and 78% (95% CI, 57%-89%), respectively.
Importantly, the researchers found an association between prior decompensation and higher adjusted post–liver transplantation mortality (adjusted hazard ratio, 2.72; 95% CI, 1.61-4.59) and harmful alcohol use (aHR, 1.77; 95% CI, 1.07-2.92).
However, the researchers noted that these patients, who had MELD scores of 39 and previous decompensation, were at exceptionally high risk of short-term mortality, but still had 1- and 3-year survival rates above 85% and 75%, respectively, with early LT. “While longer follow-up is desirable as graft failure related to alcohol is most apparent after 5-years post LT, these results suggest that prior decompensation alone should not be considered an absolute contraindication to early LT.”
Limitations of the study included its retrospective data and small sample size for patients with prior decompensation.
“These findings validate the value of the ‘first decompensation’ criteria in published experiences regarding early LT for [alcoholic hepatitis],” the investigators concluded. “Further larger and prospective studies with longer-term follow-up will be needed to assess ways to optimally select patients in this cohort who may benefit most from early LT, and ways to manage patients at highest risk for worse outcomes post LT.”
A note of caution for early LT
About half of all liver mortality is attributable to alcoholic-associated liver disease. Corticosteroids can improve short-term survival, but there are no medications proven to increase long-term survival. That leaves liver transplant as the sole alternative for patients who don’t respond to corticosteroids.
“Programs in North America have liberalized their acceptance criteria for patients with alcoholic liver disease, and that’s resulted in large numbers of patients being transplanted who have less than 6 months abstinence. And overall, the results seem good, but I think this paper strikes an appropriate note of caution. In essence, if a patient had at least one prior episode of liver failure related to alcoholic excess and had recovered from that, and continued to drink and got into trouble again, [and then] presented for consideration for liver transplantation, the fact that they resumed alcohol use after prior episodes of decompensation suggests that they may be less-than-ideal candidates [for liver transplantation],” said Dr. Martin.
He pointed out important caveats to the study, including its retrospective nature and its inclusion of a relatively small number of patients with a history of liver decompensation. But it reinforces what physicians generally know, which is that some patients with severe alcohol use disorder also have liver failure, and they tend to fare worse than others after a liver transplant.
Still, physicians also face a conundrum because there are increasing numbers of younger patients who won’t survive if they don’t get a liver transplant. “The challenge is picking out patients who are going to be good candidates from a purely medical point of view, but have a low likelihood of resuming alcohol use after transplantation [which could injure] the new liver,” said Dr. Martin. The new study has the potential to provide some additional guidance in patient selection.
The study authors disclosed no relevant conflicts of interest. Dr. Martin has no relevant financial disclosures.
Past decompensation in alcohol-associated hepatitis may be linked with worse survival following liver transplantation, but it’s not all bad news, according to a retrospective study.
Traditionally, patients with alcoholic liver disease were asked to be alcohol free for 6 months before consideration for a liver transplantation. In recent years, there’s been a loosening of that policy, with physicians considering “early” liver transplantation (early LT) instead of waiting 6 months. “It became obvious that a lot of patients do resume alcohol use after transplant, and most of them don’t appear to suffer too much in the way of adverse consequences,” said Paul Martin, MD, chief of hepatology at the University of Miami, who was not involved in the current research.
In 2011, a study confirmed that suspicion, finding that 6-month survival was 77% among carefully selected patients with alcohol-associated hepatitis for whom the 6-month sobriety requirement was waived; 6-month survival in those who did not receive a transplant was 22%. The selection criteria included the presence of supportive family members, the absence of severe coexisting conditions, and a commitment to abstaining from alcohol.
However, authors of the current study, published in the American Journal of Gastroenterology sought nuance: The appropriateness of prior decompensation as exclusion criteria in published studies is unknown, so the researchers compared outcomes of patients with prior versus first-time liver decompensation in alcohol-associated hepatitis.
Not all bad news
The study included 241 patients from six sites who consecutively received early LT between 2007 and 2020. Among these, 210 were identified as having a first-time liver decompensation event and 31 as having had a prior history of liver decompensation, defined as being diagnosed with ascites, hepatic encephalopathy, variceal bleeding, or jaundice.
There was no significant difference in median age, Model for End-Stage Liver Disease (MELD) scores, or post–liver transplant follow-up time between those with first-time liver decompensation or a prior history. The unadjusted 1-year survival rate was 93% in the first decompensation group (95% confidence interval, 89%-96%) and 86% in the prior decompensation group (95% CI, 66%-94%). The unadjusted 3-year survival rates were 85% (95% CI, 79%-90%) and 78% (95% CI, 57%-89%), respectively.
Importantly, the researchers found an association between prior decompensation and higher adjusted post–liver transplantation mortality (adjusted hazard ratio, 2.72; 95% CI, 1.61-4.59) and harmful alcohol use (aHR, 1.77; 95% CI, 1.07-2.92).
However, the researchers noted that these patients, who had MELD scores of 39 and previous decompensation, were at exceptionally high risk of short-term mortality, but still had 1- and 3-year survival rates above 85% and 75%, respectively, with early LT. “While longer follow-up is desirable as graft failure related to alcohol is most apparent after 5-years post LT, these results suggest that prior decompensation alone should not be considered an absolute contraindication to early LT.”
Limitations of the study included its retrospective data and small sample size for patients with prior decompensation.
“These findings validate the value of the ‘first decompensation’ criteria in published experiences regarding early LT for [alcoholic hepatitis],” the investigators concluded. “Further larger and prospective studies with longer-term follow-up will be needed to assess ways to optimally select patients in this cohort who may benefit most from early LT, and ways to manage patients at highest risk for worse outcomes post LT.”
A note of caution for early LT
About half of all liver mortality is attributable to alcoholic-associated liver disease. Corticosteroids can improve short-term survival, but there are no medications proven to increase long-term survival. That leaves liver transplant as the sole alternative for patients who don’t respond to corticosteroids.
“Programs in North America have liberalized their acceptance criteria for patients with alcoholic liver disease, and that’s resulted in large numbers of patients being transplanted who have less than 6 months abstinence. And overall, the results seem good, but I think this paper strikes an appropriate note of caution. In essence, if a patient had at least one prior episode of liver failure related to alcoholic excess and had recovered from that, and continued to drink and got into trouble again, [and then] presented for consideration for liver transplantation, the fact that they resumed alcohol use after prior episodes of decompensation suggests that they may be less-than-ideal candidates [for liver transplantation],” said Dr. Martin.
He pointed out important caveats to the study, including its retrospective nature and its inclusion of a relatively small number of patients with a history of liver decompensation. But it reinforces what physicians generally know, which is that some patients with severe alcohol use disorder also have liver failure, and they tend to fare worse than others after a liver transplant.
Still, physicians also face a conundrum because there are increasing numbers of younger patients who won’t survive if they don’t get a liver transplant. “The challenge is picking out patients who are going to be good candidates from a purely medical point of view, but have a low likelihood of resuming alcohol use after transplantation [which could injure] the new liver,” said Dr. Martin. The new study has the potential to provide some additional guidance in patient selection.
The study authors disclosed no relevant conflicts of interest. Dr. Martin has no relevant financial disclosures.
Past decompensation in alcohol-associated hepatitis may be linked with worse survival following liver transplantation, but it’s not all bad news, according to a retrospective study.
Traditionally, patients with alcoholic liver disease were asked to be alcohol free for 6 months before consideration for a liver transplantation. In recent years, there’s been a loosening of that policy, with physicians considering “early” liver transplantation (early LT) instead of waiting 6 months. “It became obvious that a lot of patients do resume alcohol use after transplant, and most of them don’t appear to suffer too much in the way of adverse consequences,” said Paul Martin, MD, chief of hepatology at the University of Miami, who was not involved in the current research.
In 2011, a study confirmed that suspicion, finding that 6-month survival was 77% among carefully selected patients with alcohol-associated hepatitis for whom the 6-month sobriety requirement was waived; 6-month survival in those who did not receive a transplant was 22%. The selection criteria included the presence of supportive family members, the absence of severe coexisting conditions, and a commitment to abstaining from alcohol.
However, authors of the current study, published in the American Journal of Gastroenterology sought nuance: The appropriateness of prior decompensation as exclusion criteria in published studies is unknown, so the researchers compared outcomes of patients with prior versus first-time liver decompensation in alcohol-associated hepatitis.
Not all bad news
The study included 241 patients from six sites who consecutively received early LT between 2007 and 2020. Among these, 210 were identified as having a first-time liver decompensation event and 31 as having had a prior history of liver decompensation, defined as being diagnosed with ascites, hepatic encephalopathy, variceal bleeding, or jaundice.
There was no significant difference in median age, Model for End-Stage Liver Disease (MELD) scores, or post–liver transplant follow-up time between those with first-time liver decompensation or a prior history. The unadjusted 1-year survival rate was 93% in the first decompensation group (95% confidence interval, 89%-96%) and 86% in the prior decompensation group (95% CI, 66%-94%). The unadjusted 3-year survival rates were 85% (95% CI, 79%-90%) and 78% (95% CI, 57%-89%), respectively.
Importantly, the researchers found an association between prior decompensation and higher adjusted post–liver transplantation mortality (adjusted hazard ratio, 2.72; 95% CI, 1.61-4.59) and harmful alcohol use (aHR, 1.77; 95% CI, 1.07-2.92).
However, the researchers noted that these patients, who had MELD scores of 39 and previous decompensation, were at exceptionally high risk of short-term mortality, but still had 1- and 3-year survival rates above 85% and 75%, respectively, with early LT. “While longer follow-up is desirable as graft failure related to alcohol is most apparent after 5-years post LT, these results suggest that prior decompensation alone should not be considered an absolute contraindication to early LT.”
Limitations of the study included its retrospective data and small sample size for patients with prior decompensation.
“These findings validate the value of the ‘first decompensation’ criteria in published experiences regarding early LT for [alcoholic hepatitis],” the investigators concluded. “Further larger and prospective studies with longer-term follow-up will be needed to assess ways to optimally select patients in this cohort who may benefit most from early LT, and ways to manage patients at highest risk for worse outcomes post LT.”
A note of caution for early LT
About half of all liver mortality is attributable to alcoholic-associated liver disease. Corticosteroids can improve short-term survival, but there are no medications proven to increase long-term survival. That leaves liver transplant as the sole alternative for patients who don’t respond to corticosteroids.
“Programs in North America have liberalized their acceptance criteria for patients with alcoholic liver disease, and that’s resulted in large numbers of patients being transplanted who have less than 6 months abstinence. And overall, the results seem good, but I think this paper strikes an appropriate note of caution. In essence, if a patient had at least one prior episode of liver failure related to alcoholic excess and had recovered from that, and continued to drink and got into trouble again, [and then] presented for consideration for liver transplantation, the fact that they resumed alcohol use after prior episodes of decompensation suggests that they may be less-than-ideal candidates [for liver transplantation],” said Dr. Martin.
He pointed out important caveats to the study, including its retrospective nature and its inclusion of a relatively small number of patients with a history of liver decompensation. But it reinforces what physicians generally know, which is that some patients with severe alcohol use disorder also have liver failure, and they tend to fare worse than others after a liver transplant.
Still, physicians also face a conundrum because there are increasing numbers of younger patients who won’t survive if they don’t get a liver transplant. “The challenge is picking out patients who are going to be good candidates from a purely medical point of view, but have a low likelihood of resuming alcohol use after transplantation [which could injure] the new liver,” said Dr. Martin. The new study has the potential to provide some additional guidance in patient selection.
The study authors disclosed no relevant conflicts of interest. Dr. Martin has no relevant financial disclosures.
FROM THE AMERICAN JOURNAL OF GASTROENTEROLOGY
Life-threatening adverse events in liver cancer less frequent with ICI therapy
(TKIs), shows a new systematic review and meta-analysis.
The study, which was published online in JAMA Network Open, found that ICIs were associated with fewer serious adverse events, such as death, illness requiring hospitalization or illness leading to disability.
The findings are based on a meta-analysis of 30 randomized clinical trials and 12,921 patients. The analysis found a greater frequency of serious adverse events among those treated with TKIs than those treated with ICIs, though the rates of less serious liver-related adverse events were similar.
“When considering objective response rates, combination therapy with atezolizumab and bevacizumab or lenvatinib alone likely offer the most promise in the neoadjuvant setting in terms of objective response and toxic effects without preventing patients from reaching surgery,” the authors wrote.
Most newly diagnosed cases of HCC are unresectable, which leads to palliative treatment. When disease is advanced, systemic treatment is generally chosen, and new options introduced in the past decade have boosted survival. Many of these approaches feature ICIs and TKIs.
HCC therapy continues to evolve, with targeted surgical and locoregional therapies like ablation and embolization, and it’s important to understand how side effects from ICIs and TKIs might impact follow-on procedures.
Neoadjuvant therapy can avoid delays to adjuvant chemotherapy that might occur due to surgical complications. Neoadjuvant therapy also has the potential to downstage the disease from advanced to resectable, and it can provide greater opportunity for patient selection based on both tumor biology and patient characteristics.
However, advanced HCC is a complicated condition. Patients typically have cirrhosis and require an adequate functional liver remnant. Neoadjuvant locoregional treatment has been studied in HCC. A systematic review of 55 studies found no significant difference in disease-free or overall survival between preoperative or postoperative transarterial chemoembolization in resectable HCC. There is some weak evidence that locoregional therapies may achieve downstaging or maintain candidacy past 6 months.
The median age of participants was 62 years. Among the included studies, on average, 84% of patients were male. The mean fraction of patients with disease originating outside the liver was 61%, and the mean percentage with microvascular invasion was 28%. A mean of 82% had stage C according to Barcelona Clinic Liver Center staging.
21% of patients who received TKIs (95% confidence interval, 16%-26%) experienced liver toxicities versus 24% (95% CI, 13%-35%) of patients receiving ICIs. Severe adverse events were more common with TKIs, with a frequency of 46% (95% CI, 40%-51%), compared with 24% of those who received ICIs (95% CI, 13%-35%).
TKIs other than sorafenib were associated with higher rates of severe adverse events (risk ratio, 1.24; 95% CI, 1.07-1.44). ICIs and sorafenib had similar rates of liver toxic effects and severe adverse events.
The study has some limitations, including variations within the included studies in the way adverse events were reported, and there was variation in the inclusion criteria.
(TKIs), shows a new systematic review and meta-analysis.
The study, which was published online in JAMA Network Open, found that ICIs were associated with fewer serious adverse events, such as death, illness requiring hospitalization or illness leading to disability.
The findings are based on a meta-analysis of 30 randomized clinical trials and 12,921 patients. The analysis found a greater frequency of serious adverse events among those treated with TKIs than those treated with ICIs, though the rates of less serious liver-related adverse events were similar.
“When considering objective response rates, combination therapy with atezolizumab and bevacizumab or lenvatinib alone likely offer the most promise in the neoadjuvant setting in terms of objective response and toxic effects without preventing patients from reaching surgery,” the authors wrote.
Most newly diagnosed cases of HCC are unresectable, which leads to palliative treatment. When disease is advanced, systemic treatment is generally chosen, and new options introduced in the past decade have boosted survival. Many of these approaches feature ICIs and TKIs.
HCC therapy continues to evolve, with targeted surgical and locoregional therapies like ablation and embolization, and it’s important to understand how side effects from ICIs and TKIs might impact follow-on procedures.
Neoadjuvant therapy can avoid delays to adjuvant chemotherapy that might occur due to surgical complications. Neoadjuvant therapy also has the potential to downstage the disease from advanced to resectable, and it can provide greater opportunity for patient selection based on both tumor biology and patient characteristics.
However, advanced HCC is a complicated condition. Patients typically have cirrhosis and require an adequate functional liver remnant. Neoadjuvant locoregional treatment has been studied in HCC. A systematic review of 55 studies found no significant difference in disease-free or overall survival between preoperative or postoperative transarterial chemoembolization in resectable HCC. There is some weak evidence that locoregional therapies may achieve downstaging or maintain candidacy past 6 months.
The median age of participants was 62 years. Among the included studies, on average, 84% of patients were male. The mean fraction of patients with disease originating outside the liver was 61%, and the mean percentage with microvascular invasion was 28%. A mean of 82% had stage C according to Barcelona Clinic Liver Center staging.
21% of patients who received TKIs (95% confidence interval, 16%-26%) experienced liver toxicities versus 24% (95% CI, 13%-35%) of patients receiving ICIs. Severe adverse events were more common with TKIs, with a frequency of 46% (95% CI, 40%-51%), compared with 24% of those who received ICIs (95% CI, 13%-35%).
TKIs other than sorafenib were associated with higher rates of severe adverse events (risk ratio, 1.24; 95% CI, 1.07-1.44). ICIs and sorafenib had similar rates of liver toxic effects and severe adverse events.
The study has some limitations, including variations within the included studies in the way adverse events were reported, and there was variation in the inclusion criteria.
(TKIs), shows a new systematic review and meta-analysis.
The study, which was published online in JAMA Network Open, found that ICIs were associated with fewer serious adverse events, such as death, illness requiring hospitalization or illness leading to disability.
The findings are based on a meta-analysis of 30 randomized clinical trials and 12,921 patients. The analysis found a greater frequency of serious adverse events among those treated with TKIs than those treated with ICIs, though the rates of less serious liver-related adverse events were similar.
“When considering objective response rates, combination therapy with atezolizumab and bevacizumab or lenvatinib alone likely offer the most promise in the neoadjuvant setting in terms of objective response and toxic effects without preventing patients from reaching surgery,” the authors wrote.
Most newly diagnosed cases of HCC are unresectable, which leads to palliative treatment. When disease is advanced, systemic treatment is generally chosen, and new options introduced in the past decade have boosted survival. Many of these approaches feature ICIs and TKIs.
HCC therapy continues to evolve, with targeted surgical and locoregional therapies like ablation and embolization, and it’s important to understand how side effects from ICIs and TKIs might impact follow-on procedures.
Neoadjuvant therapy can avoid delays to adjuvant chemotherapy that might occur due to surgical complications. Neoadjuvant therapy also has the potential to downstage the disease from advanced to resectable, and it can provide greater opportunity for patient selection based on both tumor biology and patient characteristics.
However, advanced HCC is a complicated condition. Patients typically have cirrhosis and require an adequate functional liver remnant. Neoadjuvant locoregional treatment has been studied in HCC. A systematic review of 55 studies found no significant difference in disease-free or overall survival between preoperative or postoperative transarterial chemoembolization in resectable HCC. There is some weak evidence that locoregional therapies may achieve downstaging or maintain candidacy past 6 months.
The median age of participants was 62 years. Among the included studies, on average, 84% of patients were male. The mean fraction of patients with disease originating outside the liver was 61%, and the mean percentage with microvascular invasion was 28%. A mean of 82% had stage C according to Barcelona Clinic Liver Center staging.
21% of patients who received TKIs (95% confidence interval, 16%-26%) experienced liver toxicities versus 24% (95% CI, 13%-35%) of patients receiving ICIs. Severe adverse events were more common with TKIs, with a frequency of 46% (95% CI, 40%-51%), compared with 24% of those who received ICIs (95% CI, 13%-35%).
TKIs other than sorafenib were associated with higher rates of severe adverse events (risk ratio, 1.24; 95% CI, 1.07-1.44). ICIs and sorafenib had similar rates of liver toxic effects and severe adverse events.
The study has some limitations, including variations within the included studies in the way adverse events were reported, and there was variation in the inclusion criteria.
FROM JAMA NETWORK OPEN
Lung cancer treatment combo may be effective after ICI failure
In a phase 2 clinical trial, the combination of an immune checkpoint inhibitor (ICI) and a vascular endothelial growth factor (VEGF) inhibitor led to improved overall survival versus standard of care in patients with non–small cell lung cancer (NSCLC) who had failed previous ICI therapy.
NSCLC patients usually receive immune checkpoint inhibitor therapy at some point, whether in the adjuvant or neoadjuvant setting, or among stage 3 patients after radiation. “The majority of patients who get diagnosed with lung cancer will get some sort of immunotherapy, and we know that at least from the advanced setting, about 15% of those will have long-term responses, which means the majority of patients will develop tumor resistance to immune checkpoint inhibitor therapy,” said Karen L. Reckamp, MD, who is the lead author of the study published online in Journal of Clinical Oncology.
That clinical need has led to the combination of ICIs with VEGF inhibitors. This approach is approved for first-line therapy of renal cell cancer, endometrial, and hepatocellular cancer. Along with its effect on tumor vasculature, VEGF inhibition assists in the activation and maturation of dendritic cells, as well as to attract cytotoxic T cells to the tumor. “By both changing the vasculature and changing the tumor milieu, there’s a potential to overcome that immune suppression and potentially overcome that (ICI) resistance,” said Dr. Reckamp, who is associate director of clinical research at Cedars Sinai Medical Center, Los Angeles. “The results of the study were encouraging. . We would like to confirm this finding in a phase 3 trial and potentially provide to patients an option that does not include chemotherapy and can potentially overcome resistance to their prior immune checkpoint inhibitor therapy,” Dr. Reckamp said.
The study included 136 patients. The median patient age was 66 years and 61% were male. The ICI/VEGF arm had better overall survival (hazard ratio, 0.69; SLR one-sided P = .05). The median overall survival was 14.5 months in the ICI/VEGF arm, versus 11.6 months in the standard care arm. Both arms had similar response rates, and grade 3 or higher treatment-related adverse events were more common in the chemotherapy arm (60% versus 42%).
The next step is a phase 3 trial and Dr. Reckamp hopes to improve patient selection for VEGF inhibitor and VEGF receptor inhibitor therapy. “The precision medicine that’s associated with other tumor alterations has kind of been elusive for VEGF therapies, but I would hope with potentially a larger trial and understanding of some of the biomarkers that we might find a more select patient population who will benefit the most,” Dr. Reckamp said.
She also noted that the comparative arm in the phase 2 study was a combination of docetaxel and ramucirumab. “That combination has shown to be more effective than single agent docetaxel alone so [the new study] was really improved overall survival over the best standard of care therapy we have,” Dr. Reckamp said.
The study was funded, in part, by Eli Lilly and Company and Merck Sharp & Dohme Corp. Dr. Reckamp disclosed ties to Amgen, Tesaro, Takeda, AstraZeneca, Seattle Genetics, Genentech, Blueprint Medicines, Daiichi Sankyo/Lilly, EMD Serono, Janssen Oncology, Merck KGaA, GlaxoSmithKline, and Mirati Therapeutics.
In a phase 2 clinical trial, the combination of an immune checkpoint inhibitor (ICI) and a vascular endothelial growth factor (VEGF) inhibitor led to improved overall survival versus standard of care in patients with non–small cell lung cancer (NSCLC) who had failed previous ICI therapy.
NSCLC patients usually receive immune checkpoint inhibitor therapy at some point, whether in the adjuvant or neoadjuvant setting, or among stage 3 patients after radiation. “The majority of patients who get diagnosed with lung cancer will get some sort of immunotherapy, and we know that at least from the advanced setting, about 15% of those will have long-term responses, which means the majority of patients will develop tumor resistance to immune checkpoint inhibitor therapy,” said Karen L. Reckamp, MD, who is the lead author of the study published online in Journal of Clinical Oncology.
That clinical need has led to the combination of ICIs with VEGF inhibitors. This approach is approved for first-line therapy of renal cell cancer, endometrial, and hepatocellular cancer. Along with its effect on tumor vasculature, VEGF inhibition assists in the activation and maturation of dendritic cells, as well as to attract cytotoxic T cells to the tumor. “By both changing the vasculature and changing the tumor milieu, there’s a potential to overcome that immune suppression and potentially overcome that (ICI) resistance,” said Dr. Reckamp, who is associate director of clinical research at Cedars Sinai Medical Center, Los Angeles. “The results of the study were encouraging. . We would like to confirm this finding in a phase 3 trial and potentially provide to patients an option that does not include chemotherapy and can potentially overcome resistance to their prior immune checkpoint inhibitor therapy,” Dr. Reckamp said.
The study included 136 patients. The median patient age was 66 years and 61% were male. The ICI/VEGF arm had better overall survival (hazard ratio, 0.69; SLR one-sided P = .05). The median overall survival was 14.5 months in the ICI/VEGF arm, versus 11.6 months in the standard care arm. Both arms had similar response rates, and grade 3 or higher treatment-related adverse events were more common in the chemotherapy arm (60% versus 42%).
The next step is a phase 3 trial and Dr. Reckamp hopes to improve patient selection for VEGF inhibitor and VEGF receptor inhibitor therapy. “The precision medicine that’s associated with other tumor alterations has kind of been elusive for VEGF therapies, but I would hope with potentially a larger trial and understanding of some of the biomarkers that we might find a more select patient population who will benefit the most,” Dr. Reckamp said.
She also noted that the comparative arm in the phase 2 study was a combination of docetaxel and ramucirumab. “That combination has shown to be more effective than single agent docetaxel alone so [the new study] was really improved overall survival over the best standard of care therapy we have,” Dr. Reckamp said.
The study was funded, in part, by Eli Lilly and Company and Merck Sharp & Dohme Corp. Dr. Reckamp disclosed ties to Amgen, Tesaro, Takeda, AstraZeneca, Seattle Genetics, Genentech, Blueprint Medicines, Daiichi Sankyo/Lilly, EMD Serono, Janssen Oncology, Merck KGaA, GlaxoSmithKline, and Mirati Therapeutics.
In a phase 2 clinical trial, the combination of an immune checkpoint inhibitor (ICI) and a vascular endothelial growth factor (VEGF) inhibitor led to improved overall survival versus standard of care in patients with non–small cell lung cancer (NSCLC) who had failed previous ICI therapy.
NSCLC patients usually receive immune checkpoint inhibitor therapy at some point, whether in the adjuvant or neoadjuvant setting, or among stage 3 patients after radiation. “The majority of patients who get diagnosed with lung cancer will get some sort of immunotherapy, and we know that at least from the advanced setting, about 15% of those will have long-term responses, which means the majority of patients will develop tumor resistance to immune checkpoint inhibitor therapy,” said Karen L. Reckamp, MD, who is the lead author of the study published online in Journal of Clinical Oncology.
That clinical need has led to the combination of ICIs with VEGF inhibitors. This approach is approved for first-line therapy of renal cell cancer, endometrial, and hepatocellular cancer. Along with its effect on tumor vasculature, VEGF inhibition assists in the activation and maturation of dendritic cells, as well as to attract cytotoxic T cells to the tumor. “By both changing the vasculature and changing the tumor milieu, there’s a potential to overcome that immune suppression and potentially overcome that (ICI) resistance,” said Dr. Reckamp, who is associate director of clinical research at Cedars Sinai Medical Center, Los Angeles. “The results of the study were encouraging. . We would like to confirm this finding in a phase 3 trial and potentially provide to patients an option that does not include chemotherapy and can potentially overcome resistance to their prior immune checkpoint inhibitor therapy,” Dr. Reckamp said.
The study included 136 patients. The median patient age was 66 years and 61% were male. The ICI/VEGF arm had better overall survival (hazard ratio, 0.69; SLR one-sided P = .05). The median overall survival was 14.5 months in the ICI/VEGF arm, versus 11.6 months in the standard care arm. Both arms had similar response rates, and grade 3 or higher treatment-related adverse events were more common in the chemotherapy arm (60% versus 42%).
The next step is a phase 3 trial and Dr. Reckamp hopes to improve patient selection for VEGF inhibitor and VEGF receptor inhibitor therapy. “The precision medicine that’s associated with other tumor alterations has kind of been elusive for VEGF therapies, but I would hope with potentially a larger trial and understanding of some of the biomarkers that we might find a more select patient population who will benefit the most,” Dr. Reckamp said.
She also noted that the comparative arm in the phase 2 study was a combination of docetaxel and ramucirumab. “That combination has shown to be more effective than single agent docetaxel alone so [the new study] was really improved overall survival over the best standard of care therapy we have,” Dr. Reckamp said.
The study was funded, in part, by Eli Lilly and Company and Merck Sharp & Dohme Corp. Dr. Reckamp disclosed ties to Amgen, Tesaro, Takeda, AstraZeneca, Seattle Genetics, Genentech, Blueprint Medicines, Daiichi Sankyo/Lilly, EMD Serono, Janssen Oncology, Merck KGaA, GlaxoSmithKline, and Mirati Therapeutics.
FROM THE JOURNAL OF CLINICAL ONCOLOGY
The shifting sands of lung cancer screening
An analysis of trends in lung cancer screening since March 2021 when the U.S. Preventive Services Task Force (USPSTF) expanded the eligibility criteria for lung cancer screening, shows that significantly more Black men have been screened for lung cancer, but not women or undereducated people.
The eligibility for lung cancer screening was expanded in 2021 to include men and women under 50 years old and people who smoke at least one pack of cigarettes a day for the last 20 years. “
“Expansion of screening criteria is a critical first step to achieving equity in lung cancer screening for all high-risk populations, but myriad challenges remain before individuals enter the door for screening,” wrote the authors, led by Julie A. Barta, MD, Thomas Jefferson University, Philadelphia. “Health policy changes must occur simultaneously with efforts to expand community outreach, overcome logistical barriers, and facilitate screening adherence. Only after comprehensive strategies to dismantle screening barriers are identified, validated, and implemented can there be a truly equitable landscape for lung cancer screening.”
For the study, published in JAMA Open Network, researchers examined rates of centralized lung cancer screening in the Baltimore area. In addition to expanding lung cancer screening generally, there was hope that the expanded criteria might increase uptake of screening in populations that are traditionally underserved, such as African American, Hispanic, and female patients. Of 815 people screened during the study period (March-December 2021), 161 were newly eligible for screening under the 2021 criteria.
“There’s been quite a bit of work in the field demonstrating that Black men and women develop lung cancer at more advanced stages of disease, and they often are diagnosed at younger ages and have fewer pack-years of smoking. So the hypothesis was that this would reduce some of the disparities seen in lung cancer screening by making more people eligible,” Dr. Barta said in an interview.
The researchers categorized participants as those who would have been eligible for screening under the USPSTF 2013 guideline (age 55 or older, 30 or more pack-years, quit within the past 15 years), and those who would be eligible under the 2021 guideline (age 50 or older, 20 or more pack-years, quit within the past 15 years). Of the 2021 cohort, 54.5% were African American, versus 39.5% of the 2013 cohort (P = .002). There were no differences between the cohorts with respect to education level or gender.
“Although we’ve seen some encouraging improvement in terms of getting more eligible patients into our screening program, there’s still a lot of work to be done in the field,” Dr. Barta said. “Diagnosing lung cancer at earlier stages of disease is more cost effective in general for the health care system than fighting lung cancer at advanced stages, which requires more complex and multimodal and prolonged therapies.”
New evidence: Chest CTs for lung cancer screening reduces incidence of advanced lung cancer
In an analysis of the SEER database presented in June at the annual meeting of the American Society of Clinical Oncology, the adoption of low-dose chest computed tomography (LDCT) led to fewer diagnoses of advanced lung cancer, although these declines varied significantly by race and ethnicity. Non-Hispanic Blacks seemed to benefit the most with a 55% decline (P < .01), while Hispanics had the lowest rate of decline at 41% (P < .01). The change was recommended by USPSTF in 2013 after the National Lung Screening Trial revealed a 20% relative reduction in mortality when CT scans were used instead of chest radiography. The Centers for Medicare and Medicaid Services approved coverage of the screen in 2015.
The SEER study looked at data from 400,343 individuals from 2004-2014 (preintervention) and 2015-2018 (postintervention). The age-adjusted incidence of advanced lung cancer declined during both periods, but the decline was sharper between 2015 and 2018, with three fewer cases per 100,000 people than 2004-2014 (P < .01). Similar patterns were seen in subanalyses of males and females, non-Hispanic Whites, non-Hispanic Blacks, and Hispanics. The relative declines were largest in women, non-Hispanic Blacks, and people who lived outside of Metropolitan areas.
During a Q&A session that followed the presentation, Robert Smith, PhD, pointed out that the bar for eligibility of lung cancer risk has been set quite high, following the eligibility criteria for clinical trials. He noted that . “We are missing opportunities to prevent avertable lung cancer deaths,” said Dr. Smith, senior vice president of cancer screening at the American Cancer Society.
On the other hand, screening-prompted biopsies have the potential to cause harm, particularly in patients who already have lung disease, said Douglas Allen Arenberg, MD, professor at the University of Michigan, Ann Arbor. “I think that’s what scares most people is the potential downside, which is very hard to measure outside of a clinical trial,” said Dr. Arenberg, who served as a discussant for the presentation.
One way to reduce that risk is to identify biomarkers, either for screens or for incidentally-detected nodules, that have good negative predictive value. “If I had a blood test that is as good as a negative PET scan, I’m going to be much more likely to say, ‘Yeah, you’re 40 and your grandfather had lung cancer. Maybe you should get a CT. If we had that, we could screen a lot more people. Right now, I would discourage anybody who is at low risk from getting screened because when they come to me, the biggest opportunity I have to do harm is when I do a biopsy, and you always remember the ones that go wrong,” he said.
Dr. Arenberg also called for improvements in electronic medical records to better flag at-risk patients. “I think we as physicians have to demand more of the software developers that create these EMRs for us,” he said.
Another study in the same session used data from 1,391,088 patients drawn from the National Cancer Database between 2010 and 2017 to examine trends in diagnosis of stage I cancer. In 2010, 23.5% of patients were diagnosed as stage I, versus 29.1% in 2017. Stage I incidence increased from 25.8% to 31.7% in non–small cell lung cancer, but there was no statistically significant change in small cell lung cancer. As with the SEER database study, the researchers noted that the shift toward stage I diagnoses predated the recommendation of LDCT.
Dr. Arenberg suggested that the trend may come down to increased frequency of CT scans, which often collect incidental images of the lungs. He added that better access to care may also be helping to drive the change. “How much of that might have had something to do with the introduction 5 or 10 years earlier of the Affordable Care Act and people just simply having access to care and taking advantage of that?” Dr. Arenberg said.
But Dr. Arenberg said that not even screening can explain all the data. He referenced a stage shift in patients of all age groups in the National Cancer Database study, even those too young to be eligible for screening. “There’s something else going on here. It would be nice for us to understand what caused these trends, so perhaps we could accentuate that trend even more, but stage shifts are clearly occurring in lung cancer,” Dr. Arenberg said.
Dr. Barta has received grants from Genentech Health Equity Innovations Fund. Dr. Arenberg has no relevant financial disclosures. Dr. Smith’s potential disclosures could not be ascertained.
An analysis of trends in lung cancer screening since March 2021 when the U.S. Preventive Services Task Force (USPSTF) expanded the eligibility criteria for lung cancer screening, shows that significantly more Black men have been screened for lung cancer, but not women or undereducated people.
The eligibility for lung cancer screening was expanded in 2021 to include men and women under 50 years old and people who smoke at least one pack of cigarettes a day for the last 20 years. “
“Expansion of screening criteria is a critical first step to achieving equity in lung cancer screening for all high-risk populations, but myriad challenges remain before individuals enter the door for screening,” wrote the authors, led by Julie A. Barta, MD, Thomas Jefferson University, Philadelphia. “Health policy changes must occur simultaneously with efforts to expand community outreach, overcome logistical barriers, and facilitate screening adherence. Only after comprehensive strategies to dismantle screening barriers are identified, validated, and implemented can there be a truly equitable landscape for lung cancer screening.”
For the study, published in JAMA Open Network, researchers examined rates of centralized lung cancer screening in the Baltimore area. In addition to expanding lung cancer screening generally, there was hope that the expanded criteria might increase uptake of screening in populations that are traditionally underserved, such as African American, Hispanic, and female patients. Of 815 people screened during the study period (March-December 2021), 161 were newly eligible for screening under the 2021 criteria.
“There’s been quite a bit of work in the field demonstrating that Black men and women develop lung cancer at more advanced stages of disease, and they often are diagnosed at younger ages and have fewer pack-years of smoking. So the hypothesis was that this would reduce some of the disparities seen in lung cancer screening by making more people eligible,” Dr. Barta said in an interview.
The researchers categorized participants as those who would have been eligible for screening under the USPSTF 2013 guideline (age 55 or older, 30 or more pack-years, quit within the past 15 years), and those who would be eligible under the 2021 guideline (age 50 or older, 20 or more pack-years, quit within the past 15 years). Of the 2021 cohort, 54.5% were African American, versus 39.5% of the 2013 cohort (P = .002). There were no differences between the cohorts with respect to education level or gender.
“Although we’ve seen some encouraging improvement in terms of getting more eligible patients into our screening program, there’s still a lot of work to be done in the field,” Dr. Barta said. “Diagnosing lung cancer at earlier stages of disease is more cost effective in general for the health care system than fighting lung cancer at advanced stages, which requires more complex and multimodal and prolonged therapies.”
New evidence: Chest CTs for lung cancer screening reduces incidence of advanced lung cancer
In an analysis of the SEER database presented in June at the annual meeting of the American Society of Clinical Oncology, the adoption of low-dose chest computed tomography (LDCT) led to fewer diagnoses of advanced lung cancer, although these declines varied significantly by race and ethnicity. Non-Hispanic Blacks seemed to benefit the most with a 55% decline (P < .01), while Hispanics had the lowest rate of decline at 41% (P < .01). The change was recommended by USPSTF in 2013 after the National Lung Screening Trial revealed a 20% relative reduction in mortality when CT scans were used instead of chest radiography. The Centers for Medicare and Medicaid Services approved coverage of the screen in 2015.
The SEER study looked at data from 400,343 individuals from 2004-2014 (preintervention) and 2015-2018 (postintervention). The age-adjusted incidence of advanced lung cancer declined during both periods, but the decline was sharper between 2015 and 2018, with three fewer cases per 100,000 people than 2004-2014 (P < .01). Similar patterns were seen in subanalyses of males and females, non-Hispanic Whites, non-Hispanic Blacks, and Hispanics. The relative declines were largest in women, non-Hispanic Blacks, and people who lived outside of Metropolitan areas.
During a Q&A session that followed the presentation, Robert Smith, PhD, pointed out that the bar for eligibility of lung cancer risk has been set quite high, following the eligibility criteria for clinical trials. He noted that . “We are missing opportunities to prevent avertable lung cancer deaths,” said Dr. Smith, senior vice president of cancer screening at the American Cancer Society.
On the other hand, screening-prompted biopsies have the potential to cause harm, particularly in patients who already have lung disease, said Douglas Allen Arenberg, MD, professor at the University of Michigan, Ann Arbor. “I think that’s what scares most people is the potential downside, which is very hard to measure outside of a clinical trial,” said Dr. Arenberg, who served as a discussant for the presentation.
One way to reduce that risk is to identify biomarkers, either for screens or for incidentally-detected nodules, that have good negative predictive value. “If I had a blood test that is as good as a negative PET scan, I’m going to be much more likely to say, ‘Yeah, you’re 40 and your grandfather had lung cancer. Maybe you should get a CT. If we had that, we could screen a lot more people. Right now, I would discourage anybody who is at low risk from getting screened because when they come to me, the biggest opportunity I have to do harm is when I do a biopsy, and you always remember the ones that go wrong,” he said.
Dr. Arenberg also called for improvements in electronic medical records to better flag at-risk patients. “I think we as physicians have to demand more of the software developers that create these EMRs for us,” he said.
Another study in the same session used data from 1,391,088 patients drawn from the National Cancer Database between 2010 and 2017 to examine trends in diagnosis of stage I cancer. In 2010, 23.5% of patients were diagnosed as stage I, versus 29.1% in 2017. Stage I incidence increased from 25.8% to 31.7% in non–small cell lung cancer, but there was no statistically significant change in small cell lung cancer. As with the SEER database study, the researchers noted that the shift toward stage I diagnoses predated the recommendation of LDCT.
Dr. Arenberg suggested that the trend may come down to increased frequency of CT scans, which often collect incidental images of the lungs. He added that better access to care may also be helping to drive the change. “How much of that might have had something to do with the introduction 5 or 10 years earlier of the Affordable Care Act and people just simply having access to care and taking advantage of that?” Dr. Arenberg said.
But Dr. Arenberg said that not even screening can explain all the data. He referenced a stage shift in patients of all age groups in the National Cancer Database study, even those too young to be eligible for screening. “There’s something else going on here. It would be nice for us to understand what caused these trends, so perhaps we could accentuate that trend even more, but stage shifts are clearly occurring in lung cancer,” Dr. Arenberg said.
Dr. Barta has received grants from Genentech Health Equity Innovations Fund. Dr. Arenberg has no relevant financial disclosures. Dr. Smith’s potential disclosures could not be ascertained.
An analysis of trends in lung cancer screening since March 2021 when the U.S. Preventive Services Task Force (USPSTF) expanded the eligibility criteria for lung cancer screening, shows that significantly more Black men have been screened for lung cancer, but not women or undereducated people.
The eligibility for lung cancer screening was expanded in 2021 to include men and women under 50 years old and people who smoke at least one pack of cigarettes a day for the last 20 years. “
“Expansion of screening criteria is a critical first step to achieving equity in lung cancer screening for all high-risk populations, but myriad challenges remain before individuals enter the door for screening,” wrote the authors, led by Julie A. Barta, MD, Thomas Jefferson University, Philadelphia. “Health policy changes must occur simultaneously with efforts to expand community outreach, overcome logistical barriers, and facilitate screening adherence. Only after comprehensive strategies to dismantle screening barriers are identified, validated, and implemented can there be a truly equitable landscape for lung cancer screening.”
For the study, published in JAMA Open Network, researchers examined rates of centralized lung cancer screening in the Baltimore area. In addition to expanding lung cancer screening generally, there was hope that the expanded criteria might increase uptake of screening in populations that are traditionally underserved, such as African American, Hispanic, and female patients. Of 815 people screened during the study period (March-December 2021), 161 were newly eligible for screening under the 2021 criteria.
“There’s been quite a bit of work in the field demonstrating that Black men and women develop lung cancer at more advanced stages of disease, and they often are diagnosed at younger ages and have fewer pack-years of smoking. So the hypothesis was that this would reduce some of the disparities seen in lung cancer screening by making more people eligible,” Dr. Barta said in an interview.
The researchers categorized participants as those who would have been eligible for screening under the USPSTF 2013 guideline (age 55 or older, 30 or more pack-years, quit within the past 15 years), and those who would be eligible under the 2021 guideline (age 50 or older, 20 or more pack-years, quit within the past 15 years). Of the 2021 cohort, 54.5% were African American, versus 39.5% of the 2013 cohort (P = .002). There were no differences between the cohorts with respect to education level or gender.
“Although we’ve seen some encouraging improvement in terms of getting more eligible patients into our screening program, there’s still a lot of work to be done in the field,” Dr. Barta said. “Diagnosing lung cancer at earlier stages of disease is more cost effective in general for the health care system than fighting lung cancer at advanced stages, which requires more complex and multimodal and prolonged therapies.”
New evidence: Chest CTs for lung cancer screening reduces incidence of advanced lung cancer
In an analysis of the SEER database presented in June at the annual meeting of the American Society of Clinical Oncology, the adoption of low-dose chest computed tomography (LDCT) led to fewer diagnoses of advanced lung cancer, although these declines varied significantly by race and ethnicity. Non-Hispanic Blacks seemed to benefit the most with a 55% decline (P < .01), while Hispanics had the lowest rate of decline at 41% (P < .01). The change was recommended by USPSTF in 2013 after the National Lung Screening Trial revealed a 20% relative reduction in mortality when CT scans were used instead of chest radiography. The Centers for Medicare and Medicaid Services approved coverage of the screen in 2015.
The SEER study looked at data from 400,343 individuals from 2004-2014 (preintervention) and 2015-2018 (postintervention). The age-adjusted incidence of advanced lung cancer declined during both periods, but the decline was sharper between 2015 and 2018, with three fewer cases per 100,000 people than 2004-2014 (P < .01). Similar patterns were seen in subanalyses of males and females, non-Hispanic Whites, non-Hispanic Blacks, and Hispanics. The relative declines were largest in women, non-Hispanic Blacks, and people who lived outside of Metropolitan areas.
During a Q&A session that followed the presentation, Robert Smith, PhD, pointed out that the bar for eligibility of lung cancer risk has been set quite high, following the eligibility criteria for clinical trials. He noted that . “We are missing opportunities to prevent avertable lung cancer deaths,” said Dr. Smith, senior vice president of cancer screening at the American Cancer Society.
On the other hand, screening-prompted biopsies have the potential to cause harm, particularly in patients who already have lung disease, said Douglas Allen Arenberg, MD, professor at the University of Michigan, Ann Arbor. “I think that’s what scares most people is the potential downside, which is very hard to measure outside of a clinical trial,” said Dr. Arenberg, who served as a discussant for the presentation.
One way to reduce that risk is to identify biomarkers, either for screens or for incidentally-detected nodules, that have good negative predictive value. “If I had a blood test that is as good as a negative PET scan, I’m going to be much more likely to say, ‘Yeah, you’re 40 and your grandfather had lung cancer. Maybe you should get a CT. If we had that, we could screen a lot more people. Right now, I would discourage anybody who is at low risk from getting screened because when they come to me, the biggest opportunity I have to do harm is when I do a biopsy, and you always remember the ones that go wrong,” he said.
Dr. Arenberg also called for improvements in electronic medical records to better flag at-risk patients. “I think we as physicians have to demand more of the software developers that create these EMRs for us,” he said.
Another study in the same session used data from 1,391,088 patients drawn from the National Cancer Database between 2010 and 2017 to examine trends in diagnosis of stage I cancer. In 2010, 23.5% of patients were diagnosed as stage I, versus 29.1% in 2017. Stage I incidence increased from 25.8% to 31.7% in non–small cell lung cancer, but there was no statistically significant change in small cell lung cancer. As with the SEER database study, the researchers noted that the shift toward stage I diagnoses predated the recommendation of LDCT.
Dr. Arenberg suggested that the trend may come down to increased frequency of CT scans, which often collect incidental images of the lungs. He added that better access to care may also be helping to drive the change. “How much of that might have had something to do with the introduction 5 or 10 years earlier of the Affordable Care Act and people just simply having access to care and taking advantage of that?” Dr. Arenberg said.
But Dr. Arenberg said that not even screening can explain all the data. He referenced a stage shift in patients of all age groups in the National Cancer Database study, even those too young to be eligible for screening. “There’s something else going on here. It would be nice for us to understand what caused these trends, so perhaps we could accentuate that trend even more, but stage shifts are clearly occurring in lung cancer,” Dr. Arenberg said.
Dr. Barta has received grants from Genentech Health Equity Innovations Fund. Dr. Arenberg has no relevant financial disclosures. Dr. Smith’s potential disclosures could not be ascertained.
FROM JAMA NETWORK OPEN
Pembrolizumab for melanoma bittersweet, doctor says
CHICAGO – Pembrolizumab has shown promise as adjuvant therapy for stage IIB and IIC melanoma, shows the first interim analysis of the phase 3 KEYNOTE-716 study recently published in The Lancet.
The findings meet an unmet need as the recurrence risk in stage IIB and IIC melanoma is “underrecognized,” said author Georgina Long, MD, comedical director of the Melanoma Institute Australia, University of Sydney.
In fact, their risk of recurrence is similar to patients with stage IIIB disease, wrote David Killock, PhD, in a related commentary published in Nature Reviews.
The adjuvant treatment resulted in an 89% recurrence-free survival in patients who received pembrolizumab, compared with 83% of patients in the placebo group (hazard ratio, 0.65; P = .0066). These findings were used as the basis for Food and Drug Administration approval of pembrolizumab (Keytruda, Merck) for this patient population in December 2021.
Despite the positive findings, Dr. Killock called for more research on distant metastasis-free survival, overall survival, and quality of life data to “establish the true clinical benefit of adjuvant pembrolizumab.”
At the annual meeting of the American Society of Clinical Oncology, Dr. Long presented the third interim analysis which showed pembrolizumab reduced recurrence and distant metastases at 24 months, although the clinical benefit was relatively small at an approximately 8% improvement in recurrence-free survival and about a 6% improvement in distant metastasis-free survival. About 83% in the pembrolizumab group had treatment-related toxicities versus 64% in the placebo group. There were no deaths caused by treatment. About 90% of pembrolizumab-related endocrinopathies led to long-term hormone replacement.
In a discussion that followed the presentation at ASCO, Charlotte Eielson Ariyan, MD, PhD, said the results are bittersweet. Higher-risk stage IIC patients have a risk of recurrence of about 40%. “It’s high, but the absolute risk reduction is about 8%. This is a very personalized discussion with the patient and the physician in understanding their risk of toxicity is about 17% and higher than their absolute risk reduction with the treatment. For me, this is a bitter pill to swallow because you’re treating people longer and you’re not sure if you’re really helping them. Until we can further define who the highest-risk patients are, I think it’s hard to give it to everyone,” said Dr. Ariyan, who is a surgeon with Memorial Sloan Kettering Cancer Center, New York.
In addition to weighing short-term benefits and toxicity, there are longer-term concerns. Toxicity experienced from PD-1 inhibitors in the adjuvant setting could impact future treatment decisions. “We’re very lucky here in melanoma to know that systemic therapies are effective and we can cure people who recur. I would argue this is why we probably will never really see a difference in the survival benefit in this group because people who cross over will probably do well,” Dr. Ariyan said.
During the Q&A session, Vernon Sondek, MD, Moffitt Cancer Center, Tampa, encouraged physician colleagues to have an open mind about treatments. “Beware of dogma. We thought that adjuvant immunotherapy works much better in patients with ulcerated primary tumors. That’s a dogma in some parts of the world. Yet the T4a patients in KEYNOTE-716 dramatically outperformed the ulcerated T3b and T4b [patients]. We still don’t know what we don’t know.”
The study details
KEYNOTE-716 included 976 patients 12 years or older with newly diagnosed completely resected stage IIB or IIC melanoma with a negative sentinel lymph node. Patients were randomized to placebo or 200 mg pembrolizumab every 3 weeks, or 2 mg/kg in pediatric patients, over 17 cycles. Almost 40% of patients were age 65 or older. T3b and T4b were the most common melanoma subcategories at 41% and 35%, respectively.
The planned third interim analysis occurred after the occurrence of 146 distant metastases. After a median follow-up of 27.4 months, distant metastasis-free survival favored the pembrolizumab group (HR, 0.64; P = .0029). At 24 months, the pembrolizumab group had a higher distant metastasis-free survival at 88.1% versus 82.2% and a lower recurrence rate at 81.2% versus 72.8% (HR, 0.64; 95% confidence interval, 0.50-0.84).
At 24 months, only the T4a patients had a statistically significant reduction in distant metastases at 58% (HR, 0.42; 95% CI, 0.19-0.96), although there were numerical reductions in T3a (HR, 0.71; 95% CI, 0.41-1.22) and T4b (HR, 0.70; 95% CI, 0.44-1.33) patients. Of patients experiencing a distant metastasis, 73% of the placebo group had a first distant metastasis to the lung compared with 49% of the pembrolizumab group.
Dr. Long has held consulting or advisory roles for Merck Sharpe & Dohme, which funded this study.
CHICAGO – Pembrolizumab has shown promise as adjuvant therapy for stage IIB and IIC melanoma, shows the first interim analysis of the phase 3 KEYNOTE-716 study recently published in The Lancet.
The findings meet an unmet need as the recurrence risk in stage IIB and IIC melanoma is “underrecognized,” said author Georgina Long, MD, comedical director of the Melanoma Institute Australia, University of Sydney.
In fact, their risk of recurrence is similar to patients with stage IIIB disease, wrote David Killock, PhD, in a related commentary published in Nature Reviews.
The adjuvant treatment resulted in an 89% recurrence-free survival in patients who received pembrolizumab, compared with 83% of patients in the placebo group (hazard ratio, 0.65; P = .0066). These findings were used as the basis for Food and Drug Administration approval of pembrolizumab (Keytruda, Merck) for this patient population in December 2021.
Despite the positive findings, Dr. Killock called for more research on distant metastasis-free survival, overall survival, and quality of life data to “establish the true clinical benefit of adjuvant pembrolizumab.”
At the annual meeting of the American Society of Clinical Oncology, Dr. Long presented the third interim analysis which showed pembrolizumab reduced recurrence and distant metastases at 24 months, although the clinical benefit was relatively small at an approximately 8% improvement in recurrence-free survival and about a 6% improvement in distant metastasis-free survival. About 83% in the pembrolizumab group had treatment-related toxicities versus 64% in the placebo group. There were no deaths caused by treatment. About 90% of pembrolizumab-related endocrinopathies led to long-term hormone replacement.
In a discussion that followed the presentation at ASCO, Charlotte Eielson Ariyan, MD, PhD, said the results are bittersweet. Higher-risk stage IIC patients have a risk of recurrence of about 40%. “It’s high, but the absolute risk reduction is about 8%. This is a very personalized discussion with the patient and the physician in understanding their risk of toxicity is about 17% and higher than their absolute risk reduction with the treatment. For me, this is a bitter pill to swallow because you’re treating people longer and you’re not sure if you’re really helping them. Until we can further define who the highest-risk patients are, I think it’s hard to give it to everyone,” said Dr. Ariyan, who is a surgeon with Memorial Sloan Kettering Cancer Center, New York.
In addition to weighing short-term benefits and toxicity, there are longer-term concerns. Toxicity experienced from PD-1 inhibitors in the adjuvant setting could impact future treatment decisions. “We’re very lucky here in melanoma to know that systemic therapies are effective and we can cure people who recur. I would argue this is why we probably will never really see a difference in the survival benefit in this group because people who cross over will probably do well,” Dr. Ariyan said.
During the Q&A session, Vernon Sondek, MD, Moffitt Cancer Center, Tampa, encouraged physician colleagues to have an open mind about treatments. “Beware of dogma. We thought that adjuvant immunotherapy works much better in patients with ulcerated primary tumors. That’s a dogma in some parts of the world. Yet the T4a patients in KEYNOTE-716 dramatically outperformed the ulcerated T3b and T4b [patients]. We still don’t know what we don’t know.”
The study details
KEYNOTE-716 included 976 patients 12 years or older with newly diagnosed completely resected stage IIB or IIC melanoma with a negative sentinel lymph node. Patients were randomized to placebo or 200 mg pembrolizumab every 3 weeks, or 2 mg/kg in pediatric patients, over 17 cycles. Almost 40% of patients were age 65 or older. T3b and T4b were the most common melanoma subcategories at 41% and 35%, respectively.
The planned third interim analysis occurred after the occurrence of 146 distant metastases. After a median follow-up of 27.4 months, distant metastasis-free survival favored the pembrolizumab group (HR, 0.64; P = .0029). At 24 months, the pembrolizumab group had a higher distant metastasis-free survival at 88.1% versus 82.2% and a lower recurrence rate at 81.2% versus 72.8% (HR, 0.64; 95% confidence interval, 0.50-0.84).
At 24 months, only the T4a patients had a statistically significant reduction in distant metastases at 58% (HR, 0.42; 95% CI, 0.19-0.96), although there were numerical reductions in T3a (HR, 0.71; 95% CI, 0.41-1.22) and T4b (HR, 0.70; 95% CI, 0.44-1.33) patients. Of patients experiencing a distant metastasis, 73% of the placebo group had a first distant metastasis to the lung compared with 49% of the pembrolizumab group.
Dr. Long has held consulting or advisory roles for Merck Sharpe & Dohme, which funded this study.
CHICAGO – Pembrolizumab has shown promise as adjuvant therapy for stage IIB and IIC melanoma, shows the first interim analysis of the phase 3 KEYNOTE-716 study recently published in The Lancet.
The findings meet an unmet need as the recurrence risk in stage IIB and IIC melanoma is “underrecognized,” said author Georgina Long, MD, comedical director of the Melanoma Institute Australia, University of Sydney.
In fact, their risk of recurrence is similar to patients with stage IIIB disease, wrote David Killock, PhD, in a related commentary published in Nature Reviews.
The adjuvant treatment resulted in an 89% recurrence-free survival in patients who received pembrolizumab, compared with 83% of patients in the placebo group (hazard ratio, 0.65; P = .0066). These findings were used as the basis for Food and Drug Administration approval of pembrolizumab (Keytruda, Merck) for this patient population in December 2021.
Despite the positive findings, Dr. Killock called for more research on distant metastasis-free survival, overall survival, and quality of life data to “establish the true clinical benefit of adjuvant pembrolizumab.”
At the annual meeting of the American Society of Clinical Oncology, Dr. Long presented the third interim analysis which showed pembrolizumab reduced recurrence and distant metastases at 24 months, although the clinical benefit was relatively small at an approximately 8% improvement in recurrence-free survival and about a 6% improvement in distant metastasis-free survival. About 83% in the pembrolizumab group had treatment-related toxicities versus 64% in the placebo group. There were no deaths caused by treatment. About 90% of pembrolizumab-related endocrinopathies led to long-term hormone replacement.
In a discussion that followed the presentation at ASCO, Charlotte Eielson Ariyan, MD, PhD, said the results are bittersweet. Higher-risk stage IIC patients have a risk of recurrence of about 40%. “It’s high, but the absolute risk reduction is about 8%. This is a very personalized discussion with the patient and the physician in understanding their risk of toxicity is about 17% and higher than their absolute risk reduction with the treatment. For me, this is a bitter pill to swallow because you’re treating people longer and you’re not sure if you’re really helping them. Until we can further define who the highest-risk patients are, I think it’s hard to give it to everyone,” said Dr. Ariyan, who is a surgeon with Memorial Sloan Kettering Cancer Center, New York.
In addition to weighing short-term benefits and toxicity, there are longer-term concerns. Toxicity experienced from PD-1 inhibitors in the adjuvant setting could impact future treatment decisions. “We’re very lucky here in melanoma to know that systemic therapies are effective and we can cure people who recur. I would argue this is why we probably will never really see a difference in the survival benefit in this group because people who cross over will probably do well,” Dr. Ariyan said.
During the Q&A session, Vernon Sondek, MD, Moffitt Cancer Center, Tampa, encouraged physician colleagues to have an open mind about treatments. “Beware of dogma. We thought that adjuvant immunotherapy works much better in patients with ulcerated primary tumors. That’s a dogma in some parts of the world. Yet the T4a patients in KEYNOTE-716 dramatically outperformed the ulcerated T3b and T4b [patients]. We still don’t know what we don’t know.”
The study details
KEYNOTE-716 included 976 patients 12 years or older with newly diagnosed completely resected stage IIB or IIC melanoma with a negative sentinel lymph node. Patients were randomized to placebo or 200 mg pembrolizumab every 3 weeks, or 2 mg/kg in pediatric patients, over 17 cycles. Almost 40% of patients were age 65 or older. T3b and T4b were the most common melanoma subcategories at 41% and 35%, respectively.
The planned third interim analysis occurred after the occurrence of 146 distant metastases. After a median follow-up of 27.4 months, distant metastasis-free survival favored the pembrolizumab group (HR, 0.64; P = .0029). At 24 months, the pembrolizumab group had a higher distant metastasis-free survival at 88.1% versus 82.2% and a lower recurrence rate at 81.2% versus 72.8% (HR, 0.64; 95% confidence interval, 0.50-0.84).
At 24 months, only the T4a patients had a statistically significant reduction in distant metastases at 58% (HR, 0.42; 95% CI, 0.19-0.96), although there were numerical reductions in T3a (HR, 0.71; 95% CI, 0.41-1.22) and T4b (HR, 0.70; 95% CI, 0.44-1.33) patients. Of patients experiencing a distant metastasis, 73% of the placebo group had a first distant metastasis to the lung compared with 49% of the pembrolizumab group.
Dr. Long has held consulting or advisory roles for Merck Sharpe & Dohme, which funded this study.
AT ASCO 2022