User login
Green light puts the stop on migraine
small study from the University of Arizona, Tucson.
, according to results of a“This is the first clinical study to evaluate green light exposure as a potential preventive therapy for patients with migraine, “ senior author Mohab M. Ibrahim, MD, PhD, said in a press release. “Now I have another tool in my toolbox to treat one of the most difficult neurologic conditions – migraine.”
“Given the safety, affordability, and efficacy of green light exposure, there is merit to conduct a larger study,” he and coauthors from the university wrote in their paper.
The study included 29 adult patients (average age 52.2 years), 22 with chronic migraine and the rest with episodic migraine who were recruited from the University of Arizona/Banner Medical Center chronic pain clinic. To be included, patients had to meet the International Headache Society diagnostic criteria for chronic or episodic migraine, have an average headache pain intensity of 5 out of 10 or greater on the numeric pain scale (NPS) over the 10 weeks prior to enrolling in the study, and be dissatisfied with their current migraine therapy.
The patients were free to start, continue, or discontinue any other migraine treatments as recommended by their physicians as long as this was reported to the study team.
White versus green
The one-way crossover design involved exposure to 10 weeks of white light emitting diodes, for 1-2 hours per day, followed by a 2-week washout period and then 10 weeks’ exposure to green light emitting diodes (GLED) for the same daily duration. The protocol involved use of a light strip emitting an intensity of between 4 and 100 lux measured at approximately 2 m and 1 m from a lux meter.
Patients were instructed to use the light in a dark room, without falling asleep, and to participate in activities that did not require external light sources, such as listening to music, reading books, doing exercises, or engaging in similar activities. The daily minimum exposure of 1 hour, up to a maximum of 2 hours, was to be completed in one sitting.
The primary outcome measure was the number of headache days per month, defined as days with moderate to severe headache pain for at least 4 hours. Secondary outcomes included perceived reduction in duration and intensity of the headache phase of the migraine episodes assessed every 2 weeks with the NPS, improved ability to fall and stay asleep, improved ability to perform work and daily activity, improved quality of life, and reduction of pain medications.
The researchers found that when the patients with chronic migraine and episodic migraine were examined as separate groups, white light exposure did not significantly reduce the number of headache days per month, but when the chronic migraine and episodic migraine groups were combined there was a significant reduction from 18.2 to 16.5 headache days per month.
On the other hand, green light did result in significantly reduced headache days both in the separate (from 7.9 to 2.4 days in the episodic migraine group and 22.3 to 9.4 days in the chronic migraine group) and combined groups (from 18.4 to 7.4 days).
“While some improvement in secondary outcomes was observed with white light emitting diodes, more secondary outcomes with significantly greater magnitude including assessments of quality of life, Short-Form McGill Pain Questionnaire, Headache Impact Test-6, and Five-level version of the EuroQol five-dimensional survey without reported side effects were observed with green light emitting diodes,” the authors reported.
“The use of a nonpharmacological therapy such as green light can be of tremendous help to a variety of patients that either do not want to be on medications or do not respond to them,” coauthor Amol M. Patwardhan, MD, PhD, said in the press release. “The beauty of this approach is the lack of associated side effects. If at all, it appears to improve sleep and other quality of life measures,” said Dr. Patwardhan, associate professor and vice chair of research in the University of Arizona’s department of anesthesiology.
Better than white light
Asked to comment on the findings, Alan M. Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles, said research has shown for some time that exposure to green light has beneficial effects in migraine patients. This study, although small, does indicate that green light is more beneficial than is white light and reduces headache days and intensity. “I believe patients would be willing to spend 1-2 hours a day in green light to reduce and improve their migraine with few side effects. A larger randomized trial should be done,” he said.
The study was funded by support from the National Center for Complementary and Integrative Health (to Dr. Ibrahim), the Comprehensive Chronic Pain and Addiction Center–University of Arizona, and the University of Arizona CHiLLI initiative. Dr. Ibrahim and one coauthor have a patent pending through the University of Arizona for use of green light therapy for the management of chronic pain. Dr. Rapoport is a former president of the International Headache Society. He is an editor of Headache and CNS Drugs, and Editor-in-Chief of Neurology Reviews. He reviews for many peer-reviewed journals such as Cephalalgia, Neurology, New England Journal of Medicine, and Headache.
small study from the University of Arizona, Tucson.
, according to results of a“This is the first clinical study to evaluate green light exposure as a potential preventive therapy for patients with migraine, “ senior author Mohab M. Ibrahim, MD, PhD, said in a press release. “Now I have another tool in my toolbox to treat one of the most difficult neurologic conditions – migraine.”
“Given the safety, affordability, and efficacy of green light exposure, there is merit to conduct a larger study,” he and coauthors from the university wrote in their paper.
The study included 29 adult patients (average age 52.2 years), 22 with chronic migraine and the rest with episodic migraine who were recruited from the University of Arizona/Banner Medical Center chronic pain clinic. To be included, patients had to meet the International Headache Society diagnostic criteria for chronic or episodic migraine, have an average headache pain intensity of 5 out of 10 or greater on the numeric pain scale (NPS) over the 10 weeks prior to enrolling in the study, and be dissatisfied with their current migraine therapy.
The patients were free to start, continue, or discontinue any other migraine treatments as recommended by their physicians as long as this was reported to the study team.
White versus green
The one-way crossover design involved exposure to 10 weeks of white light emitting diodes, for 1-2 hours per day, followed by a 2-week washout period and then 10 weeks’ exposure to green light emitting diodes (GLED) for the same daily duration. The protocol involved use of a light strip emitting an intensity of between 4 and 100 lux measured at approximately 2 m and 1 m from a lux meter.
Patients were instructed to use the light in a dark room, without falling asleep, and to participate in activities that did not require external light sources, such as listening to music, reading books, doing exercises, or engaging in similar activities. The daily minimum exposure of 1 hour, up to a maximum of 2 hours, was to be completed in one sitting.
The primary outcome measure was the number of headache days per month, defined as days with moderate to severe headache pain for at least 4 hours. Secondary outcomes included perceived reduction in duration and intensity of the headache phase of the migraine episodes assessed every 2 weeks with the NPS, improved ability to fall and stay asleep, improved ability to perform work and daily activity, improved quality of life, and reduction of pain medications.
The researchers found that when the patients with chronic migraine and episodic migraine were examined as separate groups, white light exposure did not significantly reduce the number of headache days per month, but when the chronic migraine and episodic migraine groups were combined there was a significant reduction from 18.2 to 16.5 headache days per month.
On the other hand, green light did result in significantly reduced headache days both in the separate (from 7.9 to 2.4 days in the episodic migraine group and 22.3 to 9.4 days in the chronic migraine group) and combined groups (from 18.4 to 7.4 days).
“While some improvement in secondary outcomes was observed with white light emitting diodes, more secondary outcomes with significantly greater magnitude including assessments of quality of life, Short-Form McGill Pain Questionnaire, Headache Impact Test-6, and Five-level version of the EuroQol five-dimensional survey without reported side effects were observed with green light emitting diodes,” the authors reported.
“The use of a nonpharmacological therapy such as green light can be of tremendous help to a variety of patients that either do not want to be on medications or do not respond to them,” coauthor Amol M. Patwardhan, MD, PhD, said in the press release. “The beauty of this approach is the lack of associated side effects. If at all, it appears to improve sleep and other quality of life measures,” said Dr. Patwardhan, associate professor and vice chair of research in the University of Arizona’s department of anesthesiology.
Better than white light
Asked to comment on the findings, Alan M. Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles, said research has shown for some time that exposure to green light has beneficial effects in migraine patients. This study, although small, does indicate that green light is more beneficial than is white light and reduces headache days and intensity. “I believe patients would be willing to spend 1-2 hours a day in green light to reduce and improve their migraine with few side effects. A larger randomized trial should be done,” he said.
The study was funded by support from the National Center for Complementary and Integrative Health (to Dr. Ibrahim), the Comprehensive Chronic Pain and Addiction Center–University of Arizona, and the University of Arizona CHiLLI initiative. Dr. Ibrahim and one coauthor have a patent pending through the University of Arizona for use of green light therapy for the management of chronic pain. Dr. Rapoport is a former president of the International Headache Society. He is an editor of Headache and CNS Drugs, and Editor-in-Chief of Neurology Reviews. He reviews for many peer-reviewed journals such as Cephalalgia, Neurology, New England Journal of Medicine, and Headache.
small study from the University of Arizona, Tucson.
, according to results of a“This is the first clinical study to evaluate green light exposure as a potential preventive therapy for patients with migraine, “ senior author Mohab M. Ibrahim, MD, PhD, said in a press release. “Now I have another tool in my toolbox to treat one of the most difficult neurologic conditions – migraine.”
“Given the safety, affordability, and efficacy of green light exposure, there is merit to conduct a larger study,” he and coauthors from the university wrote in their paper.
The study included 29 adult patients (average age 52.2 years), 22 with chronic migraine and the rest with episodic migraine who were recruited from the University of Arizona/Banner Medical Center chronic pain clinic. To be included, patients had to meet the International Headache Society diagnostic criteria for chronic or episodic migraine, have an average headache pain intensity of 5 out of 10 or greater on the numeric pain scale (NPS) over the 10 weeks prior to enrolling in the study, and be dissatisfied with their current migraine therapy.
The patients were free to start, continue, or discontinue any other migraine treatments as recommended by their physicians as long as this was reported to the study team.
White versus green
The one-way crossover design involved exposure to 10 weeks of white light emitting diodes, for 1-2 hours per day, followed by a 2-week washout period and then 10 weeks’ exposure to green light emitting diodes (GLED) for the same daily duration. The protocol involved use of a light strip emitting an intensity of between 4 and 100 lux measured at approximately 2 m and 1 m from a lux meter.
Patients were instructed to use the light in a dark room, without falling asleep, and to participate in activities that did not require external light sources, such as listening to music, reading books, doing exercises, or engaging in similar activities. The daily minimum exposure of 1 hour, up to a maximum of 2 hours, was to be completed in one sitting.
The primary outcome measure was the number of headache days per month, defined as days with moderate to severe headache pain for at least 4 hours. Secondary outcomes included perceived reduction in duration and intensity of the headache phase of the migraine episodes assessed every 2 weeks with the NPS, improved ability to fall and stay asleep, improved ability to perform work and daily activity, improved quality of life, and reduction of pain medications.
The researchers found that when the patients with chronic migraine and episodic migraine were examined as separate groups, white light exposure did not significantly reduce the number of headache days per month, but when the chronic migraine and episodic migraine groups were combined there was a significant reduction from 18.2 to 16.5 headache days per month.
On the other hand, green light did result in significantly reduced headache days both in the separate (from 7.9 to 2.4 days in the episodic migraine group and 22.3 to 9.4 days in the chronic migraine group) and combined groups (from 18.4 to 7.4 days).
“While some improvement in secondary outcomes was observed with white light emitting diodes, more secondary outcomes with significantly greater magnitude including assessments of quality of life, Short-Form McGill Pain Questionnaire, Headache Impact Test-6, and Five-level version of the EuroQol five-dimensional survey without reported side effects were observed with green light emitting diodes,” the authors reported.
“The use of a nonpharmacological therapy such as green light can be of tremendous help to a variety of patients that either do not want to be on medications or do not respond to them,” coauthor Amol M. Patwardhan, MD, PhD, said in the press release. “The beauty of this approach is the lack of associated side effects. If at all, it appears to improve sleep and other quality of life measures,” said Dr. Patwardhan, associate professor and vice chair of research in the University of Arizona’s department of anesthesiology.
Better than white light
Asked to comment on the findings, Alan M. Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles, said research has shown for some time that exposure to green light has beneficial effects in migraine patients. This study, although small, does indicate that green light is more beneficial than is white light and reduces headache days and intensity. “I believe patients would be willing to spend 1-2 hours a day in green light to reduce and improve their migraine with few side effects. A larger randomized trial should be done,” he said.
The study was funded by support from the National Center for Complementary and Integrative Health (to Dr. Ibrahim), the Comprehensive Chronic Pain and Addiction Center–University of Arizona, and the University of Arizona CHiLLI initiative. Dr. Ibrahim and one coauthor have a patent pending through the University of Arizona for use of green light therapy for the management of chronic pain. Dr. Rapoport is a former president of the International Headache Society. He is an editor of Headache and CNS Drugs, and Editor-in-Chief of Neurology Reviews. He reviews for many peer-reviewed journals such as Cephalalgia, Neurology, New England Journal of Medicine, and Headache.
FROM CEPHALALGIA
Simple blood test plus AI may flag early-stage Alzheimer’s disease
, raising the prospect of early intervention when effective treatments become available.
In a study, investigators used six AI methodologies, including Deep Learning, to assess blood leukocyte epigenomic biomarkers. They found more than 150 genetic differences among study participants with Alzheimer’s disease in comparison with participants who did not have Alzheimer’s disease.
All of the AI platforms were effective in predicting Alzheimer’s disease. Deep Learning’s assessment of intragenic cytosine-phosphate-guanines (CpGs) had sensitivity and specificity rates of 97%.
“It’s almost as if the leukocytes have become a newspaper to tell us, ‘This is what is going on in the brain,’ “ lead author Ray Bahado-Singh, MD, chair of the department of obstetrics and gynecology, Oakland University, Auburn Hills, Mich., said in a news release.
The researchers noted that the findings, if replicated in future studies, may help in providing Alzheimer’s disease diagnoses “much earlier” in the disease process. “The holy grail is to identify patients in the preclinical stage so effective early interventions, including new medications, can be studied and ultimately used,” Dr. Bahado-Singh said.
“This certainly isn’t the final step in Alzheimer’s research, but I think this represents a significant change in direction,” he told attendees at a press briefing.
The findings were published online March 31 in PLOS ONE.
Silver tsunami
The investigators noted that Alzheimer’s disease is often diagnosed when the disease is in its later stages, after irreversible brain damage has occurred. “There is currently no cure for the disease, and the treatment is limited to drugs that attempt to treat symptoms and have little effect on the disease’s progression,” they noted.
Coinvestigator Khaled Imam, MD, director of geriatric medicine for Beaumont Health in Michigan, pointed out that although MRI and lumbar puncture can identify Alzheimer’s disease early on, the processes are expensive and/or invasive.
“Having biomarkers in the blood ... and being able to identify [Alzheimer’s disease] years before symptoms start, hopefully we’d be able to intervene early on in the process of the disease,” Dr. Imam said.
It is estimated that the number of Americans aged 85 and older will triple by 2050. This impending “silver tsunami,” which will come with a commensurate increase in Alzheimer’s disease cases, makes it even more important to be able to diagnose the disease early on, he noted.
The study included 24 individuals with late-onset Alzheimer’s disease (70.8% women; mean age, 83 years); 24 were deemed to be “cognitively healthy” (66.7% women; mean age, 80 years). About 500 ng of genomic DNA was extracted from whole-blood samples from each participant.
The researchers used the Infinium MethylationEPIC BeadChip array, and the samples were then examined for markers of methylation that would “indicate the disease process has started,” they noted.
In addition to Deep Learning, the five other AI platforms were the Support Vector Machine, Generalized Linear Model, Prediction Analysis for Microarrays, Random Forest, and Linear Discriminant Analysis.
These platforms were used to assess leukocyte genome changes. To predict Alzheimer’s disease, the researchers also used Ingenuity Pathway Analysis.
Significant “chemical changes”
Results showed that the Alzheimer’s disease group had 152 significantly differentially methylated CpGs in 171 genes in comparison with the non-Alzheimer’s disease group (false discovery rate P value < .05).
As a whole, using intragenic and intergenic/extragenic CpGs, the AI platforms were effective in predicting who had Alzheimer’s disease (area under the curve [AUC], ≥ 0.93). Using intragenic markers, the AUC for Deep Learning was 0.99.
“We looked at close to a million different sites, and we saw some chemical changes that we know are associated with alteration or change in gene function,” Dr. Bahado-Singh said.
Altered genes that were found in the Alzheimer’s disease group included CR1L, CTSV, S1PR1, and LTB4R – all of which “have been previously linked with Alzheimer’s disease and dementia,” the researchers noted. They also found the methylated genes CTSV and PRMT5, both of which have been previously associated with cardiovascular disease.
“A significant strength of our study is the novelty, i.e. the use of blood leukocytes to accurately detect Alzheimer’s disease and also for interrogating the pathogenesis of Alzheimer’s disease,” the investigators wrote.
Dr. Bahado-Singh said that the test let them identify changes in cells in the blood, “giving us a comprehensive account not only of the fact that the brain is being affected by Alzheimer’s disease but it’s telling us what kinds of processes are going on in the brain.
“Normally you don’t have access to the brain. This gives us a simple blood test to get an ongoing reading of the course of events in the brain – and potentially tell us very early on before the onset of symptoms,” he added.
Cautiously optimistic
During the question-and-answer session following his presentation at the briefing, Dr. Bahado-Singh reiterated that they are at a very early stage in the research and were not able to make clinical recommendations at this point. However, he added, “There was evidence that DNA methylation change could likely precede the onset of abnormalities in the cells that give rise to the disease.”
Coinvestigator Stewart Graham, PhD, director of Alzheimer’s research at Beaumont Health, added that although the initial study findings led to some excitement for the team, “we have to be very conservative with what we say.”
He noted that the findings need to be replicated in a more diverse population. Still, “we’re excited at the moment and looking forward to seeing what the future results hold,” Dr. Graham said.
Dr. Bahado-Singh said that if larger studies confirm the findings and the test is viable, it would make sense to use it as a screen for individuals older than 65. He noted that because of the aging of the population, “this subset of individuals will constitute a larger and larger fraction of the population globally.”
Still early days
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that the investigators used an “interesting” diagnostic process.
“It was a unique approach to looking at and trying to understand what might be some of the biological underpinnings and using these tools and technologies to determine if they’re able to differentiate individuals with Alzheimer’s disease” from those without Alzheimer’s disease, said Dr. Snyder, who was not involved with the research.
“Ultimately, we want to know who is at greater risk, who may have some of the changing biology at the earliest time point so that we can intervene to stop the progression of the disease,” she said.
She pointed out that a number of types of biomarker tests are currently under investigation, many of which are measuring different outcomes. “And that’s what we want to see going forward. We want to have as many tools in our toolbox that allow us to accurately diagnose at that earliest time point,” Dr. Snyder said.
“At this point, [the current study] is still pretty early, so it needs to be replicated and then expanded to larger groups to really understand what they may be seeing,” she added.
Dr. Bahado-Singh, Dr. Imam, Dr. Graham, and Dr. Snyder have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, raising the prospect of early intervention when effective treatments become available.
In a study, investigators used six AI methodologies, including Deep Learning, to assess blood leukocyte epigenomic biomarkers. They found more than 150 genetic differences among study participants with Alzheimer’s disease in comparison with participants who did not have Alzheimer’s disease.
All of the AI platforms were effective in predicting Alzheimer’s disease. Deep Learning’s assessment of intragenic cytosine-phosphate-guanines (CpGs) had sensitivity and specificity rates of 97%.
“It’s almost as if the leukocytes have become a newspaper to tell us, ‘This is what is going on in the brain,’ “ lead author Ray Bahado-Singh, MD, chair of the department of obstetrics and gynecology, Oakland University, Auburn Hills, Mich., said in a news release.
The researchers noted that the findings, if replicated in future studies, may help in providing Alzheimer’s disease diagnoses “much earlier” in the disease process. “The holy grail is to identify patients in the preclinical stage so effective early interventions, including new medications, can be studied and ultimately used,” Dr. Bahado-Singh said.
“This certainly isn’t the final step in Alzheimer’s research, but I think this represents a significant change in direction,” he told attendees at a press briefing.
The findings were published online March 31 in PLOS ONE.
Silver tsunami
The investigators noted that Alzheimer’s disease is often diagnosed when the disease is in its later stages, after irreversible brain damage has occurred. “There is currently no cure for the disease, and the treatment is limited to drugs that attempt to treat symptoms and have little effect on the disease’s progression,” they noted.
Coinvestigator Khaled Imam, MD, director of geriatric medicine for Beaumont Health in Michigan, pointed out that although MRI and lumbar puncture can identify Alzheimer’s disease early on, the processes are expensive and/or invasive.
“Having biomarkers in the blood ... and being able to identify [Alzheimer’s disease] years before symptoms start, hopefully we’d be able to intervene early on in the process of the disease,” Dr. Imam said.
It is estimated that the number of Americans aged 85 and older will triple by 2050. This impending “silver tsunami,” which will come with a commensurate increase in Alzheimer’s disease cases, makes it even more important to be able to diagnose the disease early on, he noted.
The study included 24 individuals with late-onset Alzheimer’s disease (70.8% women; mean age, 83 years); 24 were deemed to be “cognitively healthy” (66.7% women; mean age, 80 years). About 500 ng of genomic DNA was extracted from whole-blood samples from each participant.
The researchers used the Infinium MethylationEPIC BeadChip array, and the samples were then examined for markers of methylation that would “indicate the disease process has started,” they noted.
In addition to Deep Learning, the five other AI platforms were the Support Vector Machine, Generalized Linear Model, Prediction Analysis for Microarrays, Random Forest, and Linear Discriminant Analysis.
These platforms were used to assess leukocyte genome changes. To predict Alzheimer’s disease, the researchers also used Ingenuity Pathway Analysis.
Significant “chemical changes”
Results showed that the Alzheimer’s disease group had 152 significantly differentially methylated CpGs in 171 genes in comparison with the non-Alzheimer’s disease group (false discovery rate P value < .05).
As a whole, using intragenic and intergenic/extragenic CpGs, the AI platforms were effective in predicting who had Alzheimer’s disease (area under the curve [AUC], ≥ 0.93). Using intragenic markers, the AUC for Deep Learning was 0.99.
“We looked at close to a million different sites, and we saw some chemical changes that we know are associated with alteration or change in gene function,” Dr. Bahado-Singh said.
Altered genes that were found in the Alzheimer’s disease group included CR1L, CTSV, S1PR1, and LTB4R – all of which “have been previously linked with Alzheimer’s disease and dementia,” the researchers noted. They also found the methylated genes CTSV and PRMT5, both of which have been previously associated with cardiovascular disease.
“A significant strength of our study is the novelty, i.e. the use of blood leukocytes to accurately detect Alzheimer’s disease and also for interrogating the pathogenesis of Alzheimer’s disease,” the investigators wrote.
Dr. Bahado-Singh said that the test let them identify changes in cells in the blood, “giving us a comprehensive account not only of the fact that the brain is being affected by Alzheimer’s disease but it’s telling us what kinds of processes are going on in the brain.
“Normally you don’t have access to the brain. This gives us a simple blood test to get an ongoing reading of the course of events in the brain – and potentially tell us very early on before the onset of symptoms,” he added.
Cautiously optimistic
During the question-and-answer session following his presentation at the briefing, Dr. Bahado-Singh reiterated that they are at a very early stage in the research and were not able to make clinical recommendations at this point. However, he added, “There was evidence that DNA methylation change could likely precede the onset of abnormalities in the cells that give rise to the disease.”
Coinvestigator Stewart Graham, PhD, director of Alzheimer’s research at Beaumont Health, added that although the initial study findings led to some excitement for the team, “we have to be very conservative with what we say.”
He noted that the findings need to be replicated in a more diverse population. Still, “we’re excited at the moment and looking forward to seeing what the future results hold,” Dr. Graham said.
Dr. Bahado-Singh said that if larger studies confirm the findings and the test is viable, it would make sense to use it as a screen for individuals older than 65. He noted that because of the aging of the population, “this subset of individuals will constitute a larger and larger fraction of the population globally.”
Still early days
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that the investigators used an “interesting” diagnostic process.
“It was a unique approach to looking at and trying to understand what might be some of the biological underpinnings and using these tools and technologies to determine if they’re able to differentiate individuals with Alzheimer’s disease” from those without Alzheimer’s disease, said Dr. Snyder, who was not involved with the research.
“Ultimately, we want to know who is at greater risk, who may have some of the changing biology at the earliest time point so that we can intervene to stop the progression of the disease,” she said.
She pointed out that a number of types of biomarker tests are currently under investigation, many of which are measuring different outcomes. “And that’s what we want to see going forward. We want to have as many tools in our toolbox that allow us to accurately diagnose at that earliest time point,” Dr. Snyder said.
“At this point, [the current study] is still pretty early, so it needs to be replicated and then expanded to larger groups to really understand what they may be seeing,” she added.
Dr. Bahado-Singh, Dr. Imam, Dr. Graham, and Dr. Snyder have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, raising the prospect of early intervention when effective treatments become available.
In a study, investigators used six AI methodologies, including Deep Learning, to assess blood leukocyte epigenomic biomarkers. They found more than 150 genetic differences among study participants with Alzheimer’s disease in comparison with participants who did not have Alzheimer’s disease.
All of the AI platforms were effective in predicting Alzheimer’s disease. Deep Learning’s assessment of intragenic cytosine-phosphate-guanines (CpGs) had sensitivity and specificity rates of 97%.
“It’s almost as if the leukocytes have become a newspaper to tell us, ‘This is what is going on in the brain,’ “ lead author Ray Bahado-Singh, MD, chair of the department of obstetrics and gynecology, Oakland University, Auburn Hills, Mich., said in a news release.
The researchers noted that the findings, if replicated in future studies, may help in providing Alzheimer’s disease diagnoses “much earlier” in the disease process. “The holy grail is to identify patients in the preclinical stage so effective early interventions, including new medications, can be studied and ultimately used,” Dr. Bahado-Singh said.
“This certainly isn’t the final step in Alzheimer’s research, but I think this represents a significant change in direction,” he told attendees at a press briefing.
The findings were published online March 31 in PLOS ONE.
Silver tsunami
The investigators noted that Alzheimer’s disease is often diagnosed when the disease is in its later stages, after irreversible brain damage has occurred. “There is currently no cure for the disease, and the treatment is limited to drugs that attempt to treat symptoms and have little effect on the disease’s progression,” they noted.
Coinvestigator Khaled Imam, MD, director of geriatric medicine for Beaumont Health in Michigan, pointed out that although MRI and lumbar puncture can identify Alzheimer’s disease early on, the processes are expensive and/or invasive.
“Having biomarkers in the blood ... and being able to identify [Alzheimer’s disease] years before symptoms start, hopefully we’d be able to intervene early on in the process of the disease,” Dr. Imam said.
It is estimated that the number of Americans aged 85 and older will triple by 2050. This impending “silver tsunami,” which will come with a commensurate increase in Alzheimer’s disease cases, makes it even more important to be able to diagnose the disease early on, he noted.
The study included 24 individuals with late-onset Alzheimer’s disease (70.8% women; mean age, 83 years); 24 were deemed to be “cognitively healthy” (66.7% women; mean age, 80 years). About 500 ng of genomic DNA was extracted from whole-blood samples from each participant.
The researchers used the Infinium MethylationEPIC BeadChip array, and the samples were then examined for markers of methylation that would “indicate the disease process has started,” they noted.
In addition to Deep Learning, the five other AI platforms were the Support Vector Machine, Generalized Linear Model, Prediction Analysis for Microarrays, Random Forest, and Linear Discriminant Analysis.
These platforms were used to assess leukocyte genome changes. To predict Alzheimer’s disease, the researchers also used Ingenuity Pathway Analysis.
Significant “chemical changes”
Results showed that the Alzheimer’s disease group had 152 significantly differentially methylated CpGs in 171 genes in comparison with the non-Alzheimer’s disease group (false discovery rate P value < .05).
As a whole, using intragenic and intergenic/extragenic CpGs, the AI platforms were effective in predicting who had Alzheimer’s disease (area under the curve [AUC], ≥ 0.93). Using intragenic markers, the AUC for Deep Learning was 0.99.
“We looked at close to a million different sites, and we saw some chemical changes that we know are associated with alteration or change in gene function,” Dr. Bahado-Singh said.
Altered genes that were found in the Alzheimer’s disease group included CR1L, CTSV, S1PR1, and LTB4R – all of which “have been previously linked with Alzheimer’s disease and dementia,” the researchers noted. They also found the methylated genes CTSV and PRMT5, both of which have been previously associated with cardiovascular disease.
“A significant strength of our study is the novelty, i.e. the use of blood leukocytes to accurately detect Alzheimer’s disease and also for interrogating the pathogenesis of Alzheimer’s disease,” the investigators wrote.
Dr. Bahado-Singh said that the test let them identify changes in cells in the blood, “giving us a comprehensive account not only of the fact that the brain is being affected by Alzheimer’s disease but it’s telling us what kinds of processes are going on in the brain.
“Normally you don’t have access to the brain. This gives us a simple blood test to get an ongoing reading of the course of events in the brain – and potentially tell us very early on before the onset of symptoms,” he added.
Cautiously optimistic
During the question-and-answer session following his presentation at the briefing, Dr. Bahado-Singh reiterated that they are at a very early stage in the research and were not able to make clinical recommendations at this point. However, he added, “There was evidence that DNA methylation change could likely precede the onset of abnormalities in the cells that give rise to the disease.”
Coinvestigator Stewart Graham, PhD, director of Alzheimer’s research at Beaumont Health, added that although the initial study findings led to some excitement for the team, “we have to be very conservative with what we say.”
He noted that the findings need to be replicated in a more diverse population. Still, “we’re excited at the moment and looking forward to seeing what the future results hold,” Dr. Graham said.
Dr. Bahado-Singh said that if larger studies confirm the findings and the test is viable, it would make sense to use it as a screen for individuals older than 65. He noted that because of the aging of the population, “this subset of individuals will constitute a larger and larger fraction of the population globally.”
Still early days
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that the investigators used an “interesting” diagnostic process.
“It was a unique approach to looking at and trying to understand what might be some of the biological underpinnings and using these tools and technologies to determine if they’re able to differentiate individuals with Alzheimer’s disease” from those without Alzheimer’s disease, said Dr. Snyder, who was not involved with the research.
“Ultimately, we want to know who is at greater risk, who may have some of the changing biology at the earliest time point so that we can intervene to stop the progression of the disease,” she said.
She pointed out that a number of types of biomarker tests are currently under investigation, many of which are measuring different outcomes. “And that’s what we want to see going forward. We want to have as many tools in our toolbox that allow us to accurately diagnose at that earliest time point,” Dr. Snyder said.
“At this point, [the current study] is still pretty early, so it needs to be replicated and then expanded to larger groups to really understand what they may be seeing,” she added.
Dr. Bahado-Singh, Dr. Imam, Dr. Graham, and Dr. Snyder have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM PLOS ONE
Encephalopathy common, often lethal in hospitalized patients with COVID-19
uremia were the most common causes, although multiple causes were present in close to 80% of patients. TME was also associated with a 24% higher risk of in-hospital death.
, new research shows. Results of a retrospective study show that of almost 4,500 patients with COVID-19, 12% were diagnosed with TME. Of these, 78% developed encephalopathy immediately prior to hospital admission. Septic encephalopathy, hypoxic-ischemic encephalopathy (HIE), and“We found that close to one in eight patients who were hospitalized with COVID-19 had TME that was not attributed to the effects of sedatives, and that this is incredibly common among these patients who are critically ill” said lead author Jennifer A. Frontera, MD, New York University.
“The general principle of our findings is to be more aggressive in TME; and from a neurologist perspective, the way to do this is to eliminate the effects of sedation, which is a confounder,” she said.
The study was published online March 16 in Neurocritical Care.
Drilling down
“Many neurological complications of COVID-19 are sequelae of severe illness or secondary effects of multisystem organ failure, but our previous work identified TME as the most common neurological complication,” Dr. Frontera said.
Previous research investigating encephalopathy among patients with COVID-19 included patients who may have been sedated or have had a positive Confusion Assessment Method (CAM) result.
“A lot of the delirium literature is effectively heterogeneous because there are a number of patients who are on sedative medication that, if you could turn it off, these patients would return to normal. Some may have underlying neurological issues that can be addressed, but you can›t get to the bottom of this unless you turn off the sedation,” Dr. Frontera noted.
“We wanted to be specific and try to drill down to see what the underlying cause of the encephalopathy was,” she said.
The researchers retrospectively analyzed data on 4,491 patients (≥ 18 years old) with COVID-19 who were admitted to four New York City hospitals between March 1, 2020, and May 20, 2020. Of these, 559 (12%) with TME were compared with 3,932 patients without TME.
The researchers looked at index admissions and included patients who had:
- New changes in mental status or significant worsening of mental status (in patients with baseline abnormal mental status).
- Hyperglycemia or with transient focal neurologic deficits that resolved with glucose correction.
- An adequate washout of sedating medications (when relevant) prior to mental status assessment.
Potential etiologies included electrolyte abnormalities, organ failure, hypertensive encephalopathy, sepsis or active infection, fever, nutritional deficiency, and environmental injury.
Foreign environment
Most (78%) of the 559 patients diagnosed with TME had already developed encephalopathy immediately prior to hospital admission, the authors report. The most common etiologies of TME among hospitalized patients with COVID-19 are listed below.
Compared with patients without TME, those with TME – (all Ps < .001):
- Were older (76 vs. 62 years).
- Had higher rates of dementia (27% vs. 3%).
- Had higher rates of psychiatric history (20% vs. 10%).
- Were more often intubated (37% vs. 20%).
- Had a longer length of hospital stay (7.9 vs. 6.0 days).
- Were less often discharged home (25% vs. 66%).
“It’s no surprise that older patients and people with dementia or psychiatric illness are predisposed to becoming encephalopathic,” said Dr. Frontera. “Being in a foreign environment, such as a hospital, or being sleep-deprived in the ICU is likely to make them more confused during their hospital stay.”
Delirium as a symptom
In-hospital mortality or discharge to hospice was considerably higher in the TME versus non-TME patients (44% vs. 18%, respectively).
When the researchers adjusted for confounders (age, sex, race, worse Sequential Organ Failure Assessment score during hospitalization, ventilator status, study week, hospital location, and ICU care level) and excluded patients receiving only comfort care, they found that TME was associated with a 24% increased risk of in-hospital death (30% in patients with TME vs. 16% in those without TME).
The highest mortality risk was associated with hypoxemia, with 42% of patients with HIE dying during hospitalization, compared with 16% of patients without HIE (adjusted hazard ratio 1.56; 95% confidence interval, 1.21-2.00; P = .001).
“Not all patients who are intubated require sedation, but there’s generally a lot of hesitation in reducing or stopping sedation in some patients,” Dr. Frontera observed.
She acknowledged there are “many extremely sick patients whom you can’t ventilate without sedation.”
Nevertheless, “delirium in and of itself does not cause death. It’s a symptom, not a disease, and we have to figure out what causes it. Delirium might not need to be sedated, and it’s more important to see what the causal problem is.”
Independent predictor of death
Commenting on the study, Panayiotis N. Varelas, MD, PhD, vice president of the Neurocritical Care Society, said the study “approached the TME issue better than previously, namely allowing time for sedatives to wear off to have a better sample of patients with this syndrome.”
Dr. Varelas, who is chairman of the department of neurology and professor of neurology at Albany (N.Y.) Medical College, emphasized that TME “is not benign and, in patients with COVID-19, it is an independent predictor of in-hospital mortality.”
“One should take all possible measures … to avoid desaturation and hypotensive episodes and also aggressively treat SAE and uremic encephalopathy in hopes of improving the outcomes,” added Dr. Varelas, who was not involved with the study.
Also commenting on the study, Mitchell Elkind, MD, professor of neurology and epidemiology at Columbia University in New York, who was not associated with the research, said it “nicely distinguishes among the different causes of encephalopathy, including sepsis, hypoxia, and kidney failure … emphasizing just how sick these patients are.”
The study received no direct funding. Individual investigators were supported by grants from the National Institute on Aging and the National Institute of Neurological Disorders and Stroke. The investigators, Dr. Varelas, and Dr. Elkind have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
uremia were the most common causes, although multiple causes were present in close to 80% of patients. TME was also associated with a 24% higher risk of in-hospital death.
, new research shows. Results of a retrospective study show that of almost 4,500 patients with COVID-19, 12% were diagnosed with TME. Of these, 78% developed encephalopathy immediately prior to hospital admission. Septic encephalopathy, hypoxic-ischemic encephalopathy (HIE), and“We found that close to one in eight patients who were hospitalized with COVID-19 had TME that was not attributed to the effects of sedatives, and that this is incredibly common among these patients who are critically ill” said lead author Jennifer A. Frontera, MD, New York University.
“The general principle of our findings is to be more aggressive in TME; and from a neurologist perspective, the way to do this is to eliminate the effects of sedation, which is a confounder,” she said.
The study was published online March 16 in Neurocritical Care.
Drilling down
“Many neurological complications of COVID-19 are sequelae of severe illness or secondary effects of multisystem organ failure, but our previous work identified TME as the most common neurological complication,” Dr. Frontera said.
Previous research investigating encephalopathy among patients with COVID-19 included patients who may have been sedated or have had a positive Confusion Assessment Method (CAM) result.
“A lot of the delirium literature is effectively heterogeneous because there are a number of patients who are on sedative medication that, if you could turn it off, these patients would return to normal. Some may have underlying neurological issues that can be addressed, but you can›t get to the bottom of this unless you turn off the sedation,” Dr. Frontera noted.
“We wanted to be specific and try to drill down to see what the underlying cause of the encephalopathy was,” she said.
The researchers retrospectively analyzed data on 4,491 patients (≥ 18 years old) with COVID-19 who were admitted to four New York City hospitals between March 1, 2020, and May 20, 2020. Of these, 559 (12%) with TME were compared with 3,932 patients without TME.
The researchers looked at index admissions and included patients who had:
- New changes in mental status or significant worsening of mental status (in patients with baseline abnormal mental status).
- Hyperglycemia or with transient focal neurologic deficits that resolved with glucose correction.
- An adequate washout of sedating medications (when relevant) prior to mental status assessment.
Potential etiologies included electrolyte abnormalities, organ failure, hypertensive encephalopathy, sepsis or active infection, fever, nutritional deficiency, and environmental injury.
Foreign environment
Most (78%) of the 559 patients diagnosed with TME had already developed encephalopathy immediately prior to hospital admission, the authors report. The most common etiologies of TME among hospitalized patients with COVID-19 are listed below.
Compared with patients without TME, those with TME – (all Ps < .001):
- Were older (76 vs. 62 years).
- Had higher rates of dementia (27% vs. 3%).
- Had higher rates of psychiatric history (20% vs. 10%).
- Were more often intubated (37% vs. 20%).
- Had a longer length of hospital stay (7.9 vs. 6.0 days).
- Were less often discharged home (25% vs. 66%).
“It’s no surprise that older patients and people with dementia or psychiatric illness are predisposed to becoming encephalopathic,” said Dr. Frontera. “Being in a foreign environment, such as a hospital, or being sleep-deprived in the ICU is likely to make them more confused during their hospital stay.”
Delirium as a symptom
In-hospital mortality or discharge to hospice was considerably higher in the TME versus non-TME patients (44% vs. 18%, respectively).
When the researchers adjusted for confounders (age, sex, race, worse Sequential Organ Failure Assessment score during hospitalization, ventilator status, study week, hospital location, and ICU care level) and excluded patients receiving only comfort care, they found that TME was associated with a 24% increased risk of in-hospital death (30% in patients with TME vs. 16% in those without TME).
The highest mortality risk was associated with hypoxemia, with 42% of patients with HIE dying during hospitalization, compared with 16% of patients without HIE (adjusted hazard ratio 1.56; 95% confidence interval, 1.21-2.00; P = .001).
“Not all patients who are intubated require sedation, but there’s generally a lot of hesitation in reducing or stopping sedation in some patients,” Dr. Frontera observed.
She acknowledged there are “many extremely sick patients whom you can’t ventilate without sedation.”
Nevertheless, “delirium in and of itself does not cause death. It’s a symptom, not a disease, and we have to figure out what causes it. Delirium might not need to be sedated, and it’s more important to see what the causal problem is.”
Independent predictor of death
Commenting on the study, Panayiotis N. Varelas, MD, PhD, vice president of the Neurocritical Care Society, said the study “approached the TME issue better than previously, namely allowing time for sedatives to wear off to have a better sample of patients with this syndrome.”
Dr. Varelas, who is chairman of the department of neurology and professor of neurology at Albany (N.Y.) Medical College, emphasized that TME “is not benign and, in patients with COVID-19, it is an independent predictor of in-hospital mortality.”
“One should take all possible measures … to avoid desaturation and hypotensive episodes and also aggressively treat SAE and uremic encephalopathy in hopes of improving the outcomes,” added Dr. Varelas, who was not involved with the study.
Also commenting on the study, Mitchell Elkind, MD, professor of neurology and epidemiology at Columbia University in New York, who was not associated with the research, said it “nicely distinguishes among the different causes of encephalopathy, including sepsis, hypoxia, and kidney failure … emphasizing just how sick these patients are.”
The study received no direct funding. Individual investigators were supported by grants from the National Institute on Aging and the National Institute of Neurological Disorders and Stroke. The investigators, Dr. Varelas, and Dr. Elkind have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
uremia were the most common causes, although multiple causes were present in close to 80% of patients. TME was also associated with a 24% higher risk of in-hospital death.
, new research shows. Results of a retrospective study show that of almost 4,500 patients with COVID-19, 12% were diagnosed with TME. Of these, 78% developed encephalopathy immediately prior to hospital admission. Septic encephalopathy, hypoxic-ischemic encephalopathy (HIE), and“We found that close to one in eight patients who were hospitalized with COVID-19 had TME that was not attributed to the effects of sedatives, and that this is incredibly common among these patients who are critically ill” said lead author Jennifer A. Frontera, MD, New York University.
“The general principle of our findings is to be more aggressive in TME; and from a neurologist perspective, the way to do this is to eliminate the effects of sedation, which is a confounder,” she said.
The study was published online March 16 in Neurocritical Care.
Drilling down
“Many neurological complications of COVID-19 are sequelae of severe illness or secondary effects of multisystem organ failure, but our previous work identified TME as the most common neurological complication,” Dr. Frontera said.
Previous research investigating encephalopathy among patients with COVID-19 included patients who may have been sedated or have had a positive Confusion Assessment Method (CAM) result.
“A lot of the delirium literature is effectively heterogeneous because there are a number of patients who are on sedative medication that, if you could turn it off, these patients would return to normal. Some may have underlying neurological issues that can be addressed, but you can›t get to the bottom of this unless you turn off the sedation,” Dr. Frontera noted.
“We wanted to be specific and try to drill down to see what the underlying cause of the encephalopathy was,” she said.
The researchers retrospectively analyzed data on 4,491 patients (≥ 18 years old) with COVID-19 who were admitted to four New York City hospitals between March 1, 2020, and May 20, 2020. Of these, 559 (12%) with TME were compared with 3,932 patients without TME.
The researchers looked at index admissions and included patients who had:
- New changes in mental status or significant worsening of mental status (in patients with baseline abnormal mental status).
- Hyperglycemia or with transient focal neurologic deficits that resolved with glucose correction.
- An adequate washout of sedating medications (when relevant) prior to mental status assessment.
Potential etiologies included electrolyte abnormalities, organ failure, hypertensive encephalopathy, sepsis or active infection, fever, nutritional deficiency, and environmental injury.
Foreign environment
Most (78%) of the 559 patients diagnosed with TME had already developed encephalopathy immediately prior to hospital admission, the authors report. The most common etiologies of TME among hospitalized patients with COVID-19 are listed below.
Compared with patients without TME, those with TME – (all Ps < .001):
- Were older (76 vs. 62 years).
- Had higher rates of dementia (27% vs. 3%).
- Had higher rates of psychiatric history (20% vs. 10%).
- Were more often intubated (37% vs. 20%).
- Had a longer length of hospital stay (7.9 vs. 6.0 days).
- Were less often discharged home (25% vs. 66%).
“It’s no surprise that older patients and people with dementia or psychiatric illness are predisposed to becoming encephalopathic,” said Dr. Frontera. “Being in a foreign environment, such as a hospital, or being sleep-deprived in the ICU is likely to make them more confused during their hospital stay.”
Delirium as a symptom
In-hospital mortality or discharge to hospice was considerably higher in the TME versus non-TME patients (44% vs. 18%, respectively).
When the researchers adjusted for confounders (age, sex, race, worse Sequential Organ Failure Assessment score during hospitalization, ventilator status, study week, hospital location, and ICU care level) and excluded patients receiving only comfort care, they found that TME was associated with a 24% increased risk of in-hospital death (30% in patients with TME vs. 16% in those without TME).
The highest mortality risk was associated with hypoxemia, with 42% of patients with HIE dying during hospitalization, compared with 16% of patients without HIE (adjusted hazard ratio 1.56; 95% confidence interval, 1.21-2.00; P = .001).
“Not all patients who are intubated require sedation, but there’s generally a lot of hesitation in reducing or stopping sedation in some patients,” Dr. Frontera observed.
She acknowledged there are “many extremely sick patients whom you can’t ventilate without sedation.”
Nevertheless, “delirium in and of itself does not cause death. It’s a symptom, not a disease, and we have to figure out what causes it. Delirium might not need to be sedated, and it’s more important to see what the causal problem is.”
Independent predictor of death
Commenting on the study, Panayiotis N. Varelas, MD, PhD, vice president of the Neurocritical Care Society, said the study “approached the TME issue better than previously, namely allowing time for sedatives to wear off to have a better sample of patients with this syndrome.”
Dr. Varelas, who is chairman of the department of neurology and professor of neurology at Albany (N.Y.) Medical College, emphasized that TME “is not benign and, in patients with COVID-19, it is an independent predictor of in-hospital mortality.”
“One should take all possible measures … to avoid desaturation and hypotensive episodes and also aggressively treat SAE and uremic encephalopathy in hopes of improving the outcomes,” added Dr. Varelas, who was not involved with the study.
Also commenting on the study, Mitchell Elkind, MD, professor of neurology and epidemiology at Columbia University in New York, who was not associated with the research, said it “nicely distinguishes among the different causes of encephalopathy, including sepsis, hypoxia, and kidney failure … emphasizing just how sick these patients are.”
The study received no direct funding. Individual investigators were supported by grants from the National Institute on Aging and the National Institute of Neurological Disorders and Stroke. The investigators, Dr. Varelas, and Dr. Elkind have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM NEUROCRITICAL CARE
Poor survival with COVID in patients who have had HSCT
Among individuals who have received a hematopoietic stem cell transplant (HSCT), often used in the treatment of blood cancers, rates of survival are poor for those who develop COVID-19.
The probability of survival 30 days after being diagnosed with COVID-19 is only 68% for persons who have received an allogeneic HSCT and 67% for autologous HSCT recipients, according to new data from the Center for International Blood and Marrow Transplant Research.
These findings underscore the need for “stringent surveillance and aggressive treatment measures” in this population, Akshay Sharma, MBBS, of St. Jude Children’s Research Hospital, Memphis, and colleagues wrote.
The findings were published online March 1, 2021, in The Lancet Haematology.
The study is “of importance for physicians caring for HSCT recipients worldwide,” Mathieu Leclerc, MD, and Sébastien Maury, MD, Hôpital Henri Mondor, Créteil, France, commented in an accompanying editorial.
Study details
For their study, Dr. Sharma and colleagues analyzed outcomes for all HSCT recipients who developed COVID-19 and whose cases were reported to the CIBMTR. Of 318 such patients, 184 had undergone allogeneic HSCT, and 134 had undergone autologous HSCT.
Overall, about half of these patients (49%) had mild COVID-19.
Severe COVID-19 that required mechanical ventilation developed in 15% and 13% of the allogeneic and autologous HSCT recipients, respectively.
About one-fifth of patients died: 22% and 19% of allogeneic and autologous HSCT recipients, respectively.
Factors associated with greater mortality risk included age of 50 years or older (hazard ratio, 2.53), male sex (HR, 3.53), and development of COVID-19 within 12 months of undergoing HSCT (HR, 2.67).
Among autologous HSCT recipients, lymphoma was associated with higher mortality risk in comparison with a plasma cell disorder or myeloma (HR, 2.41), the authors noted.
“Two important messages can be drawn from the results reported by Sharma and colleagues,” Dr. Leclerc and Dr. Maury wrote in their editorial. “The first is the confirmation that the prognosis of COVID-19 is particularly poor in HSCT recipients, and that its prevention, in the absence of any specific curative treatment with sufficient efficacy, should be at the forefront of concerns.”
The second relates to the risk factors for death among HSCT recipients who develop COVID-19. In addition to previously known risk factors, such as age and gender, the investigators identified transplant-specific factors potentially associated with prognosis – namely, the nearly threefold increase in death among allogeneic HSCT recipients who develop COVID-19 within 12 months of transplant, they explained.
However, the findings are limited by a substantial amount of missing data, short follow-up, and the possibility of selection bias, they noted.
“Further large and well-designed studies with longer follow-up are needed to confirm and refine the results,” the editorialists wrote.
“[A] better understanding of the distinctive features of COVID-19 infection in HSCT recipients will be a necessary and essential step toward improvement of the remarkably poor prognosis observed in this setting,” they added.
The study was funded by the American Society of Hematology; the Leukemia and Lymphoma Society; the National Cancer Institute; the National Heart, Lung and Blood Institute; the National Institute of Allergy and Infectious Diseases; the National Institutes of Health; the Health Resources and Services Administration; and the Office of Naval Research. Dr. Sharma receives support for the conduct of industry-sponsored trials from Vertex Pharmaceuticals, CRISPR Therapeutics, and Novartis and consulting fees from Spotlight Therapeutics. Dr. Leclerc and Dr. Maury disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Among individuals who have received a hematopoietic stem cell transplant (HSCT), often used in the treatment of blood cancers, rates of survival are poor for those who develop COVID-19.
The probability of survival 30 days after being diagnosed with COVID-19 is only 68% for persons who have received an allogeneic HSCT and 67% for autologous HSCT recipients, according to new data from the Center for International Blood and Marrow Transplant Research.
These findings underscore the need for “stringent surveillance and aggressive treatment measures” in this population, Akshay Sharma, MBBS, of St. Jude Children’s Research Hospital, Memphis, and colleagues wrote.
The findings were published online March 1, 2021, in The Lancet Haematology.
The study is “of importance for physicians caring for HSCT recipients worldwide,” Mathieu Leclerc, MD, and Sébastien Maury, MD, Hôpital Henri Mondor, Créteil, France, commented in an accompanying editorial.
Study details
For their study, Dr. Sharma and colleagues analyzed outcomes for all HSCT recipients who developed COVID-19 and whose cases were reported to the CIBMTR. Of 318 such patients, 184 had undergone allogeneic HSCT, and 134 had undergone autologous HSCT.
Overall, about half of these patients (49%) had mild COVID-19.
Severe COVID-19 that required mechanical ventilation developed in 15% and 13% of the allogeneic and autologous HSCT recipients, respectively.
About one-fifth of patients died: 22% and 19% of allogeneic and autologous HSCT recipients, respectively.
Factors associated with greater mortality risk included age of 50 years or older (hazard ratio, 2.53), male sex (HR, 3.53), and development of COVID-19 within 12 months of undergoing HSCT (HR, 2.67).
Among autologous HSCT recipients, lymphoma was associated with higher mortality risk in comparison with a plasma cell disorder or myeloma (HR, 2.41), the authors noted.
“Two important messages can be drawn from the results reported by Sharma and colleagues,” Dr. Leclerc and Dr. Maury wrote in their editorial. “The first is the confirmation that the prognosis of COVID-19 is particularly poor in HSCT recipients, and that its prevention, in the absence of any specific curative treatment with sufficient efficacy, should be at the forefront of concerns.”
The second relates to the risk factors for death among HSCT recipients who develop COVID-19. In addition to previously known risk factors, such as age and gender, the investigators identified transplant-specific factors potentially associated with prognosis – namely, the nearly threefold increase in death among allogeneic HSCT recipients who develop COVID-19 within 12 months of transplant, they explained.
However, the findings are limited by a substantial amount of missing data, short follow-up, and the possibility of selection bias, they noted.
“Further large and well-designed studies with longer follow-up are needed to confirm and refine the results,” the editorialists wrote.
“[A] better understanding of the distinctive features of COVID-19 infection in HSCT recipients will be a necessary and essential step toward improvement of the remarkably poor prognosis observed in this setting,” they added.
The study was funded by the American Society of Hematology; the Leukemia and Lymphoma Society; the National Cancer Institute; the National Heart, Lung and Blood Institute; the National Institute of Allergy and Infectious Diseases; the National Institutes of Health; the Health Resources and Services Administration; and the Office of Naval Research. Dr. Sharma receives support for the conduct of industry-sponsored trials from Vertex Pharmaceuticals, CRISPR Therapeutics, and Novartis and consulting fees from Spotlight Therapeutics. Dr. Leclerc and Dr. Maury disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Among individuals who have received a hematopoietic stem cell transplant (HSCT), often used in the treatment of blood cancers, rates of survival are poor for those who develop COVID-19.
The probability of survival 30 days after being diagnosed with COVID-19 is only 68% for persons who have received an allogeneic HSCT and 67% for autologous HSCT recipients, according to new data from the Center for International Blood and Marrow Transplant Research.
These findings underscore the need for “stringent surveillance and aggressive treatment measures” in this population, Akshay Sharma, MBBS, of St. Jude Children’s Research Hospital, Memphis, and colleagues wrote.
The findings were published online March 1, 2021, in The Lancet Haematology.
The study is “of importance for physicians caring for HSCT recipients worldwide,” Mathieu Leclerc, MD, and Sébastien Maury, MD, Hôpital Henri Mondor, Créteil, France, commented in an accompanying editorial.
Study details
For their study, Dr. Sharma and colleagues analyzed outcomes for all HSCT recipients who developed COVID-19 and whose cases were reported to the CIBMTR. Of 318 such patients, 184 had undergone allogeneic HSCT, and 134 had undergone autologous HSCT.
Overall, about half of these patients (49%) had mild COVID-19.
Severe COVID-19 that required mechanical ventilation developed in 15% and 13% of the allogeneic and autologous HSCT recipients, respectively.
About one-fifth of patients died: 22% and 19% of allogeneic and autologous HSCT recipients, respectively.
Factors associated with greater mortality risk included age of 50 years or older (hazard ratio, 2.53), male sex (HR, 3.53), and development of COVID-19 within 12 months of undergoing HSCT (HR, 2.67).
Among autologous HSCT recipients, lymphoma was associated with higher mortality risk in comparison with a plasma cell disorder or myeloma (HR, 2.41), the authors noted.
“Two important messages can be drawn from the results reported by Sharma and colleagues,” Dr. Leclerc and Dr. Maury wrote in their editorial. “The first is the confirmation that the prognosis of COVID-19 is particularly poor in HSCT recipients, and that its prevention, in the absence of any specific curative treatment with sufficient efficacy, should be at the forefront of concerns.”
The second relates to the risk factors for death among HSCT recipients who develop COVID-19. In addition to previously known risk factors, such as age and gender, the investigators identified transplant-specific factors potentially associated with prognosis – namely, the nearly threefold increase in death among allogeneic HSCT recipients who develop COVID-19 within 12 months of transplant, they explained.
However, the findings are limited by a substantial amount of missing data, short follow-up, and the possibility of selection bias, they noted.
“Further large and well-designed studies with longer follow-up are needed to confirm and refine the results,” the editorialists wrote.
“[A] better understanding of the distinctive features of COVID-19 infection in HSCT recipients will be a necessary and essential step toward improvement of the remarkably poor prognosis observed in this setting,” they added.
The study was funded by the American Society of Hematology; the Leukemia and Lymphoma Society; the National Cancer Institute; the National Heart, Lung and Blood Institute; the National Institute of Allergy and Infectious Diseases; the National Institutes of Health; the Health Resources and Services Administration; and the Office of Naval Research. Dr. Sharma receives support for the conduct of industry-sponsored trials from Vertex Pharmaceuticals, CRISPR Therapeutics, and Novartis and consulting fees from Spotlight Therapeutics. Dr. Leclerc and Dr. Maury disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Omidubicel improves on umbilical cord blood transplants
Omidubicel, an investigational enriched umbilical cord blood product being developed by Gamida Cell for transplantation in patients with blood cancers, appears to have some advantages over standard umbilical cord blood.
The results come from a global phase 3 trial (NCT02730299) presented at the annual meeting of the European Society for Blood and Bone Marrow Transplantation.
“Transplantation with omidubicel, compared to standard cord blood transplantation, results in faster hematopoietic recovery, fewer infections, and fewer days in hospital,” said coinvestigator Guillermo F. Sanz, MD, PhD, from the Hospital Universitari i Politècnic la Fe in Valencia, Spain.
“Omidubicel should be considered as the new standard of care for patients eligible for umbilical cord blood transplantation,” Dr. Sanz concluded.
Zachariah DeFilipp, MD, from Mass General Cancer Center in Boston, a hematopoietic stem cell transplantation specialist who was not involved in the study, said in an interview that “omidubicel significantly improves the engraftment after transplant, as compared to standard cord blood transplant. For patients that lack an HLA-matched donor, this approach can help overcome the prolonged cytopenias that occur with standard cord blood transplants in adults.”
Gamida Cell plans to submit these data for approval of omidubicel by the Food and Drug Administration in the fourth quarter of 2021.
Omidubicel is also being evaluated in a phase 1/2 clinical study in patients with severe aplastic anemia (NCT03173937).
Expanding possibilities
Although umbilical cord blood stem cell grafts come from a readily available source and show greater tolerance across HLA barriers than other sources (such as bone marrow), the relatively low dose of stem cells in each unit results in delayed hematopoietic recovery, increased transplant-related morbidity and mortality, and longer hospitalizations, Dr. Sanz said.
Omidubicel consists of two cryopreserved fractions from a single cord blood unit. The product contains both noncultured CD133-negative cells, including T cells, and CD133-positive cells that are then expanded ex vivo for 21 days in the presence of nicotinamide.
“Nicotinamide increases stem and progenitor cells, inhibits differentiation and increases migration, bone marrow homing, and engraftment efficiency while preserving cellular functionality and phenotype,” Dr. Sanz explained during his presentation.
In an earlier phase 1/2 trial in 36 patients with high-risk hematologic malignancies, omidubicel was associated with hematopoietic engraftment lasting at least 10 years.
Details of phase 3 trial results
The global phase 3 trial was conducted in 125 patients (aged 13-65 years) with high-risk malignancies, including acute myeloid and lymphoblastic leukemias, myelodysplastic syndrome, chronic myeloid leukemia, lymphomas, and rare leukemias. These patients were all eligible for allogeneic stem cell transplantation but did not have matched donors.
Patients were randomly assigned to receive hematopoietic reconstitution with either omidubicel (n = 52) or standard cord blood (n = 58).
At 42 days of follow-up, the median time to neutrophil engraftment in the intention-to-treat (ITT) population, the primary endpoint, was 12 days with omidubicel versus 22 days with standard cord blood (P < .001).
In the as-treated population – the 108 patients who actually received omidubicel or standard cord blood – median time to engraftment was 10.0 versus 20.5 days, respectively (P < .001).
Rates of neutrophil engraftment at 42 days were 96% with omidubicel versus 89% with standard cord blood.
The secondary endpoint of time-to-platelet engraftment in the ITT population also favored omidubicel, with a cumulative day 42 incidence rate of 55%, compared with 35% with standard cord blood (P = .028).
In the as-treated population, median times to platelet engraftment were 37 days and 50 days, respectively (P = .023). The cumulative rates of platelet engraftment at 100 days of follow-up were 83% and 73%, respectively.
The incidence of grade 2 or 3 bacterial or invasive fungal infections by day 100 in the ITT population was 37% among patients who received omidubicel, compared with 57% for patients who received standard cord blood (P = .027). Viral infections occurred in 10% versus 26% of patients, respectively.
The incidence of acute graft versus host disease at day 100 was similar between treatment groups, and there was no significant difference at 1 year.
Relapse and nonrelapse mortality rates, as well as disease-free and overall survival rates also did not differ between groups.
In the first 100 days post transplant, patients who received omidubicel were alive and out of the hospital for a median of 60.5 days, compared with 48 days for patients who received standard cord blood (P = .005).
The study was funded by Gamida Cell. Dr. Sanz reported receiving research funding from the company and several others, and consulting fees, honoraria, speakers bureau activity, and travel expenses from other companies. Dr. DeFilipp reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Omidubicel, an investigational enriched umbilical cord blood product being developed by Gamida Cell for transplantation in patients with blood cancers, appears to have some advantages over standard umbilical cord blood.
The results come from a global phase 3 trial (NCT02730299) presented at the annual meeting of the European Society for Blood and Bone Marrow Transplantation.
“Transplantation with omidubicel, compared to standard cord blood transplantation, results in faster hematopoietic recovery, fewer infections, and fewer days in hospital,” said coinvestigator Guillermo F. Sanz, MD, PhD, from the Hospital Universitari i Politècnic la Fe in Valencia, Spain.
“Omidubicel should be considered as the new standard of care for patients eligible for umbilical cord blood transplantation,” Dr. Sanz concluded.
Zachariah DeFilipp, MD, from Mass General Cancer Center in Boston, a hematopoietic stem cell transplantation specialist who was not involved in the study, said in an interview that “omidubicel significantly improves the engraftment after transplant, as compared to standard cord blood transplant. For patients that lack an HLA-matched donor, this approach can help overcome the prolonged cytopenias that occur with standard cord blood transplants in adults.”
Gamida Cell plans to submit these data for approval of omidubicel by the Food and Drug Administration in the fourth quarter of 2021.
Omidubicel is also being evaluated in a phase 1/2 clinical study in patients with severe aplastic anemia (NCT03173937).
Expanding possibilities
Although umbilical cord blood stem cell grafts come from a readily available source and show greater tolerance across HLA barriers than other sources (such as bone marrow), the relatively low dose of stem cells in each unit results in delayed hematopoietic recovery, increased transplant-related morbidity and mortality, and longer hospitalizations, Dr. Sanz said.
Omidubicel consists of two cryopreserved fractions from a single cord blood unit. The product contains both noncultured CD133-negative cells, including T cells, and CD133-positive cells that are then expanded ex vivo for 21 days in the presence of nicotinamide.
“Nicotinamide increases stem and progenitor cells, inhibits differentiation and increases migration, bone marrow homing, and engraftment efficiency while preserving cellular functionality and phenotype,” Dr. Sanz explained during his presentation.
In an earlier phase 1/2 trial in 36 patients with high-risk hematologic malignancies, omidubicel was associated with hematopoietic engraftment lasting at least 10 years.
Details of phase 3 trial results
The global phase 3 trial was conducted in 125 patients (aged 13-65 years) with high-risk malignancies, including acute myeloid and lymphoblastic leukemias, myelodysplastic syndrome, chronic myeloid leukemia, lymphomas, and rare leukemias. These patients were all eligible for allogeneic stem cell transplantation but did not have matched donors.
Patients were randomly assigned to receive hematopoietic reconstitution with either omidubicel (n = 52) or standard cord blood (n = 58).
At 42 days of follow-up, the median time to neutrophil engraftment in the intention-to-treat (ITT) population, the primary endpoint, was 12 days with omidubicel versus 22 days with standard cord blood (P < .001).
In the as-treated population – the 108 patients who actually received omidubicel or standard cord blood – median time to engraftment was 10.0 versus 20.5 days, respectively (P < .001).
Rates of neutrophil engraftment at 42 days were 96% with omidubicel versus 89% with standard cord blood.
The secondary endpoint of time-to-platelet engraftment in the ITT population also favored omidubicel, with a cumulative day 42 incidence rate of 55%, compared with 35% with standard cord blood (P = .028).
In the as-treated population, median times to platelet engraftment were 37 days and 50 days, respectively (P = .023). The cumulative rates of platelet engraftment at 100 days of follow-up were 83% and 73%, respectively.
The incidence of grade 2 or 3 bacterial or invasive fungal infections by day 100 in the ITT population was 37% among patients who received omidubicel, compared with 57% for patients who received standard cord blood (P = .027). Viral infections occurred in 10% versus 26% of patients, respectively.
The incidence of acute graft versus host disease at day 100 was similar between treatment groups, and there was no significant difference at 1 year.
Relapse and nonrelapse mortality rates, as well as disease-free and overall survival rates also did not differ between groups.
In the first 100 days post transplant, patients who received omidubicel were alive and out of the hospital for a median of 60.5 days, compared with 48 days for patients who received standard cord blood (P = .005).
The study was funded by Gamida Cell. Dr. Sanz reported receiving research funding from the company and several others, and consulting fees, honoraria, speakers bureau activity, and travel expenses from other companies. Dr. DeFilipp reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Omidubicel, an investigational enriched umbilical cord blood product being developed by Gamida Cell for transplantation in patients with blood cancers, appears to have some advantages over standard umbilical cord blood.
The results come from a global phase 3 trial (NCT02730299) presented at the annual meeting of the European Society for Blood and Bone Marrow Transplantation.
“Transplantation with omidubicel, compared to standard cord blood transplantation, results in faster hematopoietic recovery, fewer infections, and fewer days in hospital,” said coinvestigator Guillermo F. Sanz, MD, PhD, from the Hospital Universitari i Politècnic la Fe in Valencia, Spain.
“Omidubicel should be considered as the new standard of care for patients eligible for umbilical cord blood transplantation,” Dr. Sanz concluded.
Zachariah DeFilipp, MD, from Mass General Cancer Center in Boston, a hematopoietic stem cell transplantation specialist who was not involved in the study, said in an interview that “omidubicel significantly improves the engraftment after transplant, as compared to standard cord blood transplant. For patients that lack an HLA-matched donor, this approach can help overcome the prolonged cytopenias that occur with standard cord blood transplants in adults.”
Gamida Cell plans to submit these data for approval of omidubicel by the Food and Drug Administration in the fourth quarter of 2021.
Omidubicel is also being evaluated in a phase 1/2 clinical study in patients with severe aplastic anemia (NCT03173937).
Expanding possibilities
Although umbilical cord blood stem cell grafts come from a readily available source and show greater tolerance across HLA barriers than other sources (such as bone marrow), the relatively low dose of stem cells in each unit results in delayed hematopoietic recovery, increased transplant-related morbidity and mortality, and longer hospitalizations, Dr. Sanz said.
Omidubicel consists of two cryopreserved fractions from a single cord blood unit. The product contains both noncultured CD133-negative cells, including T cells, and CD133-positive cells that are then expanded ex vivo for 21 days in the presence of nicotinamide.
“Nicotinamide increases stem and progenitor cells, inhibits differentiation and increases migration, bone marrow homing, and engraftment efficiency while preserving cellular functionality and phenotype,” Dr. Sanz explained during his presentation.
In an earlier phase 1/2 trial in 36 patients with high-risk hematologic malignancies, omidubicel was associated with hematopoietic engraftment lasting at least 10 years.
Details of phase 3 trial results
The global phase 3 trial was conducted in 125 patients (aged 13-65 years) with high-risk malignancies, including acute myeloid and lymphoblastic leukemias, myelodysplastic syndrome, chronic myeloid leukemia, lymphomas, and rare leukemias. These patients were all eligible for allogeneic stem cell transplantation but did not have matched donors.
Patients were randomly assigned to receive hematopoietic reconstitution with either omidubicel (n = 52) or standard cord blood (n = 58).
At 42 days of follow-up, the median time to neutrophil engraftment in the intention-to-treat (ITT) population, the primary endpoint, was 12 days with omidubicel versus 22 days with standard cord blood (P < .001).
In the as-treated population – the 108 patients who actually received omidubicel or standard cord blood – median time to engraftment was 10.0 versus 20.5 days, respectively (P < .001).
Rates of neutrophil engraftment at 42 days were 96% with omidubicel versus 89% with standard cord blood.
The secondary endpoint of time-to-platelet engraftment in the ITT population also favored omidubicel, with a cumulative day 42 incidence rate of 55%, compared with 35% with standard cord blood (P = .028).
In the as-treated population, median times to platelet engraftment were 37 days and 50 days, respectively (P = .023). The cumulative rates of platelet engraftment at 100 days of follow-up were 83% and 73%, respectively.
The incidence of grade 2 or 3 bacterial or invasive fungal infections by day 100 in the ITT population was 37% among patients who received omidubicel, compared with 57% for patients who received standard cord blood (P = .027). Viral infections occurred in 10% versus 26% of patients, respectively.
The incidence of acute graft versus host disease at day 100 was similar between treatment groups, and there was no significant difference at 1 year.
Relapse and nonrelapse mortality rates, as well as disease-free and overall survival rates also did not differ between groups.
In the first 100 days post transplant, patients who received omidubicel were alive and out of the hospital for a median of 60.5 days, compared with 48 days for patients who received standard cord blood (P = .005).
The study was funded by Gamida Cell. Dr. Sanz reported receiving research funding from the company and several others, and consulting fees, honoraria, speakers bureau activity, and travel expenses from other companies. Dr. DeFilipp reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
High-dose chemo no better than standard dose for B-cell lymphoma
After 10 years of follow-up, event-free survival and overall survival were similar between conventional chemotherapy treated patients with aggressive B-cell lymphoma and those receiving high-dose chemotherapy followed by autologous hematopoietic stem-cell transplantation (HSCT), according to a report published online in the Lancet Hematology.
The open-label, randomized, phase 3 trial (NCT00129090) was conducted across 61 centers in Germany on patients aged 18-60 years who had newly diagnosed, high-risk, aggressive B-cell lymphoma, according to Fabian Frontzek, MD, of the University Hospital Münster (Germany) and colleagues.
Between March 2003 and April 2009, patients were randomly assigned to eight cycles of conventional chemotherapy (cyclophosphamide, doxorubicin, vincristine, etoposide, and prednisolone) plus rituximab (R-CHOEP-14) or four cycles of high-dose chemotherapy plus rituximab followed by autologous HSCT (R-MegaCHOEP). The intention-to-treat population comprised 130 patients in the R-CHOEP-14 group and 132 patients in the R-MegaCHOEP group. The median follow-up was 9.3 years.
Similar outcomes
The 10-year event-free survival was 51% in the R-MegaCHOEP group and 57% in the R-CHOEP-14 group, a nonsignificant difference (P = .23). Similarly, the 10-year progression-free survival was 59% in the
R-MegaCHOEP group and 60% (P = .64). The 10-year overall survival was 66% in the R-MegaCHOEP group and 72% in the R-CHOEP-14 group (P = .26). Among the 190 patients who had complete remission or unconfirmed complete remission, relapse occurred in 30 (16%); 17 (17%) of 100 patients in the R-CHOEP-14 group and 13 (14%) of 90 patients in the R-MegaCHOEP group.
In terms of secondary malignancies, 22 were reported in the intention-to-treat population; comprising 12 (9%) of 127 patients in the R-CHOEP-14 group and 10 (8%) of 126 patients in the R-MegaCHOEP group.
Patients who relapsed with aggressive histology and with CNS involvement in particular had worse outcomes and “represent a group with an unmet medical need, for which new molecular and cellular therapies should be studied,” the authors stated.
“This study shows that, in the rituximab era, high-dose therapy and autologous HSCT in first-line treatment does not improve long-term survival of younger high-risk patients with aggressive B-cell lymphoma. The R-CHOEP-14 regimen led to favorable outcomes, supporting its continued use in such patients,” the researchers concluded.
In an accompanying commentary, Gita Thanarajasingam, MD, of the Mayo Clinic, Rochester, Minn., and colleagues added that the issue of long-term outcomes is critical to evaluating these new regimens.
They applauded the inclusion of secondary malignancies in the long-term follow-up, but regretted the lack of the, admittedly resource-intensive, information on long-term nonneoplastic adverse events. They added that “the burden of late adverse events such as cardiotoxicity, cumulative neuropathy, delayed infections, or lasting cognitive effects, among others that might drive substantial morbidity, does matter to lymphoma survivors.”
They also commented on the importance of considering effects on fertility in these patients, noting that R-MegaCHOEP patients would be unable to conceive naturally, but that the effect of R-CHOEP-14 was less clear.
“We encourage ongoing emphasis on this type of longitudinal follow-up of secondary malignancies and other nonneoplastic late toxicities in phase 3 studies as well as in the real world in hematological malignancies, so that after prioritizing cure in the front-line setting, we do not neglect the life we have helped survivors achieve for years and decades to come,” they concluded.
The study was sponsored by the German High-Grade Non-Hodgkin’s Lymphoma Study Group. The authors reported grants, personal fees, and non-financial support from multiple pharmaceutical and biotechnology companies. Dr. Thanarajasingam and her colleagues reported that they had no competing interests.
After 10 years of follow-up, event-free survival and overall survival were similar between conventional chemotherapy treated patients with aggressive B-cell lymphoma and those receiving high-dose chemotherapy followed by autologous hematopoietic stem-cell transplantation (HSCT), according to a report published online in the Lancet Hematology.
The open-label, randomized, phase 3 trial (NCT00129090) was conducted across 61 centers in Germany on patients aged 18-60 years who had newly diagnosed, high-risk, aggressive B-cell lymphoma, according to Fabian Frontzek, MD, of the University Hospital Münster (Germany) and colleagues.
Between March 2003 and April 2009, patients were randomly assigned to eight cycles of conventional chemotherapy (cyclophosphamide, doxorubicin, vincristine, etoposide, and prednisolone) plus rituximab (R-CHOEP-14) or four cycles of high-dose chemotherapy plus rituximab followed by autologous HSCT (R-MegaCHOEP). The intention-to-treat population comprised 130 patients in the R-CHOEP-14 group and 132 patients in the R-MegaCHOEP group. The median follow-up was 9.3 years.
Similar outcomes
The 10-year event-free survival was 51% in the R-MegaCHOEP group and 57% in the R-CHOEP-14 group, a nonsignificant difference (P = .23). Similarly, the 10-year progression-free survival was 59% in the
R-MegaCHOEP group and 60% (P = .64). The 10-year overall survival was 66% in the R-MegaCHOEP group and 72% in the R-CHOEP-14 group (P = .26). Among the 190 patients who had complete remission or unconfirmed complete remission, relapse occurred in 30 (16%); 17 (17%) of 100 patients in the R-CHOEP-14 group and 13 (14%) of 90 patients in the R-MegaCHOEP group.
In terms of secondary malignancies, 22 were reported in the intention-to-treat population; comprising 12 (9%) of 127 patients in the R-CHOEP-14 group and 10 (8%) of 126 patients in the R-MegaCHOEP group.
Patients who relapsed with aggressive histology and with CNS involvement in particular had worse outcomes and “represent a group with an unmet medical need, for which new molecular and cellular therapies should be studied,” the authors stated.
“This study shows that, in the rituximab era, high-dose therapy and autologous HSCT in first-line treatment does not improve long-term survival of younger high-risk patients with aggressive B-cell lymphoma. The R-CHOEP-14 regimen led to favorable outcomes, supporting its continued use in such patients,” the researchers concluded.
In an accompanying commentary, Gita Thanarajasingam, MD, of the Mayo Clinic, Rochester, Minn., and colleagues added that the issue of long-term outcomes is critical to evaluating these new regimens.
They applauded the inclusion of secondary malignancies in the long-term follow-up, but regretted the lack of the, admittedly resource-intensive, information on long-term nonneoplastic adverse events. They added that “the burden of late adverse events such as cardiotoxicity, cumulative neuropathy, delayed infections, or lasting cognitive effects, among others that might drive substantial morbidity, does matter to lymphoma survivors.”
They also commented on the importance of considering effects on fertility in these patients, noting that R-MegaCHOEP patients would be unable to conceive naturally, but that the effect of R-CHOEP-14 was less clear.
“We encourage ongoing emphasis on this type of longitudinal follow-up of secondary malignancies and other nonneoplastic late toxicities in phase 3 studies as well as in the real world in hematological malignancies, so that after prioritizing cure in the front-line setting, we do not neglect the life we have helped survivors achieve for years and decades to come,” they concluded.
The study was sponsored by the German High-Grade Non-Hodgkin’s Lymphoma Study Group. The authors reported grants, personal fees, and non-financial support from multiple pharmaceutical and biotechnology companies. Dr. Thanarajasingam and her colleagues reported that they had no competing interests.
After 10 years of follow-up, event-free survival and overall survival were similar between conventional chemotherapy treated patients with aggressive B-cell lymphoma and those receiving high-dose chemotherapy followed by autologous hematopoietic stem-cell transplantation (HSCT), according to a report published online in the Lancet Hematology.
The open-label, randomized, phase 3 trial (NCT00129090) was conducted across 61 centers in Germany on patients aged 18-60 years who had newly diagnosed, high-risk, aggressive B-cell lymphoma, according to Fabian Frontzek, MD, of the University Hospital Münster (Germany) and colleagues.
Between March 2003 and April 2009, patients were randomly assigned to eight cycles of conventional chemotherapy (cyclophosphamide, doxorubicin, vincristine, etoposide, and prednisolone) plus rituximab (R-CHOEP-14) or four cycles of high-dose chemotherapy plus rituximab followed by autologous HSCT (R-MegaCHOEP). The intention-to-treat population comprised 130 patients in the R-CHOEP-14 group and 132 patients in the R-MegaCHOEP group. The median follow-up was 9.3 years.
Similar outcomes
The 10-year event-free survival was 51% in the R-MegaCHOEP group and 57% in the R-CHOEP-14 group, a nonsignificant difference (P = .23). Similarly, the 10-year progression-free survival was 59% in the
R-MegaCHOEP group and 60% (P = .64). The 10-year overall survival was 66% in the R-MegaCHOEP group and 72% in the R-CHOEP-14 group (P = .26). Among the 190 patients who had complete remission or unconfirmed complete remission, relapse occurred in 30 (16%); 17 (17%) of 100 patients in the R-CHOEP-14 group and 13 (14%) of 90 patients in the R-MegaCHOEP group.
In terms of secondary malignancies, 22 were reported in the intention-to-treat population; comprising 12 (9%) of 127 patients in the R-CHOEP-14 group and 10 (8%) of 126 patients in the R-MegaCHOEP group.
Patients who relapsed with aggressive histology and with CNS involvement in particular had worse outcomes and “represent a group with an unmet medical need, for which new molecular and cellular therapies should be studied,” the authors stated.
“This study shows that, in the rituximab era, high-dose therapy and autologous HSCT in first-line treatment does not improve long-term survival of younger high-risk patients with aggressive B-cell lymphoma. The R-CHOEP-14 regimen led to favorable outcomes, supporting its continued use in such patients,” the researchers concluded.
In an accompanying commentary, Gita Thanarajasingam, MD, of the Mayo Clinic, Rochester, Minn., and colleagues added that the issue of long-term outcomes is critical to evaluating these new regimens.
They applauded the inclusion of secondary malignancies in the long-term follow-up, but regretted the lack of the, admittedly resource-intensive, information on long-term nonneoplastic adverse events. They added that “the burden of late adverse events such as cardiotoxicity, cumulative neuropathy, delayed infections, or lasting cognitive effects, among others that might drive substantial morbidity, does matter to lymphoma survivors.”
They also commented on the importance of considering effects on fertility in these patients, noting that R-MegaCHOEP patients would be unable to conceive naturally, but that the effect of R-CHOEP-14 was less clear.
“We encourage ongoing emphasis on this type of longitudinal follow-up of secondary malignancies and other nonneoplastic late toxicities in phase 3 studies as well as in the real world in hematological malignancies, so that after prioritizing cure in the front-line setting, we do not neglect the life we have helped survivors achieve for years and decades to come,” they concluded.
The study was sponsored by the German High-Grade Non-Hodgkin’s Lymphoma Study Group. The authors reported grants, personal fees, and non-financial support from multiple pharmaceutical and biotechnology companies. Dr. Thanarajasingam and her colleagues reported that they had no competing interests.
FROM THE LANCET HEMATOLOGY
Neurologic drug prices jump 50% in five years
, new research shows. Results of the retrospective study also showed that most of the increased costs for these agents were due to rising costs for neuroimmunology drugs, mainly for those used to treat multiple sclerosis (MS).
“The same brand name medication in 2017 cost approximately 50% more than in 2013,” said Adam de Havenon, MD, assistant professor of neurology, University of Utah, Salt Lake City.
“An analogy would be if you bought an iPhone 5 in 2013 for $500, and then in 2017, you were asked to pay $750 for the exact same iPhone 5,” Dr. de Havenon added.
The study findings were published online March 10 in the journal Neurology.
$26 billion in payments
Both neurologists and patients are concerned about the high cost of prescription drugs for neurologic diseases, and Medicare Part D data indicate that these drugs are the most expensive component of neurologic care, the researchers noted. In addition, out-of-pocket costs have increased significantly for patients with neurologic disease such as Parkinson’s disease, epilepsy, and MS.
To understand trends in payments for neurologic drugs, Dr. de Havenon and colleagues analyzed Medicare Part D claims filed from 2013 to 2017. The payments include costs paid by Medicare, the patient, government subsidies, and other third-party payers.
In addition to examining more current Medicare Part D data than previous studies, the current analysis examined all medications prescribed by neurologists that consistently remained branded or generic during the 5-year study period, said Dr. de Havenon. This approach resulted in a large number of claims and a large total cost.
To calculate the percentage change in annual payment claims, the researchers used 2013 prices as a reference point. They identified drugs named in 2013 claims and classified them as generic, brand-name only, or brand-name with generic equivalent. Researchers also divided the drugs by neurologic subspecialty.
The analysis included 520 drugs, all of which were available in each year of the study period. Of these drugs, 322 were generic, 61 were brand-name only, and 137 were brand-name with a generic equivalent. There were 90.7 million total claims.
Results showed total payments amounted to $26.65 billion. Yearly total payments increased from $4.05 billion in 2013 to $6.09 billion in 2017, representing a 50.4% increase, even after adjusting for inflation. Total claims increased by 7.6% – from 17.1 million in 2013 to 18.4 million in 2017.
From 2013 to 2017, claim payments increased by 0.6% for generic drugs, 42.4% for brand-name only drugs, and 45% for brand-name drugs with generic equivalents. The proportion of claims increased from 81.9% to 88% for generic drugs and from 4.9% to 6.2% for brand-name only drugs.
However, the proportion of claims for brand-name drugs with generic equivalents decreased from 13.3% to 5.8%.
Treatment barrier
Neuroimmunologic drugs, most of which were prescribed for MS, had exceptional cost, the researchers noted. These drugs accounted for more than 50% of payments but only 4.3% of claims. Claim payment for these drugs increased by 46.9% during the study period, from $3,337 to $4,902.
When neuroimmunologic drugs were removed from the analysis there was still significant increase in claim payments for brand-name only drugs (50.4%) and brand-name drugs with generic equivalents (45.6%).
Although neuroimmunologic medicines, including monoclonal antibodies, are more expensive to produce, this factor alone does not explain their exceptional cost, said Dr. de Havenon. “The high cost of brand-name drugs in this speciality is likely because the market bears it,” he added. “In other words, MS is a disabling disease and the medications work, so historically the Centers for Medicare & Medicaid Services have been willing to tolerate the high cost of these primarily brand-name medications.”
Several countries have controlled drug costs by negotiating with pharmaceutical companies and through legislation, Dr. de Havenon noted.
“My intent with this article was to raise awareness on the topic, which I struggle with frequently as a clinician. I know I want my patients to have a medication, but the cost prevents it,” he said.
‘Unfettered’ price-setting
Commenting on the findings, Robert J. Fox, MD, vice chair for research at the Neurological Institute of the Cleveland Clinic, said the study “brings into clear light” what neurologists, particularly those who treat MS, have long suspected but did not really know. These neurologists “are typically distanced from the payment aspects of the medications they prescribe,” said Dr. Fox, who was not involved with the research.
Although a particular strength of the study was its comprehensiveness, the researchers excluded infusion claims – which account for a large portion of total patient care costs for many disorders, he noted.
Drugs for MS historically have been expensive, ostensibly because of their high cost of development. In addition, the large and continued price increase that occurs long after these drugs have been approved remains unexplained, said Dr. Fox.
He noted that the study findings might not directly affect clinical practice because neurologists will continue prescribing medications they think are best for their patients. “Instead, I think this is a lesson to lawmakers about the massive error in the Medicare Modernization Act of 2003, where the federal government was prohibited from negotiating drug prices. If the seller is unfettered in setting a price, then no one should be surprised when the price rises,” Dr. Fox said.
Because many new drugs and new generic formulations for treating MS have become available during the past year, “repeating these types of economic studies for the period 2020-2025 will help us understand if generic competition – as well as new laws if they are passed – alter price,” he concluded.
The study was funded by the American Academy of Neurology, which publishes Neurology. Dr. de Havenon has received clinical research funding from AMAG Pharmaceuticals and Regeneron Pharmaceuticals. Dr. Fox receives consulting fees from many pharmaceutical companies involved in the development of therapies for MS.
A version of this article first appeared on Medscape.com.
, new research shows. Results of the retrospective study also showed that most of the increased costs for these agents were due to rising costs for neuroimmunology drugs, mainly for those used to treat multiple sclerosis (MS).
“The same brand name medication in 2017 cost approximately 50% more than in 2013,” said Adam de Havenon, MD, assistant professor of neurology, University of Utah, Salt Lake City.
“An analogy would be if you bought an iPhone 5 in 2013 for $500, and then in 2017, you were asked to pay $750 for the exact same iPhone 5,” Dr. de Havenon added.
The study findings were published online March 10 in the journal Neurology.
$26 billion in payments
Both neurologists and patients are concerned about the high cost of prescription drugs for neurologic diseases, and Medicare Part D data indicate that these drugs are the most expensive component of neurologic care, the researchers noted. In addition, out-of-pocket costs have increased significantly for patients with neurologic disease such as Parkinson’s disease, epilepsy, and MS.
To understand trends in payments for neurologic drugs, Dr. de Havenon and colleagues analyzed Medicare Part D claims filed from 2013 to 2017. The payments include costs paid by Medicare, the patient, government subsidies, and other third-party payers.
In addition to examining more current Medicare Part D data than previous studies, the current analysis examined all medications prescribed by neurologists that consistently remained branded or generic during the 5-year study period, said Dr. de Havenon. This approach resulted in a large number of claims and a large total cost.
To calculate the percentage change in annual payment claims, the researchers used 2013 prices as a reference point. They identified drugs named in 2013 claims and classified them as generic, brand-name only, or brand-name with generic equivalent. Researchers also divided the drugs by neurologic subspecialty.
The analysis included 520 drugs, all of which were available in each year of the study period. Of these drugs, 322 were generic, 61 were brand-name only, and 137 were brand-name with a generic equivalent. There were 90.7 million total claims.
Results showed total payments amounted to $26.65 billion. Yearly total payments increased from $4.05 billion in 2013 to $6.09 billion in 2017, representing a 50.4% increase, even after adjusting for inflation. Total claims increased by 7.6% – from 17.1 million in 2013 to 18.4 million in 2017.
From 2013 to 2017, claim payments increased by 0.6% for generic drugs, 42.4% for brand-name only drugs, and 45% for brand-name drugs with generic equivalents. The proportion of claims increased from 81.9% to 88% for generic drugs and from 4.9% to 6.2% for brand-name only drugs.
However, the proportion of claims for brand-name drugs with generic equivalents decreased from 13.3% to 5.8%.
Treatment barrier
Neuroimmunologic drugs, most of which were prescribed for MS, had exceptional cost, the researchers noted. These drugs accounted for more than 50% of payments but only 4.3% of claims. Claim payment for these drugs increased by 46.9% during the study period, from $3,337 to $4,902.
When neuroimmunologic drugs were removed from the analysis there was still significant increase in claim payments for brand-name only drugs (50.4%) and brand-name drugs with generic equivalents (45.6%).
Although neuroimmunologic medicines, including monoclonal antibodies, are more expensive to produce, this factor alone does not explain their exceptional cost, said Dr. de Havenon. “The high cost of brand-name drugs in this speciality is likely because the market bears it,” he added. “In other words, MS is a disabling disease and the medications work, so historically the Centers for Medicare & Medicaid Services have been willing to tolerate the high cost of these primarily brand-name medications.”
Several countries have controlled drug costs by negotiating with pharmaceutical companies and through legislation, Dr. de Havenon noted.
“My intent with this article was to raise awareness on the topic, which I struggle with frequently as a clinician. I know I want my patients to have a medication, but the cost prevents it,” he said.
‘Unfettered’ price-setting
Commenting on the findings, Robert J. Fox, MD, vice chair for research at the Neurological Institute of the Cleveland Clinic, said the study “brings into clear light” what neurologists, particularly those who treat MS, have long suspected but did not really know. These neurologists “are typically distanced from the payment aspects of the medications they prescribe,” said Dr. Fox, who was not involved with the research.
Although a particular strength of the study was its comprehensiveness, the researchers excluded infusion claims – which account for a large portion of total patient care costs for many disorders, he noted.
Drugs for MS historically have been expensive, ostensibly because of their high cost of development. In addition, the large and continued price increase that occurs long after these drugs have been approved remains unexplained, said Dr. Fox.
He noted that the study findings might not directly affect clinical practice because neurologists will continue prescribing medications they think are best for their patients. “Instead, I think this is a lesson to lawmakers about the massive error in the Medicare Modernization Act of 2003, where the federal government was prohibited from negotiating drug prices. If the seller is unfettered in setting a price, then no one should be surprised when the price rises,” Dr. Fox said.
Because many new drugs and new generic formulations for treating MS have become available during the past year, “repeating these types of economic studies for the period 2020-2025 will help us understand if generic competition – as well as new laws if they are passed – alter price,” he concluded.
The study was funded by the American Academy of Neurology, which publishes Neurology. Dr. de Havenon has received clinical research funding from AMAG Pharmaceuticals and Regeneron Pharmaceuticals. Dr. Fox receives consulting fees from many pharmaceutical companies involved in the development of therapies for MS.
A version of this article first appeared on Medscape.com.
, new research shows. Results of the retrospective study also showed that most of the increased costs for these agents were due to rising costs for neuroimmunology drugs, mainly for those used to treat multiple sclerosis (MS).
“The same brand name medication in 2017 cost approximately 50% more than in 2013,” said Adam de Havenon, MD, assistant professor of neurology, University of Utah, Salt Lake City.
“An analogy would be if you bought an iPhone 5 in 2013 for $500, and then in 2017, you were asked to pay $750 for the exact same iPhone 5,” Dr. de Havenon added.
The study findings were published online March 10 in the journal Neurology.
$26 billion in payments
Both neurologists and patients are concerned about the high cost of prescription drugs for neurologic diseases, and Medicare Part D data indicate that these drugs are the most expensive component of neurologic care, the researchers noted. In addition, out-of-pocket costs have increased significantly for patients with neurologic disease such as Parkinson’s disease, epilepsy, and MS.
To understand trends in payments for neurologic drugs, Dr. de Havenon and colleagues analyzed Medicare Part D claims filed from 2013 to 2017. The payments include costs paid by Medicare, the patient, government subsidies, and other third-party payers.
In addition to examining more current Medicare Part D data than previous studies, the current analysis examined all medications prescribed by neurologists that consistently remained branded or generic during the 5-year study period, said Dr. de Havenon. This approach resulted in a large number of claims and a large total cost.
To calculate the percentage change in annual payment claims, the researchers used 2013 prices as a reference point. They identified drugs named in 2013 claims and classified them as generic, brand-name only, or brand-name with generic equivalent. Researchers also divided the drugs by neurologic subspecialty.
The analysis included 520 drugs, all of which were available in each year of the study period. Of these drugs, 322 were generic, 61 were brand-name only, and 137 were brand-name with a generic equivalent. There were 90.7 million total claims.
Results showed total payments amounted to $26.65 billion. Yearly total payments increased from $4.05 billion in 2013 to $6.09 billion in 2017, representing a 50.4% increase, even after adjusting for inflation. Total claims increased by 7.6% – from 17.1 million in 2013 to 18.4 million in 2017.
From 2013 to 2017, claim payments increased by 0.6% for generic drugs, 42.4% for brand-name only drugs, and 45% for brand-name drugs with generic equivalents. The proportion of claims increased from 81.9% to 88% for generic drugs and from 4.9% to 6.2% for brand-name only drugs.
However, the proportion of claims for brand-name drugs with generic equivalents decreased from 13.3% to 5.8%.
Treatment barrier
Neuroimmunologic drugs, most of which were prescribed for MS, had exceptional cost, the researchers noted. These drugs accounted for more than 50% of payments but only 4.3% of claims. Claim payment for these drugs increased by 46.9% during the study period, from $3,337 to $4,902.
When neuroimmunologic drugs were removed from the analysis there was still significant increase in claim payments for brand-name only drugs (50.4%) and brand-name drugs with generic equivalents (45.6%).
Although neuroimmunologic medicines, including monoclonal antibodies, are more expensive to produce, this factor alone does not explain their exceptional cost, said Dr. de Havenon. “The high cost of brand-name drugs in this speciality is likely because the market bears it,” he added. “In other words, MS is a disabling disease and the medications work, so historically the Centers for Medicare & Medicaid Services have been willing to tolerate the high cost of these primarily brand-name medications.”
Several countries have controlled drug costs by negotiating with pharmaceutical companies and through legislation, Dr. de Havenon noted.
“My intent with this article was to raise awareness on the topic, which I struggle with frequently as a clinician. I know I want my patients to have a medication, but the cost prevents it,” he said.
‘Unfettered’ price-setting
Commenting on the findings, Robert J. Fox, MD, vice chair for research at the Neurological Institute of the Cleveland Clinic, said the study “brings into clear light” what neurologists, particularly those who treat MS, have long suspected but did not really know. These neurologists “are typically distanced from the payment aspects of the medications they prescribe,” said Dr. Fox, who was not involved with the research.
Although a particular strength of the study was its comprehensiveness, the researchers excluded infusion claims – which account for a large portion of total patient care costs for many disorders, he noted.
Drugs for MS historically have been expensive, ostensibly because of their high cost of development. In addition, the large and continued price increase that occurs long after these drugs have been approved remains unexplained, said Dr. Fox.
He noted that the study findings might not directly affect clinical practice because neurologists will continue prescribing medications they think are best for their patients. “Instead, I think this is a lesson to lawmakers about the massive error in the Medicare Modernization Act of 2003, where the federal government was prohibited from negotiating drug prices. If the seller is unfettered in setting a price, then no one should be surprised when the price rises,” Dr. Fox said.
Because many new drugs and new generic formulations for treating MS have become available during the past year, “repeating these types of economic studies for the period 2020-2025 will help us understand if generic competition – as well as new laws if they are passed – alter price,” he concluded.
The study was funded by the American Academy of Neurology, which publishes Neurology. Dr. de Havenon has received clinical research funding from AMAG Pharmaceuticals and Regeneron Pharmaceuticals. Dr. Fox receives consulting fees from many pharmaceutical companies involved in the development of therapies for MS.
A version of this article first appeared on Medscape.com.
FROM NEUROLOGY
Despite risks and warnings, CNS polypharmacy is prevalent among patients with dementia
, new research suggests.
Investigators found that 14% of these individuals were receiving CNS-active polypharmacy, defined as combinations of multiple psychotropic and opioid medications taken for more than 30 days.
“For most patients, the risks of these medications, particularly in combination, are almost certainly greater than the potential benefits,” said Donovan Maust, MD, associate director of the geriatric psychiatry program, University of Michigan, Ann Arbor.
The study was published online March 9 in JAMA.
Serious risks
Memory impairment is the cardinal feature of dementia, but behavioral and psychological symptoms, which can include apathy, delusions, and agitation, are common during all stages of illness and cause significant caregiver distress, the researchers noted.
They noted that there is a dearth of high-quality evidence to support prescribing these medications in this patient population, yet “clinicians regularly prescribe psychotropic medications to community-dwelling persons with dementia in rates that far exceed use in the general older adult population.”
The Beers Criteria, from the American Geriatrics Society, advise against the practice of CNS polypharmacy because of the significant increase in risk for falls as well as impaired cognition, cardiac conduction abnormalities, respiratory suppression, and death when polypharmacy involves opioids.
They note that previous studies from Europe of polypharmacy for patients with dementia have not included antiepileptic medications or opioids, so the true extent of CNS-active polypharmacy may be “significantly” underestimated.
To determine the prevalence of polypharmacy with CNS-active medications among community-dwelling older adults with dementia, the researchers analyzed data on prescription fills for nearly 1.2 million community-dwelling Medicare patients with dementia.
The primary outcome was the prevalence of CNS-active polypharmacy in 2018. They defined CNS-active polypharmacy as exposure to three or more medications for more than 30 consecutive days from the following drug classes: antidepressants, antipsychotics, antiepileptics, benzodiazepines, nonbenzodiazepines, benzodiazepine receptor agonist hypnotics, and opioids.
They found that roughly one in seven (13.9%) patients met criteria for CNS-active polypharmacy. Of those receiving a CNS-active polypharmacy regimen, 57.8% had been doing so for longer than 180 days, and 6.8% had been doing so for a year. Nearly 30% of patients were exposed to five or more medications, and 5.2% were exposed to five or more medication classes.
Conservative approach warranted
Nearly all (92%) patients taking three or more CNS-active medications were taking an antidepressant, “consistent with their place as the psychotropic class most commonly prescribed both to older adults overall and those with dementia,” the investigators noted.
There is minimal high-quality evidence to support the efficacy of antidepressants for the treatment of depression for patients with dementia, they pointed out.
Nearly half (47%) of patients who were taking three or more CNS-active medications received at least one antipsychotic, most often quetiapine. Antipsychotics are not approved for people with dementia but are often prescribed off label for agitation, anxiety, and sleep problems, the researchers noted.
Nearly two thirds (62%) of patients with dementia who were taking three or more CNS drugs were taking an antiepileptic (most commonly, gabapentin); 41%, benzodiazepines; 32%, opioids; and 6%, Z-drugs.
The most common polypharmacy class combination included at least one antidepressant, one antiepileptic, and one antipsychotic. These accounted for 12.9% of polypharmacy days.
Despite limited high-quality evidence of efficacy, the prescribing of psychotropic medications and opioids is “pervasive” for adults with dementia in the United States, the investigators noted.
“Especially given that older adults with dementia might not be able to convey side effects they are experiencing, I think clinicians should be more conservative in how they are prescribing these medications and skeptical about the potential for benefit,” said Dr. Maust.
Regarding study limitations, the researchers noted that prescription medication claims may have led to an overestimation of the exposure to polypharmacy, insofar as the prescriptions may have been filled but not taken or were taken only on an as-needed basis.
In addition, the investigators were unable to determine the appropriateness of the particular combinations used or to examine the specific harms associated with CNS-active polypharmacy.
A major clinical challenge
Weighing in on the results, Howard Fillit, MD, founding executive director and chief science officer of the Alzheimer’s Drug Discovery Foundation, said the study is important because polypharmacy is one of the “geriatric giants, and the question is, what do you do about it?”
Dr. Fillit said it is important to conduct a careful medication review for all older patients, “making sure that the use of each drug is appropriate. The most important thing is to define what is the appropriate utilization of these kinds of drugs. That goes for both overutilization or misuse of these drugs and underutilization, where people are undertreated for symptoms that can’t be managed by behavioral management, for example,” Dr. Fillit said.
Dr. Fillit also said the finding that about 14% of dementia patients were receiving three or more of these drugs “may not be an outrageous number, because these patients, especially as they get into moderate and severe stages of disease, can be incredibly difficult to manage.
“Very often, dementia patients have depression, and up to 90% will have agitation and even psychosis during the course of dementia. And many of these patients need these types of drugs,” said Dr. Fillit.
Echoing the authors, Dr. Fillit said a key limitation of the study is not knowing whether the prescribing was appropriate or not.
The study was supported by a grant from the National Institute on Aging. Dr. Maust and Dr. Fillit have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests.
Investigators found that 14% of these individuals were receiving CNS-active polypharmacy, defined as combinations of multiple psychotropic and opioid medications taken for more than 30 days.
“For most patients, the risks of these medications, particularly in combination, are almost certainly greater than the potential benefits,” said Donovan Maust, MD, associate director of the geriatric psychiatry program, University of Michigan, Ann Arbor.
The study was published online March 9 in JAMA.
Serious risks
Memory impairment is the cardinal feature of dementia, but behavioral and psychological symptoms, which can include apathy, delusions, and agitation, are common during all stages of illness and cause significant caregiver distress, the researchers noted.
They noted that there is a dearth of high-quality evidence to support prescribing these medications in this patient population, yet “clinicians regularly prescribe psychotropic medications to community-dwelling persons with dementia in rates that far exceed use in the general older adult population.”
The Beers Criteria, from the American Geriatrics Society, advise against the practice of CNS polypharmacy because of the significant increase in risk for falls as well as impaired cognition, cardiac conduction abnormalities, respiratory suppression, and death when polypharmacy involves opioids.
They note that previous studies from Europe of polypharmacy for patients with dementia have not included antiepileptic medications or opioids, so the true extent of CNS-active polypharmacy may be “significantly” underestimated.
To determine the prevalence of polypharmacy with CNS-active medications among community-dwelling older adults with dementia, the researchers analyzed data on prescription fills for nearly 1.2 million community-dwelling Medicare patients with dementia.
The primary outcome was the prevalence of CNS-active polypharmacy in 2018. They defined CNS-active polypharmacy as exposure to three or more medications for more than 30 consecutive days from the following drug classes: antidepressants, antipsychotics, antiepileptics, benzodiazepines, nonbenzodiazepines, benzodiazepine receptor agonist hypnotics, and opioids.
They found that roughly one in seven (13.9%) patients met criteria for CNS-active polypharmacy. Of those receiving a CNS-active polypharmacy regimen, 57.8% had been doing so for longer than 180 days, and 6.8% had been doing so for a year. Nearly 30% of patients were exposed to five or more medications, and 5.2% were exposed to five or more medication classes.
Conservative approach warranted
Nearly all (92%) patients taking three or more CNS-active medications were taking an antidepressant, “consistent with their place as the psychotropic class most commonly prescribed both to older adults overall and those with dementia,” the investigators noted.
There is minimal high-quality evidence to support the efficacy of antidepressants for the treatment of depression for patients with dementia, they pointed out.
Nearly half (47%) of patients who were taking three or more CNS-active medications received at least one antipsychotic, most often quetiapine. Antipsychotics are not approved for people with dementia but are often prescribed off label for agitation, anxiety, and sleep problems, the researchers noted.
Nearly two thirds (62%) of patients with dementia who were taking three or more CNS drugs were taking an antiepileptic (most commonly, gabapentin); 41%, benzodiazepines; 32%, opioids; and 6%, Z-drugs.
The most common polypharmacy class combination included at least one antidepressant, one antiepileptic, and one antipsychotic. These accounted for 12.9% of polypharmacy days.
Despite limited high-quality evidence of efficacy, the prescribing of psychotropic medications and opioids is “pervasive” for adults with dementia in the United States, the investigators noted.
“Especially given that older adults with dementia might not be able to convey side effects they are experiencing, I think clinicians should be more conservative in how they are prescribing these medications and skeptical about the potential for benefit,” said Dr. Maust.
Regarding study limitations, the researchers noted that prescription medication claims may have led to an overestimation of the exposure to polypharmacy, insofar as the prescriptions may have been filled but not taken or were taken only on an as-needed basis.
In addition, the investigators were unable to determine the appropriateness of the particular combinations used or to examine the specific harms associated with CNS-active polypharmacy.
A major clinical challenge
Weighing in on the results, Howard Fillit, MD, founding executive director and chief science officer of the Alzheimer’s Drug Discovery Foundation, said the study is important because polypharmacy is one of the “geriatric giants, and the question is, what do you do about it?”
Dr. Fillit said it is important to conduct a careful medication review for all older patients, “making sure that the use of each drug is appropriate. The most important thing is to define what is the appropriate utilization of these kinds of drugs. That goes for both overutilization or misuse of these drugs and underutilization, where people are undertreated for symptoms that can’t be managed by behavioral management, for example,” Dr. Fillit said.
Dr. Fillit also said the finding that about 14% of dementia patients were receiving three or more of these drugs “may not be an outrageous number, because these patients, especially as they get into moderate and severe stages of disease, can be incredibly difficult to manage.
“Very often, dementia patients have depression, and up to 90% will have agitation and even psychosis during the course of dementia. And many of these patients need these types of drugs,” said Dr. Fillit.
Echoing the authors, Dr. Fillit said a key limitation of the study is not knowing whether the prescribing was appropriate or not.
The study was supported by a grant from the National Institute on Aging. Dr. Maust and Dr. Fillit have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests.
Investigators found that 14% of these individuals were receiving CNS-active polypharmacy, defined as combinations of multiple psychotropic and opioid medications taken for more than 30 days.
“For most patients, the risks of these medications, particularly in combination, are almost certainly greater than the potential benefits,” said Donovan Maust, MD, associate director of the geriatric psychiatry program, University of Michigan, Ann Arbor.
The study was published online March 9 in JAMA.
Serious risks
Memory impairment is the cardinal feature of dementia, but behavioral and psychological symptoms, which can include apathy, delusions, and agitation, are common during all stages of illness and cause significant caregiver distress, the researchers noted.
They noted that there is a dearth of high-quality evidence to support prescribing these medications in this patient population, yet “clinicians regularly prescribe psychotropic medications to community-dwelling persons with dementia in rates that far exceed use in the general older adult population.”
The Beers Criteria, from the American Geriatrics Society, advise against the practice of CNS polypharmacy because of the significant increase in risk for falls as well as impaired cognition, cardiac conduction abnormalities, respiratory suppression, and death when polypharmacy involves opioids.
They note that previous studies from Europe of polypharmacy for patients with dementia have not included antiepileptic medications or opioids, so the true extent of CNS-active polypharmacy may be “significantly” underestimated.
To determine the prevalence of polypharmacy with CNS-active medications among community-dwelling older adults with dementia, the researchers analyzed data on prescription fills for nearly 1.2 million community-dwelling Medicare patients with dementia.
The primary outcome was the prevalence of CNS-active polypharmacy in 2018. They defined CNS-active polypharmacy as exposure to three or more medications for more than 30 consecutive days from the following drug classes: antidepressants, antipsychotics, antiepileptics, benzodiazepines, nonbenzodiazepines, benzodiazepine receptor agonist hypnotics, and opioids.
They found that roughly one in seven (13.9%) patients met criteria for CNS-active polypharmacy. Of those receiving a CNS-active polypharmacy regimen, 57.8% had been doing so for longer than 180 days, and 6.8% had been doing so for a year. Nearly 30% of patients were exposed to five or more medications, and 5.2% were exposed to five or more medication classes.
Conservative approach warranted
Nearly all (92%) patients taking three or more CNS-active medications were taking an antidepressant, “consistent with their place as the psychotropic class most commonly prescribed both to older adults overall and those with dementia,” the investigators noted.
There is minimal high-quality evidence to support the efficacy of antidepressants for the treatment of depression for patients with dementia, they pointed out.
Nearly half (47%) of patients who were taking three or more CNS-active medications received at least one antipsychotic, most often quetiapine. Antipsychotics are not approved for people with dementia but are often prescribed off label for agitation, anxiety, and sleep problems, the researchers noted.
Nearly two thirds (62%) of patients with dementia who were taking three or more CNS drugs were taking an antiepileptic (most commonly, gabapentin); 41%, benzodiazepines; 32%, opioids; and 6%, Z-drugs.
The most common polypharmacy class combination included at least one antidepressant, one antiepileptic, and one antipsychotic. These accounted for 12.9% of polypharmacy days.
Despite limited high-quality evidence of efficacy, the prescribing of psychotropic medications and opioids is “pervasive” for adults with dementia in the United States, the investigators noted.
“Especially given that older adults with dementia might not be able to convey side effects they are experiencing, I think clinicians should be more conservative in how they are prescribing these medications and skeptical about the potential for benefit,” said Dr. Maust.
Regarding study limitations, the researchers noted that prescription medication claims may have led to an overestimation of the exposure to polypharmacy, insofar as the prescriptions may have been filled but not taken or were taken only on an as-needed basis.
In addition, the investigators were unable to determine the appropriateness of the particular combinations used or to examine the specific harms associated with CNS-active polypharmacy.
A major clinical challenge
Weighing in on the results, Howard Fillit, MD, founding executive director and chief science officer of the Alzheimer’s Drug Discovery Foundation, said the study is important because polypharmacy is one of the “geriatric giants, and the question is, what do you do about it?”
Dr. Fillit said it is important to conduct a careful medication review for all older patients, “making sure that the use of each drug is appropriate. The most important thing is to define what is the appropriate utilization of these kinds of drugs. That goes for both overutilization or misuse of these drugs and underutilization, where people are undertreated for symptoms that can’t be managed by behavioral management, for example,” Dr. Fillit said.
Dr. Fillit also said the finding that about 14% of dementia patients were receiving three or more of these drugs “may not be an outrageous number, because these patients, especially as they get into moderate and severe stages of disease, can be incredibly difficult to manage.
“Very often, dementia patients have depression, and up to 90% will have agitation and even psychosis during the course of dementia. And many of these patients need these types of drugs,” said Dr. Fillit.
Echoing the authors, Dr. Fillit said a key limitation of the study is not knowing whether the prescribing was appropriate or not.
The study was supported by a grant from the National Institute on Aging. Dr. Maust and Dr. Fillit have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA
Don’t delay: Cancer patients need both doses of COVID vaccine
The new findings, which are soon to be published as a preprint, cast doubt on the current U.K. policy of delaying the second dose of the vaccine.
Delaying the second dose can leave most patients with cancer wholly or partially unprotected, according to the researchers. Moreover, such a delay has implications for transmission of SARS-CoV-2 in the cancer patient’s environs as well as for the evolution of virus variants that could be of concern, the researchers concluded.
The data come from a British study that included 151 patients with cancer and 54 healthy control persons. All participants received the COVID-19 mRNA BNT162b2 vaccine (Pfizer-BioNTech).
This vaccine requires two doses. The first few participants in this study were given the second dose 21 days after they had received the first dose, but then national guidelines changed, and the remaining participants had to wait 12 weeks to receive their second dose.
The researchers reported that, among health controls, the immune efficacy of the first dose was very high (97% efficacious). By contrast, among patients with solid tumors, the immune efficacy of a single dose was strikingly low (39%), and it was even lower in patients with hematologic malignancies (13%).
The second dose of vaccine greatly and rapidly increased the immune efficacy in patients with solid tumors (95% within 2 weeks of receiving the second dose), the researchers added.
Too few patients with hematologic cancers had received the second dose before the study ended for clear conclusions to be drawn. Nevertheless, the available data suggest that 50% of patients with hematologic cancers who had received the booster at day 21 were seropositive at 5 weeks vs. only 8% of those who had not received the booster.
“Our data provide the first real-world evidence of immune efficacy following one dose of the Pfizer vaccine in immunocompromised patient populations [and] clearly show that the poor one-dose efficacy in cancer patients can be rescued with an early booster at day 21,” commented senior author Sheeba Irshad, MD, senior clinical lecturer, King’s College London.
“Based on our findings, we would recommend an urgent review of the vaccine strategy for clinically extremely vulnerable groups. Until then, it is important that cancer patients continue to observe all public health measures in place, such as social distancing and shielding when attending hospitals, even after vaccination,” Dr. Irshad added.
The paper, with first author Leticia Monin-Aldama, PhD, is scheduled to appear on the preprint server medRxiv. It has not undergone peer review. The paper was distributed to journalists, with comments from experts not involved in the study, by the UK Science Media Centre.
These data are “of immediate importance” to patients with cancer, commented Shoba Amarnath, PhD, Newcastle University research fellow, Laboratory of T-cell Regulation, Newcastle University Center for Cancer, Newcastle upon Tyne, England.
“These findings are consistent with our understanding. … We know that the immune system within cancer patients is compromised as compared to healthy controls,” Dr. Amarnath said. “The data in the study support the notion that, in solid cancer patients, a considerable delay in second dose will extend the period when cancer patients are at risk of SARS-CoV-2 infection.”
Although more data are required, “this study does raise the issue of whether patients with cancer, other diseases, or those undergoing therapies that affect the body’s immune response should be fast-tracked for their second vaccine dose,” commented Lawrence Young, PhD, professor of molecular oncology and director of the Warwick Cancer Research Center, University of Warwick, Coventry, England.
Stephen Evans, MSc, professor of pharmacoepidemiology, London School of Hygiene and Tropical Medicine, underlined that the study is “essentially” observational and “inevitable limitations must be taken into account.
“Nevertheless, these results do suggest that the vaccines may well not protect those patients with cancer as well as those without cancer,” Mr. Evans said. He added that it is “important that this population continues to observe all COVID-19–associated measures, such as social distancing and shielding when attending hospitals, even after vaccination.”
Study details
Previous studies have shown that some patients with cancer have prolonged responses to SARS-CoV-2 infection, with ongoing immune dysregulation, inefficient seroconversion, and prolonged viral shedding.
There are few data, however, on how these patients respond to COVID-19 vaccination. The authors point out that, among the 18,860 individuals who received the Pfizer vaccine during its development trials, “none with an active oncological diagnosis was included.”
To investigate this issue, they launched the SARS-CoV-2 for Cancer Patients (SOAP-02) study.
The 151 patients with cancer who participated in this study were mostly elderly, the authors noted (75% were older than 65 years; the median age was 73 years). The majority (63%) had solid-tumor malignancies. Of those, 8% had late-stage disease and had been living with their cancer for more than 24 months.
The healthy control persons were vaccine-eligible primary health care workers who were not age matched to the cancer patients.
All participants received the first dose of vaccine; 31 (of 151) patients with cancer and 16 (of 54) healthy control persons received the second dose on day 21.
The remaining participants were scheduled to receive their second dose 12 weeks later (after the study ended), in line with the changes in the national guidelines.
The team reported that, approximately 21 days after receiving the first vaccine dose, the immune efficacy of the vaccine was estimated to be 97% among healthy control persons vs. 39% for patients with solid tumors and only 13% for those with hematologic malignancies (P < .0001 for both).
T-cell responses, as assessed via interferon-gamma and/or interleukin-2 production, were observed in 82% of healthy control persons, 71% of patients with solid tumors, and 50% of those with hematologic cancers.
Vaccine boosting at day 21 resulted in immune efficacy of 100% for healthy control persons and 95% for patients with solid tumors. In contrast, only 43% of those who did not receive the second dose were seropositive 2 weeks later.
Further analysis suggested that participants who did not have a serologic response were “spread evenly” across different cancer types, but the reduced responses were more frequent among patients who had received the vaccine within 15 days of cancer treatment, especially chemotherapy, and had undergone intensive treatments.
The SOAP study is sponsored by King’s College London and Guy’s and St. Thomas Trust Foundation NHS Trust. It is funded from grants from the KCL Charity, Cancer Research UK, and program grants from Breast Cancer Now. The investigators have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
The new findings, which are soon to be published as a preprint, cast doubt on the current U.K. policy of delaying the second dose of the vaccine.
Delaying the second dose can leave most patients with cancer wholly or partially unprotected, according to the researchers. Moreover, such a delay has implications for transmission of SARS-CoV-2 in the cancer patient’s environs as well as for the evolution of virus variants that could be of concern, the researchers concluded.
The data come from a British study that included 151 patients with cancer and 54 healthy control persons. All participants received the COVID-19 mRNA BNT162b2 vaccine (Pfizer-BioNTech).
This vaccine requires two doses. The first few participants in this study were given the second dose 21 days after they had received the first dose, but then national guidelines changed, and the remaining participants had to wait 12 weeks to receive their second dose.
The researchers reported that, among health controls, the immune efficacy of the first dose was very high (97% efficacious). By contrast, among patients with solid tumors, the immune efficacy of a single dose was strikingly low (39%), and it was even lower in patients with hematologic malignancies (13%).
The second dose of vaccine greatly and rapidly increased the immune efficacy in patients with solid tumors (95% within 2 weeks of receiving the second dose), the researchers added.
Too few patients with hematologic cancers had received the second dose before the study ended for clear conclusions to be drawn. Nevertheless, the available data suggest that 50% of patients with hematologic cancers who had received the booster at day 21 were seropositive at 5 weeks vs. only 8% of those who had not received the booster.
“Our data provide the first real-world evidence of immune efficacy following one dose of the Pfizer vaccine in immunocompromised patient populations [and] clearly show that the poor one-dose efficacy in cancer patients can be rescued with an early booster at day 21,” commented senior author Sheeba Irshad, MD, senior clinical lecturer, King’s College London.
“Based on our findings, we would recommend an urgent review of the vaccine strategy for clinically extremely vulnerable groups. Until then, it is important that cancer patients continue to observe all public health measures in place, such as social distancing and shielding when attending hospitals, even after vaccination,” Dr. Irshad added.
The paper, with first author Leticia Monin-Aldama, PhD, is scheduled to appear on the preprint server medRxiv. It has not undergone peer review. The paper was distributed to journalists, with comments from experts not involved in the study, by the UK Science Media Centre.
These data are “of immediate importance” to patients with cancer, commented Shoba Amarnath, PhD, Newcastle University research fellow, Laboratory of T-cell Regulation, Newcastle University Center for Cancer, Newcastle upon Tyne, England.
“These findings are consistent with our understanding. … We know that the immune system within cancer patients is compromised as compared to healthy controls,” Dr. Amarnath said. “The data in the study support the notion that, in solid cancer patients, a considerable delay in second dose will extend the period when cancer patients are at risk of SARS-CoV-2 infection.”
Although more data are required, “this study does raise the issue of whether patients with cancer, other diseases, or those undergoing therapies that affect the body’s immune response should be fast-tracked for their second vaccine dose,” commented Lawrence Young, PhD, professor of molecular oncology and director of the Warwick Cancer Research Center, University of Warwick, Coventry, England.
Stephen Evans, MSc, professor of pharmacoepidemiology, London School of Hygiene and Tropical Medicine, underlined that the study is “essentially” observational and “inevitable limitations must be taken into account.
“Nevertheless, these results do suggest that the vaccines may well not protect those patients with cancer as well as those without cancer,” Mr. Evans said. He added that it is “important that this population continues to observe all COVID-19–associated measures, such as social distancing and shielding when attending hospitals, even after vaccination.”
Study details
Previous studies have shown that some patients with cancer have prolonged responses to SARS-CoV-2 infection, with ongoing immune dysregulation, inefficient seroconversion, and prolonged viral shedding.
There are few data, however, on how these patients respond to COVID-19 vaccination. The authors point out that, among the 18,860 individuals who received the Pfizer vaccine during its development trials, “none with an active oncological diagnosis was included.”
To investigate this issue, they launched the SARS-CoV-2 for Cancer Patients (SOAP-02) study.
The 151 patients with cancer who participated in this study were mostly elderly, the authors noted (75% were older than 65 years; the median age was 73 years). The majority (63%) had solid-tumor malignancies. Of those, 8% had late-stage disease and had been living with their cancer for more than 24 months.
The healthy control persons were vaccine-eligible primary health care workers who were not age matched to the cancer patients.
All participants received the first dose of vaccine; 31 (of 151) patients with cancer and 16 (of 54) healthy control persons received the second dose on day 21.
The remaining participants were scheduled to receive their second dose 12 weeks later (after the study ended), in line with the changes in the national guidelines.
The team reported that, approximately 21 days after receiving the first vaccine dose, the immune efficacy of the vaccine was estimated to be 97% among healthy control persons vs. 39% for patients with solid tumors and only 13% for those with hematologic malignancies (P < .0001 for both).
T-cell responses, as assessed via interferon-gamma and/or interleukin-2 production, were observed in 82% of healthy control persons, 71% of patients with solid tumors, and 50% of those with hematologic cancers.
Vaccine boosting at day 21 resulted in immune efficacy of 100% for healthy control persons and 95% for patients with solid tumors. In contrast, only 43% of those who did not receive the second dose were seropositive 2 weeks later.
Further analysis suggested that participants who did not have a serologic response were “spread evenly” across different cancer types, but the reduced responses were more frequent among patients who had received the vaccine within 15 days of cancer treatment, especially chemotherapy, and had undergone intensive treatments.
The SOAP study is sponsored by King’s College London and Guy’s and St. Thomas Trust Foundation NHS Trust. It is funded from grants from the KCL Charity, Cancer Research UK, and program grants from Breast Cancer Now. The investigators have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
The new findings, which are soon to be published as a preprint, cast doubt on the current U.K. policy of delaying the second dose of the vaccine.
Delaying the second dose can leave most patients with cancer wholly or partially unprotected, according to the researchers. Moreover, such a delay has implications for transmission of SARS-CoV-2 in the cancer patient’s environs as well as for the evolution of virus variants that could be of concern, the researchers concluded.
The data come from a British study that included 151 patients with cancer and 54 healthy control persons. All participants received the COVID-19 mRNA BNT162b2 vaccine (Pfizer-BioNTech).
This vaccine requires two doses. The first few participants in this study were given the second dose 21 days after they had received the first dose, but then national guidelines changed, and the remaining participants had to wait 12 weeks to receive their second dose.
The researchers reported that, among health controls, the immune efficacy of the first dose was very high (97% efficacious). By contrast, among patients with solid tumors, the immune efficacy of a single dose was strikingly low (39%), and it was even lower in patients with hematologic malignancies (13%).
The second dose of vaccine greatly and rapidly increased the immune efficacy in patients with solid tumors (95% within 2 weeks of receiving the second dose), the researchers added.
Too few patients with hematologic cancers had received the second dose before the study ended for clear conclusions to be drawn. Nevertheless, the available data suggest that 50% of patients with hematologic cancers who had received the booster at day 21 were seropositive at 5 weeks vs. only 8% of those who had not received the booster.
“Our data provide the first real-world evidence of immune efficacy following one dose of the Pfizer vaccine in immunocompromised patient populations [and] clearly show that the poor one-dose efficacy in cancer patients can be rescued with an early booster at day 21,” commented senior author Sheeba Irshad, MD, senior clinical lecturer, King’s College London.
“Based on our findings, we would recommend an urgent review of the vaccine strategy for clinically extremely vulnerable groups. Until then, it is important that cancer patients continue to observe all public health measures in place, such as social distancing and shielding when attending hospitals, even after vaccination,” Dr. Irshad added.
The paper, with first author Leticia Monin-Aldama, PhD, is scheduled to appear on the preprint server medRxiv. It has not undergone peer review. The paper was distributed to journalists, with comments from experts not involved in the study, by the UK Science Media Centre.
These data are “of immediate importance” to patients with cancer, commented Shoba Amarnath, PhD, Newcastle University research fellow, Laboratory of T-cell Regulation, Newcastle University Center for Cancer, Newcastle upon Tyne, England.
“These findings are consistent with our understanding. … We know that the immune system within cancer patients is compromised as compared to healthy controls,” Dr. Amarnath said. “The data in the study support the notion that, in solid cancer patients, a considerable delay in second dose will extend the period when cancer patients are at risk of SARS-CoV-2 infection.”
Although more data are required, “this study does raise the issue of whether patients with cancer, other diseases, or those undergoing therapies that affect the body’s immune response should be fast-tracked for their second vaccine dose,” commented Lawrence Young, PhD, professor of molecular oncology and director of the Warwick Cancer Research Center, University of Warwick, Coventry, England.
Stephen Evans, MSc, professor of pharmacoepidemiology, London School of Hygiene and Tropical Medicine, underlined that the study is “essentially” observational and “inevitable limitations must be taken into account.
“Nevertheless, these results do suggest that the vaccines may well not protect those patients with cancer as well as those without cancer,” Mr. Evans said. He added that it is “important that this population continues to observe all COVID-19–associated measures, such as social distancing and shielding when attending hospitals, even after vaccination.”
Study details
Previous studies have shown that some patients with cancer have prolonged responses to SARS-CoV-2 infection, with ongoing immune dysregulation, inefficient seroconversion, and prolonged viral shedding.
There are few data, however, on how these patients respond to COVID-19 vaccination. The authors point out that, among the 18,860 individuals who received the Pfizer vaccine during its development trials, “none with an active oncological diagnosis was included.”
To investigate this issue, they launched the SARS-CoV-2 for Cancer Patients (SOAP-02) study.
The 151 patients with cancer who participated in this study were mostly elderly, the authors noted (75% were older than 65 years; the median age was 73 years). The majority (63%) had solid-tumor malignancies. Of those, 8% had late-stage disease and had been living with their cancer for more than 24 months.
The healthy control persons were vaccine-eligible primary health care workers who were not age matched to the cancer patients.
All participants received the first dose of vaccine; 31 (of 151) patients with cancer and 16 (of 54) healthy control persons received the second dose on day 21.
The remaining participants were scheduled to receive their second dose 12 weeks later (after the study ended), in line with the changes in the national guidelines.
The team reported that, approximately 21 days after receiving the first vaccine dose, the immune efficacy of the vaccine was estimated to be 97% among healthy control persons vs. 39% for patients with solid tumors and only 13% for those with hematologic malignancies (P < .0001 for both).
T-cell responses, as assessed via interferon-gamma and/or interleukin-2 production, were observed in 82% of healthy control persons, 71% of patients with solid tumors, and 50% of those with hematologic cancers.
Vaccine boosting at day 21 resulted in immune efficacy of 100% for healthy control persons and 95% for patients with solid tumors. In contrast, only 43% of those who did not receive the second dose were seropositive 2 weeks later.
Further analysis suggested that participants who did not have a serologic response were “spread evenly” across different cancer types, but the reduced responses were more frequent among patients who had received the vaccine within 15 days of cancer treatment, especially chemotherapy, and had undergone intensive treatments.
The SOAP study is sponsored by King’s College London and Guy’s and St. Thomas Trust Foundation NHS Trust. It is funded from grants from the KCL Charity, Cancer Research UK, and program grants from Breast Cancer Now. The investigators have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
New inhibitor shows promise in previously failed B-cell malignancies
BRUIN trial, a phase 1/2 study.
who discontinued prior Bruton’s tyrosine kinase (BTK)–inhibitor treatment due to resistance or intolerance, according to the results of thePirtobrutinib (formerly known as LOXO-305) is an oral-dose, highly selective, reversible BTK inhibitor, which might address a growing, unmet need for alternative therapies in BTK-inhibitor treatment failure patients, according to Anthony R. Mato, MD, of Memorial Sloan Kettering Cancer Center, New York, and colleagues. Their report was published in The Lancet.
The study included 109 women (34%) and 214 men (66%), with a median age of 68 years, who were treated with pirtobrutinib. Of these, 203 patients were assigned to pirtobrutinib (25-300 mg once per day) in the phase 1 portion of the study, and 120 patients were assigned to pirtobrutinib (200 mg once per day) in phase 2.
Promising outcomes
Pirtobrutinib, showed promising efficacy and tolerable safety in patients with CLL or small lymphocytic lymphoma, mantle cell lymphoma, and Waldenström macroglobulinemia who were previously treated with a BTK inhibitor. In 121 efficacy-evaluable patients with CLL or SLL treated with a previous covalent BTK inhibitor, the overall response rate with pirtobrutinib was 62% (95% confidence interval, 53-71). The ORR was similar in CLL patients with previous covalent BTK inhibitor resistance (67%), covalent BTK inhibitor intolerance (52%), BTK C481-mutant (71%), and BTK wild-type (66%) disease.
In 52 efficacy-evaluable patients with mantle cell lymphoma (MCL) previously treated with covalent BTK inhibitors, the ORR was 52% (95% CI, 38-66). Of 117 patients with CLL, SLL, or MCL who responded, all but 8 remain progression free to date, the authors stated.
In 19 efficacy-evaluable patients with Waldenström macroglobulinemia, the ORR was 68%. Among eight patients with follicular lymphoma who were efficacy evaluable, responses were observed in four (50%) patients, and six (75%) of eight efficacy evaluable patients with Richter’s transformation identified before enrollment responded to treatment, the authors stated.
No dose-limiting toxicities were observed and the maximum tolerated dose was not reached, according to the researchers. The recommended phase 2 dose was 200 mg daily. The adverse events, which occurred in at least 10% of 323 patients, were fatigue (20%), diarrhea (17%), and contusion (13%). The most common grade 3 or higher adverse event was neutropenia (10%). Five patients (1%) discontinued treatment because of a treatment-related adverse event.
In this “first-in-human trial of pirtobrutinib, we showed promising efficacy and safety in patients with B-cell malignancies, including CLL or SLL, MCL, Waldenström macroglobulinemia, and follicular lymphoma. Activity was observed in heavily pretreated patients, including patients with resistance and intolerance to previous covalent BTK inhibitor treatment. Global randomized phase 3 studies in CLL or SLL, and MCL are planned,” the researchers concluded.
Birth of a third generation?
“The pirtobrutinib study, by opening the way for a third generation of BTK inhibitors, could improve such a personalized molecular approach in the treatment of B-cell malignancies,” according to accompanying editorial comment by Jean-Marie Michot, MD, and Vincent Ribrag, MD, both of the Institut de Cancérologie Gustave Roussy, Villejuif, France.
They discussed how BTK inhibitors have been a considerable therapeutic advance in the treatment of NHL-B and CLL and how the three currently approved BTK inhibitors, namely ibrutinib, acalabrutinib, and zanubrutinib, are all covalent and irreversible inhibitors at the protein – the C481 binding site. “Ibrutinib was the first approved drug. The second-generation inhibitors, acalabrutinib and zanubrutinib, were designed to be more BTK selective,” they added. However, the covalency and irreversibility of the drugs, considered therapeutic strengths, have resulted in induced resistance mutations occurring at the covalent binding, rendering the drugs inactive. “Two advantages of this new drug class are highlighted. First, the selectivity of the drug on BTK appears to be increased,” they wrote. “Second, this class does not bind BTK to the C481 residue, and the efficacy of the drug is therefore not affected by mutations in the BTK binding site.”
Several of the study authors reported receiving grants and personal fees from Loxo Oncology (a wholly owned subsidiary of Eli Lilly), which sponsored the study, as well as financial relationships with other pharmaceutical and biotechnology companies.
Dr. Michot and Dr. Ribrag reported that they had no disclosures relevant to the discussion.
BRUIN trial, a phase 1/2 study.
who discontinued prior Bruton’s tyrosine kinase (BTK)–inhibitor treatment due to resistance or intolerance, according to the results of thePirtobrutinib (formerly known as LOXO-305) is an oral-dose, highly selective, reversible BTK inhibitor, which might address a growing, unmet need for alternative therapies in BTK-inhibitor treatment failure patients, according to Anthony R. Mato, MD, of Memorial Sloan Kettering Cancer Center, New York, and colleagues. Their report was published in The Lancet.
The study included 109 women (34%) and 214 men (66%), with a median age of 68 years, who were treated with pirtobrutinib. Of these, 203 patients were assigned to pirtobrutinib (25-300 mg once per day) in the phase 1 portion of the study, and 120 patients were assigned to pirtobrutinib (200 mg once per day) in phase 2.
Promising outcomes
Pirtobrutinib, showed promising efficacy and tolerable safety in patients with CLL or small lymphocytic lymphoma, mantle cell lymphoma, and Waldenström macroglobulinemia who were previously treated with a BTK inhibitor. In 121 efficacy-evaluable patients with CLL or SLL treated with a previous covalent BTK inhibitor, the overall response rate with pirtobrutinib was 62% (95% confidence interval, 53-71). The ORR was similar in CLL patients with previous covalent BTK inhibitor resistance (67%), covalent BTK inhibitor intolerance (52%), BTK C481-mutant (71%), and BTK wild-type (66%) disease.
In 52 efficacy-evaluable patients with mantle cell lymphoma (MCL) previously treated with covalent BTK inhibitors, the ORR was 52% (95% CI, 38-66). Of 117 patients with CLL, SLL, or MCL who responded, all but 8 remain progression free to date, the authors stated.
In 19 efficacy-evaluable patients with Waldenström macroglobulinemia, the ORR was 68%. Among eight patients with follicular lymphoma who were efficacy evaluable, responses were observed in four (50%) patients, and six (75%) of eight efficacy evaluable patients with Richter’s transformation identified before enrollment responded to treatment, the authors stated.
No dose-limiting toxicities were observed and the maximum tolerated dose was not reached, according to the researchers. The recommended phase 2 dose was 200 mg daily. The adverse events, which occurred in at least 10% of 323 patients, were fatigue (20%), diarrhea (17%), and contusion (13%). The most common grade 3 or higher adverse event was neutropenia (10%). Five patients (1%) discontinued treatment because of a treatment-related adverse event.
In this “first-in-human trial of pirtobrutinib, we showed promising efficacy and safety in patients with B-cell malignancies, including CLL or SLL, MCL, Waldenström macroglobulinemia, and follicular lymphoma. Activity was observed in heavily pretreated patients, including patients with resistance and intolerance to previous covalent BTK inhibitor treatment. Global randomized phase 3 studies in CLL or SLL, and MCL are planned,” the researchers concluded.
Birth of a third generation?
“The pirtobrutinib study, by opening the way for a third generation of BTK inhibitors, could improve such a personalized molecular approach in the treatment of B-cell malignancies,” according to accompanying editorial comment by Jean-Marie Michot, MD, and Vincent Ribrag, MD, both of the Institut de Cancérologie Gustave Roussy, Villejuif, France.
They discussed how BTK inhibitors have been a considerable therapeutic advance in the treatment of NHL-B and CLL and how the three currently approved BTK inhibitors, namely ibrutinib, acalabrutinib, and zanubrutinib, are all covalent and irreversible inhibitors at the protein – the C481 binding site. “Ibrutinib was the first approved drug. The second-generation inhibitors, acalabrutinib and zanubrutinib, were designed to be more BTK selective,” they added. However, the covalency and irreversibility of the drugs, considered therapeutic strengths, have resulted in induced resistance mutations occurring at the covalent binding, rendering the drugs inactive. “Two advantages of this new drug class are highlighted. First, the selectivity of the drug on BTK appears to be increased,” they wrote. “Second, this class does not bind BTK to the C481 residue, and the efficacy of the drug is therefore not affected by mutations in the BTK binding site.”
Several of the study authors reported receiving grants and personal fees from Loxo Oncology (a wholly owned subsidiary of Eli Lilly), which sponsored the study, as well as financial relationships with other pharmaceutical and biotechnology companies.
Dr. Michot and Dr. Ribrag reported that they had no disclosures relevant to the discussion.
BRUIN trial, a phase 1/2 study.
who discontinued prior Bruton’s tyrosine kinase (BTK)–inhibitor treatment due to resistance or intolerance, according to the results of thePirtobrutinib (formerly known as LOXO-305) is an oral-dose, highly selective, reversible BTK inhibitor, which might address a growing, unmet need for alternative therapies in BTK-inhibitor treatment failure patients, according to Anthony R. Mato, MD, of Memorial Sloan Kettering Cancer Center, New York, and colleagues. Their report was published in The Lancet.
The study included 109 women (34%) and 214 men (66%), with a median age of 68 years, who were treated with pirtobrutinib. Of these, 203 patients were assigned to pirtobrutinib (25-300 mg once per day) in the phase 1 portion of the study, and 120 patients were assigned to pirtobrutinib (200 mg once per day) in phase 2.
Promising outcomes
Pirtobrutinib, showed promising efficacy and tolerable safety in patients with CLL or small lymphocytic lymphoma, mantle cell lymphoma, and Waldenström macroglobulinemia who were previously treated with a BTK inhibitor. In 121 efficacy-evaluable patients with CLL or SLL treated with a previous covalent BTK inhibitor, the overall response rate with pirtobrutinib was 62% (95% confidence interval, 53-71). The ORR was similar in CLL patients with previous covalent BTK inhibitor resistance (67%), covalent BTK inhibitor intolerance (52%), BTK C481-mutant (71%), and BTK wild-type (66%) disease.
In 52 efficacy-evaluable patients with mantle cell lymphoma (MCL) previously treated with covalent BTK inhibitors, the ORR was 52% (95% CI, 38-66). Of 117 patients with CLL, SLL, or MCL who responded, all but 8 remain progression free to date, the authors stated.
In 19 efficacy-evaluable patients with Waldenström macroglobulinemia, the ORR was 68%. Among eight patients with follicular lymphoma who were efficacy evaluable, responses were observed in four (50%) patients, and six (75%) of eight efficacy evaluable patients with Richter’s transformation identified before enrollment responded to treatment, the authors stated.
No dose-limiting toxicities were observed and the maximum tolerated dose was not reached, according to the researchers. The recommended phase 2 dose was 200 mg daily. The adverse events, which occurred in at least 10% of 323 patients, were fatigue (20%), diarrhea (17%), and contusion (13%). The most common grade 3 or higher adverse event was neutropenia (10%). Five patients (1%) discontinued treatment because of a treatment-related adverse event.
In this “first-in-human trial of pirtobrutinib, we showed promising efficacy and safety in patients with B-cell malignancies, including CLL or SLL, MCL, Waldenström macroglobulinemia, and follicular lymphoma. Activity was observed in heavily pretreated patients, including patients with resistance and intolerance to previous covalent BTK inhibitor treatment. Global randomized phase 3 studies in CLL or SLL, and MCL are planned,” the researchers concluded.
Birth of a third generation?
“The pirtobrutinib study, by opening the way for a third generation of BTK inhibitors, could improve such a personalized molecular approach in the treatment of B-cell malignancies,” according to accompanying editorial comment by Jean-Marie Michot, MD, and Vincent Ribrag, MD, both of the Institut de Cancérologie Gustave Roussy, Villejuif, France.
They discussed how BTK inhibitors have been a considerable therapeutic advance in the treatment of NHL-B and CLL and how the three currently approved BTK inhibitors, namely ibrutinib, acalabrutinib, and zanubrutinib, are all covalent and irreversible inhibitors at the protein – the C481 binding site. “Ibrutinib was the first approved drug. The second-generation inhibitors, acalabrutinib and zanubrutinib, were designed to be more BTK selective,” they added. However, the covalency and irreversibility of the drugs, considered therapeutic strengths, have resulted in induced resistance mutations occurring at the covalent binding, rendering the drugs inactive. “Two advantages of this new drug class are highlighted. First, the selectivity of the drug on BTK appears to be increased,” they wrote. “Second, this class does not bind BTK to the C481 residue, and the efficacy of the drug is therefore not affected by mutations in the BTK binding site.”
Several of the study authors reported receiving grants and personal fees from Loxo Oncology (a wholly owned subsidiary of Eli Lilly), which sponsored the study, as well as financial relationships with other pharmaceutical and biotechnology companies.
Dr. Michot and Dr. Ribrag reported that they had no disclosures relevant to the discussion.
FROM THE LANCET