User login
Palliative care for patients with dementia: When to refer?
Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer,
A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.
For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
Standardized criteria is lacking
The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.
A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.
Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.
Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.
The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.
Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.
Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
A starting point for further discussion
Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.
“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”
Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.
One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”
Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.
Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer,
A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.
For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
Standardized criteria is lacking
The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.
A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.
Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.
Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.
The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.
Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.
Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
A starting point for further discussion
Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.
“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”
Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.
One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”
Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.
Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer,
A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.
For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
Standardized criteria is lacking
The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.
A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.
Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.
Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.
The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.
Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.
Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
A starting point for further discussion
Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.
“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”
Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.
One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”
Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.
FROM THE JOURNAL OF THE AMERICAN GERIATRICS SOCIETY
Risdiplam study shows promise for spinal muscular atrophy
FIREFISH study.
A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.
“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.
However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.
The findings were published online Feb. 24 in the New England Journal of Medicine.
A phase 2-3 open-label study
The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.
The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.
In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.
Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.
Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
The first oral treatment option
Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”
While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.
Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”
Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”
Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”
The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.
The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.
FIREFISH study.
A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.
“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.
However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.
The findings were published online Feb. 24 in the New England Journal of Medicine.
A phase 2-3 open-label study
The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.
The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.
In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.
Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.
Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
The first oral treatment option
Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”
While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.
Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”
Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”
Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”
The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.
The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.
FIREFISH study.
A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.
“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.
However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.
The findings were published online Feb. 24 in the New England Journal of Medicine.
A phase 2-3 open-label study
The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.
The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.
In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.
Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.
Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
The first oral treatment option
Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”
While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.
Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”
Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”
Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”
The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.
The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Adherence and discontinuation limit triptan outcomes
“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”
Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.
Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
25 years of triptan use
Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.
Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).
Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.
Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.
The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.
Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.
One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.
“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.
“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.
Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.
“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”
Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.
Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
25 years of triptan use
Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.
Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).
Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.
Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.
The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.
Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.
One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.
“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.
“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.
Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.
“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”
Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.
Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
25 years of triptan use
Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.
Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).
Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.
Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.
The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.
Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.
One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.
“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.
“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.
Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.
FROM CEPHALALGIA
Acute care of migraine and cluster headaches: Mainstay treatments and emerging strategies
Acute migraine headache attacks
A recent review in the Journal of Neuro-Ophthalmology by Konstantinos Spingos and colleagues (including me as the senior author) details typical and new treatments for migraine. We all know about the longstanding options, including the 7 triptans and ergots, as well as over-the-counter analgesics, which can be combined with caffeine, nonsteroidal anti-inflammatory drugs; and many use the 2 categories of medication that I no longer use for migraine, butalbital-containing medications, and opioids.
Now 2 gepants are available—small molecule calcitonin gene-related peptide (CGRP) receptor antagonists. These medications are thought to be a useful alternative for those in whom triptans do not work or are relatively contraindicated due to coronary and cerebrovascular problems and other cardiac risk factors like obesity, smoking, lack of exercise, high cholesterol, and diabetes. Ubrogepant was approved by the FDA in 2019, and rimegepant soon followed in 2020.
- Ubrogepant: In the ACHIEVE trials, approximately 1 in 5 participants who received the 50 mg dose were pain-free at 2 hours. Moreover, nearly 40% of individuals who received it said their worst migraine symptom was resolved at 2 hours. Pain relief at 2 hours was 59%
- Rimegepant: Like ubrogepant, about 20% of trial participants who received the 75 mg melt tablet dose of rimegepant were pain-free at 2 hours. Thirty-seven percent reported that their worst migraine symptom was gone at 2 hours. Patients began to return to normal functioning in 15 minutes.
In addition to gepants, there is 1 ditan approved, which stimulates 5-HT1F receptors. Lasmiditan is the first medication in this class to be FDA-approved. It, too, is considered an alternative in patients in whom triptans are ineffective or when patients should not take a vasoconstrictor. In the most recent phase 3 study, the percentage of individuals who received lasmiditan and were pain-free at 2 hours were 28% (50 mg), 31% (100 mg) and 39% (200 mg). Relief from the migraine sufferers’ most bothersome symptom at 2 hours occurred in 41%, 44%, and 49% of patients, respectively. Lasmiditan is a Class V controlled substance. It has 18% dizziness in clinical trials. After administration, patients should not drive for 8 hours, and it should only be used once in a 24-hour period.
Non-pharmaceutical treatment options for acute migraine include nerve stimulation using electrical and magnetic stimulation devices, and behavioral approaches such as biofeedback training and mindfulness. The Nerivio device for the upper arm is controlled by a smart phone app and seems to work as well as a triptan in some patients with almost no adverse events. Just approved in February is the Relivion device which is worn like a tiara on the head and stimulates the frontal branches of the trigeminal nerve as well as the 2 occipital nerves in the back of the head.
Acute care of cluster headache attacks
In 2011, Ashkenazi and Schwedt published a comprehensive table in Headache outlining the treatment options for acute cluster headache. More recently, a review in CNS Drugs by Brandt and colleagues presented the choices with level 1 evidence for efficacy. They include:
- Sumatriptan, 6 mg subcutaneous injection, or 20 mg nasal spray
- Zolmitriptan, 5 or 10 mg nasal spray
- Oxygen, 100%, 7 to 12 liters per minute via a mask over the nose and mouth
The authors recommend subcutaneous sumatriptan 6 mg and/or high-flow oxygen at 9- to 12- liters per minute for 15 minutes. Subcutaneous sumatriptan, they note, has been shown to achieve pain relief within 15 minutes in 75% of patients who receive it. Moreover, one-third report pain freedom. Oxygen’s efficacy has long been established, and relief comes with no adverse events. As for mask type, though no significant differences have been observed in studies, patients appear to express a preference for the demand valve oxygen type, which allows a high flow rate and is dependent on the user’s breathing rate.
Lidocaine intranasally has been found to be effective when triptans or oxygen do not work, according to a review in The Lancet Neurology by Hoffman and May. The medication is dripped or sprayed into the ipsilateral nostril at a concentration of between 4% and 10%. Pain relief is typically achieved within 10 minutes. This review also reports efficacy with percutaneous vagus nerve stimulation with the gammaCore device and neurostimulation of the sphenopalatine ganglion, though the mechanisms of these approaches are poorly understood.
Evolving therapies for acute cluster headache include the aforementioned CGRP receptor-antagonists. Additionally, intranasal ketamine hydrochloride is under investigation in an open-label, proof-of-concept study; and a zolmitriptan patch is being evaluated in a double-blind, placebo-controlled trial.
Attacks of migraine occur in 12% of the adult population, 3 times more in women than men and are painful and debilitating. Cluster attacks are even more painful and occur in about 0.1% of the population, somewhat more in men. Both types of headache have a variety of effective treatment as detailed above.
Acute migraine headache attacks
A recent review in the Journal of Neuro-Ophthalmology by Konstantinos Spingos and colleagues (including me as the senior author) details typical and new treatments for migraine. We all know about the longstanding options, including the 7 triptans and ergots, as well as over-the-counter analgesics, which can be combined with caffeine, nonsteroidal anti-inflammatory drugs; and many use the 2 categories of medication that I no longer use for migraine, butalbital-containing medications, and opioids.
Now 2 gepants are available—small molecule calcitonin gene-related peptide (CGRP) receptor antagonists. These medications are thought to be a useful alternative for those in whom triptans do not work or are relatively contraindicated due to coronary and cerebrovascular problems and other cardiac risk factors like obesity, smoking, lack of exercise, high cholesterol, and diabetes. Ubrogepant was approved by the FDA in 2019, and rimegepant soon followed in 2020.
- Ubrogepant: In the ACHIEVE trials, approximately 1 in 5 participants who received the 50 mg dose were pain-free at 2 hours. Moreover, nearly 40% of individuals who received it said their worst migraine symptom was resolved at 2 hours. Pain relief at 2 hours was 59%
- Rimegepant: Like ubrogepant, about 20% of trial participants who received the 75 mg melt tablet dose of rimegepant were pain-free at 2 hours. Thirty-seven percent reported that their worst migraine symptom was gone at 2 hours. Patients began to return to normal functioning in 15 minutes.
In addition to gepants, there is 1 ditan approved, which stimulates 5-HT1F receptors. Lasmiditan is the first medication in this class to be FDA-approved. It, too, is considered an alternative in patients in whom triptans are ineffective or when patients should not take a vasoconstrictor. In the most recent phase 3 study, the percentage of individuals who received lasmiditan and were pain-free at 2 hours were 28% (50 mg), 31% (100 mg) and 39% (200 mg). Relief from the migraine sufferers’ most bothersome symptom at 2 hours occurred in 41%, 44%, and 49% of patients, respectively. Lasmiditan is a Class V controlled substance. It has 18% dizziness in clinical trials. After administration, patients should not drive for 8 hours, and it should only be used once in a 24-hour period.
Non-pharmaceutical treatment options for acute migraine include nerve stimulation using electrical and magnetic stimulation devices, and behavioral approaches such as biofeedback training and mindfulness. The Nerivio device for the upper arm is controlled by a smart phone app and seems to work as well as a triptan in some patients with almost no adverse events. Just approved in February is the Relivion device which is worn like a tiara on the head and stimulates the frontal branches of the trigeminal nerve as well as the 2 occipital nerves in the back of the head.
Acute care of cluster headache attacks
In 2011, Ashkenazi and Schwedt published a comprehensive table in Headache outlining the treatment options for acute cluster headache. More recently, a review in CNS Drugs by Brandt and colleagues presented the choices with level 1 evidence for efficacy. They include:
- Sumatriptan, 6 mg subcutaneous injection, or 20 mg nasal spray
- Zolmitriptan, 5 or 10 mg nasal spray
- Oxygen, 100%, 7 to 12 liters per minute via a mask over the nose and mouth
The authors recommend subcutaneous sumatriptan 6 mg and/or high-flow oxygen at 9- to 12- liters per minute for 15 minutes. Subcutaneous sumatriptan, they note, has been shown to achieve pain relief within 15 minutes in 75% of patients who receive it. Moreover, one-third report pain freedom. Oxygen’s efficacy has long been established, and relief comes with no adverse events. As for mask type, though no significant differences have been observed in studies, patients appear to express a preference for the demand valve oxygen type, which allows a high flow rate and is dependent on the user’s breathing rate.
Lidocaine intranasally has been found to be effective when triptans or oxygen do not work, according to a review in The Lancet Neurology by Hoffman and May. The medication is dripped or sprayed into the ipsilateral nostril at a concentration of between 4% and 10%. Pain relief is typically achieved within 10 minutes. This review also reports efficacy with percutaneous vagus nerve stimulation with the gammaCore device and neurostimulation of the sphenopalatine ganglion, though the mechanisms of these approaches are poorly understood.
Evolving therapies for acute cluster headache include the aforementioned CGRP receptor-antagonists. Additionally, intranasal ketamine hydrochloride is under investigation in an open-label, proof-of-concept study; and a zolmitriptan patch is being evaluated in a double-blind, placebo-controlled trial.
Attacks of migraine occur in 12% of the adult population, 3 times more in women than men and are painful and debilitating. Cluster attacks are even more painful and occur in about 0.1% of the population, somewhat more in men. Both types of headache have a variety of effective treatment as detailed above.
Acute migraine headache attacks
A recent review in the Journal of Neuro-Ophthalmology by Konstantinos Spingos and colleagues (including me as the senior author) details typical and new treatments for migraine. We all know about the longstanding options, including the 7 triptans and ergots, as well as over-the-counter analgesics, which can be combined with caffeine, nonsteroidal anti-inflammatory drugs; and many use the 2 categories of medication that I no longer use for migraine, butalbital-containing medications, and opioids.
Now 2 gepants are available—small molecule calcitonin gene-related peptide (CGRP) receptor antagonists. These medications are thought to be a useful alternative for those in whom triptans do not work or are relatively contraindicated due to coronary and cerebrovascular problems and other cardiac risk factors like obesity, smoking, lack of exercise, high cholesterol, and diabetes. Ubrogepant was approved by the FDA in 2019, and rimegepant soon followed in 2020.
- Ubrogepant: In the ACHIEVE trials, approximately 1 in 5 participants who received the 50 mg dose were pain-free at 2 hours. Moreover, nearly 40% of individuals who received it said their worst migraine symptom was resolved at 2 hours. Pain relief at 2 hours was 59%
- Rimegepant: Like ubrogepant, about 20% of trial participants who received the 75 mg melt tablet dose of rimegepant were pain-free at 2 hours. Thirty-seven percent reported that their worst migraine symptom was gone at 2 hours. Patients began to return to normal functioning in 15 minutes.
In addition to gepants, there is 1 ditan approved, which stimulates 5-HT1F receptors. Lasmiditan is the first medication in this class to be FDA-approved. It, too, is considered an alternative in patients in whom triptans are ineffective or when patients should not take a vasoconstrictor. In the most recent phase 3 study, the percentage of individuals who received lasmiditan and were pain-free at 2 hours were 28% (50 mg), 31% (100 mg) and 39% (200 mg). Relief from the migraine sufferers’ most bothersome symptom at 2 hours occurred in 41%, 44%, and 49% of patients, respectively. Lasmiditan is a Class V controlled substance. It has 18% dizziness in clinical trials. After administration, patients should not drive for 8 hours, and it should only be used once in a 24-hour period.
Non-pharmaceutical treatment options for acute migraine include nerve stimulation using electrical and magnetic stimulation devices, and behavioral approaches such as biofeedback training and mindfulness. The Nerivio device for the upper arm is controlled by a smart phone app and seems to work as well as a triptan in some patients with almost no adverse events. Just approved in February is the Relivion device which is worn like a tiara on the head and stimulates the frontal branches of the trigeminal nerve as well as the 2 occipital nerves in the back of the head.
Acute care of cluster headache attacks
In 2011, Ashkenazi and Schwedt published a comprehensive table in Headache outlining the treatment options for acute cluster headache. More recently, a review in CNS Drugs by Brandt and colleagues presented the choices with level 1 evidence for efficacy. They include:
- Sumatriptan, 6 mg subcutaneous injection, or 20 mg nasal spray
- Zolmitriptan, 5 or 10 mg nasal spray
- Oxygen, 100%, 7 to 12 liters per minute via a mask over the nose and mouth
The authors recommend subcutaneous sumatriptan 6 mg and/or high-flow oxygen at 9- to 12- liters per minute for 15 minutes. Subcutaneous sumatriptan, they note, has been shown to achieve pain relief within 15 minutes in 75% of patients who receive it. Moreover, one-third report pain freedom. Oxygen’s efficacy has long been established, and relief comes with no adverse events. As for mask type, though no significant differences have been observed in studies, patients appear to express a preference for the demand valve oxygen type, which allows a high flow rate and is dependent on the user’s breathing rate.
Lidocaine intranasally has been found to be effective when triptans or oxygen do not work, according to a review in The Lancet Neurology by Hoffman and May. The medication is dripped or sprayed into the ipsilateral nostril at a concentration of between 4% and 10%. Pain relief is typically achieved within 10 minutes. This review also reports efficacy with percutaneous vagus nerve stimulation with the gammaCore device and neurostimulation of the sphenopalatine ganglion, though the mechanisms of these approaches are poorly understood.
Evolving therapies for acute cluster headache include the aforementioned CGRP receptor-antagonists. Additionally, intranasal ketamine hydrochloride is under investigation in an open-label, proof-of-concept study; and a zolmitriptan patch is being evaluated in a double-blind, placebo-controlled trial.
Attacks of migraine occur in 12% of the adult population, 3 times more in women than men and are painful and debilitating. Cluster attacks are even more painful and occur in about 0.1% of the population, somewhat more in men. Both types of headache have a variety of effective treatment as detailed above.
Core feature of frontotemporal dementia may aid diagnosis
(FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.
“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.
“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.
The study was published online Feb. 17 in Neurology.
Difficult diagnosis
“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.
“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.
Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.
FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.
WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
Higher disease severity
To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).
Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).
Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).
The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.
The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).
Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.
A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
Unexpected findings
Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.
After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.
Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.
“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.
The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.
“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
Major research contribution
Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.
“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.
The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.
“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.
“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.
The study was published online Feb. 17 in Neurology.
Difficult diagnosis
“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.
“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.
Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.
FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.
WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
Higher disease severity
To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).
Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).
Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).
The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.
The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).
Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.
A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
Unexpected findings
Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.
After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.
Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.
“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.
The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.
“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
Major research contribution
Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.
“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.
The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.
“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.
“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.
The study was published online Feb. 17 in Neurology.
Difficult diagnosis
“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.
“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.
Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.
FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.
WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
Higher disease severity
To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).
Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).
Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).
The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.
The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).
Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.
A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
Unexpected findings
Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.
After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.
Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.
“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.
The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.
“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
Major research contribution
Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.
“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.
The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
EEG data may help aid diagnosis, treatment of focal epilepsy
, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.
“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.
The study was published online Feb. 8 in JAMA Neurology.
Distinct chronotypes
Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.
Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.
To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.
After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.
Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.
To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.
To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.
Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.
The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.
The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).
Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.
Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.
Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.
“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”
The results support patients’ impressions that their seizures occur in a cyclical pattern.
“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.
“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
Need for more research
Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”
The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.
The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.
However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.
“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.
“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”
The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.
“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.
The study was published online Feb. 8 in JAMA Neurology.
Distinct chronotypes
Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.
Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.
To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.
After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.
Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.
To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.
To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.
Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.
The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.
The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).
Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.
Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.
Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.
“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”
The results support patients’ impressions that their seizures occur in a cyclical pattern.
“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.
“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
Need for more research
Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”
The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.
The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.
However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.
“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.
“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”
The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.
“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.
The study was published online Feb. 8 in JAMA Neurology.
Distinct chronotypes
Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.
Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.
To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.
After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.
Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.
To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.
To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.
Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.
The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.
The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).
Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.
Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.
Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.
“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”
The results support patients’ impressions that their seizures occur in a cyclical pattern.
“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.
“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
Need for more research
Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”
The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.
The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.
However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.
“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.
“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”
The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NEUROLOGY
New data may help intercept head injuries in college football
Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.
The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows
The research also reveals that such injuries occur more often during practices than games.
“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”
The study was published online Feb. 1 in JAMA Neurology.
More injuries in preseason
Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).
The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.
“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”
To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.
A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.
“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.
Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.
Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).
“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”
Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.
“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.
These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.
“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.
“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.
Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.
‘Shocking’ findings
In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.
“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”
“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”
Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.
“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.
“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”
A version of this article first appeared on Medscape.com.
Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.
The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows
The research also reveals that such injuries occur more often during practices than games.
“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”
The study was published online Feb. 1 in JAMA Neurology.
More injuries in preseason
Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).
The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.
“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”
To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.
A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.
“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.
Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.
Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).
“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”
Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.
“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.
These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.
“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.
“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.
Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.
‘Shocking’ findings
In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.
“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”
“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”
Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.
“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.
“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”
A version of this article first appeared on Medscape.com.
Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.
The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows
The research also reveals that such injuries occur more often during practices than games.
“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”
The study was published online Feb. 1 in JAMA Neurology.
More injuries in preseason
Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).
The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.
“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”
To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.
A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.
“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.
Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.
Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).
“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”
Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.
“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.
These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.
“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.
“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.
Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.
‘Shocking’ findings
In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.
“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”
“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”
Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.
“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.
“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”
A version of this article first appeared on Medscape.com.
FROM JAMA NEUROLOGY
Using engineered T cells reduced acute, chronic GVHD
A novel T-cell engineered product, Orca-T (Orca Bio), was associated with lower incidence of both acute and chronic graft-versus-host disease (GVHD) and more than double the rate of GVHD-free and relapse-free survival, compared with the current standard of care for patients undergoing hematopoietic stem cell transplants (HSCT), investigators said.
In both a multicenter phase 1 trial (NCT04013685) and single-center phase 1/2 trial (NCT01660607) with a total of 50 patients, those who received Orca-T with single-agent GVHD prophylaxis had a 1-year GVHD-free and relapse-free survival rate of 75%, compared with 31% for patients who received standard of care with two-agent prophylaxis, reported Everett H. Meyer, MD, PhD, from the Stanford (Calif.) University.
“Orca-T has good evidence for reduced acute graft-versus-host disease, reduced chromic graft-versus-host disease, and a low nonrelapse mortality,” he said at the Transplant & Cellular Therapies Meetings.
The product can be quickly manufactured and delivered to treatment centers across the continental United States, with “vein-to-vein” time of less than 72 hours, he said at the meeting held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.
Orca-T consists of highly purified, donor-derived T-regulatory (Treg) cells that are sorted and delivered on day 0 with hematopoietic stem cells, without immunosuppressants, followed 2 days later with infusion of a matching dose of conventional T cells.
“The Treg cells are allowed to expand to create the right microenvironment for the [conventional T cells],” he explained.
In preclinical studies, donor-derived, high-purity Tregs delivered prior to adoptive transfer of conventional T cells prevented GVHD while maintaining graft-versus-tumor immunity, he said.
Two T-cell infusions
He reported updated results from current studies on a total of 50 adults, with a cohort of 144 patients treated concurrently with standard of care as controls.
The Orca-T–treated patients had a median age of 47 and 52% were male. Indications for transplant included acute myeloid and acute lymphoblastic leukemia, chronic myeloid leukemia, B-cell lymphoma, myelodysplastic syndrome/myelofibrosis, and other unspecified indications.
In both the Orca-T and control cohorts, patients underwent myeloablative conditioning from 10 to 2 days prior to stem cell infusion.
As noted patients in the experimental arm received infusion of hematopoietic stem/progenitor cells and Tregs, followed 2 days later by conventional T-cell infusion, and, on the day after that, tacrolimus at a target dose of 4.6 ng/mL. The conventional T cells were reserved from donor apheresis and were otherwise unmanipulated prior to infusion into the recipient, Dr. Meyer noted.
Patients in the standard-of-care arm received tacrolimus on the day before standard infusion of the apheresis product, followed by methotrexate prophylaxis on days 1, 3, 6 and 11.
Time to neutrophil engraftment, platelet engraftment, and from day 0 to hospital discharge were all significantly shorter in the Orca-T group, at 12 versus 14 days (P < .0001), 11 vs. 17 days (P < .0001), and 15 vs. 17 days (P = .01) respectively.
At 100 days of follow-up, the rate of grade 2 or greater acute GVHD was 30% among standard-of-care patients versus 10% among Orca-T–treated patients. At 1-year follow-up, respective rates of chronic GVHD were 46% vs. 3%.
Safety
“In general, the protocol is extremely well tolerated by our patients. We’ve seen no exceptional infectious disease complications, and we’ve seen no other major complications,” Dr. Meyer said.
Cytomegalovirus prophylaxis was used variably, depending on the center and on the attending physician. Epstein-Barr virus reactivation occurred in eight patients, with one requiring therapy, but there was no biopsy or radiographic evidence of posttransplant lymphoproliferative disorder.
In all, 18% of patients had serious adverse events during the reporting period, all of which resolved. There were no treatment-related deaths in the Orca-T arm, compared with 11% of controls.
Engraftment differences explored
In the question-and-answer session following the presentation, Christopher J. Gamper, MD, PhD, from the Johns Hopkins Hospital in Baltimore, told Dr. Meyer that “your outcomes from Orca-T look excellent,” and asked about the cost differential, compared with similar, unmanipulated transplants performed with standard GVHD prophylaxis.
“Is this recovered by lower costs for treatment of GVHD?” he asked.
“I have not done an economic cost analysis of course, and I think others may be looking into this,” Dr. Meyer replied. “Graft engineering can be expensive, although it’s an engineering proposition and one could imagine that the costs will go down substantially over time.”
Session moderator Alan Hanash, MD, PhD, from Memorial Sloan Kettering Cancer Center in New York, commented on the differences in engraftment between the experimental controls arms, and asked Dr. Meyer: “Do you think this is due to the difference in prophylaxis? Absence of methotrexate? Do you think that it could be a direct impact of regulatory T cells on hematopoietic engraftment?”
“Certainly not having methotrexate is beneficial for engraftment, and may account for the differences we see, Dr. Meyer said. “However, it is possible that Tregs could be playing a facilitative role. There certainly is good preclinical literature that Tregs, particularly in the bone marrow space, can facilitate bone marrow engraftment.”
The Orca-T trials are sponsored by Orca Bio and Stanford, with support from the National Institutes of Health. Dr. Meyer receives research support from Orca and is a scientific adviser to GigaGen, Triursus, Incyte, and Indee Labs. Dr. Hanash and Dr. Gamper had no relevant disclosures.
A novel T-cell engineered product, Orca-T (Orca Bio), was associated with lower incidence of both acute and chronic graft-versus-host disease (GVHD) and more than double the rate of GVHD-free and relapse-free survival, compared with the current standard of care for patients undergoing hematopoietic stem cell transplants (HSCT), investigators said.
In both a multicenter phase 1 trial (NCT04013685) and single-center phase 1/2 trial (NCT01660607) with a total of 50 patients, those who received Orca-T with single-agent GVHD prophylaxis had a 1-year GVHD-free and relapse-free survival rate of 75%, compared with 31% for patients who received standard of care with two-agent prophylaxis, reported Everett H. Meyer, MD, PhD, from the Stanford (Calif.) University.
“Orca-T has good evidence for reduced acute graft-versus-host disease, reduced chromic graft-versus-host disease, and a low nonrelapse mortality,” he said at the Transplant & Cellular Therapies Meetings.
The product can be quickly manufactured and delivered to treatment centers across the continental United States, with “vein-to-vein” time of less than 72 hours, he said at the meeting held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.
Orca-T consists of highly purified, donor-derived T-regulatory (Treg) cells that are sorted and delivered on day 0 with hematopoietic stem cells, without immunosuppressants, followed 2 days later with infusion of a matching dose of conventional T cells.
“The Treg cells are allowed to expand to create the right microenvironment for the [conventional T cells],” he explained.
In preclinical studies, donor-derived, high-purity Tregs delivered prior to adoptive transfer of conventional T cells prevented GVHD while maintaining graft-versus-tumor immunity, he said.
Two T-cell infusions
He reported updated results from current studies on a total of 50 adults, with a cohort of 144 patients treated concurrently with standard of care as controls.
The Orca-T–treated patients had a median age of 47 and 52% were male. Indications for transplant included acute myeloid and acute lymphoblastic leukemia, chronic myeloid leukemia, B-cell lymphoma, myelodysplastic syndrome/myelofibrosis, and other unspecified indications.
In both the Orca-T and control cohorts, patients underwent myeloablative conditioning from 10 to 2 days prior to stem cell infusion.
As noted patients in the experimental arm received infusion of hematopoietic stem/progenitor cells and Tregs, followed 2 days later by conventional T-cell infusion, and, on the day after that, tacrolimus at a target dose of 4.6 ng/mL. The conventional T cells were reserved from donor apheresis and were otherwise unmanipulated prior to infusion into the recipient, Dr. Meyer noted.
Patients in the standard-of-care arm received tacrolimus on the day before standard infusion of the apheresis product, followed by methotrexate prophylaxis on days 1, 3, 6 and 11.
Time to neutrophil engraftment, platelet engraftment, and from day 0 to hospital discharge were all significantly shorter in the Orca-T group, at 12 versus 14 days (P < .0001), 11 vs. 17 days (P < .0001), and 15 vs. 17 days (P = .01) respectively.
At 100 days of follow-up, the rate of grade 2 or greater acute GVHD was 30% among standard-of-care patients versus 10% among Orca-T–treated patients. At 1-year follow-up, respective rates of chronic GVHD were 46% vs. 3%.
Safety
“In general, the protocol is extremely well tolerated by our patients. We’ve seen no exceptional infectious disease complications, and we’ve seen no other major complications,” Dr. Meyer said.
Cytomegalovirus prophylaxis was used variably, depending on the center and on the attending physician. Epstein-Barr virus reactivation occurred in eight patients, with one requiring therapy, but there was no biopsy or radiographic evidence of posttransplant lymphoproliferative disorder.
In all, 18% of patients had serious adverse events during the reporting period, all of which resolved. There were no treatment-related deaths in the Orca-T arm, compared with 11% of controls.
Engraftment differences explored
In the question-and-answer session following the presentation, Christopher J. Gamper, MD, PhD, from the Johns Hopkins Hospital in Baltimore, told Dr. Meyer that “your outcomes from Orca-T look excellent,” and asked about the cost differential, compared with similar, unmanipulated transplants performed with standard GVHD prophylaxis.
“Is this recovered by lower costs for treatment of GVHD?” he asked.
“I have not done an economic cost analysis of course, and I think others may be looking into this,” Dr. Meyer replied. “Graft engineering can be expensive, although it’s an engineering proposition and one could imagine that the costs will go down substantially over time.”
Session moderator Alan Hanash, MD, PhD, from Memorial Sloan Kettering Cancer Center in New York, commented on the differences in engraftment between the experimental controls arms, and asked Dr. Meyer: “Do you think this is due to the difference in prophylaxis? Absence of methotrexate? Do you think that it could be a direct impact of regulatory T cells on hematopoietic engraftment?”
“Certainly not having methotrexate is beneficial for engraftment, and may account for the differences we see, Dr. Meyer said. “However, it is possible that Tregs could be playing a facilitative role. There certainly is good preclinical literature that Tregs, particularly in the bone marrow space, can facilitate bone marrow engraftment.”
The Orca-T trials are sponsored by Orca Bio and Stanford, with support from the National Institutes of Health. Dr. Meyer receives research support from Orca and is a scientific adviser to GigaGen, Triursus, Incyte, and Indee Labs. Dr. Hanash and Dr. Gamper had no relevant disclosures.
A novel T-cell engineered product, Orca-T (Orca Bio), was associated with lower incidence of both acute and chronic graft-versus-host disease (GVHD) and more than double the rate of GVHD-free and relapse-free survival, compared with the current standard of care for patients undergoing hematopoietic stem cell transplants (HSCT), investigators said.
In both a multicenter phase 1 trial (NCT04013685) and single-center phase 1/2 trial (NCT01660607) with a total of 50 patients, those who received Orca-T with single-agent GVHD prophylaxis had a 1-year GVHD-free and relapse-free survival rate of 75%, compared with 31% for patients who received standard of care with two-agent prophylaxis, reported Everett H. Meyer, MD, PhD, from the Stanford (Calif.) University.
“Orca-T has good evidence for reduced acute graft-versus-host disease, reduced chromic graft-versus-host disease, and a low nonrelapse mortality,” he said at the Transplant & Cellular Therapies Meetings.
The product can be quickly manufactured and delivered to treatment centers across the continental United States, with “vein-to-vein” time of less than 72 hours, he said at the meeting held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.
Orca-T consists of highly purified, donor-derived T-regulatory (Treg) cells that are sorted and delivered on day 0 with hematopoietic stem cells, without immunosuppressants, followed 2 days later with infusion of a matching dose of conventional T cells.
“The Treg cells are allowed to expand to create the right microenvironment for the [conventional T cells],” he explained.
In preclinical studies, donor-derived, high-purity Tregs delivered prior to adoptive transfer of conventional T cells prevented GVHD while maintaining graft-versus-tumor immunity, he said.
Two T-cell infusions
He reported updated results from current studies on a total of 50 adults, with a cohort of 144 patients treated concurrently with standard of care as controls.
The Orca-T–treated patients had a median age of 47 and 52% were male. Indications for transplant included acute myeloid and acute lymphoblastic leukemia, chronic myeloid leukemia, B-cell lymphoma, myelodysplastic syndrome/myelofibrosis, and other unspecified indications.
In both the Orca-T and control cohorts, patients underwent myeloablative conditioning from 10 to 2 days prior to stem cell infusion.
As noted patients in the experimental arm received infusion of hematopoietic stem/progenitor cells and Tregs, followed 2 days later by conventional T-cell infusion, and, on the day after that, tacrolimus at a target dose of 4.6 ng/mL. The conventional T cells were reserved from donor apheresis and were otherwise unmanipulated prior to infusion into the recipient, Dr. Meyer noted.
Patients in the standard-of-care arm received tacrolimus on the day before standard infusion of the apheresis product, followed by methotrexate prophylaxis on days 1, 3, 6 and 11.
Time to neutrophil engraftment, platelet engraftment, and from day 0 to hospital discharge were all significantly shorter in the Orca-T group, at 12 versus 14 days (P < .0001), 11 vs. 17 days (P < .0001), and 15 vs. 17 days (P = .01) respectively.
At 100 days of follow-up, the rate of grade 2 or greater acute GVHD was 30% among standard-of-care patients versus 10% among Orca-T–treated patients. At 1-year follow-up, respective rates of chronic GVHD were 46% vs. 3%.
Safety
“In general, the protocol is extremely well tolerated by our patients. We’ve seen no exceptional infectious disease complications, and we’ve seen no other major complications,” Dr. Meyer said.
Cytomegalovirus prophylaxis was used variably, depending on the center and on the attending physician. Epstein-Barr virus reactivation occurred in eight patients, with one requiring therapy, but there was no biopsy or radiographic evidence of posttransplant lymphoproliferative disorder.
In all, 18% of patients had serious adverse events during the reporting period, all of which resolved. There were no treatment-related deaths in the Orca-T arm, compared with 11% of controls.
Engraftment differences explored
In the question-and-answer session following the presentation, Christopher J. Gamper, MD, PhD, from the Johns Hopkins Hospital in Baltimore, told Dr. Meyer that “your outcomes from Orca-T look excellent,” and asked about the cost differential, compared with similar, unmanipulated transplants performed with standard GVHD prophylaxis.
“Is this recovered by lower costs for treatment of GVHD?” he asked.
“I have not done an economic cost analysis of course, and I think others may be looking into this,” Dr. Meyer replied. “Graft engineering can be expensive, although it’s an engineering proposition and one could imagine that the costs will go down substantially over time.”
Session moderator Alan Hanash, MD, PhD, from Memorial Sloan Kettering Cancer Center in New York, commented on the differences in engraftment between the experimental controls arms, and asked Dr. Meyer: “Do you think this is due to the difference in prophylaxis? Absence of methotrexate? Do you think that it could be a direct impact of regulatory T cells on hematopoietic engraftment?”
“Certainly not having methotrexate is beneficial for engraftment, and may account for the differences we see, Dr. Meyer said. “However, it is possible that Tregs could be playing a facilitative role. There certainly is good preclinical literature that Tregs, particularly in the bone marrow space, can facilitate bone marrow engraftment.”
The Orca-T trials are sponsored by Orca Bio and Stanford, with support from the National Institutes of Health. Dr. Meyer receives research support from Orca and is a scientific adviser to GigaGen, Triursus, Incyte, and Indee Labs. Dr. Hanash and Dr. Gamper had no relevant disclosures.
FROM TCT 2021
Novel ddPCR assay precisely measures CAR T-cells after infusion
A novel quantitative assay used with flow cytometry helps to precisely measure chimeric antigen receptor (CAR) T-cell engraftment and in vivo expansion to predict patient outcomes after CAR T-cell infusion, according to researchers at the Fondazione IRCCS Istituto Nazionale Tumorion in Milan.
Higher frequencies of CAR-positive T cells at day 9 after infusion, as measured using the polymerase chain reaction (PCR)-based assay, accurately distinguished responders from nonresponders, Paolo Corradini, MD, said at the 3rd European CAR T-cell Meeting.
The findings, first presented in December at the American Society of Hematology annual conference, suggest the assay could improve treatment decision-making, Dr. Corradini of the University of Milan said at the meeting, which is jointly sponsored by the European Society for Blood and Marrow Transplantation and the European Hematology Association
He and his colleagues prospectively collected samples from 16 patients with diffuse large B-cell lymphoma, 5 with transformed follicular lymphoma, and 7 with primary mediastinal B-cell lymphoma who were treated with either axicabtagene ciloleucel (axi-cel; Yescarta) or tisagenlecleucel (tisa-cal; Kymriah) between November 2019 and July 2020. CAR T cells were monitored using flow cytometry.
Pivotal trial data and subsequent findings with respect to tisa-cel and axi-cel have demonstrated that CAR T-cell engraftment and in vivo expansion have a crucial impact on disease response and toxicity: a cut-off value of CAR+ cells at day 9 greater than 24.5/microliters distinguished responders from nonresponders with a sensitivity of 87.5% and specificity of 81%, Dr. Corradini noted.
“But we have also devised a methodology by digital droplet PCR (ddPCR) recently that correlates perfectly with the flow cytometry data,” he said, adding that the assay is “easy and allowed precise enumeration of the CAR T cells in the blood of the patient.”
The R square (coefficient of determination) for ddPCR and flow cytometry was 0.9995 and 0.9997 for tisa-cel and axi-cel, respectively (P < .0001 for each). This is particularly useful for assessing whether low CAR T-cell levels on flow cytometry are background signals resulting from nonspecific binding of the antibodies or true low levels, and the findings therefore have implications for improving clinical decision-making and outcomes in CAR T-cell therapy recipients, he said.
A novel quantitative assay used with flow cytometry helps to precisely measure chimeric antigen receptor (CAR) T-cell engraftment and in vivo expansion to predict patient outcomes after CAR T-cell infusion, according to researchers at the Fondazione IRCCS Istituto Nazionale Tumorion in Milan.
Higher frequencies of CAR-positive T cells at day 9 after infusion, as measured using the polymerase chain reaction (PCR)-based assay, accurately distinguished responders from nonresponders, Paolo Corradini, MD, said at the 3rd European CAR T-cell Meeting.
The findings, first presented in December at the American Society of Hematology annual conference, suggest the assay could improve treatment decision-making, Dr. Corradini of the University of Milan said at the meeting, which is jointly sponsored by the European Society for Blood and Marrow Transplantation and the European Hematology Association
He and his colleagues prospectively collected samples from 16 patients with diffuse large B-cell lymphoma, 5 with transformed follicular lymphoma, and 7 with primary mediastinal B-cell lymphoma who were treated with either axicabtagene ciloleucel (axi-cel; Yescarta) or tisagenlecleucel (tisa-cal; Kymriah) between November 2019 and July 2020. CAR T cells were monitored using flow cytometry.
Pivotal trial data and subsequent findings with respect to tisa-cel and axi-cel have demonstrated that CAR T-cell engraftment and in vivo expansion have a crucial impact on disease response and toxicity: a cut-off value of CAR+ cells at day 9 greater than 24.5/microliters distinguished responders from nonresponders with a sensitivity of 87.5% and specificity of 81%, Dr. Corradini noted.
“But we have also devised a methodology by digital droplet PCR (ddPCR) recently that correlates perfectly with the flow cytometry data,” he said, adding that the assay is “easy and allowed precise enumeration of the CAR T cells in the blood of the patient.”
The R square (coefficient of determination) for ddPCR and flow cytometry was 0.9995 and 0.9997 for tisa-cel and axi-cel, respectively (P < .0001 for each). This is particularly useful for assessing whether low CAR T-cell levels on flow cytometry are background signals resulting from nonspecific binding of the antibodies or true low levels, and the findings therefore have implications for improving clinical decision-making and outcomes in CAR T-cell therapy recipients, he said.
A novel quantitative assay used with flow cytometry helps to precisely measure chimeric antigen receptor (CAR) T-cell engraftment and in vivo expansion to predict patient outcomes after CAR T-cell infusion, according to researchers at the Fondazione IRCCS Istituto Nazionale Tumorion in Milan.
Higher frequencies of CAR-positive T cells at day 9 after infusion, as measured using the polymerase chain reaction (PCR)-based assay, accurately distinguished responders from nonresponders, Paolo Corradini, MD, said at the 3rd European CAR T-cell Meeting.
The findings, first presented in December at the American Society of Hematology annual conference, suggest the assay could improve treatment decision-making, Dr. Corradini of the University of Milan said at the meeting, which is jointly sponsored by the European Society for Blood and Marrow Transplantation and the European Hematology Association
He and his colleagues prospectively collected samples from 16 patients with diffuse large B-cell lymphoma, 5 with transformed follicular lymphoma, and 7 with primary mediastinal B-cell lymphoma who were treated with either axicabtagene ciloleucel (axi-cel; Yescarta) or tisagenlecleucel (tisa-cal; Kymriah) between November 2019 and July 2020. CAR T cells were monitored using flow cytometry.
Pivotal trial data and subsequent findings with respect to tisa-cel and axi-cel have demonstrated that CAR T-cell engraftment and in vivo expansion have a crucial impact on disease response and toxicity: a cut-off value of CAR+ cells at day 9 greater than 24.5/microliters distinguished responders from nonresponders with a sensitivity of 87.5% and specificity of 81%, Dr. Corradini noted.
“But we have also devised a methodology by digital droplet PCR (ddPCR) recently that correlates perfectly with the flow cytometry data,” he said, adding that the assay is “easy and allowed precise enumeration of the CAR T cells in the blood of the patient.”
The R square (coefficient of determination) for ddPCR and flow cytometry was 0.9995 and 0.9997 for tisa-cel and axi-cel, respectively (P < .0001 for each). This is particularly useful for assessing whether low CAR T-cell levels on flow cytometry are background signals resulting from nonspecific binding of the antibodies or true low levels, and the findings therefore have implications for improving clinical decision-making and outcomes in CAR T-cell therapy recipients, he said.
REPORTING FROM CART21
Steroid complications in GVHD common, boost costs of care
Steroids are usually the first choice of therapy for the treatment of patients with graft-vs.-host disease (GVHD), but complications from steroid use may carry a high financial cost, investigators caution.
Among 689 patients with a diagnosis of GVHD following a hematopoietic stem cell transplant (HSCT) who received steroids, 685 (97%) had at least one steroid-related complication, resulting in nearly $165,000 in mean health-care costs over 24 months, said Elizabeth J. Bell, PhD, MPH, an epidemiologist at Optum Inc.
“For both acute and chronic GVHD, the standard of care for first-line treatment is systemic steroids. The complications associated with steroid treatment are well known. However, the health-care resources utilized and the costs incurred by these patients are not well-quantified,” she said at the Transplantation & Cellular Therapies Meetings (Abstract 12).
Dr. Bell reported the results of a retrospective database analysis on costs associated with steroid complications in HSCT recipients at the meeting, which was held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.
She and colleagues from Optum, Incyte, and the University of Minnesota in Minneapolis looked at data on 689 patients with a diagnosis of GVHD after HSCT who received systemic steroids from July 1, 2010, through Aug. 31, 2019. The data were extracted from the Optum Research database, and included U.S. commercial and Medicare Advantage patients.
They looked at total complications and steroid-associated complications in each of four categories: infections; metabolic or endocrine complications (for example, diabetes, dyslipidemia); gastrointestinal (GI) complications (e.g., peptic ulcer disease); and bone or muscle complications (myopathy, etc).
They estimated costs based on International Classification of Diseases (ICD) codes for any steroid complications during the 24 months after steroid initiation, including those complications that may have been present at the time of GVHD diagnosis.
The median patient age was 55 years, and 60% of the sample were male. The mean Charlson Comorbidity Index score at baseline was 3.
Overall, 22% of patients had only acute GVHD, 21% had only chronic GVHD, and 39% had both acute and chronic disease. The GVHD type was unspecified in the remaining 18%.
The median time from GVHD diagnosis to initiating steroids was 30 days for patients with both acute and chronic disease, as well as those with both presentations. The median time to initiation was 36 days for patients with unspecified GVHD type.
The median cumulative duration of steroid use over 24 months was 62 days for patients with acute GVHD, 208 days for those with chronic GVHD, 166 days for those with both, and 74 days for patients with unspecified GVHD type.
As noted before, complications occurred in 97% of patients, with infections being the most common complications, occurring in 80% of patients, followed by metabolic/endocrine complications in 32%, gastrointestinal in 29%, and bone/muscle complications in 20%.
For the 665 patients who had any steroid-related complication, the mean costs of steroid-associated care in the 24 months after they were started on steroids was $164,787, and the median cost was $50,834.
Health care costs were highest among patients with infections, at a mean of $167,473, and a median of $57,680, followed by bone/muscle conditions ($75,289 and $2,057, respectively), GI conditions ($67,861 and $3,360), and metabolic or endocrine conditions ($47, 101 and $1,164).
In all categories, hospitalizations accounted for the large majority of costs.
Two-thirds (66%) of patients who experienced any steroid-related complication required hospitalization, primarily for infections.
Among all patients with complications, the median cumulative hospital stay over 24 months was 20 days, with bone/muscle complications and infections associated with a median of 19 and 18 days of hospitalization, respectively.
Dr. Bell acknowledged that the study was limited by use of ICD coding to identify steroid complication-related health-care utilization and costs, which can be imprecise, and by the fact that the analysis included only complications resulting in health care use as documented in medical claims. In addition, the investigators noted that they could not control for the possibility that steroids exacerbated conditions that existed at baseline.
“These findings emphasize the need to cautiously evaluate the treatment options for patients with GVHD. Future study with medical records is needed to provide insights on the clinical aspects of the complications (e.g., severity and suspected causality),” Dr. Bell and colleagues concluded in the study’s abstract.
Definitions questioned
An HSCT specialist approached for comment said that the findings of the study made sense, but she had questions regarding the study methodology.
“I would intuitively think that steroid-associated complications are a major cause of health care use in GVHD patients and it’s interesting to see that there is emerging data to support this hypothesis,” HSCT specialist Hélène Schoemans, MD of the University of Leuven, Belgium, said in an interview.
She noted, however, that “it is surprising that the period of steroid initiation was the same for acute and chronic GVHD,” and questioned whether that anomalous finding could be due to the study’s definition of acute and chronic GVHD or to how the period from baseline to steroid initiation was defined.
The questions about the definitions and timing of therapy make it uncertain as to whether the complications reported were caused by steroids or by some other factor, she suggested.
The study was supported by Optum Inc. Dr. Bell is an employee of the company, and a paid consultant of Incyte. Dr. Schoemans has received travel expenses from Celgene, Abbvie, and Incyte; is part of the advisory boards for Incyte; and has received speakers fees from Novartis, Incyte, Jazz Pharmaceuticals, and Takeda.
Steroids are usually the first choice of therapy for the treatment of patients with graft-vs.-host disease (GVHD), but complications from steroid use may carry a high financial cost, investigators caution.
Among 689 patients with a diagnosis of GVHD following a hematopoietic stem cell transplant (HSCT) who received steroids, 685 (97%) had at least one steroid-related complication, resulting in nearly $165,000 in mean health-care costs over 24 months, said Elizabeth J. Bell, PhD, MPH, an epidemiologist at Optum Inc.
“For both acute and chronic GVHD, the standard of care for first-line treatment is systemic steroids. The complications associated with steroid treatment are well known. However, the health-care resources utilized and the costs incurred by these patients are not well-quantified,” she said at the Transplantation & Cellular Therapies Meetings (Abstract 12).
Dr. Bell reported the results of a retrospective database analysis on costs associated with steroid complications in HSCT recipients at the meeting, which was held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.
She and colleagues from Optum, Incyte, and the University of Minnesota in Minneapolis looked at data on 689 patients with a diagnosis of GVHD after HSCT who received systemic steroids from July 1, 2010, through Aug. 31, 2019. The data were extracted from the Optum Research database, and included U.S. commercial and Medicare Advantage patients.
They looked at total complications and steroid-associated complications in each of four categories: infections; metabolic or endocrine complications (for example, diabetes, dyslipidemia); gastrointestinal (GI) complications (e.g., peptic ulcer disease); and bone or muscle complications (myopathy, etc).
They estimated costs based on International Classification of Diseases (ICD) codes for any steroid complications during the 24 months after steroid initiation, including those complications that may have been present at the time of GVHD diagnosis.
The median patient age was 55 years, and 60% of the sample were male. The mean Charlson Comorbidity Index score at baseline was 3.
Overall, 22% of patients had only acute GVHD, 21% had only chronic GVHD, and 39% had both acute and chronic disease. The GVHD type was unspecified in the remaining 18%.
The median time from GVHD diagnosis to initiating steroids was 30 days for patients with both acute and chronic disease, as well as those with both presentations. The median time to initiation was 36 days for patients with unspecified GVHD type.
The median cumulative duration of steroid use over 24 months was 62 days for patients with acute GVHD, 208 days for those with chronic GVHD, 166 days for those with both, and 74 days for patients with unspecified GVHD type.
As noted before, complications occurred in 97% of patients, with infections being the most common complications, occurring in 80% of patients, followed by metabolic/endocrine complications in 32%, gastrointestinal in 29%, and bone/muscle complications in 20%.
For the 665 patients who had any steroid-related complication, the mean costs of steroid-associated care in the 24 months after they were started on steroids was $164,787, and the median cost was $50,834.
Health care costs were highest among patients with infections, at a mean of $167,473, and a median of $57,680, followed by bone/muscle conditions ($75,289 and $2,057, respectively), GI conditions ($67,861 and $3,360), and metabolic or endocrine conditions ($47, 101 and $1,164).
In all categories, hospitalizations accounted for the large majority of costs.
Two-thirds (66%) of patients who experienced any steroid-related complication required hospitalization, primarily for infections.
Among all patients with complications, the median cumulative hospital stay over 24 months was 20 days, with bone/muscle complications and infections associated with a median of 19 and 18 days of hospitalization, respectively.
Dr. Bell acknowledged that the study was limited by use of ICD coding to identify steroid complication-related health-care utilization and costs, which can be imprecise, and by the fact that the analysis included only complications resulting in health care use as documented in medical claims. In addition, the investigators noted that they could not control for the possibility that steroids exacerbated conditions that existed at baseline.
“These findings emphasize the need to cautiously evaluate the treatment options for patients with GVHD. Future study with medical records is needed to provide insights on the clinical aspects of the complications (e.g., severity and suspected causality),” Dr. Bell and colleagues concluded in the study’s abstract.
Definitions questioned
An HSCT specialist approached for comment said that the findings of the study made sense, but she had questions regarding the study methodology.
“I would intuitively think that steroid-associated complications are a major cause of health care use in GVHD patients and it’s interesting to see that there is emerging data to support this hypothesis,” HSCT specialist Hélène Schoemans, MD of the University of Leuven, Belgium, said in an interview.
She noted, however, that “it is surprising that the period of steroid initiation was the same for acute and chronic GVHD,” and questioned whether that anomalous finding could be due to the study’s definition of acute and chronic GVHD or to how the period from baseline to steroid initiation was defined.
The questions about the definitions and timing of therapy make it uncertain as to whether the complications reported were caused by steroids or by some other factor, she suggested.
The study was supported by Optum Inc. Dr. Bell is an employee of the company, and a paid consultant of Incyte. Dr. Schoemans has received travel expenses from Celgene, Abbvie, and Incyte; is part of the advisory boards for Incyte; and has received speakers fees from Novartis, Incyte, Jazz Pharmaceuticals, and Takeda.
Steroids are usually the first choice of therapy for the treatment of patients with graft-vs.-host disease (GVHD), but complications from steroid use may carry a high financial cost, investigators caution.
Among 689 patients with a diagnosis of GVHD following a hematopoietic stem cell transplant (HSCT) who received steroids, 685 (97%) had at least one steroid-related complication, resulting in nearly $165,000 in mean health-care costs over 24 months, said Elizabeth J. Bell, PhD, MPH, an epidemiologist at Optum Inc.
“For both acute and chronic GVHD, the standard of care for first-line treatment is systemic steroids. The complications associated with steroid treatment are well known. However, the health-care resources utilized and the costs incurred by these patients are not well-quantified,” she said at the Transplantation & Cellular Therapies Meetings (Abstract 12).
Dr. Bell reported the results of a retrospective database analysis on costs associated with steroid complications in HSCT recipients at the meeting, which was held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.
She and colleagues from Optum, Incyte, and the University of Minnesota in Minneapolis looked at data on 689 patients with a diagnosis of GVHD after HSCT who received systemic steroids from July 1, 2010, through Aug. 31, 2019. The data were extracted from the Optum Research database, and included U.S. commercial and Medicare Advantage patients.
They looked at total complications and steroid-associated complications in each of four categories: infections; metabolic or endocrine complications (for example, diabetes, dyslipidemia); gastrointestinal (GI) complications (e.g., peptic ulcer disease); and bone or muscle complications (myopathy, etc).
They estimated costs based on International Classification of Diseases (ICD) codes for any steroid complications during the 24 months after steroid initiation, including those complications that may have been present at the time of GVHD diagnosis.
The median patient age was 55 years, and 60% of the sample were male. The mean Charlson Comorbidity Index score at baseline was 3.
Overall, 22% of patients had only acute GVHD, 21% had only chronic GVHD, and 39% had both acute and chronic disease. The GVHD type was unspecified in the remaining 18%.
The median time from GVHD diagnosis to initiating steroids was 30 days for patients with both acute and chronic disease, as well as those with both presentations. The median time to initiation was 36 days for patients with unspecified GVHD type.
The median cumulative duration of steroid use over 24 months was 62 days for patients with acute GVHD, 208 days for those with chronic GVHD, 166 days for those with both, and 74 days for patients with unspecified GVHD type.
As noted before, complications occurred in 97% of patients, with infections being the most common complications, occurring in 80% of patients, followed by metabolic/endocrine complications in 32%, gastrointestinal in 29%, and bone/muscle complications in 20%.
For the 665 patients who had any steroid-related complication, the mean costs of steroid-associated care in the 24 months after they were started on steroids was $164,787, and the median cost was $50,834.
Health care costs were highest among patients with infections, at a mean of $167,473, and a median of $57,680, followed by bone/muscle conditions ($75,289 and $2,057, respectively), GI conditions ($67,861 and $3,360), and metabolic or endocrine conditions ($47, 101 and $1,164).
In all categories, hospitalizations accounted for the large majority of costs.
Two-thirds (66%) of patients who experienced any steroid-related complication required hospitalization, primarily for infections.
Among all patients with complications, the median cumulative hospital stay over 24 months was 20 days, with bone/muscle complications and infections associated with a median of 19 and 18 days of hospitalization, respectively.
Dr. Bell acknowledged that the study was limited by use of ICD coding to identify steroid complication-related health-care utilization and costs, which can be imprecise, and by the fact that the analysis included only complications resulting in health care use as documented in medical claims. In addition, the investigators noted that they could not control for the possibility that steroids exacerbated conditions that existed at baseline.
“These findings emphasize the need to cautiously evaluate the treatment options for patients with GVHD. Future study with medical records is needed to provide insights on the clinical aspects of the complications (e.g., severity and suspected causality),” Dr. Bell and colleagues concluded in the study’s abstract.
Definitions questioned
An HSCT specialist approached for comment said that the findings of the study made sense, but she had questions regarding the study methodology.
“I would intuitively think that steroid-associated complications are a major cause of health care use in GVHD patients and it’s interesting to see that there is emerging data to support this hypothesis,” HSCT specialist Hélène Schoemans, MD of the University of Leuven, Belgium, said in an interview.
She noted, however, that “it is surprising that the period of steroid initiation was the same for acute and chronic GVHD,” and questioned whether that anomalous finding could be due to the study’s definition of acute and chronic GVHD or to how the period from baseline to steroid initiation was defined.
The questions about the definitions and timing of therapy make it uncertain as to whether the complications reported were caused by steroids or by some other factor, she suggested.
The study was supported by Optum Inc. Dr. Bell is an employee of the company, and a paid consultant of Incyte. Dr. Schoemans has received travel expenses from Celgene, Abbvie, and Incyte; is part of the advisory boards for Incyte; and has received speakers fees from Novartis, Incyte, Jazz Pharmaceuticals, and Takeda.
FROM TCT 2021