User login
B-cell repletion is common with MS drug, but no symptom worsening
. However, there are no corresponding worsening of symptoms or signs of a “wearing off” effect, new research shows.
“Most people expect that since this is a B-cell depleting drug, that if you are not depleting B cells, then that should be reflected clinically and there should be some breakthrough activity,” said study investigator Joshua D. Katz, MD, codirector of the Elliot Lewis Center for Multiple Sclerosis Care in Wellesley, Massachusetts.
“So [these results] were a surprise, but I would not conclude from our data that B-cell repletion does not put someone at risk. We can only say that we didn’t observe anybody having a breakthrough,” he added.
The research was presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Real-world study
Preapproval clinical trials of ocrelizumab suggest about 5% of patients experience a repletion of B cells. However, the timing and association with breakthrough symptoms were unclear.
To investigate, Dr. Katz and colleagues conducted two studies. The first is a substudy of the prospective ACAPELLA trial to assess ocrelizumab-associated adverse events in a real-world population. The study included 294 patients with relapsing and progressive forms of MS treated with at least two cycles of ocrelizumab, given as infusion once every 6 months.
The results showed that overall, 91 (31%) of the 294 patients had some degree of repletion at one or more timepoints.
In categorizing patients according to their highest CD19 measure after two cycles, 108 patients (64.7%) had no significant repletion of B-cells after infusion, defined as an increase of less than 10 cells/μL, while 45 (26.9%) were considered mild repleters, defined as having increases of 10-49 cells/μL.
Seven patients (4.2%) were moderate repleters, with an increase of 50-79 cells/μL, and 7 (4.2%) were categorized as marked repleters, with increases of 80 or more cells/μL.
Eight patients in the study fully repleted, with values from 114-319 cells/μL, occurring between 23 and 34 weeks of the last infusion.
However, there was no relationship between repletion of the B-cells and clinical or MRI evidence of relapse.
Of note, the proportion of patients who did not have B-cell repletion increased with greater numbers of infusions. Whereas 64.7% were non-repleters at cycle 2, that number increased to 88.8% by cycle 6, with a slight drop to 85.6% being non-repleters by cycle 7 (36 months).
“Mild B-cell repletion was fairly common after two cycles of ocrelizumab, but with repeated dosing, a greater proportion of patients were non-repleters, suggesting that cumulative exposure to ocrelizumab results in greater depletion,” the researchers noted.
However, “while the number of moderate or marked repleters in our study was small, they had a tendency to remain repleters over time with subsequent infusions,” they added.
In looking at patient characteristics, moderate and marketed repleters had higher mean BMI (34.1 and 32.6, respectively) compared with the non- and mild repleters (27.0 and 29.4, respectively; P < .0001).
Dr. Katz noted that the increased risk of B-cell repletion with higher BMI was not a surprise. This association, he said, “makes sense” because patients’ relative exposure to ocrelizumab decreases with higher BMI. Similar patterns with BMI were observed in the clinical trial for ocrelizumab approval, in which patients with lower BMI tended to have greater improvement.
No symptom worsening
In the second study, the investigators further examined changes in symptom burden related to the amount of time from ocrelizumab infusion. They evaluated 110 patients, aged 18-80 (mean age 44.8) who had Expanded Disability Status Scale (EDSS) scores between 0-7. Study participants were either initiating ocrelizumab or had been on the drug for at least 1 year.
Symptom burden was evaluated with the Neurological Disorders (Neuro-Qol) questionnaire and SymptoMScreen patient-reported outcomes at the beginning of the study at week 4, and near the end of the ocrelizumab infusion cycle, at week 22.
The researchers found that among 69 participants who completed the questionnaires, there were no significant differences at week 22 versus week 4 across a wide range of symptoms, including walking, spasticity, pain, fatigue, cognitive function, dizziness, and depression between the two timepoints.
The only change on the Neuro-QoL score was in the sleep disturbance domain, which improved marginally at the end of the cycle (P = .052). This study did not evaluate changes in B-cells.
Dr. Katz noted that the inclusion of patients over age of 55 in the study offered important insights.
“Our hypothesis was that we were going to start seeing a higher rate of complications, especially infections, in people who are older and may be at a higher risk of infection and disability,” Dr. Katz noted. “But so far, we haven’t seen any higher risk in older patients or those with more disability than anyone else, which is good news.”
Amplification of baseline symptoms not uncommon
Commenting on the research, Scott D. Newsome, DO, current president of the CMSC, noted that although no association was observed between the B-cell repletion and symptoms, amplification of flare-up symptoms that are linked to B-cell depleting therapy infusion timing are not uncommon.
“The ‘wearing-off’ phenomenon is not unique to the B-cell therapies,” said Dr. Newsome, who is also director of Johns Hopkins University’s Neurosciences Consultation and Infusion Center and an associate professor of neurology at the JHU med school. “With natalizumab (Tysabri), patients can have an amplification of baseline symptoms as they come closer to their next infusion, and it has been speculated that maybe it was something biologically happening, such as inflammatory cytokines ramping back up or some other mechanisms.”
“Now that we have the B-cell depleting therapies, we tend see the same kind of pattern, where a few weeks leading up to the next infusion, people will develop these amplified symptoms,” he said.
The possibility of a cumulative effect, appearing to address the B-cell repletion associated with early infusions, could have implications over time, Dr. Newsome noted.
“This is important because if people are going on these therapies long-term, the question we may need to ask is whether they actually need to continue to get an infusion every 6 months,” he said.
As these questions around the safety of long-term immunosuppressant drug use continue, different dosing regimens may need to be considered in order to mitigate potential infection risk, he added.
Dr. Katz reports consulting and/or speakers’ bureau relationships with Alexion, Biogen, EMD Serono, Genentech, Novartis, and Sanofi. Dr. Newsome reports relationships with Autobahn, BioIncept, Biogen, Genentech, Novartis, Bristol Myers Squibb, EMD Serono, Greenwich Biosciences, and MedDay Pharmaceuticals.
A version of this article first appeared on Medscape.com.
. However, there are no corresponding worsening of symptoms or signs of a “wearing off” effect, new research shows.
“Most people expect that since this is a B-cell depleting drug, that if you are not depleting B cells, then that should be reflected clinically and there should be some breakthrough activity,” said study investigator Joshua D. Katz, MD, codirector of the Elliot Lewis Center for Multiple Sclerosis Care in Wellesley, Massachusetts.
“So [these results] were a surprise, but I would not conclude from our data that B-cell repletion does not put someone at risk. We can only say that we didn’t observe anybody having a breakthrough,” he added.
The research was presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Real-world study
Preapproval clinical trials of ocrelizumab suggest about 5% of patients experience a repletion of B cells. However, the timing and association with breakthrough symptoms were unclear.
To investigate, Dr. Katz and colleagues conducted two studies. The first is a substudy of the prospective ACAPELLA trial to assess ocrelizumab-associated adverse events in a real-world population. The study included 294 patients with relapsing and progressive forms of MS treated with at least two cycles of ocrelizumab, given as infusion once every 6 months.
The results showed that overall, 91 (31%) of the 294 patients had some degree of repletion at one or more timepoints.
In categorizing patients according to their highest CD19 measure after two cycles, 108 patients (64.7%) had no significant repletion of B-cells after infusion, defined as an increase of less than 10 cells/μL, while 45 (26.9%) were considered mild repleters, defined as having increases of 10-49 cells/μL.
Seven patients (4.2%) were moderate repleters, with an increase of 50-79 cells/μL, and 7 (4.2%) were categorized as marked repleters, with increases of 80 or more cells/μL.
Eight patients in the study fully repleted, with values from 114-319 cells/μL, occurring between 23 and 34 weeks of the last infusion.
However, there was no relationship between repletion of the B-cells and clinical or MRI evidence of relapse.
Of note, the proportion of patients who did not have B-cell repletion increased with greater numbers of infusions. Whereas 64.7% were non-repleters at cycle 2, that number increased to 88.8% by cycle 6, with a slight drop to 85.6% being non-repleters by cycle 7 (36 months).
“Mild B-cell repletion was fairly common after two cycles of ocrelizumab, but with repeated dosing, a greater proportion of patients were non-repleters, suggesting that cumulative exposure to ocrelizumab results in greater depletion,” the researchers noted.
However, “while the number of moderate or marked repleters in our study was small, they had a tendency to remain repleters over time with subsequent infusions,” they added.
In looking at patient characteristics, moderate and marketed repleters had higher mean BMI (34.1 and 32.6, respectively) compared with the non- and mild repleters (27.0 and 29.4, respectively; P < .0001).
Dr. Katz noted that the increased risk of B-cell repletion with higher BMI was not a surprise. This association, he said, “makes sense” because patients’ relative exposure to ocrelizumab decreases with higher BMI. Similar patterns with BMI were observed in the clinical trial for ocrelizumab approval, in which patients with lower BMI tended to have greater improvement.
No symptom worsening
In the second study, the investigators further examined changes in symptom burden related to the amount of time from ocrelizumab infusion. They evaluated 110 patients, aged 18-80 (mean age 44.8) who had Expanded Disability Status Scale (EDSS) scores between 0-7. Study participants were either initiating ocrelizumab or had been on the drug for at least 1 year.
Symptom burden was evaluated with the Neurological Disorders (Neuro-Qol) questionnaire and SymptoMScreen patient-reported outcomes at the beginning of the study at week 4, and near the end of the ocrelizumab infusion cycle, at week 22.
The researchers found that among 69 participants who completed the questionnaires, there were no significant differences at week 22 versus week 4 across a wide range of symptoms, including walking, spasticity, pain, fatigue, cognitive function, dizziness, and depression between the two timepoints.
The only change on the Neuro-QoL score was in the sleep disturbance domain, which improved marginally at the end of the cycle (P = .052). This study did not evaluate changes in B-cells.
Dr. Katz noted that the inclusion of patients over age of 55 in the study offered important insights.
“Our hypothesis was that we were going to start seeing a higher rate of complications, especially infections, in people who are older and may be at a higher risk of infection and disability,” Dr. Katz noted. “But so far, we haven’t seen any higher risk in older patients or those with more disability than anyone else, which is good news.”
Amplification of baseline symptoms not uncommon
Commenting on the research, Scott D. Newsome, DO, current president of the CMSC, noted that although no association was observed between the B-cell repletion and symptoms, amplification of flare-up symptoms that are linked to B-cell depleting therapy infusion timing are not uncommon.
“The ‘wearing-off’ phenomenon is not unique to the B-cell therapies,” said Dr. Newsome, who is also director of Johns Hopkins University’s Neurosciences Consultation and Infusion Center and an associate professor of neurology at the JHU med school. “With natalizumab (Tysabri), patients can have an amplification of baseline symptoms as they come closer to their next infusion, and it has been speculated that maybe it was something biologically happening, such as inflammatory cytokines ramping back up or some other mechanisms.”
“Now that we have the B-cell depleting therapies, we tend see the same kind of pattern, where a few weeks leading up to the next infusion, people will develop these amplified symptoms,” he said.
The possibility of a cumulative effect, appearing to address the B-cell repletion associated with early infusions, could have implications over time, Dr. Newsome noted.
“This is important because if people are going on these therapies long-term, the question we may need to ask is whether they actually need to continue to get an infusion every 6 months,” he said.
As these questions around the safety of long-term immunosuppressant drug use continue, different dosing regimens may need to be considered in order to mitigate potential infection risk, he added.
Dr. Katz reports consulting and/or speakers’ bureau relationships with Alexion, Biogen, EMD Serono, Genentech, Novartis, and Sanofi. Dr. Newsome reports relationships with Autobahn, BioIncept, Biogen, Genentech, Novartis, Bristol Myers Squibb, EMD Serono, Greenwich Biosciences, and MedDay Pharmaceuticals.
A version of this article first appeared on Medscape.com.
. However, there are no corresponding worsening of symptoms or signs of a “wearing off” effect, new research shows.
“Most people expect that since this is a B-cell depleting drug, that if you are not depleting B cells, then that should be reflected clinically and there should be some breakthrough activity,” said study investigator Joshua D. Katz, MD, codirector of the Elliot Lewis Center for Multiple Sclerosis Care in Wellesley, Massachusetts.
“So [these results] were a surprise, but I would not conclude from our data that B-cell repletion does not put someone at risk. We can only say that we didn’t observe anybody having a breakthrough,” he added.
The research was presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Real-world study
Preapproval clinical trials of ocrelizumab suggest about 5% of patients experience a repletion of B cells. However, the timing and association with breakthrough symptoms were unclear.
To investigate, Dr. Katz and colleagues conducted two studies. The first is a substudy of the prospective ACAPELLA trial to assess ocrelizumab-associated adverse events in a real-world population. The study included 294 patients with relapsing and progressive forms of MS treated with at least two cycles of ocrelizumab, given as infusion once every 6 months.
The results showed that overall, 91 (31%) of the 294 patients had some degree of repletion at one or more timepoints.
In categorizing patients according to their highest CD19 measure after two cycles, 108 patients (64.7%) had no significant repletion of B-cells after infusion, defined as an increase of less than 10 cells/μL, while 45 (26.9%) were considered mild repleters, defined as having increases of 10-49 cells/μL.
Seven patients (4.2%) were moderate repleters, with an increase of 50-79 cells/μL, and 7 (4.2%) were categorized as marked repleters, with increases of 80 or more cells/μL.
Eight patients in the study fully repleted, with values from 114-319 cells/μL, occurring between 23 and 34 weeks of the last infusion.
However, there was no relationship between repletion of the B-cells and clinical or MRI evidence of relapse.
Of note, the proportion of patients who did not have B-cell repletion increased with greater numbers of infusions. Whereas 64.7% were non-repleters at cycle 2, that number increased to 88.8% by cycle 6, with a slight drop to 85.6% being non-repleters by cycle 7 (36 months).
“Mild B-cell repletion was fairly common after two cycles of ocrelizumab, but with repeated dosing, a greater proportion of patients were non-repleters, suggesting that cumulative exposure to ocrelizumab results in greater depletion,” the researchers noted.
However, “while the number of moderate or marked repleters in our study was small, they had a tendency to remain repleters over time with subsequent infusions,” they added.
In looking at patient characteristics, moderate and marketed repleters had higher mean BMI (34.1 and 32.6, respectively) compared with the non- and mild repleters (27.0 and 29.4, respectively; P < .0001).
Dr. Katz noted that the increased risk of B-cell repletion with higher BMI was not a surprise. This association, he said, “makes sense” because patients’ relative exposure to ocrelizumab decreases with higher BMI. Similar patterns with BMI were observed in the clinical trial for ocrelizumab approval, in which patients with lower BMI tended to have greater improvement.
No symptom worsening
In the second study, the investigators further examined changes in symptom burden related to the amount of time from ocrelizumab infusion. They evaluated 110 patients, aged 18-80 (mean age 44.8) who had Expanded Disability Status Scale (EDSS) scores between 0-7. Study participants were either initiating ocrelizumab or had been on the drug for at least 1 year.
Symptom burden was evaluated with the Neurological Disorders (Neuro-Qol) questionnaire and SymptoMScreen patient-reported outcomes at the beginning of the study at week 4, and near the end of the ocrelizumab infusion cycle, at week 22.
The researchers found that among 69 participants who completed the questionnaires, there were no significant differences at week 22 versus week 4 across a wide range of symptoms, including walking, spasticity, pain, fatigue, cognitive function, dizziness, and depression between the two timepoints.
The only change on the Neuro-QoL score was in the sleep disturbance domain, which improved marginally at the end of the cycle (P = .052). This study did not evaluate changes in B-cells.
Dr. Katz noted that the inclusion of patients over age of 55 in the study offered important insights.
“Our hypothesis was that we were going to start seeing a higher rate of complications, especially infections, in people who are older and may be at a higher risk of infection and disability,” Dr. Katz noted. “But so far, we haven’t seen any higher risk in older patients or those with more disability than anyone else, which is good news.”
Amplification of baseline symptoms not uncommon
Commenting on the research, Scott D. Newsome, DO, current president of the CMSC, noted that although no association was observed between the B-cell repletion and symptoms, amplification of flare-up symptoms that are linked to B-cell depleting therapy infusion timing are not uncommon.
“The ‘wearing-off’ phenomenon is not unique to the B-cell therapies,” said Dr. Newsome, who is also director of Johns Hopkins University’s Neurosciences Consultation and Infusion Center and an associate professor of neurology at the JHU med school. “With natalizumab (Tysabri), patients can have an amplification of baseline symptoms as they come closer to their next infusion, and it has been speculated that maybe it was something biologically happening, such as inflammatory cytokines ramping back up or some other mechanisms.”
“Now that we have the B-cell depleting therapies, we tend see the same kind of pattern, where a few weeks leading up to the next infusion, people will develop these amplified symptoms,” he said.
The possibility of a cumulative effect, appearing to address the B-cell repletion associated with early infusions, could have implications over time, Dr. Newsome noted.
“This is important because if people are going on these therapies long-term, the question we may need to ask is whether they actually need to continue to get an infusion every 6 months,” he said.
As these questions around the safety of long-term immunosuppressant drug use continue, different dosing regimens may need to be considered in order to mitigate potential infection risk, he added.
Dr. Katz reports consulting and/or speakers’ bureau relationships with Alexion, Biogen, EMD Serono, Genentech, Novartis, and Sanofi. Dr. Newsome reports relationships with Autobahn, BioIncept, Biogen, Genentech, Novartis, Bristol Myers Squibb, EMD Serono, Greenwich Biosciences, and MedDay Pharmaceuticals.
A version of this article first appeared on Medscape.com.
FROM CMSC 2021
Brief, automated cognitive test may offer key advantages in MS
(RRMS), new research shows.
“To our knowledge this is the first psychometric evaluation of the NIH Toolbox Cognition Battery in MS,” said study investigator Heena R. Manglani, MA, a clinical psychology fellow at Massachusetts General Hospital and Harvard Medical School, Boston.
“[The findings] suggest that the NIH Toolbox Cognition Battery may be used as an alternative to other gold-standard measures which may cover limited domains or require manual scoring,” added Ms. Manglani, who is working toward her PhD in clinical psychology.
The study was presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
An indicator of disease activity?
Cognitive deficits affecting a range of functions – including memory, attention and communication – are common in MS and affect 34% to 65% of patients with the disease, and the ability to detect and monitor such deficits has important implications.
Cognitive changes can provide a unique opportunity to identify acute disease activity in patients with MS that might be already occurring before physical manifestations become apparent, said Ms. Manglani. “If we can detect subtle changes in cognition that might foreshadow other symptoms of disease worsening, we can then allocate interventions that might stave off cognitive decline,” she explained.
While there is an array of well-established neuropsychological tests for the assessment of cognitive deficits, each has limitations, so a shorter, computerized, convenient, and reliable test could prove beneficial.
The NIHTB-CB has been validated in a large, nationally representative sample of individuals aged 8 to 85 and represents a potentially attractive option, yielding composite measures and scores corrected for age, gender, education, race, and ethnicity.
Comparative testing
To compare the test with other leading cognition tools used in MS, the investigators recruited 87 patients with RRMS (79% female, mean age 47.3 years). Participants were recruited to perform the full NIHTB-CB (about 30 minutes) and the full Minimal Assessment of Cognitive Function in Multiple Sclerosis (MACFIMS), which takes about 90 minutes, as well as some subsets from the Wechsler Adult Intelligence Scale-IV (WAIS-IV) covering processing speed and working memory. All patients had an EDSS of 5.0 or below and, on average, had been living with MS for about a decade.
The results showed the normative scores for NIHTB-CB had significant concordance with the other measures in terms of processing speed (concordance correlation coefficient [CCC] range = 0.28-0.48), working memory (CCC range = 0.27-0.37), and episodic memory (CCC range = 0.21-0.32). However, agreement was not shown for executive function (CCC range = 0.096-0.11).
Ms. Manglani noted executive function included various submeasures such as planning and inhibitory control. “Perhaps our gold standard measures tapped into a different facet of executive function than measured by the NIHTB,” she said.
The investigators found the proportion of participants classified as cognitively impaired was similar between the MACFIMS and the NIHTB tests.
Further assessment of fluid cognition on the NIHTB-CB – a composite of processing speed, working memory, episodic memory, and executive function that is automatically generated by the toolbox – showed the measure was negatively associated with disease severity, as measured by the EDSS (P = .006). However, the measure was not associated with a difference in depression (P = .39) or fatigue (P = .69).
Of note, a similar association with lower disease severity on the EDSS was not observed with MACFIMS.
“Interestingly, we found that only the NIHTB-CB fluid cognition was associated with disease severity, such that it was associated with nearly 11% of the variance in EDSS scores, and we were surprised that we didn’t see this with MACFIMS,” Ms. Manglani said.
Key advantages
The NIHTB-CB was developed as part of the NIH Blueprint for Neuroscience Research initiative and commissioned by 16 NIH Institutes to provide brief, efficient assessment measures of cognitive function.
The battery has been validated in healthy individuals and tested in other populations with neurologic disorders, including patients who have suffered stroke and traumatic brain injury.
Ms. Manglani noted that the NIHTB-CB had key advantages over other tests. “First, it is a 30-minute iPad-based battery, which is shorter than most cognitive batteries available, and one of the few that is completely computerized. In addition, it automatically scores performance and yields a report with both composite scores and scores for each subtest,” she said.
In addition, said Ms. Manglani, “the NIH toolbox has a large validation sample of individuals between 8-85 years of age and provides normative scores that account for age, gender, education, and race/ethnicity, which allows individuals’ performances to be compared with their peers.”
The findings underscore that with further validation, the battery could have an important role in MS, she added.
“The NIH Toolbox needs to be tested in all subtypes of MS, with a full range of disease severity, and in MS clinics to gauge the clinical feasibility. Larger samples and repeated assessments are also needed to assess the test-retest reliability,” she said.
The study had no specific funding. The authors have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(RRMS), new research shows.
“To our knowledge this is the first psychometric evaluation of the NIH Toolbox Cognition Battery in MS,” said study investigator Heena R. Manglani, MA, a clinical psychology fellow at Massachusetts General Hospital and Harvard Medical School, Boston.
“[The findings] suggest that the NIH Toolbox Cognition Battery may be used as an alternative to other gold-standard measures which may cover limited domains or require manual scoring,” added Ms. Manglani, who is working toward her PhD in clinical psychology.
The study was presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
An indicator of disease activity?
Cognitive deficits affecting a range of functions – including memory, attention and communication – are common in MS and affect 34% to 65% of patients with the disease, and the ability to detect and monitor such deficits has important implications.
Cognitive changes can provide a unique opportunity to identify acute disease activity in patients with MS that might be already occurring before physical manifestations become apparent, said Ms. Manglani. “If we can detect subtle changes in cognition that might foreshadow other symptoms of disease worsening, we can then allocate interventions that might stave off cognitive decline,” she explained.
While there is an array of well-established neuropsychological tests for the assessment of cognitive deficits, each has limitations, so a shorter, computerized, convenient, and reliable test could prove beneficial.
The NIHTB-CB has been validated in a large, nationally representative sample of individuals aged 8 to 85 and represents a potentially attractive option, yielding composite measures and scores corrected for age, gender, education, race, and ethnicity.
Comparative testing
To compare the test with other leading cognition tools used in MS, the investigators recruited 87 patients with RRMS (79% female, mean age 47.3 years). Participants were recruited to perform the full NIHTB-CB (about 30 minutes) and the full Minimal Assessment of Cognitive Function in Multiple Sclerosis (MACFIMS), which takes about 90 minutes, as well as some subsets from the Wechsler Adult Intelligence Scale-IV (WAIS-IV) covering processing speed and working memory. All patients had an EDSS of 5.0 or below and, on average, had been living with MS for about a decade.
The results showed the normative scores for NIHTB-CB had significant concordance with the other measures in terms of processing speed (concordance correlation coefficient [CCC] range = 0.28-0.48), working memory (CCC range = 0.27-0.37), and episodic memory (CCC range = 0.21-0.32). However, agreement was not shown for executive function (CCC range = 0.096-0.11).
Ms. Manglani noted executive function included various submeasures such as planning and inhibitory control. “Perhaps our gold standard measures tapped into a different facet of executive function than measured by the NIHTB,” she said.
The investigators found the proportion of participants classified as cognitively impaired was similar between the MACFIMS and the NIHTB tests.
Further assessment of fluid cognition on the NIHTB-CB – a composite of processing speed, working memory, episodic memory, and executive function that is automatically generated by the toolbox – showed the measure was negatively associated with disease severity, as measured by the EDSS (P = .006). However, the measure was not associated with a difference in depression (P = .39) or fatigue (P = .69).
Of note, a similar association with lower disease severity on the EDSS was not observed with MACFIMS.
“Interestingly, we found that only the NIHTB-CB fluid cognition was associated with disease severity, such that it was associated with nearly 11% of the variance in EDSS scores, and we were surprised that we didn’t see this with MACFIMS,” Ms. Manglani said.
Key advantages
The NIHTB-CB was developed as part of the NIH Blueprint for Neuroscience Research initiative and commissioned by 16 NIH Institutes to provide brief, efficient assessment measures of cognitive function.
The battery has been validated in healthy individuals and tested in other populations with neurologic disorders, including patients who have suffered stroke and traumatic brain injury.
Ms. Manglani noted that the NIHTB-CB had key advantages over other tests. “First, it is a 30-minute iPad-based battery, which is shorter than most cognitive batteries available, and one of the few that is completely computerized. In addition, it automatically scores performance and yields a report with both composite scores and scores for each subtest,” she said.
In addition, said Ms. Manglani, “the NIH toolbox has a large validation sample of individuals between 8-85 years of age and provides normative scores that account for age, gender, education, and race/ethnicity, which allows individuals’ performances to be compared with their peers.”
The findings underscore that with further validation, the battery could have an important role in MS, she added.
“The NIH Toolbox needs to be tested in all subtypes of MS, with a full range of disease severity, and in MS clinics to gauge the clinical feasibility. Larger samples and repeated assessments are also needed to assess the test-retest reliability,” she said.
The study had no specific funding. The authors have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(RRMS), new research shows.
“To our knowledge this is the first psychometric evaluation of the NIH Toolbox Cognition Battery in MS,” said study investigator Heena R. Manglani, MA, a clinical psychology fellow at Massachusetts General Hospital and Harvard Medical School, Boston.
“[The findings] suggest that the NIH Toolbox Cognition Battery may be used as an alternative to other gold-standard measures which may cover limited domains or require manual scoring,” added Ms. Manglani, who is working toward her PhD in clinical psychology.
The study was presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
An indicator of disease activity?
Cognitive deficits affecting a range of functions – including memory, attention and communication – are common in MS and affect 34% to 65% of patients with the disease, and the ability to detect and monitor such deficits has important implications.
Cognitive changes can provide a unique opportunity to identify acute disease activity in patients with MS that might be already occurring before physical manifestations become apparent, said Ms. Manglani. “If we can detect subtle changes in cognition that might foreshadow other symptoms of disease worsening, we can then allocate interventions that might stave off cognitive decline,” she explained.
While there is an array of well-established neuropsychological tests for the assessment of cognitive deficits, each has limitations, so a shorter, computerized, convenient, and reliable test could prove beneficial.
The NIHTB-CB has been validated in a large, nationally representative sample of individuals aged 8 to 85 and represents a potentially attractive option, yielding composite measures and scores corrected for age, gender, education, race, and ethnicity.
Comparative testing
To compare the test with other leading cognition tools used in MS, the investigators recruited 87 patients with RRMS (79% female, mean age 47.3 years). Participants were recruited to perform the full NIHTB-CB (about 30 minutes) and the full Minimal Assessment of Cognitive Function in Multiple Sclerosis (MACFIMS), which takes about 90 minutes, as well as some subsets from the Wechsler Adult Intelligence Scale-IV (WAIS-IV) covering processing speed and working memory. All patients had an EDSS of 5.0 or below and, on average, had been living with MS for about a decade.
The results showed the normative scores for NIHTB-CB had significant concordance with the other measures in terms of processing speed (concordance correlation coefficient [CCC] range = 0.28-0.48), working memory (CCC range = 0.27-0.37), and episodic memory (CCC range = 0.21-0.32). However, agreement was not shown for executive function (CCC range = 0.096-0.11).
Ms. Manglani noted executive function included various submeasures such as planning and inhibitory control. “Perhaps our gold standard measures tapped into a different facet of executive function than measured by the NIHTB,” she said.
The investigators found the proportion of participants classified as cognitively impaired was similar between the MACFIMS and the NIHTB tests.
Further assessment of fluid cognition on the NIHTB-CB – a composite of processing speed, working memory, episodic memory, and executive function that is automatically generated by the toolbox – showed the measure was negatively associated with disease severity, as measured by the EDSS (P = .006). However, the measure was not associated with a difference in depression (P = .39) or fatigue (P = .69).
Of note, a similar association with lower disease severity on the EDSS was not observed with MACFIMS.
“Interestingly, we found that only the NIHTB-CB fluid cognition was associated with disease severity, such that it was associated with nearly 11% of the variance in EDSS scores, and we were surprised that we didn’t see this with MACFIMS,” Ms. Manglani said.
Key advantages
The NIHTB-CB was developed as part of the NIH Blueprint for Neuroscience Research initiative and commissioned by 16 NIH Institutes to provide brief, efficient assessment measures of cognitive function.
The battery has been validated in healthy individuals and tested in other populations with neurologic disorders, including patients who have suffered stroke and traumatic brain injury.
Ms. Manglani noted that the NIHTB-CB had key advantages over other tests. “First, it is a 30-minute iPad-based battery, which is shorter than most cognitive batteries available, and one of the few that is completely computerized. In addition, it automatically scores performance and yields a report with both composite scores and scores for each subtest,” she said.
In addition, said Ms. Manglani, “the NIH toolbox has a large validation sample of individuals between 8-85 years of age and provides normative scores that account for age, gender, education, and race/ethnicity, which allows individuals’ performances to be compared with their peers.”
The findings underscore that with further validation, the battery could have an important role in MS, she added.
“The NIH Toolbox needs to be tested in all subtypes of MS, with a full range of disease severity, and in MS clinics to gauge the clinical feasibility. Larger samples and repeated assessments are also needed to assess the test-retest reliability,” she said.
The study had no specific funding. The authors have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM CMSC 2021
Specialty pharmacists may speed time to MS treatment
, new data suggest.
“As DMT management and treatment options for MS symptoms become more complex, clinical pharmacists can be utilized for medication education and management,” Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, department of neurology, Duke University Hospital, Durham, N.C., told delegates attending the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Since 2018, more than half a dozen DMTs have been approved for MS by the U.S. Food and Drug Administration. However, there is currently no established DMT selection algorithm, and because of this, there is a need for specialty pharmacists, she added.
“DMT approvals by the FDA have outpaced MS guideline recommendations. This can be overwhelming for patients, especially now that they have so many options to choose from,” she said.
Key services provided by specialty pharmacists include coordinating pretreatment requirements, as well as help with dosing, side effects, safety monitoring, and treatment adherence. In addition, pharmacists help with switching therapies, dispensing, and cost and authorization problems.
In reporting on improvements associated with specialty pharmacists, researchers from prominent MS centers around the country described specific outcomes.
Aids early intervention
A report on the Kaiser Permanente Washington (KPWA) MS Pharmacy Program detailed significant reductions in the time to address patients’ needs through the use of specialty pharmacists. In an assessment of 391 referrals to the program from 2019 to 2020, the average total time spent per patient per year dropped from 145 minutes in 2019 to 109 minutes in 2020.
Services included assessment of medication adherence, adverse drug reaction consultation, lab monitoring, patient counseling on initiation of a DMT, shared decision making, and follow-up visits.
“The KPWA MS Pharmacy Program plays an integral role in the care of patients with MS. The MS clinical pharmacists ensure patients are well informed about their DMT options and are fully educated about selected treatment,” the investigators noted.
A report on an outpatient MS clinic at Emory Healthcare, Atlanta, described how use of specialty pharmacist services resulted in a 49% reduction in time to treatment initiation with fingolimod. The time decreased from 83.9 days to 42.9 days following the introduction of specialty pharmacist services.
“Integration of a clinical pharmacy specialist in the therapeutic management of MS patients is crucial to early intervention with disease-modifying therapy,” the investigators noted.
A report on the specialty pharmacy services provided at Johns Hopkins MS Precision Medicine Center of Excellence, Baltimore, described an evaluation of 708 assessments between July 2019 and June 2020. Results showed that the vast majority (98%) of patients reported no missed days from work or school due to MS-related symptoms and that 99.3% reported no hospitalizations due to MS relapses, which are both key measures of MS treatment adherence.
High patient satisfaction
Patients reported high satisfaction with the in-house pharmacy on the National Association of Specialty Pharmacy’s patient satisfaction survey. In the survey, the average score was 82, compared with 79 for external specialty pharmacies.
“Moreover, patients were highly satisfied with the services provided at the pharmacy and were likely to continue receiving their comprehensive pharmacy care at our institution,” the researchers reported.
The study “highlights the value of pharmacists’ involvement in patient care and supports the need for continuation of integrated clinical services in health system specialty pharmacy,” the investigators noted.
CMSC President Scott D. Newsome, DO, director of the Neurosciences Consultation and Infusion Center at Green Spring Station, Lutherville, Maryland, and associate professor of neurology at Johns Hopkins University School of Medicine, said that as a clinician, he is highly satisfied with the specialty pharmacy services for MS at Johns Hopkins.
“Our pharmacists are fantastic in communicating with the prescriber if something comes up related to medication safety or they are concerned that the patient isn’t adhering to the medication,” Dr. Newsome said.
He noted that in addition to helping to alleviate the burden of a myriad of tasks associated with prescribing for patients with MS, specialty pharmacists may have an important impact on outcomes, although more data are needed.
“Having a specialty pharmacy involved in the care of our patients can help navigate the challenges associated with the process of obtaining approval for DMTs,” he said. “We know how important it is to expedite and shorten the time frame from writing the prescription to getting the person on their DMT.”
Telemedicine, other models
Although integrated specialty pharmacist services may seem out of reach for smaller MS clinics, the use of telemedicine and other models may help achieve similar results.
“A model I have seen is having pharmacists split their time between a specialty pharmacy and the MS clinic,” said Dr. Montgomery.
“A telemedicine model can also be utilized, in which a pharmacist can reach out to patients by telephone or through video visits. This would allow a pharmacist to be utilized for multiple clinics or as an MS specialist within a specialty pharmacy,” she added.
Whether provided in house or through telemedicine, a key benefit for clinicians is in freeing up valuable time, which has a domino effect in improving quality all around.
“In addition to improving safety outcomes, specialty pharmacists help with the allocation of clinic staff to other clinic responsibilities, and the utilization of services by patients results in more resources allocated for their care,” Dr. Montgomery said.
Dr. Montgomery is a nonpromotional speaker for Novartis and is on its advisory board.
A version of this article first appeared on Medscape.com.
, new data suggest.
“As DMT management and treatment options for MS symptoms become more complex, clinical pharmacists can be utilized for medication education and management,” Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, department of neurology, Duke University Hospital, Durham, N.C., told delegates attending the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Since 2018, more than half a dozen DMTs have been approved for MS by the U.S. Food and Drug Administration. However, there is currently no established DMT selection algorithm, and because of this, there is a need for specialty pharmacists, she added.
“DMT approvals by the FDA have outpaced MS guideline recommendations. This can be overwhelming for patients, especially now that they have so many options to choose from,” she said.
Key services provided by specialty pharmacists include coordinating pretreatment requirements, as well as help with dosing, side effects, safety monitoring, and treatment adherence. In addition, pharmacists help with switching therapies, dispensing, and cost and authorization problems.
In reporting on improvements associated with specialty pharmacists, researchers from prominent MS centers around the country described specific outcomes.
Aids early intervention
A report on the Kaiser Permanente Washington (KPWA) MS Pharmacy Program detailed significant reductions in the time to address patients’ needs through the use of specialty pharmacists. In an assessment of 391 referrals to the program from 2019 to 2020, the average total time spent per patient per year dropped from 145 minutes in 2019 to 109 minutes in 2020.
Services included assessment of medication adherence, adverse drug reaction consultation, lab monitoring, patient counseling on initiation of a DMT, shared decision making, and follow-up visits.
“The KPWA MS Pharmacy Program plays an integral role in the care of patients with MS. The MS clinical pharmacists ensure patients are well informed about their DMT options and are fully educated about selected treatment,” the investigators noted.
A report on an outpatient MS clinic at Emory Healthcare, Atlanta, described how use of specialty pharmacist services resulted in a 49% reduction in time to treatment initiation with fingolimod. The time decreased from 83.9 days to 42.9 days following the introduction of specialty pharmacist services.
“Integration of a clinical pharmacy specialist in the therapeutic management of MS patients is crucial to early intervention with disease-modifying therapy,” the investigators noted.
A report on the specialty pharmacy services provided at Johns Hopkins MS Precision Medicine Center of Excellence, Baltimore, described an evaluation of 708 assessments between July 2019 and June 2020. Results showed that the vast majority (98%) of patients reported no missed days from work or school due to MS-related symptoms and that 99.3% reported no hospitalizations due to MS relapses, which are both key measures of MS treatment adherence.
High patient satisfaction
Patients reported high satisfaction with the in-house pharmacy on the National Association of Specialty Pharmacy’s patient satisfaction survey. In the survey, the average score was 82, compared with 79 for external specialty pharmacies.
“Moreover, patients were highly satisfied with the services provided at the pharmacy and were likely to continue receiving their comprehensive pharmacy care at our institution,” the researchers reported.
The study “highlights the value of pharmacists’ involvement in patient care and supports the need for continuation of integrated clinical services in health system specialty pharmacy,” the investigators noted.
CMSC President Scott D. Newsome, DO, director of the Neurosciences Consultation and Infusion Center at Green Spring Station, Lutherville, Maryland, and associate professor of neurology at Johns Hopkins University School of Medicine, said that as a clinician, he is highly satisfied with the specialty pharmacy services for MS at Johns Hopkins.
“Our pharmacists are fantastic in communicating with the prescriber if something comes up related to medication safety or they are concerned that the patient isn’t adhering to the medication,” Dr. Newsome said.
He noted that in addition to helping to alleviate the burden of a myriad of tasks associated with prescribing for patients with MS, specialty pharmacists may have an important impact on outcomes, although more data are needed.
“Having a specialty pharmacy involved in the care of our patients can help navigate the challenges associated with the process of obtaining approval for DMTs,” he said. “We know how important it is to expedite and shorten the time frame from writing the prescription to getting the person on their DMT.”
Telemedicine, other models
Although integrated specialty pharmacist services may seem out of reach for smaller MS clinics, the use of telemedicine and other models may help achieve similar results.
“A model I have seen is having pharmacists split their time between a specialty pharmacy and the MS clinic,” said Dr. Montgomery.
“A telemedicine model can also be utilized, in which a pharmacist can reach out to patients by telephone or through video visits. This would allow a pharmacist to be utilized for multiple clinics or as an MS specialist within a specialty pharmacy,” she added.
Whether provided in house or through telemedicine, a key benefit for clinicians is in freeing up valuable time, which has a domino effect in improving quality all around.
“In addition to improving safety outcomes, specialty pharmacists help with the allocation of clinic staff to other clinic responsibilities, and the utilization of services by patients results in more resources allocated for their care,” Dr. Montgomery said.
Dr. Montgomery is a nonpromotional speaker for Novartis and is on its advisory board.
A version of this article first appeared on Medscape.com.
, new data suggest.
“As DMT management and treatment options for MS symptoms become more complex, clinical pharmacists can be utilized for medication education and management,” Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, department of neurology, Duke University Hospital, Durham, N.C., told delegates attending the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Since 2018, more than half a dozen DMTs have been approved for MS by the U.S. Food and Drug Administration. However, there is currently no established DMT selection algorithm, and because of this, there is a need for specialty pharmacists, she added.
“DMT approvals by the FDA have outpaced MS guideline recommendations. This can be overwhelming for patients, especially now that they have so many options to choose from,” she said.
Key services provided by specialty pharmacists include coordinating pretreatment requirements, as well as help with dosing, side effects, safety monitoring, and treatment adherence. In addition, pharmacists help with switching therapies, dispensing, and cost and authorization problems.
In reporting on improvements associated with specialty pharmacists, researchers from prominent MS centers around the country described specific outcomes.
Aids early intervention
A report on the Kaiser Permanente Washington (KPWA) MS Pharmacy Program detailed significant reductions in the time to address patients’ needs through the use of specialty pharmacists. In an assessment of 391 referrals to the program from 2019 to 2020, the average total time spent per patient per year dropped from 145 minutes in 2019 to 109 minutes in 2020.
Services included assessment of medication adherence, adverse drug reaction consultation, lab monitoring, patient counseling on initiation of a DMT, shared decision making, and follow-up visits.
“The KPWA MS Pharmacy Program plays an integral role in the care of patients with MS. The MS clinical pharmacists ensure patients are well informed about their DMT options and are fully educated about selected treatment,” the investigators noted.
A report on an outpatient MS clinic at Emory Healthcare, Atlanta, described how use of specialty pharmacist services resulted in a 49% reduction in time to treatment initiation with fingolimod. The time decreased from 83.9 days to 42.9 days following the introduction of specialty pharmacist services.
“Integration of a clinical pharmacy specialist in the therapeutic management of MS patients is crucial to early intervention with disease-modifying therapy,” the investigators noted.
A report on the specialty pharmacy services provided at Johns Hopkins MS Precision Medicine Center of Excellence, Baltimore, described an evaluation of 708 assessments between July 2019 and June 2020. Results showed that the vast majority (98%) of patients reported no missed days from work or school due to MS-related symptoms and that 99.3% reported no hospitalizations due to MS relapses, which are both key measures of MS treatment adherence.
High patient satisfaction
Patients reported high satisfaction with the in-house pharmacy on the National Association of Specialty Pharmacy’s patient satisfaction survey. In the survey, the average score was 82, compared with 79 for external specialty pharmacies.
“Moreover, patients were highly satisfied with the services provided at the pharmacy and were likely to continue receiving their comprehensive pharmacy care at our institution,” the researchers reported.
The study “highlights the value of pharmacists’ involvement in patient care and supports the need for continuation of integrated clinical services in health system specialty pharmacy,” the investigators noted.
CMSC President Scott D. Newsome, DO, director of the Neurosciences Consultation and Infusion Center at Green Spring Station, Lutherville, Maryland, and associate professor of neurology at Johns Hopkins University School of Medicine, said that as a clinician, he is highly satisfied with the specialty pharmacy services for MS at Johns Hopkins.
“Our pharmacists are fantastic in communicating with the prescriber if something comes up related to medication safety or they are concerned that the patient isn’t adhering to the medication,” Dr. Newsome said.
He noted that in addition to helping to alleviate the burden of a myriad of tasks associated with prescribing for patients with MS, specialty pharmacists may have an important impact on outcomes, although more data are needed.
“Having a specialty pharmacy involved in the care of our patients can help navigate the challenges associated with the process of obtaining approval for DMTs,” he said. “We know how important it is to expedite and shorten the time frame from writing the prescription to getting the person on their DMT.”
Telemedicine, other models
Although integrated specialty pharmacist services may seem out of reach for smaller MS clinics, the use of telemedicine and other models may help achieve similar results.
“A model I have seen is having pharmacists split their time between a specialty pharmacy and the MS clinic,” said Dr. Montgomery.
“A telemedicine model can also be utilized, in which a pharmacist can reach out to patients by telephone or through video visits. This would allow a pharmacist to be utilized for multiple clinics or as an MS specialist within a specialty pharmacy,” she added.
Whether provided in house or through telemedicine, a key benefit for clinicians is in freeing up valuable time, which has a domino effect in improving quality all around.
“In addition to improving safety outcomes, specialty pharmacists help with the allocation of clinic staff to other clinic responsibilities, and the utilization of services by patients results in more resources allocated for their care,” Dr. Montgomery said.
Dr. Montgomery is a nonpromotional speaker for Novartis and is on its advisory board.
A version of this article first appeared on Medscape.com.
FROM CMSC 2021
Cannabis use common for MS-related spasticity
, new research suggests. Findings from a survey conducted through a large registry in 2020 showed that 31% of patients with MS reported trying cannabis to treat their symptoms – and 20% reported regular use.
Spasticity was reported by 80% as the reason why they used cannabis, while pain was cited as the reason by 69% and sleep problems/insomnia was cited by 61%.
Investigators noted that the new data reflect the latest patterns of use amid sweeping changes in recreational and medical marijuana laws.
“Interest in the use of cannabis for managing MS symptoms continues to increase as more data become available and access becomes easier,” co-investigator Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas, told attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Administration routes vary
The survey was conducted through the longitudinal North American Research Committee on Multiple Sclerosis (NARCOMS) Registry, a voluntary, self-report registry for patients with MS. Of 6,934 registry participants invited to participate, 3,249 (47%) responded. The majority of responders were women (79%) and the mean age was 61 years. About 63% were being treated with disease-modifying therapies.
Overall, 31% of respondents reported having used cannabis to treat their MS symptoms. In addition, 20% reported regular current cannabis use, with an average use of 20 days in the past month. As many as 40% of the current users reported using cannabis daily.
“In general we saw some small differences in current users, who tended to include more males; have higher spasticity, pain, and sleep symptoms; and [were] more likely to be unemployed and younger,” Dr. Salter said.
The most common forms of cannabis administration were smoking (33%) and eating (20%). In addition, 12% reported vaporizing cannabis with a highly concentrated material, 11% administered cannabis sublingually, and 11% reported swallowing it.
Further, 8% reported vaporizing cannabis as a dried flower, 5% used it topically, and 1% reported drinking it.
Of note, the definition of “cannabis/marijuana” in the study excluded hemp cannabidiol (CBD) or products marketed as CBD only.
Consistent use
The most common reason for use by far was spasticity (80%). This was followed by for pain (69%) and sleep/insomnia problems (61%). Among users, 37% reported doing so to treat all three of those problems.
Regarding other symptoms, 36% used cannabis for anxiety, 24% for depression, 18% for overactive bladder, 17% for nausea or gastrointestinal problems, 16% for migraine or headaches, 14% for tremors, and 6% for other purposes.
The vast majority (95%) reported cannabis to be very or somewhat helpful for their symptoms.
Among the 69% of respondents who reported not using cannabis for their MS symptoms, the most commonly cited reasons were a lack of evidence on efficacy (40%) or safety (27%), concerns of legality (25%), lack of insurance coverage (22%), prohibitive cost (18%), and adverse side effects.
Surprisingly, the dramatic shift in the legalization of cannabis use in many states does not appear to be reflected in changes in cannabis use for MS, Dr. Salter said.
“We conducted an anonymous NARCOMS survey a couple of years prior to this survey, and our results are generally consistent. There’s been a small increase in the use and an acceptance or willingness to consider cannabis, but it’s relatively consistent,” she said.
“Despite the changes in access, the landscape hasn’t really changed very much in terms of evidence of the effects on MS symptoms, so that could be why,” Dr. Salter added.
Most patients appear to feel comfortable discussing their cannabis use with their physician, with 75% reporting doing so. However, the most common primary source of medical guidance for treating MS with cannabis was “nobody/self”; for 20%, the source for medical guidance was a dispensary professional.
As many as 62% of respondents reported obtaining their cannabis products from dispensaries, while other sources included family/friend (18%) or an acquaintance (13%). About 31% reported their most preferred type of cannabis to be equal parts THC and cannabidiol, while 30% preferred high THC/low cannabidiol (30%).
Mirrors clinical practice findings
Commenting on the study, Laura T. Safar, MD, vice chair of Psychiatry at Lahey Hospital and Medical Center and assistant professor of psychiatry at Harvard Medical School, Boston, said the findings generally fall in line with cannabis use among patients with MS in her practice.
“This is [consistent] with my general experience: A high percentage of my patients with MS are using cannabis with the goal of addressing their MS symptoms that way,” said Dr. Safar, who was not involved with the research.
One notable recent change in patients’ inquiries about cannabis is their apparent confidence in the information they’re getting, she noted. This is a sign of the ever-expanding sources of information – but from sources who may or may not have an understanding of effects in MS, she added.
“What seems new is a certain level of specificity in the information patients state – regardless of its accuracy. There is more technical information widely available about cannabis online and in the dispensaries,” said Dr. Safar.
“A lot of that information may not have been tested scientifically, but it is presented with an aura of truth,” she said.
While misconceptions about cannabis use in MS may not be new, “the conviction with which they are stated and believed seems stronger,” even though they have been validated by questionably expert sources, Dr. Safar noted.
She pointed out that psychiatric effects are among her patients’ notable concerns of cannabis use in MS.
“Cannabis use, especially daily use in moderate to large amounts, can have negative cognitive side effects,” she said. “In addition, it can have other psychiatric side effects: worsening of mood and anxiety, apathy, and anhedonia, a lack of pleasure or enjoyment, and a flattening of the emotional experience.”
Countering misinformation
Dr. Safar said she works to counter misinformation and provide more reliable, evidence-based recommendations.
“I educate my patients about what we know from scientific trials about the potential benefits, including possible help with pain, excluding central pain, and with spasticity,” she said. Dr. Safar added that she also discusses possible risks, such as worsening of cognition, mood, and anxiety.
On the basis of an individual’s presentation, and working in collaboration with their neurologist as appropriate, Dr. Safar said she discusses the following issues with the patient:
- Does cannabis make sense for the symptoms being presented?
- Has the patient received benefit so far?
- Are there side effects they may be experiencing?
- Would it be appropriate to lower the cannabis dose/frequency of its use?
- If a patient is using cannabis with an objective that is not backed up by the literature, such as depression, are they open to information about other treatment options?
The study was sponsored by GW Research. Dr. Salter has conducted research for GW Pharmaceuticals companies. Dr. Safar has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. Findings from a survey conducted through a large registry in 2020 showed that 31% of patients with MS reported trying cannabis to treat their symptoms – and 20% reported regular use.
Spasticity was reported by 80% as the reason why they used cannabis, while pain was cited as the reason by 69% and sleep problems/insomnia was cited by 61%.
Investigators noted that the new data reflect the latest patterns of use amid sweeping changes in recreational and medical marijuana laws.
“Interest in the use of cannabis for managing MS symptoms continues to increase as more data become available and access becomes easier,” co-investigator Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas, told attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Administration routes vary
The survey was conducted through the longitudinal North American Research Committee on Multiple Sclerosis (NARCOMS) Registry, a voluntary, self-report registry for patients with MS. Of 6,934 registry participants invited to participate, 3,249 (47%) responded. The majority of responders were women (79%) and the mean age was 61 years. About 63% were being treated with disease-modifying therapies.
Overall, 31% of respondents reported having used cannabis to treat their MS symptoms. In addition, 20% reported regular current cannabis use, with an average use of 20 days in the past month. As many as 40% of the current users reported using cannabis daily.
“In general we saw some small differences in current users, who tended to include more males; have higher spasticity, pain, and sleep symptoms; and [were] more likely to be unemployed and younger,” Dr. Salter said.
The most common forms of cannabis administration were smoking (33%) and eating (20%). In addition, 12% reported vaporizing cannabis with a highly concentrated material, 11% administered cannabis sublingually, and 11% reported swallowing it.
Further, 8% reported vaporizing cannabis as a dried flower, 5% used it topically, and 1% reported drinking it.
Of note, the definition of “cannabis/marijuana” in the study excluded hemp cannabidiol (CBD) or products marketed as CBD only.
Consistent use
The most common reason for use by far was spasticity (80%). This was followed by for pain (69%) and sleep/insomnia problems (61%). Among users, 37% reported doing so to treat all three of those problems.
Regarding other symptoms, 36% used cannabis for anxiety, 24% for depression, 18% for overactive bladder, 17% for nausea or gastrointestinal problems, 16% for migraine or headaches, 14% for tremors, and 6% for other purposes.
The vast majority (95%) reported cannabis to be very or somewhat helpful for their symptoms.
Among the 69% of respondents who reported not using cannabis for their MS symptoms, the most commonly cited reasons were a lack of evidence on efficacy (40%) or safety (27%), concerns of legality (25%), lack of insurance coverage (22%), prohibitive cost (18%), and adverse side effects.
Surprisingly, the dramatic shift in the legalization of cannabis use in many states does not appear to be reflected in changes in cannabis use for MS, Dr. Salter said.
“We conducted an anonymous NARCOMS survey a couple of years prior to this survey, and our results are generally consistent. There’s been a small increase in the use and an acceptance or willingness to consider cannabis, but it’s relatively consistent,” she said.
“Despite the changes in access, the landscape hasn’t really changed very much in terms of evidence of the effects on MS symptoms, so that could be why,” Dr. Salter added.
Most patients appear to feel comfortable discussing their cannabis use with their physician, with 75% reporting doing so. However, the most common primary source of medical guidance for treating MS with cannabis was “nobody/self”; for 20%, the source for medical guidance was a dispensary professional.
As many as 62% of respondents reported obtaining their cannabis products from dispensaries, while other sources included family/friend (18%) or an acquaintance (13%). About 31% reported their most preferred type of cannabis to be equal parts THC and cannabidiol, while 30% preferred high THC/low cannabidiol (30%).
Mirrors clinical practice findings
Commenting on the study, Laura T. Safar, MD, vice chair of Psychiatry at Lahey Hospital and Medical Center and assistant professor of psychiatry at Harvard Medical School, Boston, said the findings generally fall in line with cannabis use among patients with MS in her practice.
“This is [consistent] with my general experience: A high percentage of my patients with MS are using cannabis with the goal of addressing their MS symptoms that way,” said Dr. Safar, who was not involved with the research.
One notable recent change in patients’ inquiries about cannabis is their apparent confidence in the information they’re getting, she noted. This is a sign of the ever-expanding sources of information – but from sources who may or may not have an understanding of effects in MS, she added.
“What seems new is a certain level of specificity in the information patients state – regardless of its accuracy. There is more technical information widely available about cannabis online and in the dispensaries,” said Dr. Safar.
“A lot of that information may not have been tested scientifically, but it is presented with an aura of truth,” she said.
While misconceptions about cannabis use in MS may not be new, “the conviction with which they are stated and believed seems stronger,” even though they have been validated by questionably expert sources, Dr. Safar noted.
She pointed out that psychiatric effects are among her patients’ notable concerns of cannabis use in MS.
“Cannabis use, especially daily use in moderate to large amounts, can have negative cognitive side effects,” she said. “In addition, it can have other psychiatric side effects: worsening of mood and anxiety, apathy, and anhedonia, a lack of pleasure or enjoyment, and a flattening of the emotional experience.”
Countering misinformation
Dr. Safar said she works to counter misinformation and provide more reliable, evidence-based recommendations.
“I educate my patients about what we know from scientific trials about the potential benefits, including possible help with pain, excluding central pain, and with spasticity,” she said. Dr. Safar added that she also discusses possible risks, such as worsening of cognition, mood, and anxiety.
On the basis of an individual’s presentation, and working in collaboration with their neurologist as appropriate, Dr. Safar said she discusses the following issues with the patient:
- Does cannabis make sense for the symptoms being presented?
- Has the patient received benefit so far?
- Are there side effects they may be experiencing?
- Would it be appropriate to lower the cannabis dose/frequency of its use?
- If a patient is using cannabis with an objective that is not backed up by the literature, such as depression, are they open to information about other treatment options?
The study was sponsored by GW Research. Dr. Salter has conducted research for GW Pharmaceuticals companies. Dr. Safar has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. Findings from a survey conducted through a large registry in 2020 showed that 31% of patients with MS reported trying cannabis to treat their symptoms – and 20% reported regular use.
Spasticity was reported by 80% as the reason why they used cannabis, while pain was cited as the reason by 69% and sleep problems/insomnia was cited by 61%.
Investigators noted that the new data reflect the latest patterns of use amid sweeping changes in recreational and medical marijuana laws.
“Interest in the use of cannabis for managing MS symptoms continues to increase as more data become available and access becomes easier,” co-investigator Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas, told attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Administration routes vary
The survey was conducted through the longitudinal North American Research Committee on Multiple Sclerosis (NARCOMS) Registry, a voluntary, self-report registry for patients with MS. Of 6,934 registry participants invited to participate, 3,249 (47%) responded. The majority of responders were women (79%) and the mean age was 61 years. About 63% were being treated with disease-modifying therapies.
Overall, 31% of respondents reported having used cannabis to treat their MS symptoms. In addition, 20% reported regular current cannabis use, with an average use of 20 days in the past month. As many as 40% of the current users reported using cannabis daily.
“In general we saw some small differences in current users, who tended to include more males; have higher spasticity, pain, and sleep symptoms; and [were] more likely to be unemployed and younger,” Dr. Salter said.
The most common forms of cannabis administration were smoking (33%) and eating (20%). In addition, 12% reported vaporizing cannabis with a highly concentrated material, 11% administered cannabis sublingually, and 11% reported swallowing it.
Further, 8% reported vaporizing cannabis as a dried flower, 5% used it topically, and 1% reported drinking it.
Of note, the definition of “cannabis/marijuana” in the study excluded hemp cannabidiol (CBD) or products marketed as CBD only.
Consistent use
The most common reason for use by far was spasticity (80%). This was followed by for pain (69%) and sleep/insomnia problems (61%). Among users, 37% reported doing so to treat all three of those problems.
Regarding other symptoms, 36% used cannabis for anxiety, 24% for depression, 18% for overactive bladder, 17% for nausea or gastrointestinal problems, 16% for migraine or headaches, 14% for tremors, and 6% for other purposes.
The vast majority (95%) reported cannabis to be very or somewhat helpful for their symptoms.
Among the 69% of respondents who reported not using cannabis for their MS symptoms, the most commonly cited reasons were a lack of evidence on efficacy (40%) or safety (27%), concerns of legality (25%), lack of insurance coverage (22%), prohibitive cost (18%), and adverse side effects.
Surprisingly, the dramatic shift in the legalization of cannabis use in many states does not appear to be reflected in changes in cannabis use for MS, Dr. Salter said.
“We conducted an anonymous NARCOMS survey a couple of years prior to this survey, and our results are generally consistent. There’s been a small increase in the use and an acceptance or willingness to consider cannabis, but it’s relatively consistent,” she said.
“Despite the changes in access, the landscape hasn’t really changed very much in terms of evidence of the effects on MS symptoms, so that could be why,” Dr. Salter added.
Most patients appear to feel comfortable discussing their cannabis use with their physician, with 75% reporting doing so. However, the most common primary source of medical guidance for treating MS with cannabis was “nobody/self”; for 20%, the source for medical guidance was a dispensary professional.
As many as 62% of respondents reported obtaining their cannabis products from dispensaries, while other sources included family/friend (18%) or an acquaintance (13%). About 31% reported their most preferred type of cannabis to be equal parts THC and cannabidiol, while 30% preferred high THC/low cannabidiol (30%).
Mirrors clinical practice findings
Commenting on the study, Laura T. Safar, MD, vice chair of Psychiatry at Lahey Hospital and Medical Center and assistant professor of psychiatry at Harvard Medical School, Boston, said the findings generally fall in line with cannabis use among patients with MS in her practice.
“This is [consistent] with my general experience: A high percentage of my patients with MS are using cannabis with the goal of addressing their MS symptoms that way,” said Dr. Safar, who was not involved with the research.
One notable recent change in patients’ inquiries about cannabis is their apparent confidence in the information they’re getting, she noted. This is a sign of the ever-expanding sources of information – but from sources who may or may not have an understanding of effects in MS, she added.
“What seems new is a certain level of specificity in the information patients state – regardless of its accuracy. There is more technical information widely available about cannabis online and in the dispensaries,” said Dr. Safar.
“A lot of that information may not have been tested scientifically, but it is presented with an aura of truth,” she said.
While misconceptions about cannabis use in MS may not be new, “the conviction with which they are stated and believed seems stronger,” even though they have been validated by questionably expert sources, Dr. Safar noted.
She pointed out that psychiatric effects are among her patients’ notable concerns of cannabis use in MS.
“Cannabis use, especially daily use in moderate to large amounts, can have negative cognitive side effects,” she said. “In addition, it can have other psychiatric side effects: worsening of mood and anxiety, apathy, and anhedonia, a lack of pleasure or enjoyment, and a flattening of the emotional experience.”
Countering misinformation
Dr. Safar said she works to counter misinformation and provide more reliable, evidence-based recommendations.
“I educate my patients about what we know from scientific trials about the potential benefits, including possible help with pain, excluding central pain, and with spasticity,” she said. Dr. Safar added that she also discusses possible risks, such as worsening of cognition, mood, and anxiety.
On the basis of an individual’s presentation, and working in collaboration with their neurologist as appropriate, Dr. Safar said she discusses the following issues with the patient:
- Does cannabis make sense for the symptoms being presented?
- Has the patient received benefit so far?
- Are there side effects they may be experiencing?
- Would it be appropriate to lower the cannabis dose/frequency of its use?
- If a patient is using cannabis with an objective that is not backed up by the literature, such as depression, are they open to information about other treatment options?
The study was sponsored by GW Research. Dr. Salter has conducted research for GW Pharmaceuticals companies. Dr. Safar has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM CMSC 2021
New consensus guideline on clinical MRI use in MS
The guideline represents a collaboration between the Consortium of Multiple Sclerosis Centers, the European-based Magnetic Resonance Imaging in Multiple Sclerosis, and North American Imaging in Multiple Sclerosis.
Among its recommendations for improving diagnosis and management of MS is the establishment of much-needed ways to boost protocol adherence. “The key part of these recommendations that we want to emphasize is how important it is for them to be used,” said David Li, MD, University of British Columbia, Vancouver, and cochair of the MRI guideline committee.
Dr. Li noted that there was a widespread lack of adherence among MRI centers to compliance with the 2018 CMSC guidelines in imaging for MS. This potentially compromised clinicians’ ability to identify lesions that allow for earlier and confident diagnoses and to monitor for disease changes that may necessitate the initiation or change of therapy, he said.
“The key to being able to know that brain changes have occurred in patients over time is to have scans that have been performed using standardized protocols – to be certain that the change is truly the result of a change in disease activity and progression and not erroneously due to differences resulting from different MRI scanning procedures,” he said to attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
The guideline was also published this summer as a position paper in Lancet Neurology.
Key recommendations
The new guideline covers a broad range of imaging topics, with key areas of focus including the use of three-dimensional imaging, when and when not to use gadolinium contrast, and spinal cord imaging.
For example, a 3 Tesla magnet strength is preferred when imaging the brain with MRI because of its increased sensitivity for detecting lesions – but a minimum magnet strength of at least 1.5 T can also be used. For the spinal cord, there is no advantage of 3 T over 1.5 T, the guideline notes.
Other recommendations include:
- Core sequences for the brain should include sagittal and axial T2-weighted 3D fluid-attenuated inversion recovery (FLAIR), along with axial T2-weighted and diffusion-weighted sequences.
- 3D acquisition, which is now available on most scanners, is preferable to 2D acquisitions.
- Use of the subcallosal plane for consistent and reproducible alignment of axial scans is again emphasized, as it allows for easier and more confident comparison of follow-up studies to detect changes over time.
- At least two of three sagittal sequences are recommended for spinal cord MRI.
- The judicious use of macrocyclic gadolinium-based contrast agents (GBCA) is reemphasized because of its invaluable role in specific circumstances.
- However, for routine follow-up monitoring for subclinical disease activity, high-quality nonenhanced scans will allow for identification of new or enlarging T2 lesions without the need for GBCA.
- A new baseline brain MRI scan without gadolinium is recommended at least 3 months after treatment initiation, with annual follow-up scans without gadolinium.
For the diagnosis of MS, imaging of the entire spinal cord, as opposed to only the cervical segments, is recommended for the detection of lesions in the lower thoracic spinal segments and conus. However, 1.5-T scans are acceptable in that imaging, as 3-T scans provide no advantage. For routine follow-up monitoring, spinal cord MRI is optional.
“The current guidelines do not recommend routine follow-up spinal cord MRI, as it remains technically challenging and would disproportionately increase the scanning time, however experienced centers have the option to do so as a small number of asymptomatic spinal cord lesions do develop on follow-up,” the authors noted.
“However, follow up spinal cord MRI is recommended in special circumstances, including unexpected disease worsening and the possibility of a diagnosis other than multiple sclerosis,” they added.
Although the central vein sign has gained significant interest as a potential biomarker of inflammatory demyelination to help distinguish between MS and non-MS lesions, the 2021 protocol does not currently recommend imaging for the feature. However, those recommendations may change in future guidelines, the authors noted.
Low protocol adherence
The ongoing lack of adherence to guidelines that has resulted in frustrating inconsistencies in imaging was documented in no less than four studies presented at the meeting. They showed compliance with standard protocols to be strikingly poor.
Among the studies was one presented by Anthony Traboulsee, MD, professor and research chair of the MS Society of Canada, and from the University of British Columbia in Vancouver. Findings showed that only about half of scans acquired in a real-world dataset satisfied 2018 CMSC Standardized Brain MRI recommendations.
“Of note was that all the scans that were compliant were acquired in 3D while none of the 2D-acquired sequences were adherent,” Dr. Li commented.
Another study assessed use of standardized MRI protocols in a pragmatic, multisite MS clinical trial, the Traditional vs. Early Aggressive Therapy in Multiple Sclerosis (TREAT-MS) trial. Results showed that, upon enrollment, only 10% of scans followed CMSC guidelines for all three structural contrasts.
In that study, when the images provided by Johns Hopkins University Medical School were excluded, that figure dropped to 2.75% of remaining scans that met the criteria.
“Despite the importance of standardization of high-quality MRIs for the monitoring of people with MS, adoption of recommended imaging remains low,” the investigators wrote.
Resistance to change?
Commenting on the research and new guideline, Blake E. Dewey, PhD student, department of electrical and computer engineering at Johns Hopkins University, Baltimore, speculated that the noncompliance is often simply a matter of resistance to change.
“There are a number of reasons that are given for the retention of older, noncompliant MRI scans at different institutions, such as timing and patient throughput; but in my mind the issue is institutional inertia,” he said.
“It is difficult in many instances to get the clinician [radiologist] and institutional buy-in to make these kinds of changes across the board,” Mr. Dewey noted.
“The most common protocol that we see acquired is a set of 2D, low-resolution images with gaps between slices. These are simply not sufficient given modern MRI technology and the needs of MS clinicians,” he added.
Importantly, Mr. Dewey noted that, through direct communication with imaging staff and practitioners in the trial, compliance increased substantially – nearly 20-fold, “indicating a real possibility for outreach, including to commonly used outpatient radiology facilities.”
The updated MAGNIMS-CMSC-NAIMS MRI protocol is beneficial in providing “simple, reasonable guidelines that can be easily acquired at almost any imaging location in the U.S., and much of the rest of the world,” he said.
“As imaging researchers, we often reach for more that is needed clinically to properly diagnose and monitor a patient’s disease,” Mr. Dewey added. “This updated protocol has ‘trimmed the fat’ and left some discretion to institutions, which should help with compliance.”
Mr. Dewey said he also encourages imaging professionals to consider performing the sequences described as “optional” as well.
“Some of these are useful in measuring potential biomarkers currently under extensive validation, such as brain volumetrics and the central vein sign, that may help patient populations that are currently underserved by more traditional imaging, such as progressive patients and patients that could be potentially misdiagnosed,” he said.
Spreading the word
In the meantime, as part of its own outreach efforts, the CMSC is providing laminated cards that detail in simplified tables the 2021 updated MRI protocol. This makes it easy for centers to access the information and patients to help improve awareness of the protocol.
“We are urging clinicians to provide the cards to their MS patients and have them present the cards to their imaging center,” Dr. Li said. “This effort could make such an important difference in helping to encourage more to follow the protocol.”
Clinicians and patients alike can download the MRI protocol card from the CMSC website.
A version of this article first appeared on Medscape.com.
The guideline represents a collaboration between the Consortium of Multiple Sclerosis Centers, the European-based Magnetic Resonance Imaging in Multiple Sclerosis, and North American Imaging in Multiple Sclerosis.
Among its recommendations for improving diagnosis and management of MS is the establishment of much-needed ways to boost protocol adherence. “The key part of these recommendations that we want to emphasize is how important it is for them to be used,” said David Li, MD, University of British Columbia, Vancouver, and cochair of the MRI guideline committee.
Dr. Li noted that there was a widespread lack of adherence among MRI centers to compliance with the 2018 CMSC guidelines in imaging for MS. This potentially compromised clinicians’ ability to identify lesions that allow for earlier and confident diagnoses and to monitor for disease changes that may necessitate the initiation or change of therapy, he said.
“The key to being able to know that brain changes have occurred in patients over time is to have scans that have been performed using standardized protocols – to be certain that the change is truly the result of a change in disease activity and progression and not erroneously due to differences resulting from different MRI scanning procedures,” he said to attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
The guideline was also published this summer as a position paper in Lancet Neurology.
Key recommendations
The new guideline covers a broad range of imaging topics, with key areas of focus including the use of three-dimensional imaging, when and when not to use gadolinium contrast, and spinal cord imaging.
For example, a 3 Tesla magnet strength is preferred when imaging the brain with MRI because of its increased sensitivity for detecting lesions – but a minimum magnet strength of at least 1.5 T can also be used. For the spinal cord, there is no advantage of 3 T over 1.5 T, the guideline notes.
Other recommendations include:
- Core sequences for the brain should include sagittal and axial T2-weighted 3D fluid-attenuated inversion recovery (FLAIR), along with axial T2-weighted and diffusion-weighted sequences.
- 3D acquisition, which is now available on most scanners, is preferable to 2D acquisitions.
- Use of the subcallosal plane for consistent and reproducible alignment of axial scans is again emphasized, as it allows for easier and more confident comparison of follow-up studies to detect changes over time.
- At least two of three sagittal sequences are recommended for spinal cord MRI.
- The judicious use of macrocyclic gadolinium-based contrast agents (GBCA) is reemphasized because of its invaluable role in specific circumstances.
- However, for routine follow-up monitoring for subclinical disease activity, high-quality nonenhanced scans will allow for identification of new or enlarging T2 lesions without the need for GBCA.
- A new baseline brain MRI scan without gadolinium is recommended at least 3 months after treatment initiation, with annual follow-up scans without gadolinium.
For the diagnosis of MS, imaging of the entire spinal cord, as opposed to only the cervical segments, is recommended for the detection of lesions in the lower thoracic spinal segments and conus. However, 1.5-T scans are acceptable in that imaging, as 3-T scans provide no advantage. For routine follow-up monitoring, spinal cord MRI is optional.
“The current guidelines do not recommend routine follow-up spinal cord MRI, as it remains technically challenging and would disproportionately increase the scanning time, however experienced centers have the option to do so as a small number of asymptomatic spinal cord lesions do develop on follow-up,” the authors noted.
“However, follow up spinal cord MRI is recommended in special circumstances, including unexpected disease worsening and the possibility of a diagnosis other than multiple sclerosis,” they added.
Although the central vein sign has gained significant interest as a potential biomarker of inflammatory demyelination to help distinguish between MS and non-MS lesions, the 2021 protocol does not currently recommend imaging for the feature. However, those recommendations may change in future guidelines, the authors noted.
Low protocol adherence
The ongoing lack of adherence to guidelines that has resulted in frustrating inconsistencies in imaging was documented in no less than four studies presented at the meeting. They showed compliance with standard protocols to be strikingly poor.
Among the studies was one presented by Anthony Traboulsee, MD, professor and research chair of the MS Society of Canada, and from the University of British Columbia in Vancouver. Findings showed that only about half of scans acquired in a real-world dataset satisfied 2018 CMSC Standardized Brain MRI recommendations.
“Of note was that all the scans that were compliant were acquired in 3D while none of the 2D-acquired sequences were adherent,” Dr. Li commented.
Another study assessed use of standardized MRI protocols in a pragmatic, multisite MS clinical trial, the Traditional vs. Early Aggressive Therapy in Multiple Sclerosis (TREAT-MS) trial. Results showed that, upon enrollment, only 10% of scans followed CMSC guidelines for all three structural contrasts.
In that study, when the images provided by Johns Hopkins University Medical School were excluded, that figure dropped to 2.75% of remaining scans that met the criteria.
“Despite the importance of standardization of high-quality MRIs for the monitoring of people with MS, adoption of recommended imaging remains low,” the investigators wrote.
Resistance to change?
Commenting on the research and new guideline, Blake E. Dewey, PhD student, department of electrical and computer engineering at Johns Hopkins University, Baltimore, speculated that the noncompliance is often simply a matter of resistance to change.
“There are a number of reasons that are given for the retention of older, noncompliant MRI scans at different institutions, such as timing and patient throughput; but in my mind the issue is institutional inertia,” he said.
“It is difficult in many instances to get the clinician [radiologist] and institutional buy-in to make these kinds of changes across the board,” Mr. Dewey noted.
“The most common protocol that we see acquired is a set of 2D, low-resolution images with gaps between slices. These are simply not sufficient given modern MRI technology and the needs of MS clinicians,” he added.
Importantly, Mr. Dewey noted that, through direct communication with imaging staff and practitioners in the trial, compliance increased substantially – nearly 20-fold, “indicating a real possibility for outreach, including to commonly used outpatient radiology facilities.”
The updated MAGNIMS-CMSC-NAIMS MRI protocol is beneficial in providing “simple, reasonable guidelines that can be easily acquired at almost any imaging location in the U.S., and much of the rest of the world,” he said.
“As imaging researchers, we often reach for more that is needed clinically to properly diagnose and monitor a patient’s disease,” Mr. Dewey added. “This updated protocol has ‘trimmed the fat’ and left some discretion to institutions, which should help with compliance.”
Mr. Dewey said he also encourages imaging professionals to consider performing the sequences described as “optional” as well.
“Some of these are useful in measuring potential biomarkers currently under extensive validation, such as brain volumetrics and the central vein sign, that may help patient populations that are currently underserved by more traditional imaging, such as progressive patients and patients that could be potentially misdiagnosed,” he said.
Spreading the word
In the meantime, as part of its own outreach efforts, the CMSC is providing laminated cards that detail in simplified tables the 2021 updated MRI protocol. This makes it easy for centers to access the information and patients to help improve awareness of the protocol.
“We are urging clinicians to provide the cards to their MS patients and have them present the cards to their imaging center,” Dr. Li said. “This effort could make such an important difference in helping to encourage more to follow the protocol.”
Clinicians and patients alike can download the MRI protocol card from the CMSC website.
A version of this article first appeared on Medscape.com.
The guideline represents a collaboration between the Consortium of Multiple Sclerosis Centers, the European-based Magnetic Resonance Imaging in Multiple Sclerosis, and North American Imaging in Multiple Sclerosis.
Among its recommendations for improving diagnosis and management of MS is the establishment of much-needed ways to boost protocol adherence. “The key part of these recommendations that we want to emphasize is how important it is for them to be used,” said David Li, MD, University of British Columbia, Vancouver, and cochair of the MRI guideline committee.
Dr. Li noted that there was a widespread lack of adherence among MRI centers to compliance with the 2018 CMSC guidelines in imaging for MS. This potentially compromised clinicians’ ability to identify lesions that allow for earlier and confident diagnoses and to monitor for disease changes that may necessitate the initiation or change of therapy, he said.
“The key to being able to know that brain changes have occurred in patients over time is to have scans that have been performed using standardized protocols – to be certain that the change is truly the result of a change in disease activity and progression and not erroneously due to differences resulting from different MRI scanning procedures,” he said to attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
The guideline was also published this summer as a position paper in Lancet Neurology.
Key recommendations
The new guideline covers a broad range of imaging topics, with key areas of focus including the use of three-dimensional imaging, when and when not to use gadolinium contrast, and spinal cord imaging.
For example, a 3 Tesla magnet strength is preferred when imaging the brain with MRI because of its increased sensitivity for detecting lesions – but a minimum magnet strength of at least 1.5 T can also be used. For the spinal cord, there is no advantage of 3 T over 1.5 T, the guideline notes.
Other recommendations include:
- Core sequences for the brain should include sagittal and axial T2-weighted 3D fluid-attenuated inversion recovery (FLAIR), along with axial T2-weighted and diffusion-weighted sequences.
- 3D acquisition, which is now available on most scanners, is preferable to 2D acquisitions.
- Use of the subcallosal plane for consistent and reproducible alignment of axial scans is again emphasized, as it allows for easier and more confident comparison of follow-up studies to detect changes over time.
- At least two of three sagittal sequences are recommended for spinal cord MRI.
- The judicious use of macrocyclic gadolinium-based contrast agents (GBCA) is reemphasized because of its invaluable role in specific circumstances.
- However, for routine follow-up monitoring for subclinical disease activity, high-quality nonenhanced scans will allow for identification of new or enlarging T2 lesions without the need for GBCA.
- A new baseline brain MRI scan without gadolinium is recommended at least 3 months after treatment initiation, with annual follow-up scans without gadolinium.
For the diagnosis of MS, imaging of the entire spinal cord, as opposed to only the cervical segments, is recommended for the detection of lesions in the lower thoracic spinal segments and conus. However, 1.5-T scans are acceptable in that imaging, as 3-T scans provide no advantage. For routine follow-up monitoring, spinal cord MRI is optional.
“The current guidelines do not recommend routine follow-up spinal cord MRI, as it remains technically challenging and would disproportionately increase the scanning time, however experienced centers have the option to do so as a small number of asymptomatic spinal cord lesions do develop on follow-up,” the authors noted.
“However, follow up spinal cord MRI is recommended in special circumstances, including unexpected disease worsening and the possibility of a diagnosis other than multiple sclerosis,” they added.
Although the central vein sign has gained significant interest as a potential biomarker of inflammatory demyelination to help distinguish between MS and non-MS lesions, the 2021 protocol does not currently recommend imaging for the feature. However, those recommendations may change in future guidelines, the authors noted.
Low protocol adherence
The ongoing lack of adherence to guidelines that has resulted in frustrating inconsistencies in imaging was documented in no less than four studies presented at the meeting. They showed compliance with standard protocols to be strikingly poor.
Among the studies was one presented by Anthony Traboulsee, MD, professor and research chair of the MS Society of Canada, and from the University of British Columbia in Vancouver. Findings showed that only about half of scans acquired in a real-world dataset satisfied 2018 CMSC Standardized Brain MRI recommendations.
“Of note was that all the scans that were compliant were acquired in 3D while none of the 2D-acquired sequences were adherent,” Dr. Li commented.
Another study assessed use of standardized MRI protocols in a pragmatic, multisite MS clinical trial, the Traditional vs. Early Aggressive Therapy in Multiple Sclerosis (TREAT-MS) trial. Results showed that, upon enrollment, only 10% of scans followed CMSC guidelines for all three structural contrasts.
In that study, when the images provided by Johns Hopkins University Medical School were excluded, that figure dropped to 2.75% of remaining scans that met the criteria.
“Despite the importance of standardization of high-quality MRIs for the monitoring of people with MS, adoption of recommended imaging remains low,” the investigators wrote.
Resistance to change?
Commenting on the research and new guideline, Blake E. Dewey, PhD student, department of electrical and computer engineering at Johns Hopkins University, Baltimore, speculated that the noncompliance is often simply a matter of resistance to change.
“There are a number of reasons that are given for the retention of older, noncompliant MRI scans at different institutions, such as timing and patient throughput; but in my mind the issue is institutional inertia,” he said.
“It is difficult in many instances to get the clinician [radiologist] and institutional buy-in to make these kinds of changes across the board,” Mr. Dewey noted.
“The most common protocol that we see acquired is a set of 2D, low-resolution images with gaps between slices. These are simply not sufficient given modern MRI technology and the needs of MS clinicians,” he added.
Importantly, Mr. Dewey noted that, through direct communication with imaging staff and practitioners in the trial, compliance increased substantially – nearly 20-fold, “indicating a real possibility for outreach, including to commonly used outpatient radiology facilities.”
The updated MAGNIMS-CMSC-NAIMS MRI protocol is beneficial in providing “simple, reasonable guidelines that can be easily acquired at almost any imaging location in the U.S., and much of the rest of the world,” he said.
“As imaging researchers, we often reach for more that is needed clinically to properly diagnose and monitor a patient’s disease,” Mr. Dewey added. “This updated protocol has ‘trimmed the fat’ and left some discretion to institutions, which should help with compliance.”
Mr. Dewey said he also encourages imaging professionals to consider performing the sequences described as “optional” as well.
“Some of these are useful in measuring potential biomarkers currently under extensive validation, such as brain volumetrics and the central vein sign, that may help patient populations that are currently underserved by more traditional imaging, such as progressive patients and patients that could be potentially misdiagnosed,” he said.
Spreading the word
In the meantime, as part of its own outreach efforts, the CMSC is providing laminated cards that detail in simplified tables the 2021 updated MRI protocol. This makes it easy for centers to access the information and patients to help improve awareness of the protocol.
“We are urging clinicians to provide the cards to their MS patients and have them present the cards to their imaging center,” Dr. Li said. “This effort could make such an important difference in helping to encourage more to follow the protocol.”
Clinicians and patients alike can download the MRI protocol card from the CMSC website.
A version of this article first appeared on Medscape.com.
FROM CMSC 2021
Two diets linked to improved cognition, fatigue in MS
A Paleolithic elimination diet (Wahls diet) or a low-saturated fat diet (Swank diet) are associated with improved cognition, among other clinical outcomes, in relapsing-remitting multiple sclerosis (RRMS), new research suggests.
In a randomized study of patients with RRMS, the group that followed a Wahls diet and the group that followed a Swank diet both showed significant, unique improvement in measures of cognitive dysfunction, fatigue, and quality of life.
“Several dietary intervention studies have demonstrated favorable results on MS-related fatigue and quality of life. However, these results are among the first to show favorable reductions in cognitive dysfunction,” said co-investigator Tyler Titcomb, PhD, department of internal medicine, University of Iowa, Iowa City.
The results were presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Similar diets
The CMSC findings came from a secondary analysis of a randomized trial published online in July in the Multiple Sclerosis Journal Experimental, Translational, and Clinical (MSJ-ETC).
The primary analysis of the single-blind, parallel group, randomized trial showed the Wahls and Swank diets were linked to significant improvement in outcomes on the Fatigue Severity Scale (FSS), the Modified Fatigue Impact Scale (MFIS), and other measures among participants with RRMS. There were no significant differences between the two dietary regimens.
The Swank diet restricts saturated fat to a maximum of 15 g per day while providing 20 g to 50 g (4 to 10 teaspoons) of unsaturated fat per day, with four servings each of whole grains, fruits, and vegetables.
The Wahls diet recommends six to nine servings of fruits and vegetables per day, in addition to 6 to 12 ounces of meat per day, according to gender. Grains, legumes, eggs, and dairy, with the exception of clarified butter or ghee, are not permitted on this diet. Both diets eschew processed foods.
To further evaluate the diets’ effects on perceived fatigue and cognitive dysfunction, the researchers returned to the trial, which enrolled 95 adults with stable RRMS at the University of Iowa Prevention Intervention Center between August 2016 and May 2019.
After a 12-week run-in period with support and education from registered dietitians, participants were randomly assigned to either the Swank or Wahls diets in a 24-week intervention that did not include dietitian support.
Inclusion criteria included having moderate to severe fatigue, as shown by an FSS score of at least 4.0, while not having severe mental impairment, an eating disorder, or liver or kidney disease. There were no significant differences in baseline demographic or clinical characteristics between the groups.
Of the patients, 77 completed the 12-week run-in (38 in the Swank diet group and 39 in the Wahls group). A total of 72 participants completed the 24-week follow-up (37 and 35, respectively).
Reduction in fatigue, cognitive dysfunction
After the researchers controlled for smoking, alcohol consumption, age, sex, baseline distance 6-minute walk test, body mass index, serum vitamin D, and years with MS, results at 12 and 24 weeks showed significant improvements from baseline in the key outcomes of fatigue and cognitive function, as measured by the Fatigue Scale for Motor and Cognitive Function (FSMC).
Scores were −5.7 and −9.0, respectively, for the Swank diet group and −9.3 and −14.9 for the Wahls group (P ≤ .001 for all comparisons).
In addition, there was a significant reduction in both groups on the total Perceived Deficits Questionnaire (PDQ) at 12 and 24 weeks (Swank, −7.4 and −6.3, respectively; Wahls, −6.8 and −10.8; P ≤ .001 for all).
There were similar improvements for both diets in an analysis of the mental and physical scores on FSMC and on the subscales on PDQ of attention, retrospective memory, prospective memory, and planning.
As observed in the primary analysis, there were no significant differences between the two groups in absolute mean scores on FSMC, PDQ, or their subscales at any timepoint.
“Both diets led to significant reductions in fatigue and cognitive dysfunction,” Dr. Titcomb said.
Of note, the primary analysis further showed statistically and clinically significant increases in the 6-minute walk test at 24-weeks of 6% in the Wahls group (P = .007). After removal of nonadherent participants, the improvement was still significant at 24 weeks in the Wahls group (P = .02), as well as in the Swank group (P = .001).
Dr. Titcomb noted that the majority of study participants were taking disease-modifying therapies (DMTs). However, there were no interactions between any specific DMTs and dietary benefits.
Potential mechanisms
Although the similar outcomes between the diets point to a common mechanism, there are also various other possibilities, said Dr. Titcomb. These include modulation of the microbiome, inflammation, immune system, or micronutrient optimization, he said.
Previous research has shown reduced mass and diversity in the gut microbiota among patients with MS compared with those without MS, potentially promoting inflammation. Other research has shown improvements in those factors with dietary modification.
While there is no evidence of gut microbiota changes with the Wahls and Swank diets, each is rich in fiber and plant-derived phytochemicals, which are known to be associated with improvements in gut microbiota and neuroinflammation, the investigators noted.
Dr. Titcomb reported that research into the diets is continuing as they evaluate longer-term and other effects. “This trial was a short-term parallel arm trial that did not include MRI or a control group,” he said, adding that the investigators will soon start recruiting for a follow-up study that will include a control group, long-term follow-up, and MRIs.
That upcoming study “has the potential to answer several of the unknown questions regarding the effect of diet on MS,” Dr. Titcomb said.
Notable research with limitations
Commenting on the study, Rebecca Spain, MD, MSPH, associate professor of neurology at the Oregon Health & Science University, and associate director of clinical care at VA MS Center of Excellence West in Portland, said there were several notable findings.
This includes that most people with MS “were able to adhere to the protocols for significant lengths of time, even without the support of dietitians for the final 12 weeks of the study,” said Spain, who was not involved with the research.
A significant limitation was the lack of a control group. Without that, “it’s hard to know for sure if the improvements in fatigue and cognition were from the diets or were simply from the social support of participating in a research study,” she said
Nevertheless, trials reporting on dietary effects in MS such as the current study are important, Dr. Spain noted. They demonstrate “that it is feasible and safe to conduct dietary studies and suggest which key MS symptoms may benefit and should be evaluated in future studies.”
“Critically, diet studies address one of the most frequent concerns of people with MS, promoting self-management and empowerment,” Dr. Spain concluded.
General guidelines for common dietary elements with evidence of improving fatigue, cognition, and mood are available on the National MS Society’s website.
The study received no outside funding. Dr. Titcomb and Dr. Spain have disclosed no relevant financial relationships. Terry L. Wahls, MD, who developed the Wahls diet, was a senior author of the study.
A version of this article first appeared on Medscape.com.
A Paleolithic elimination diet (Wahls diet) or a low-saturated fat diet (Swank diet) are associated with improved cognition, among other clinical outcomes, in relapsing-remitting multiple sclerosis (RRMS), new research suggests.
In a randomized study of patients with RRMS, the group that followed a Wahls diet and the group that followed a Swank diet both showed significant, unique improvement in measures of cognitive dysfunction, fatigue, and quality of life.
“Several dietary intervention studies have demonstrated favorable results on MS-related fatigue and quality of life. However, these results are among the first to show favorable reductions in cognitive dysfunction,” said co-investigator Tyler Titcomb, PhD, department of internal medicine, University of Iowa, Iowa City.
The results were presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Similar diets
The CMSC findings came from a secondary analysis of a randomized trial published online in July in the Multiple Sclerosis Journal Experimental, Translational, and Clinical (MSJ-ETC).
The primary analysis of the single-blind, parallel group, randomized trial showed the Wahls and Swank diets were linked to significant improvement in outcomes on the Fatigue Severity Scale (FSS), the Modified Fatigue Impact Scale (MFIS), and other measures among participants with RRMS. There were no significant differences between the two dietary regimens.
The Swank diet restricts saturated fat to a maximum of 15 g per day while providing 20 g to 50 g (4 to 10 teaspoons) of unsaturated fat per day, with four servings each of whole grains, fruits, and vegetables.
The Wahls diet recommends six to nine servings of fruits and vegetables per day, in addition to 6 to 12 ounces of meat per day, according to gender. Grains, legumes, eggs, and dairy, with the exception of clarified butter or ghee, are not permitted on this diet. Both diets eschew processed foods.
To further evaluate the diets’ effects on perceived fatigue and cognitive dysfunction, the researchers returned to the trial, which enrolled 95 adults with stable RRMS at the University of Iowa Prevention Intervention Center between August 2016 and May 2019.
After a 12-week run-in period with support and education from registered dietitians, participants were randomly assigned to either the Swank or Wahls diets in a 24-week intervention that did not include dietitian support.
Inclusion criteria included having moderate to severe fatigue, as shown by an FSS score of at least 4.0, while not having severe mental impairment, an eating disorder, or liver or kidney disease. There were no significant differences in baseline demographic or clinical characteristics between the groups.
Of the patients, 77 completed the 12-week run-in (38 in the Swank diet group and 39 in the Wahls group). A total of 72 participants completed the 24-week follow-up (37 and 35, respectively).
Reduction in fatigue, cognitive dysfunction
After the researchers controlled for smoking, alcohol consumption, age, sex, baseline distance 6-minute walk test, body mass index, serum vitamin D, and years with MS, results at 12 and 24 weeks showed significant improvements from baseline in the key outcomes of fatigue and cognitive function, as measured by the Fatigue Scale for Motor and Cognitive Function (FSMC).
Scores were −5.7 and −9.0, respectively, for the Swank diet group and −9.3 and −14.9 for the Wahls group (P ≤ .001 for all comparisons).
In addition, there was a significant reduction in both groups on the total Perceived Deficits Questionnaire (PDQ) at 12 and 24 weeks (Swank, −7.4 and −6.3, respectively; Wahls, −6.8 and −10.8; P ≤ .001 for all).
There were similar improvements for both diets in an analysis of the mental and physical scores on FSMC and on the subscales on PDQ of attention, retrospective memory, prospective memory, and planning.
As observed in the primary analysis, there were no significant differences between the two groups in absolute mean scores on FSMC, PDQ, or their subscales at any timepoint.
“Both diets led to significant reductions in fatigue and cognitive dysfunction,” Dr. Titcomb said.
Of note, the primary analysis further showed statistically and clinically significant increases in the 6-minute walk test at 24-weeks of 6% in the Wahls group (P = .007). After removal of nonadherent participants, the improvement was still significant at 24 weeks in the Wahls group (P = .02), as well as in the Swank group (P = .001).
Dr. Titcomb noted that the majority of study participants were taking disease-modifying therapies (DMTs). However, there were no interactions between any specific DMTs and dietary benefits.
Potential mechanisms
Although the similar outcomes between the diets point to a common mechanism, there are also various other possibilities, said Dr. Titcomb. These include modulation of the microbiome, inflammation, immune system, or micronutrient optimization, he said.
Previous research has shown reduced mass and diversity in the gut microbiota among patients with MS compared with those without MS, potentially promoting inflammation. Other research has shown improvements in those factors with dietary modification.
While there is no evidence of gut microbiota changes with the Wahls and Swank diets, each is rich in fiber and plant-derived phytochemicals, which are known to be associated with improvements in gut microbiota and neuroinflammation, the investigators noted.
Dr. Titcomb reported that research into the diets is continuing as they evaluate longer-term and other effects. “This trial was a short-term parallel arm trial that did not include MRI or a control group,” he said, adding that the investigators will soon start recruiting for a follow-up study that will include a control group, long-term follow-up, and MRIs.
That upcoming study “has the potential to answer several of the unknown questions regarding the effect of diet on MS,” Dr. Titcomb said.
Notable research with limitations
Commenting on the study, Rebecca Spain, MD, MSPH, associate professor of neurology at the Oregon Health & Science University, and associate director of clinical care at VA MS Center of Excellence West in Portland, said there were several notable findings.
This includes that most people with MS “were able to adhere to the protocols for significant lengths of time, even without the support of dietitians for the final 12 weeks of the study,” said Spain, who was not involved with the research.
A significant limitation was the lack of a control group. Without that, “it’s hard to know for sure if the improvements in fatigue and cognition were from the diets or were simply from the social support of participating in a research study,” she said
Nevertheless, trials reporting on dietary effects in MS such as the current study are important, Dr. Spain noted. They demonstrate “that it is feasible and safe to conduct dietary studies and suggest which key MS symptoms may benefit and should be evaluated in future studies.”
“Critically, diet studies address one of the most frequent concerns of people with MS, promoting self-management and empowerment,” Dr. Spain concluded.
General guidelines for common dietary elements with evidence of improving fatigue, cognition, and mood are available on the National MS Society’s website.
The study received no outside funding. Dr. Titcomb and Dr. Spain have disclosed no relevant financial relationships. Terry L. Wahls, MD, who developed the Wahls diet, was a senior author of the study.
A version of this article first appeared on Medscape.com.
A Paleolithic elimination diet (Wahls diet) or a low-saturated fat diet (Swank diet) are associated with improved cognition, among other clinical outcomes, in relapsing-remitting multiple sclerosis (RRMS), new research suggests.
In a randomized study of patients with RRMS, the group that followed a Wahls diet and the group that followed a Swank diet both showed significant, unique improvement in measures of cognitive dysfunction, fatigue, and quality of life.
“Several dietary intervention studies have demonstrated favorable results on MS-related fatigue and quality of life. However, these results are among the first to show favorable reductions in cognitive dysfunction,” said co-investigator Tyler Titcomb, PhD, department of internal medicine, University of Iowa, Iowa City.
The results were presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
Similar diets
The CMSC findings came from a secondary analysis of a randomized trial published online in July in the Multiple Sclerosis Journal Experimental, Translational, and Clinical (MSJ-ETC).
The primary analysis of the single-blind, parallel group, randomized trial showed the Wahls and Swank diets were linked to significant improvement in outcomes on the Fatigue Severity Scale (FSS), the Modified Fatigue Impact Scale (MFIS), and other measures among participants with RRMS. There were no significant differences between the two dietary regimens.
The Swank diet restricts saturated fat to a maximum of 15 g per day while providing 20 g to 50 g (4 to 10 teaspoons) of unsaturated fat per day, with four servings each of whole grains, fruits, and vegetables.
The Wahls diet recommends six to nine servings of fruits and vegetables per day, in addition to 6 to 12 ounces of meat per day, according to gender. Grains, legumes, eggs, and dairy, with the exception of clarified butter or ghee, are not permitted on this diet. Both diets eschew processed foods.
To further evaluate the diets’ effects on perceived fatigue and cognitive dysfunction, the researchers returned to the trial, which enrolled 95 adults with stable RRMS at the University of Iowa Prevention Intervention Center between August 2016 and May 2019.
After a 12-week run-in period with support and education from registered dietitians, participants were randomly assigned to either the Swank or Wahls diets in a 24-week intervention that did not include dietitian support.
Inclusion criteria included having moderate to severe fatigue, as shown by an FSS score of at least 4.0, while not having severe mental impairment, an eating disorder, or liver or kidney disease. There were no significant differences in baseline demographic or clinical characteristics between the groups.
Of the patients, 77 completed the 12-week run-in (38 in the Swank diet group and 39 in the Wahls group). A total of 72 participants completed the 24-week follow-up (37 and 35, respectively).
Reduction in fatigue, cognitive dysfunction
After the researchers controlled for smoking, alcohol consumption, age, sex, baseline distance 6-minute walk test, body mass index, serum vitamin D, and years with MS, results at 12 and 24 weeks showed significant improvements from baseline in the key outcomes of fatigue and cognitive function, as measured by the Fatigue Scale for Motor and Cognitive Function (FSMC).
Scores were −5.7 and −9.0, respectively, for the Swank diet group and −9.3 and −14.9 for the Wahls group (P ≤ .001 for all comparisons).
In addition, there was a significant reduction in both groups on the total Perceived Deficits Questionnaire (PDQ) at 12 and 24 weeks (Swank, −7.4 and −6.3, respectively; Wahls, −6.8 and −10.8; P ≤ .001 for all).
There were similar improvements for both diets in an analysis of the mental and physical scores on FSMC and on the subscales on PDQ of attention, retrospective memory, prospective memory, and planning.
As observed in the primary analysis, there were no significant differences between the two groups in absolute mean scores on FSMC, PDQ, or their subscales at any timepoint.
“Both diets led to significant reductions in fatigue and cognitive dysfunction,” Dr. Titcomb said.
Of note, the primary analysis further showed statistically and clinically significant increases in the 6-minute walk test at 24-weeks of 6% in the Wahls group (P = .007). After removal of nonadherent participants, the improvement was still significant at 24 weeks in the Wahls group (P = .02), as well as in the Swank group (P = .001).
Dr. Titcomb noted that the majority of study participants were taking disease-modifying therapies (DMTs). However, there were no interactions between any specific DMTs and dietary benefits.
Potential mechanisms
Although the similar outcomes between the diets point to a common mechanism, there are also various other possibilities, said Dr. Titcomb. These include modulation of the microbiome, inflammation, immune system, or micronutrient optimization, he said.
Previous research has shown reduced mass and diversity in the gut microbiota among patients with MS compared with those without MS, potentially promoting inflammation. Other research has shown improvements in those factors with dietary modification.
While there is no evidence of gut microbiota changes with the Wahls and Swank diets, each is rich in fiber and plant-derived phytochemicals, which are known to be associated with improvements in gut microbiota and neuroinflammation, the investigators noted.
Dr. Titcomb reported that research into the diets is continuing as they evaluate longer-term and other effects. “This trial was a short-term parallel arm trial that did not include MRI or a control group,” he said, adding that the investigators will soon start recruiting for a follow-up study that will include a control group, long-term follow-up, and MRIs.
That upcoming study “has the potential to answer several of the unknown questions regarding the effect of diet on MS,” Dr. Titcomb said.
Notable research with limitations
Commenting on the study, Rebecca Spain, MD, MSPH, associate professor of neurology at the Oregon Health & Science University, and associate director of clinical care at VA MS Center of Excellence West in Portland, said there were several notable findings.
This includes that most people with MS “were able to adhere to the protocols for significant lengths of time, even without the support of dietitians for the final 12 weeks of the study,” said Spain, who was not involved with the research.
A significant limitation was the lack of a control group. Without that, “it’s hard to know for sure if the improvements in fatigue and cognition were from the diets or were simply from the social support of participating in a research study,” she said
Nevertheless, trials reporting on dietary effects in MS such as the current study are important, Dr. Spain noted. They demonstrate “that it is feasible and safe to conduct dietary studies and suggest which key MS symptoms may benefit and should be evaluated in future studies.”
“Critically, diet studies address one of the most frequent concerns of people with MS, promoting self-management and empowerment,” Dr. Spain concluded.
General guidelines for common dietary elements with evidence of improving fatigue, cognition, and mood are available on the National MS Society’s website.
The study received no outside funding. Dr. Titcomb and Dr. Spain have disclosed no relevant financial relationships. Terry L. Wahls, MD, who developed the Wahls diet, was a senior author of the study.
A version of this article first appeared on Medscape.com.
FROM CMSC 2021
Free vitamin D no better at predicting death in men than standard testing
In the clinical assessment of vitamin D concentrations, free 25-hydroxyvitamin D shows little added benefit to the current standard of total 25(OH)D, with deficiencies in each associated with at least a twofold risk of all-cause mortality, new research shows.
“In this prospective, population-based study of middle-aged and older European men, total 25(OH)D levels below 20 mcg/L were independently associated with a twofold increased all-cause mortality,” the researchers reported.
“Lower concentrations of free 25(OH)D were also predictive of mortality, but did not provide any additional information,” they noted. “The data do not support routine measurement of free 25(OH)D or 1,25(OH)2D [1,25-dihydroxyvitamin D] over total 25(OH)D levels.”
Despite vitamin D deficiency being well established as playing a role in a wide range of adverse health effects, including cardiovascular disease and mortality, there has been a lack of consensus on the optimal concentration of total 25(OH)D, with studies showing inconsistent levels to define insufficiency and deficiency.
One aspect of the debate has focused on precisely how to measure the concentrations, with some evidence supporting the “free hormone hypothesis,” which suggests that free 25(OH)D could represent a better indicator than the standard total 25(OH)D of functional availability of vitamin D, and have stronger clinical utility.
To investigate both issues, Marian Dejaeger, MD, PhD, and colleagues evaluated prospective data on 1,915 men recruited from eight centers around Europe in the European Male Aging Study in a report published in the Journal of Clinical Endocrinology & Metabolism
The men, who were aged between 40 and 79 years, had a mean follow-up of 12.3 years; during that time, about a quarter (23.5%) of them died.
In addition to other factors, including being older, having a higher body mass index, and having at least two comorbidities, men who died had significantly lower levels of total 25(OH)D, total 1,25(OH)2D, free 25(OH)D, and free 1,25(OH)2D, as well as higher parathyroid hormone and creatinine values.
After adjustment for key confounders, including body mass index, smoking, alcohol consumption, kidney function, number of comorbidities at baseline and other factors, men with a total 25(OH)D below 20 mcg/L had a significantly increased risk of mortality, compared with those who had normal levels of vitamin D, defined as above 30 mcg/L (hazard ratio, 2.03; P < .001).
In terms of free 25(OH)D, the lowest three free 25(OH)D quintiles (under 4.43 ng/L) similarly had a significantly higher mortality risk, compared with the highest quintile (HR, 2.09; P < .01) after adjustment for the confounders.
Further observations of all quintiles of other measures of 1,25(OH)2D and vitamin D binding protein (DBP) showed no associations with mortality after adjusting for confounders.
Methods of measurement
An important caveat of the study is the type of method used to measure free 25(OH)D. The authors calculated free 25(OH)D using a formula, as opposed to the alternative of direct measurement with an enzyme-linked immunosorbent assay kit, and there can be important differences between the two approaches, said Daniel Bikle, MD, PhD, a professor of medicine and dermatology at the San Francisco Veterans Affairs Medical Center and University of California, San Francisco, in a comment on the research.
“The biggest problem is that calculating free 25(OH)D does not give an accurate estimate of the real free level, so making conclusions regarding its role in clinical situations is subject to error,” said Dr. Bikle, who recently authored a review of the free hormone hypothesis.
A calculation approach “depends heavily on the total 25(OH)D level, so in a population with reasonably normal DBP and albumin levels, the correlation with total 25(OH)D is very high, so I am not surprised by the results showing no additional value,” he said in an interview.
The authors addressed their use of the calculation over the direct measurement in the study, noting that there is a “high correlation between both methods.”
But they added that, “as no equilibrium analysis method is available for free 25(OH)D, nor for free 1,25(OH)2D, no method can be considered superior.”
Dr. Dejaeger, of the department of public health and primary care, Katholieke Universiteit Leuven (Belgium), added that she agreed that high or low DBP could potentially shift some correlations, but noted that other research has shown calculated and direct measures to match relatively well.
“So we partly agree [with Dr. Bikle] not being surprised that we did not find an added value because we also found little variation in DBP, but we are not convinced that a different measurement method could make the difference here.”
Another caveat of the study is that, despite half of the measurements being taken in the summer, more than 90% of subjects in the study’s cohort had vitamin D insufficiency, defined in the study as total 25(OH)D levels below 30 mcg/L, and as many as 70% had deficiency, with levels below 20 mcg/L.
Therefore, “as the number of participants with high levels of total 25(OH)D in our study is small, a true threshold concentration for optimal vitamin D status cannot be defined on basis of our data,” the authors noted.
Under current recommendations, the Endocrine Society indicates that concentrations below 30 mcg/L are insufficient, while other groups, including the Institute of Medicine, suggest concentrations of 20 mcg/L or above are adequate.
Free hormone hypothesis
Under the free hormone hypothesis, which is observed with thyroid hormones and sex steroids, the very small fraction of free hormones that are not bound to protein carriers can enter cells and help facilitate biologic activity.
The hypothesis of a role of free 25(OH)D in mortality was supported by a recent study, in which free 25(OH)D levels – but not total 25(OH)D levels, were found to be independently associated with an increased risk of all-cause and cardiovascular mortality among patients with coronary artery disease.
However, two other studies are more consistent with the new findings, including one study showing no added value of free 25(OH)D as a marker for bone mineral density in older women, and another study showing no value as a marker of metabolic variables in healthy children.
“Currently, there are no hard data to support routine measurements of free 25(OH)D or 1,25(OH)2D over total 25(OH)D, the current standard of assessing vitamin D status, as stated in guidelines from different scientific bodies,” Dr. Dejaeger said in an interview.
The study received support from Versus Arthritis and the National Institute for Health Research Manchester Biomedical Research Centre. Dr. Dejaeger and Dr. Bikle had no disclosures to report.
In the clinical assessment of vitamin D concentrations, free 25-hydroxyvitamin D shows little added benefit to the current standard of total 25(OH)D, with deficiencies in each associated with at least a twofold risk of all-cause mortality, new research shows.
“In this prospective, population-based study of middle-aged and older European men, total 25(OH)D levels below 20 mcg/L were independently associated with a twofold increased all-cause mortality,” the researchers reported.
“Lower concentrations of free 25(OH)D were also predictive of mortality, but did not provide any additional information,” they noted. “The data do not support routine measurement of free 25(OH)D or 1,25(OH)2D [1,25-dihydroxyvitamin D] over total 25(OH)D levels.”
Despite vitamin D deficiency being well established as playing a role in a wide range of adverse health effects, including cardiovascular disease and mortality, there has been a lack of consensus on the optimal concentration of total 25(OH)D, with studies showing inconsistent levels to define insufficiency and deficiency.
One aspect of the debate has focused on precisely how to measure the concentrations, with some evidence supporting the “free hormone hypothesis,” which suggests that free 25(OH)D could represent a better indicator than the standard total 25(OH)D of functional availability of vitamin D, and have stronger clinical utility.
To investigate both issues, Marian Dejaeger, MD, PhD, and colleagues evaluated prospective data on 1,915 men recruited from eight centers around Europe in the European Male Aging Study in a report published in the Journal of Clinical Endocrinology & Metabolism
The men, who were aged between 40 and 79 years, had a mean follow-up of 12.3 years; during that time, about a quarter (23.5%) of them died.
In addition to other factors, including being older, having a higher body mass index, and having at least two comorbidities, men who died had significantly lower levels of total 25(OH)D, total 1,25(OH)2D, free 25(OH)D, and free 1,25(OH)2D, as well as higher parathyroid hormone and creatinine values.
After adjustment for key confounders, including body mass index, smoking, alcohol consumption, kidney function, number of comorbidities at baseline and other factors, men with a total 25(OH)D below 20 mcg/L had a significantly increased risk of mortality, compared with those who had normal levels of vitamin D, defined as above 30 mcg/L (hazard ratio, 2.03; P < .001).
In terms of free 25(OH)D, the lowest three free 25(OH)D quintiles (under 4.43 ng/L) similarly had a significantly higher mortality risk, compared with the highest quintile (HR, 2.09; P < .01) after adjustment for the confounders.
Further observations of all quintiles of other measures of 1,25(OH)2D and vitamin D binding protein (DBP) showed no associations with mortality after adjusting for confounders.
Methods of measurement
An important caveat of the study is the type of method used to measure free 25(OH)D. The authors calculated free 25(OH)D using a formula, as opposed to the alternative of direct measurement with an enzyme-linked immunosorbent assay kit, and there can be important differences between the two approaches, said Daniel Bikle, MD, PhD, a professor of medicine and dermatology at the San Francisco Veterans Affairs Medical Center and University of California, San Francisco, in a comment on the research.
“The biggest problem is that calculating free 25(OH)D does not give an accurate estimate of the real free level, so making conclusions regarding its role in clinical situations is subject to error,” said Dr. Bikle, who recently authored a review of the free hormone hypothesis.
A calculation approach “depends heavily on the total 25(OH)D level, so in a population with reasonably normal DBP and albumin levels, the correlation with total 25(OH)D is very high, so I am not surprised by the results showing no additional value,” he said in an interview.
The authors addressed their use of the calculation over the direct measurement in the study, noting that there is a “high correlation between both methods.”
But they added that, “as no equilibrium analysis method is available for free 25(OH)D, nor for free 1,25(OH)2D, no method can be considered superior.”
Dr. Dejaeger, of the department of public health and primary care, Katholieke Universiteit Leuven (Belgium), added that she agreed that high or low DBP could potentially shift some correlations, but noted that other research has shown calculated and direct measures to match relatively well.
“So we partly agree [with Dr. Bikle] not being surprised that we did not find an added value because we also found little variation in DBP, but we are not convinced that a different measurement method could make the difference here.”
Another caveat of the study is that, despite half of the measurements being taken in the summer, more than 90% of subjects in the study’s cohort had vitamin D insufficiency, defined in the study as total 25(OH)D levels below 30 mcg/L, and as many as 70% had deficiency, with levels below 20 mcg/L.
Therefore, “as the number of participants with high levels of total 25(OH)D in our study is small, a true threshold concentration for optimal vitamin D status cannot be defined on basis of our data,” the authors noted.
Under current recommendations, the Endocrine Society indicates that concentrations below 30 mcg/L are insufficient, while other groups, including the Institute of Medicine, suggest concentrations of 20 mcg/L or above are adequate.
Free hormone hypothesis
Under the free hormone hypothesis, which is observed with thyroid hormones and sex steroids, the very small fraction of free hormones that are not bound to protein carriers can enter cells and help facilitate biologic activity.
The hypothesis of a role of free 25(OH)D in mortality was supported by a recent study, in which free 25(OH)D levels – but not total 25(OH)D levels, were found to be independently associated with an increased risk of all-cause and cardiovascular mortality among patients with coronary artery disease.
However, two other studies are more consistent with the new findings, including one study showing no added value of free 25(OH)D as a marker for bone mineral density in older women, and another study showing no value as a marker of metabolic variables in healthy children.
“Currently, there are no hard data to support routine measurements of free 25(OH)D or 1,25(OH)2D over total 25(OH)D, the current standard of assessing vitamin D status, as stated in guidelines from different scientific bodies,” Dr. Dejaeger said in an interview.
The study received support from Versus Arthritis and the National Institute for Health Research Manchester Biomedical Research Centre. Dr. Dejaeger and Dr. Bikle had no disclosures to report.
In the clinical assessment of vitamin D concentrations, free 25-hydroxyvitamin D shows little added benefit to the current standard of total 25(OH)D, with deficiencies in each associated with at least a twofold risk of all-cause mortality, new research shows.
“In this prospective, population-based study of middle-aged and older European men, total 25(OH)D levels below 20 mcg/L were independently associated with a twofold increased all-cause mortality,” the researchers reported.
“Lower concentrations of free 25(OH)D were also predictive of mortality, but did not provide any additional information,” they noted. “The data do not support routine measurement of free 25(OH)D or 1,25(OH)2D [1,25-dihydroxyvitamin D] over total 25(OH)D levels.”
Despite vitamin D deficiency being well established as playing a role in a wide range of adverse health effects, including cardiovascular disease and mortality, there has been a lack of consensus on the optimal concentration of total 25(OH)D, with studies showing inconsistent levels to define insufficiency and deficiency.
One aspect of the debate has focused on precisely how to measure the concentrations, with some evidence supporting the “free hormone hypothesis,” which suggests that free 25(OH)D could represent a better indicator than the standard total 25(OH)D of functional availability of vitamin D, and have stronger clinical utility.
To investigate both issues, Marian Dejaeger, MD, PhD, and colleagues evaluated prospective data on 1,915 men recruited from eight centers around Europe in the European Male Aging Study in a report published in the Journal of Clinical Endocrinology & Metabolism
The men, who were aged between 40 and 79 years, had a mean follow-up of 12.3 years; during that time, about a quarter (23.5%) of them died.
In addition to other factors, including being older, having a higher body mass index, and having at least two comorbidities, men who died had significantly lower levels of total 25(OH)D, total 1,25(OH)2D, free 25(OH)D, and free 1,25(OH)2D, as well as higher parathyroid hormone and creatinine values.
After adjustment for key confounders, including body mass index, smoking, alcohol consumption, kidney function, number of comorbidities at baseline and other factors, men with a total 25(OH)D below 20 mcg/L had a significantly increased risk of mortality, compared with those who had normal levels of vitamin D, defined as above 30 mcg/L (hazard ratio, 2.03; P < .001).
In terms of free 25(OH)D, the lowest three free 25(OH)D quintiles (under 4.43 ng/L) similarly had a significantly higher mortality risk, compared with the highest quintile (HR, 2.09; P < .01) after adjustment for the confounders.
Further observations of all quintiles of other measures of 1,25(OH)2D and vitamin D binding protein (DBP) showed no associations with mortality after adjusting for confounders.
Methods of measurement
An important caveat of the study is the type of method used to measure free 25(OH)D. The authors calculated free 25(OH)D using a formula, as opposed to the alternative of direct measurement with an enzyme-linked immunosorbent assay kit, and there can be important differences between the two approaches, said Daniel Bikle, MD, PhD, a professor of medicine and dermatology at the San Francisco Veterans Affairs Medical Center and University of California, San Francisco, in a comment on the research.
“The biggest problem is that calculating free 25(OH)D does not give an accurate estimate of the real free level, so making conclusions regarding its role in clinical situations is subject to error,” said Dr. Bikle, who recently authored a review of the free hormone hypothesis.
A calculation approach “depends heavily on the total 25(OH)D level, so in a population with reasonably normal DBP and albumin levels, the correlation with total 25(OH)D is very high, so I am not surprised by the results showing no additional value,” he said in an interview.
The authors addressed their use of the calculation over the direct measurement in the study, noting that there is a “high correlation between both methods.”
But they added that, “as no equilibrium analysis method is available for free 25(OH)D, nor for free 1,25(OH)2D, no method can be considered superior.”
Dr. Dejaeger, of the department of public health and primary care, Katholieke Universiteit Leuven (Belgium), added that she agreed that high or low DBP could potentially shift some correlations, but noted that other research has shown calculated and direct measures to match relatively well.
“So we partly agree [with Dr. Bikle] not being surprised that we did not find an added value because we also found little variation in DBP, but we are not convinced that a different measurement method could make the difference here.”
Another caveat of the study is that, despite half of the measurements being taken in the summer, more than 90% of subjects in the study’s cohort had vitamin D insufficiency, defined in the study as total 25(OH)D levels below 30 mcg/L, and as many as 70% had deficiency, with levels below 20 mcg/L.
Therefore, “as the number of participants with high levels of total 25(OH)D in our study is small, a true threshold concentration for optimal vitamin D status cannot be defined on basis of our data,” the authors noted.
Under current recommendations, the Endocrine Society indicates that concentrations below 30 mcg/L are insufficient, while other groups, including the Institute of Medicine, suggest concentrations of 20 mcg/L or above are adequate.
Free hormone hypothesis
Under the free hormone hypothesis, which is observed with thyroid hormones and sex steroids, the very small fraction of free hormones that are not bound to protein carriers can enter cells and help facilitate biologic activity.
The hypothesis of a role of free 25(OH)D in mortality was supported by a recent study, in which free 25(OH)D levels – but not total 25(OH)D levels, were found to be independently associated with an increased risk of all-cause and cardiovascular mortality among patients with coronary artery disease.
However, two other studies are more consistent with the new findings, including one study showing no added value of free 25(OH)D as a marker for bone mineral density in older women, and another study showing no value as a marker of metabolic variables in healthy children.
“Currently, there are no hard data to support routine measurements of free 25(OH)D or 1,25(OH)2D over total 25(OH)D, the current standard of assessing vitamin D status, as stated in guidelines from different scientific bodies,” Dr. Dejaeger said in an interview.
The study received support from Versus Arthritis and the National Institute for Health Research Manchester Biomedical Research Centre. Dr. Dejaeger and Dr. Bikle had no disclosures to report.
FROM JOURNAL OF CLINICAL ENDOCRINOLOGY & METABOLISM
Vitamin D status may play a pivotal role in colon cancer prevention
This is according to an observational study published in the journal Gastroenterology. The study included 94,205 women (aged 25-42 years) who were followed between 1991 and 2015 during which 111 incident cases of early-onset colorectal cancer were diagnosed. Among 29,186 women who had at least one lower endoscopy from 1991 to 2011, 1,439 newly diagnosed conventional adenomas and 1,878 serrated polyps were found.
Women who consumed the highest average levels of total vitamin D of 450 IU per day, compared with those consuming less than 300 IU per day, showed a significantly reduced risk of early-onset colorectal cancer. Consuming 400 IU each day was associated with a 54% reduced risk of early-onset colorectal cancer.
“If confirmed, our findings could potentially lead to recommendations for higher vitamin D intake as an inexpensive low-risk complement to colorectal cancer screening as a prevention strategy for adults younger than age 50,” wrote the study authors, led by Edward L. Giovannucci, MD, ScD, of the Harvard School of Public Health, Boston.
Associations between vitamin D levels and colorectal cancer have been documented in review articles over the years. The link is the subject of 10 recently completed or ongoing clinical trials. Few studies have focused on early colorectal cancer and vitamin D intake. Unlike advanced colorectal cancer, the early-onset form of the disease is not as strongly associated with the traditional risk factors of a family history of colorectal cancer and it is therefore believed to be more strongly linked to other factors, such as lifestyle and diet – including vitamin D supplementation.
The evidence is in, but it’s incomplete
In addition to the new study in Gastroenterology, other observational studies, as well as laboratory and animal studies, suggest that vitamin D plays a role in inhibiting carcinogenesis. Vitamin D, researchers theorize, contains anti-inflammatory, immunomodulatory, and tumor angiogenesis properties that can slow the growth of tumors, but the evidence is mixed.
A meta-analysis of 137,567 patients published in 2013 in Preventive Medicine found an inverse association between 25-hydroxyvitamin D (25[OH]D) and total cancer mortality in women, but not among men. Three meta-analyses published in 2014 and 2019 found that vitamin D supplementation does not affect cancer incidence but does significantly reduce total cancer mortality rates by 12%-13%.
In 2019, researchers led by Marjorie McCullough, ScD, RD, senior scientific director of epidemiology research for the American Cancer Society, described a causal relationship between circulating vitamin D and colorectal cancer risk among 17 cohorts from a pooled analysis. “Our study suggests that optimal circulating 25(OH)D concentrations for colorectal cancer risk reduction are 75-100 nmol/L, [which is] higher than current Institute of Medicine recommendations for bone health,” she and colleagues wrote. Their findings were published in the Journal of the National Cancer Institute.
The Vitamin D and Omega-3 Trial (VITAL) published in 2019 in the New England Journal of Medicine, showed no significant effect of vitamin D3 supplementation of 2,000 IU/day in lowering the risk of invasive cancer or cardiovascular events.
Despite the mixed results, studies offer valuable insights into cancer risks, said Scott Kopetz, MD, PhD, codirector of the colorectal cancer moon shot research program at the University of Texas MD Anderson Cancer Center, Houston.
The Gastroenterology study is noteworthy because it focuses on early-onset colorectal cancer, he said.
“[The authors] demonstrate for the first time that there is an association of vitamin D intake with early-onset colorectal incidence, especially in the left side of the colon and rectum where the increase in early onset colorectal cancer manifests,” Dr. Kopetz said. “The analysis suggests that it may require long-term vitamin D intake to derive the benefit, which may explain why some shorter-term randomized studies failed to demonstrate.”
In animal models, vitamin D3 is “estimated to lower the incidence of colorectal cancer by 50%,” according to Lidija Klampfer, PhD, formerly a molecular biologist and senior research scientist with the Southern Research Institute, Birmingham, Ala.
Dr. Klampfer, a founding partner of ProteXase Therapeutics, is the author of an article on vitamin D and colon cancer published in 2014 in the World Journal of Gastrointestinal Oncology.
“The levels of vitamin D3 appear to be an essential determinant for the development and progression of colon cancer and supplementation with vitamin D3 is effective in suppressing intestinal tumorigenesis in animal models,” she wrote. “Studies have shown that 1,25 dihydroxyvitamin D3 can inhibit tumor-promoting inflammation leading to the development and progression of colon cancer.”
The hazards of a vitamin D deficiency
A severe vitamin D deficiency is associated with compromised bone and muscle health, calcium absorption, immunity, heart function and it can affect mood. Other studies have linked vitamin D deficiency to colorectal cancer, blood cancers, and bowel cancer.
Serum 25(OH)D is the primary circulating form of vitamin D and is considered the best marker for assessing vitamin D status, says Karin Amrein, MD, MSc, an endocrinologist with the Medical University of Graz (Austria). She was the lead author of a review on vitamin D deficiency published in January 2020 in the European Journal of Clinical Nutrition.
The Global Consensus Recommendations define vitamin D insufficiency as 12-20 ng/mL (30-50 nmol/L) and a deficiency as a serum 25OHD concentration less than 12 ng/mL (30 nmol/L). A deficiency in adults is usually treated with 50,000 IU of vitamin D2 or D3 once weekly for 8 weeks followed by maintenance dosages of cholecalciferol (vitamin D3) at 800-1,000 IU daily from dietary and supplemental sources.
Screening is recommended for individuals who exhibit symptoms and conditions associated with a vitamin D deficiency, but there is little agreement on recommended serum levels because every individual is different, according to the U.S. Preventive Services Task Force which updated its vitamin D recommendations in April for the first time in 7 years.
This is according to an observational study published in the journal Gastroenterology. The study included 94,205 women (aged 25-42 years) who were followed between 1991 and 2015 during which 111 incident cases of early-onset colorectal cancer were diagnosed. Among 29,186 women who had at least one lower endoscopy from 1991 to 2011, 1,439 newly diagnosed conventional adenomas and 1,878 serrated polyps were found.
Women who consumed the highest average levels of total vitamin D of 450 IU per day, compared with those consuming less than 300 IU per day, showed a significantly reduced risk of early-onset colorectal cancer. Consuming 400 IU each day was associated with a 54% reduced risk of early-onset colorectal cancer.
“If confirmed, our findings could potentially lead to recommendations for higher vitamin D intake as an inexpensive low-risk complement to colorectal cancer screening as a prevention strategy for adults younger than age 50,” wrote the study authors, led by Edward L. Giovannucci, MD, ScD, of the Harvard School of Public Health, Boston.
Associations between vitamin D levels and colorectal cancer have been documented in review articles over the years. The link is the subject of 10 recently completed or ongoing clinical trials. Few studies have focused on early colorectal cancer and vitamin D intake. Unlike advanced colorectal cancer, the early-onset form of the disease is not as strongly associated with the traditional risk factors of a family history of colorectal cancer and it is therefore believed to be more strongly linked to other factors, such as lifestyle and diet – including vitamin D supplementation.
The evidence is in, but it’s incomplete
In addition to the new study in Gastroenterology, other observational studies, as well as laboratory and animal studies, suggest that vitamin D plays a role in inhibiting carcinogenesis. Vitamin D, researchers theorize, contains anti-inflammatory, immunomodulatory, and tumor angiogenesis properties that can slow the growth of tumors, but the evidence is mixed.
A meta-analysis of 137,567 patients published in 2013 in Preventive Medicine found an inverse association between 25-hydroxyvitamin D (25[OH]D) and total cancer mortality in women, but not among men. Three meta-analyses published in 2014 and 2019 found that vitamin D supplementation does not affect cancer incidence but does significantly reduce total cancer mortality rates by 12%-13%.
In 2019, researchers led by Marjorie McCullough, ScD, RD, senior scientific director of epidemiology research for the American Cancer Society, described a causal relationship between circulating vitamin D and colorectal cancer risk among 17 cohorts from a pooled analysis. “Our study suggests that optimal circulating 25(OH)D concentrations for colorectal cancer risk reduction are 75-100 nmol/L, [which is] higher than current Institute of Medicine recommendations for bone health,” she and colleagues wrote. Their findings were published in the Journal of the National Cancer Institute.
The Vitamin D and Omega-3 Trial (VITAL) published in 2019 in the New England Journal of Medicine, showed no significant effect of vitamin D3 supplementation of 2,000 IU/day in lowering the risk of invasive cancer or cardiovascular events.
Despite the mixed results, studies offer valuable insights into cancer risks, said Scott Kopetz, MD, PhD, codirector of the colorectal cancer moon shot research program at the University of Texas MD Anderson Cancer Center, Houston.
The Gastroenterology study is noteworthy because it focuses on early-onset colorectal cancer, he said.
“[The authors] demonstrate for the first time that there is an association of vitamin D intake with early-onset colorectal incidence, especially in the left side of the colon and rectum where the increase in early onset colorectal cancer manifests,” Dr. Kopetz said. “The analysis suggests that it may require long-term vitamin D intake to derive the benefit, which may explain why some shorter-term randomized studies failed to demonstrate.”
In animal models, vitamin D3 is “estimated to lower the incidence of colorectal cancer by 50%,” according to Lidija Klampfer, PhD, formerly a molecular biologist and senior research scientist with the Southern Research Institute, Birmingham, Ala.
Dr. Klampfer, a founding partner of ProteXase Therapeutics, is the author of an article on vitamin D and colon cancer published in 2014 in the World Journal of Gastrointestinal Oncology.
“The levels of vitamin D3 appear to be an essential determinant for the development and progression of colon cancer and supplementation with vitamin D3 is effective in suppressing intestinal tumorigenesis in animal models,” she wrote. “Studies have shown that 1,25 dihydroxyvitamin D3 can inhibit tumor-promoting inflammation leading to the development and progression of colon cancer.”
The hazards of a vitamin D deficiency
A severe vitamin D deficiency is associated with compromised bone and muscle health, calcium absorption, immunity, heart function and it can affect mood. Other studies have linked vitamin D deficiency to colorectal cancer, blood cancers, and bowel cancer.
Serum 25(OH)D is the primary circulating form of vitamin D and is considered the best marker for assessing vitamin D status, says Karin Amrein, MD, MSc, an endocrinologist with the Medical University of Graz (Austria). She was the lead author of a review on vitamin D deficiency published in January 2020 in the European Journal of Clinical Nutrition.
The Global Consensus Recommendations define vitamin D insufficiency as 12-20 ng/mL (30-50 nmol/L) and a deficiency as a serum 25OHD concentration less than 12 ng/mL (30 nmol/L). A deficiency in adults is usually treated with 50,000 IU of vitamin D2 or D3 once weekly for 8 weeks followed by maintenance dosages of cholecalciferol (vitamin D3) at 800-1,000 IU daily from dietary and supplemental sources.
Screening is recommended for individuals who exhibit symptoms and conditions associated with a vitamin D deficiency, but there is little agreement on recommended serum levels because every individual is different, according to the U.S. Preventive Services Task Force which updated its vitamin D recommendations in April for the first time in 7 years.
This is according to an observational study published in the journal Gastroenterology. The study included 94,205 women (aged 25-42 years) who were followed between 1991 and 2015 during which 111 incident cases of early-onset colorectal cancer were diagnosed. Among 29,186 women who had at least one lower endoscopy from 1991 to 2011, 1,439 newly diagnosed conventional adenomas and 1,878 serrated polyps were found.
Women who consumed the highest average levels of total vitamin D of 450 IU per day, compared with those consuming less than 300 IU per day, showed a significantly reduced risk of early-onset colorectal cancer. Consuming 400 IU each day was associated with a 54% reduced risk of early-onset colorectal cancer.
“If confirmed, our findings could potentially lead to recommendations for higher vitamin D intake as an inexpensive low-risk complement to colorectal cancer screening as a prevention strategy for adults younger than age 50,” wrote the study authors, led by Edward L. Giovannucci, MD, ScD, of the Harvard School of Public Health, Boston.
Associations between vitamin D levels and colorectal cancer have been documented in review articles over the years. The link is the subject of 10 recently completed or ongoing clinical trials. Few studies have focused on early colorectal cancer and vitamin D intake. Unlike advanced colorectal cancer, the early-onset form of the disease is not as strongly associated with the traditional risk factors of a family history of colorectal cancer and it is therefore believed to be more strongly linked to other factors, such as lifestyle and diet – including vitamin D supplementation.
The evidence is in, but it’s incomplete
In addition to the new study in Gastroenterology, other observational studies, as well as laboratory and animal studies, suggest that vitamin D plays a role in inhibiting carcinogenesis. Vitamin D, researchers theorize, contains anti-inflammatory, immunomodulatory, and tumor angiogenesis properties that can slow the growth of tumors, but the evidence is mixed.
A meta-analysis of 137,567 patients published in 2013 in Preventive Medicine found an inverse association between 25-hydroxyvitamin D (25[OH]D) and total cancer mortality in women, but not among men. Three meta-analyses published in 2014 and 2019 found that vitamin D supplementation does not affect cancer incidence but does significantly reduce total cancer mortality rates by 12%-13%.
In 2019, researchers led by Marjorie McCullough, ScD, RD, senior scientific director of epidemiology research for the American Cancer Society, described a causal relationship between circulating vitamin D and colorectal cancer risk among 17 cohorts from a pooled analysis. “Our study suggests that optimal circulating 25(OH)D concentrations for colorectal cancer risk reduction are 75-100 nmol/L, [which is] higher than current Institute of Medicine recommendations for bone health,” she and colleagues wrote. Their findings were published in the Journal of the National Cancer Institute.
The Vitamin D and Omega-3 Trial (VITAL) published in 2019 in the New England Journal of Medicine, showed no significant effect of vitamin D3 supplementation of 2,000 IU/day in lowering the risk of invasive cancer or cardiovascular events.
Despite the mixed results, studies offer valuable insights into cancer risks, said Scott Kopetz, MD, PhD, codirector of the colorectal cancer moon shot research program at the University of Texas MD Anderson Cancer Center, Houston.
The Gastroenterology study is noteworthy because it focuses on early-onset colorectal cancer, he said.
“[The authors] demonstrate for the first time that there is an association of vitamin D intake with early-onset colorectal incidence, especially in the left side of the colon and rectum where the increase in early onset colorectal cancer manifests,” Dr. Kopetz said. “The analysis suggests that it may require long-term vitamin D intake to derive the benefit, which may explain why some shorter-term randomized studies failed to demonstrate.”
In animal models, vitamin D3 is “estimated to lower the incidence of colorectal cancer by 50%,” according to Lidija Klampfer, PhD, formerly a molecular biologist and senior research scientist with the Southern Research Institute, Birmingham, Ala.
Dr. Klampfer, a founding partner of ProteXase Therapeutics, is the author of an article on vitamin D and colon cancer published in 2014 in the World Journal of Gastrointestinal Oncology.
“The levels of vitamin D3 appear to be an essential determinant for the development and progression of colon cancer and supplementation with vitamin D3 is effective in suppressing intestinal tumorigenesis in animal models,” she wrote. “Studies have shown that 1,25 dihydroxyvitamin D3 can inhibit tumor-promoting inflammation leading to the development and progression of colon cancer.”
The hazards of a vitamin D deficiency
A severe vitamin D deficiency is associated with compromised bone and muscle health, calcium absorption, immunity, heart function and it can affect mood. Other studies have linked vitamin D deficiency to colorectal cancer, blood cancers, and bowel cancer.
Serum 25(OH)D is the primary circulating form of vitamin D and is considered the best marker for assessing vitamin D status, says Karin Amrein, MD, MSc, an endocrinologist with the Medical University of Graz (Austria). She was the lead author of a review on vitamin D deficiency published in January 2020 in the European Journal of Clinical Nutrition.
The Global Consensus Recommendations define vitamin D insufficiency as 12-20 ng/mL (30-50 nmol/L) and a deficiency as a serum 25OHD concentration less than 12 ng/mL (30 nmol/L). A deficiency in adults is usually treated with 50,000 IU of vitamin D2 or D3 once weekly for 8 weeks followed by maintenance dosages of cholecalciferol (vitamin D3) at 800-1,000 IU daily from dietary and supplemental sources.
Screening is recommended for individuals who exhibit symptoms and conditions associated with a vitamin D deficiency, but there is little agreement on recommended serum levels because every individual is different, according to the U.S. Preventive Services Task Force which updated its vitamin D recommendations in April for the first time in 7 years.
FROM GASTROENTEROLOGY
Bone risk: Is time since menopause a better predictor than age?
Although early menopause is linked to increased risks in bone loss and fracture, new research indicates that, even among the majority of women who have menopause after age 45, the time since the final menstrual period can be a stronger predictor than chronological age for key risks in bone health and fracture.
In a large longitudinal cohort, the number of years since a woman’s final menstrual period specifically showed a stronger association with femoral neck bone mineral density (BMD) than chronological age, while an earlier age at menopause – even among those over 45 years, was linked to an increased risk of fracture.
“Most of our clinical tools to predict osteoporosis-related outcomes use chronological age,” first author Albert Shieh, MD, told this news organization.
“Our findings suggest that more research should be done to examine whether ovarian age (time since final menstrual period) should be used in these tools as well.”
An increased focus on the significance of age at the time of the final menstrual period, compared with chronological age, has gained interest in risk assessment because of the known acceleration in the decline of BMD that occurs 1 year prior to the final menstrual period and continues at a rapid pace for 3 years afterwards before slowing.
To further investigate the association with BMD, Dr. Shieh, an endocrinologist specializing in osteoporosis at the University of California, Los Angeles, and his colleagues turned to data from the Study of Women’s Health Across the Nation (SWAN), a longitudinal cohort study of ambulatory women with pre- or early perimenopausal baseline data and 15 annual follow-up assessments.
Outcomes regarding postmenopausal lumbar spine (LS) or femoral neck (FN) BMD were evaluated in 1,038 women, while the time to fracture in relation to the final menstrual period was separately evaluated in 1,554 women.
In both cohorts, the women had a known final menstrual period at age 45 or older, and on average, their final menstrual period occurred at age 52.
After a multivariate adjustment for age, body mass index, and various other factors, they found that each additional year after a woman’s final menstrual period was associated with a significant (0.006 g/cm2) reduction in postmenopausal lumbar spine BMD and a 0.004 g/cm2 reduction femoral neck BMD (both P < .0001).
Conversely, chronological age was not associated with a change in femoral neck BMD when evaluated independently of years since the final menstrual period, the researchers reported in the Journal of Clinical Endocrinology and Metabolism.
Regarding lumbar spine BMD, chronological age was unexpectedly associated not just with change, but in fact with increases in lumbar spine BMD (P < .0001 per year). However, the authors speculate the change “is likely a reflection of age-associated degenerative changes causing false elevations in BMD measured by dual-energy x-ray absorptiometry.”
Fracture risk with earlier menopause
In terms of the fracture risk analysis, despite the women all being aged 45 or older, earlier age at menopause was still tied to an increased risk of incident fracture, with a 5% increase in risk for each earlier year in age at the time of the final menstrual period (P = .02).
Compared with women who had their final menstrual period at age 55, for instance, those who finished menstruating at age 47 had a 6.3% greater 20-year cumulative fracture risk, the authors note.
While previous findings from the Malmo Perimenopausal Study showed menopause prior to the age of 47 to be associated with an 83% and 59% greater risk of densitometric osteoporosis and fracture, respectively, by age 77, the authors note that the new study is unique in including only women who had a final menstrual period over the age of 45, therefore reducing the potential confounding of data on women under 45.
The new results “add to a growing body of literature suggesting that the endocrine changes that occur during the menopause transition trigger a pathophysiologic cascade that leads to organ dysfunction,” the authors note.
In terms of implications in risk assessment, “future studies should examine whether years since the final menstrual period predicts major osteoporotic fractures and hip fractures, specifically, and, if so, whether replacing chronological age with years since the final menstrual period improves the performance of clinical prediction tools, such as FRAX [Fracture Risk Assessment Tool],” they add.
Addition to guidelines?
Commenting on the findings, Peter Ebeling, MD, the current president of the American Society of Bone and Mineral Research, noted that the study importantly “confirms what we had previously anticipated, that in women with menopause who are 45 years of age or older a lower age of final menstrual period is associated with lower spine and hip BMD and more fractures.”
“We had already known this for women with premature ovarian insufficiency or an early menopause, and this extends the observation to the vast majority of women – more than 90% – with a normal menopause age,” said Dr. Ebeling, professor of medicine at Monash Health, Monash University, in Melbourne.
Despite the known importance of the time since final menstrual period, guidelines still focus on age in terms of chronology, rather than biology, emphasizing the risk among women over 50, in general, rather than the time since the last menstrual period, he noted.
“There is an important difference [between those two], as shown by this study,” he said. “Guidelines could be easily adapted to reflect this.”
Specifically, the association between lower age of final menstrual period and lower spine and hip BMD and more fractures requires “more formal assessment to determine whether adding age of final menstrual period to existing fracture risk calculator tools, like FRAX, can improve absolute fracture risk prediction,” Dr. Ebeling noted.
The authors and Dr. Ebeling had no disclosures to report.
Although early menopause is linked to increased risks in bone loss and fracture, new research indicates that, even among the majority of women who have menopause after age 45, the time since the final menstrual period can be a stronger predictor than chronological age for key risks in bone health and fracture.
In a large longitudinal cohort, the number of years since a woman’s final menstrual period specifically showed a stronger association with femoral neck bone mineral density (BMD) than chronological age, while an earlier age at menopause – even among those over 45 years, was linked to an increased risk of fracture.
“Most of our clinical tools to predict osteoporosis-related outcomes use chronological age,” first author Albert Shieh, MD, told this news organization.
“Our findings suggest that more research should be done to examine whether ovarian age (time since final menstrual period) should be used in these tools as well.”
An increased focus on the significance of age at the time of the final menstrual period, compared with chronological age, has gained interest in risk assessment because of the known acceleration in the decline of BMD that occurs 1 year prior to the final menstrual period and continues at a rapid pace for 3 years afterwards before slowing.
To further investigate the association with BMD, Dr. Shieh, an endocrinologist specializing in osteoporosis at the University of California, Los Angeles, and his colleagues turned to data from the Study of Women’s Health Across the Nation (SWAN), a longitudinal cohort study of ambulatory women with pre- or early perimenopausal baseline data and 15 annual follow-up assessments.
Outcomes regarding postmenopausal lumbar spine (LS) or femoral neck (FN) BMD were evaluated in 1,038 women, while the time to fracture in relation to the final menstrual period was separately evaluated in 1,554 women.
In both cohorts, the women had a known final menstrual period at age 45 or older, and on average, their final menstrual period occurred at age 52.
After a multivariate adjustment for age, body mass index, and various other factors, they found that each additional year after a woman’s final menstrual period was associated with a significant (0.006 g/cm2) reduction in postmenopausal lumbar spine BMD and a 0.004 g/cm2 reduction femoral neck BMD (both P < .0001).
Conversely, chronological age was not associated with a change in femoral neck BMD when evaluated independently of years since the final menstrual period, the researchers reported in the Journal of Clinical Endocrinology and Metabolism.
Regarding lumbar spine BMD, chronological age was unexpectedly associated not just with change, but in fact with increases in lumbar spine BMD (P < .0001 per year). However, the authors speculate the change “is likely a reflection of age-associated degenerative changes causing false elevations in BMD measured by dual-energy x-ray absorptiometry.”
Fracture risk with earlier menopause
In terms of the fracture risk analysis, despite the women all being aged 45 or older, earlier age at menopause was still tied to an increased risk of incident fracture, with a 5% increase in risk for each earlier year in age at the time of the final menstrual period (P = .02).
Compared with women who had their final menstrual period at age 55, for instance, those who finished menstruating at age 47 had a 6.3% greater 20-year cumulative fracture risk, the authors note.
While previous findings from the Malmo Perimenopausal Study showed menopause prior to the age of 47 to be associated with an 83% and 59% greater risk of densitometric osteoporosis and fracture, respectively, by age 77, the authors note that the new study is unique in including only women who had a final menstrual period over the age of 45, therefore reducing the potential confounding of data on women under 45.
The new results “add to a growing body of literature suggesting that the endocrine changes that occur during the menopause transition trigger a pathophysiologic cascade that leads to organ dysfunction,” the authors note.
In terms of implications in risk assessment, “future studies should examine whether years since the final menstrual period predicts major osteoporotic fractures and hip fractures, specifically, and, if so, whether replacing chronological age with years since the final menstrual period improves the performance of clinical prediction tools, such as FRAX [Fracture Risk Assessment Tool],” they add.
Addition to guidelines?
Commenting on the findings, Peter Ebeling, MD, the current president of the American Society of Bone and Mineral Research, noted that the study importantly “confirms what we had previously anticipated, that in women with menopause who are 45 years of age or older a lower age of final menstrual period is associated with lower spine and hip BMD and more fractures.”
“We had already known this for women with premature ovarian insufficiency or an early menopause, and this extends the observation to the vast majority of women – more than 90% – with a normal menopause age,” said Dr. Ebeling, professor of medicine at Monash Health, Monash University, in Melbourne.
Despite the known importance of the time since final menstrual period, guidelines still focus on age in terms of chronology, rather than biology, emphasizing the risk among women over 50, in general, rather than the time since the last menstrual period, he noted.
“There is an important difference [between those two], as shown by this study,” he said. “Guidelines could be easily adapted to reflect this.”
Specifically, the association between lower age of final menstrual period and lower spine and hip BMD and more fractures requires “more formal assessment to determine whether adding age of final menstrual period to existing fracture risk calculator tools, like FRAX, can improve absolute fracture risk prediction,” Dr. Ebeling noted.
The authors and Dr. Ebeling had no disclosures to report.
Although early menopause is linked to increased risks in bone loss and fracture, new research indicates that, even among the majority of women who have menopause after age 45, the time since the final menstrual period can be a stronger predictor than chronological age for key risks in bone health and fracture.
In a large longitudinal cohort, the number of years since a woman’s final menstrual period specifically showed a stronger association with femoral neck bone mineral density (BMD) than chronological age, while an earlier age at menopause – even among those over 45 years, was linked to an increased risk of fracture.
“Most of our clinical tools to predict osteoporosis-related outcomes use chronological age,” first author Albert Shieh, MD, told this news organization.
“Our findings suggest that more research should be done to examine whether ovarian age (time since final menstrual period) should be used in these tools as well.”
An increased focus on the significance of age at the time of the final menstrual period, compared with chronological age, has gained interest in risk assessment because of the known acceleration in the decline of BMD that occurs 1 year prior to the final menstrual period and continues at a rapid pace for 3 years afterwards before slowing.
To further investigate the association with BMD, Dr. Shieh, an endocrinologist specializing in osteoporosis at the University of California, Los Angeles, and his colleagues turned to data from the Study of Women’s Health Across the Nation (SWAN), a longitudinal cohort study of ambulatory women with pre- or early perimenopausal baseline data and 15 annual follow-up assessments.
Outcomes regarding postmenopausal lumbar spine (LS) or femoral neck (FN) BMD were evaluated in 1,038 women, while the time to fracture in relation to the final menstrual period was separately evaluated in 1,554 women.
In both cohorts, the women had a known final menstrual period at age 45 or older, and on average, their final menstrual period occurred at age 52.
After a multivariate adjustment for age, body mass index, and various other factors, they found that each additional year after a woman’s final menstrual period was associated with a significant (0.006 g/cm2) reduction in postmenopausal lumbar spine BMD and a 0.004 g/cm2 reduction femoral neck BMD (both P < .0001).
Conversely, chronological age was not associated with a change in femoral neck BMD when evaluated independently of years since the final menstrual period, the researchers reported in the Journal of Clinical Endocrinology and Metabolism.
Regarding lumbar spine BMD, chronological age was unexpectedly associated not just with change, but in fact with increases in lumbar spine BMD (P < .0001 per year). However, the authors speculate the change “is likely a reflection of age-associated degenerative changes causing false elevations in BMD measured by dual-energy x-ray absorptiometry.”
Fracture risk with earlier menopause
In terms of the fracture risk analysis, despite the women all being aged 45 or older, earlier age at menopause was still tied to an increased risk of incident fracture, with a 5% increase in risk for each earlier year in age at the time of the final menstrual period (P = .02).
Compared with women who had their final menstrual period at age 55, for instance, those who finished menstruating at age 47 had a 6.3% greater 20-year cumulative fracture risk, the authors note.
While previous findings from the Malmo Perimenopausal Study showed menopause prior to the age of 47 to be associated with an 83% and 59% greater risk of densitometric osteoporosis and fracture, respectively, by age 77, the authors note that the new study is unique in including only women who had a final menstrual period over the age of 45, therefore reducing the potential confounding of data on women under 45.
The new results “add to a growing body of literature suggesting that the endocrine changes that occur during the menopause transition trigger a pathophysiologic cascade that leads to organ dysfunction,” the authors note.
In terms of implications in risk assessment, “future studies should examine whether years since the final menstrual period predicts major osteoporotic fractures and hip fractures, specifically, and, if so, whether replacing chronological age with years since the final menstrual period improves the performance of clinical prediction tools, such as FRAX [Fracture Risk Assessment Tool],” they add.
Addition to guidelines?
Commenting on the findings, Peter Ebeling, MD, the current president of the American Society of Bone and Mineral Research, noted that the study importantly “confirms what we had previously anticipated, that in women with menopause who are 45 years of age or older a lower age of final menstrual period is associated with lower spine and hip BMD and more fractures.”
“We had already known this for women with premature ovarian insufficiency or an early menopause, and this extends the observation to the vast majority of women – more than 90% – with a normal menopause age,” said Dr. Ebeling, professor of medicine at Monash Health, Monash University, in Melbourne.
Despite the known importance of the time since final menstrual period, guidelines still focus on age in terms of chronology, rather than biology, emphasizing the risk among women over 50, in general, rather than the time since the last menstrual period, he noted.
“There is an important difference [between those two], as shown by this study,” he said. “Guidelines could be easily adapted to reflect this.”
Specifically, the association between lower age of final menstrual period and lower spine and hip BMD and more fractures requires “more formal assessment to determine whether adding age of final menstrual period to existing fracture risk calculator tools, like FRAX, can improve absolute fracture risk prediction,” Dr. Ebeling noted.
The authors and Dr. Ebeling had no disclosures to report.
FROM JOURNAL OF CLINICAL ENDOCRINOLOGY AND METABOLISM
Use live donors for liver transplants for HCC patients, say experts
A new study shows that outcomes with liver transplants from live donors are better than outcomes with transplants from deceased donors, leading to calls for increasing the availability of live donation.
“Transplant programs worldwide should be encouraged to expand their live donor programs to manage patients with HCC,” suggest authors of the new study, published in September in JAMA Surgery.
The findings are important in light of the fact that among patients with HCC, liver transplants are restricted to those patients who have the highest chances of survival, owing to long donor organ waiting lists, say the authors. Use of transplants from living donors could increase the availability of organs for patients on the deceased donor waiting list.
“One could even argue that a living donor gives two organs back to the organ pool,” the authors comment.
“Efforts to expand the donor pool through living donor liver transplant for patients with HCC will ultimately increase the number of available deceased donor liver transplants to help all patients in need of liver transplant,” David A. Gerber, MD, of the University of North Carolina at Chapel Hill, and colleagues write in an accompanying commentary.
“It is very important that donors aren’t recruited or solicited, but with the growth of transplant programs, more potential donors will become aware of this opportunity and will step forward seeking to help someone else,” Dr. Gerber commented.
Live liver donor = lower death risk
The new study was conducted by first author Quirino Laid, MD, PhD, of the Department of General Surgery and Organ Transplantation, Sapienza University, Rome, and colleagues. They explain that the need to better understand the potential benefits of living donor organs is pressing. Liver cancer rates continue to rise, and the demand for organs outpaces the supply. Although various smaller studies have shown survival benefits of live donor liver transplant for people with HCC, debate continues. Previous evidence has suggested higher cancer recurrence rates and unfavorable outcomes.
The multicenter study is thought to be the largest to date on this issue. The investigators evaluated data from patients who were on liver donation waiting lists for a first transplant between January 2000 and December 2017. The study included two cohorts of patients on waiting lists: an international cohort, consisting of 3,052 patients at 12 collaborative transplant centers in Europe, Asia, and the United States; and a Canadian cohort, consisting of 906 patients.
The majority of patients were men (80.2%). The median age at the time of first referral was 58 years.
About a third of patients (33.1%) in the international cohort and slightly fewer than a third (27%) in the Canadian cohort received live donor liver transplants; the reminder received liver transplants from deceased donors.
The median follow-up period was 3.3 years. Receiving a live donor liver transplant was independently associated with a 49% reduction in the overall risk for death (hazard ratio, 0.51) in the international cohort and a 43% reduction in the Canadian cohort (HR, 0.57; both P < .001).
After adjustment for potential confounders, living donor liver transplantation remained independently associated with a reduced the risk for overall death. There was a reduction of 33% in the international cohort (P = .001) and a reduction of 48% in the Canadian cohort (P < .001).
“Divergent experiences all converged to a similar 40% to 50% reduction in intention-to-treat death risk,” the authors write.
Importantly, there were no increases in post-transplant cancer recurrence rates in the live donor groups in either cohort. Rates ranged from 13% to 16% over 5 years and from 17% to 22% after 10 years in both groups.
The median amount of time on the waiting list was significantly shorter for patients in the live donor group than for those in the deceased donor group (1 month vs. 6 months in the international cohort [P < .001]; 5 months vs. 6 months in the Canadian cohort [P = .006]).
Notably, in the deceased donor groups, there were 295 dropouts, compared with no dropouts among the live donor patients in the international cohort (P < .001). In the Canadian cohort, the corresponding rates were 32.2% and 13.9% (P < .001).
Diverse transplant centers, larger cohorts set study apart
Although these latest results are consistent with those of recent studies conducted in France, Hong Kong, and elsewhere, in the current study, the cohorts were larger, say the authors.
“Compared with previous studies, all of which were based on relatively small case series, the present study examined the data of almost 4,000 patients who were on a waiting list for a transplant; therefore, this study may be the largest cohort study on this topic,” they point out.
In addition to improved timing of a transplant, other factors, such as patient selection, help explain the better survival, editorialist Dr. Gerber commented.
“Survival improvement [with live donor liver transplants] is a combination of [surgeon] experience in this transplant procedure and an appropriate selection bias, meaning taking patients who aren’t too sick while waiting on the transplant but who would benefit from the operation,” he said.
Gaining that experience may be particularly challenging in the United States, owing to regulatory barriers to expanding the programs, but efforts to overcome that are moving ahead, Dr. Gerber added.
“This issue of where an individual gains the experience or expertise is being discussed as transplantation has grown worldwide,” he notes.
As programs expand, the availability of live liver donors should improve, he suggested.
In a related story, this news organization recently reported on the controversial issue of liver transplant as an option for the treatment of liver metastases resulting from colorectal cancer.
Study coauthor Gonzalo Sapisochin, MD, has received grants from Bayer and Roche outside the submitted work as well as personal fees from Integra, Novartis, and AstraZeneca. No other relevant financial relationships were reported.
A version of this article first appeared on Medscape.com.
A new study shows that outcomes with liver transplants from live donors are better than outcomes with transplants from deceased donors, leading to calls for increasing the availability of live donation.
“Transplant programs worldwide should be encouraged to expand their live donor programs to manage patients with HCC,” suggest authors of the new study, published in September in JAMA Surgery.
The findings are important in light of the fact that among patients with HCC, liver transplants are restricted to those patients who have the highest chances of survival, owing to long donor organ waiting lists, say the authors. Use of transplants from living donors could increase the availability of organs for patients on the deceased donor waiting list.
“One could even argue that a living donor gives two organs back to the organ pool,” the authors comment.
“Efforts to expand the donor pool through living donor liver transplant for patients with HCC will ultimately increase the number of available deceased donor liver transplants to help all patients in need of liver transplant,” David A. Gerber, MD, of the University of North Carolina at Chapel Hill, and colleagues write in an accompanying commentary.
“It is very important that donors aren’t recruited or solicited, but with the growth of transplant programs, more potential donors will become aware of this opportunity and will step forward seeking to help someone else,” Dr. Gerber commented.
Live liver donor = lower death risk
The new study was conducted by first author Quirino Laid, MD, PhD, of the Department of General Surgery and Organ Transplantation, Sapienza University, Rome, and colleagues. They explain that the need to better understand the potential benefits of living donor organs is pressing. Liver cancer rates continue to rise, and the demand for organs outpaces the supply. Although various smaller studies have shown survival benefits of live donor liver transplant for people with HCC, debate continues. Previous evidence has suggested higher cancer recurrence rates and unfavorable outcomes.
The multicenter study is thought to be the largest to date on this issue. The investigators evaluated data from patients who were on liver donation waiting lists for a first transplant between January 2000 and December 2017. The study included two cohorts of patients on waiting lists: an international cohort, consisting of 3,052 patients at 12 collaborative transplant centers in Europe, Asia, and the United States; and a Canadian cohort, consisting of 906 patients.
The majority of patients were men (80.2%). The median age at the time of first referral was 58 years.
About a third of patients (33.1%) in the international cohort and slightly fewer than a third (27%) in the Canadian cohort received live donor liver transplants; the reminder received liver transplants from deceased donors.
The median follow-up period was 3.3 years. Receiving a live donor liver transplant was independently associated with a 49% reduction in the overall risk for death (hazard ratio, 0.51) in the international cohort and a 43% reduction in the Canadian cohort (HR, 0.57; both P < .001).
After adjustment for potential confounders, living donor liver transplantation remained independently associated with a reduced the risk for overall death. There was a reduction of 33% in the international cohort (P = .001) and a reduction of 48% in the Canadian cohort (P < .001).
“Divergent experiences all converged to a similar 40% to 50% reduction in intention-to-treat death risk,” the authors write.
Importantly, there were no increases in post-transplant cancer recurrence rates in the live donor groups in either cohort. Rates ranged from 13% to 16% over 5 years and from 17% to 22% after 10 years in both groups.
The median amount of time on the waiting list was significantly shorter for patients in the live donor group than for those in the deceased donor group (1 month vs. 6 months in the international cohort [P < .001]; 5 months vs. 6 months in the Canadian cohort [P = .006]).
Notably, in the deceased donor groups, there were 295 dropouts, compared with no dropouts among the live donor patients in the international cohort (P < .001). In the Canadian cohort, the corresponding rates were 32.2% and 13.9% (P < .001).
Diverse transplant centers, larger cohorts set study apart
Although these latest results are consistent with those of recent studies conducted in France, Hong Kong, and elsewhere, in the current study, the cohorts were larger, say the authors.
“Compared with previous studies, all of which were based on relatively small case series, the present study examined the data of almost 4,000 patients who were on a waiting list for a transplant; therefore, this study may be the largest cohort study on this topic,” they point out.
In addition to improved timing of a transplant, other factors, such as patient selection, help explain the better survival, editorialist Dr. Gerber commented.
“Survival improvement [with live donor liver transplants] is a combination of [surgeon] experience in this transplant procedure and an appropriate selection bias, meaning taking patients who aren’t too sick while waiting on the transplant but who would benefit from the operation,” he said.
Gaining that experience may be particularly challenging in the United States, owing to regulatory barriers to expanding the programs, but efforts to overcome that are moving ahead, Dr. Gerber added.
“This issue of where an individual gains the experience or expertise is being discussed as transplantation has grown worldwide,” he notes.
As programs expand, the availability of live liver donors should improve, he suggested.
In a related story, this news organization recently reported on the controversial issue of liver transplant as an option for the treatment of liver metastases resulting from colorectal cancer.
Study coauthor Gonzalo Sapisochin, MD, has received grants from Bayer and Roche outside the submitted work as well as personal fees from Integra, Novartis, and AstraZeneca. No other relevant financial relationships were reported.
A version of this article first appeared on Medscape.com.
A new study shows that outcomes with liver transplants from live donors are better than outcomes with transplants from deceased donors, leading to calls for increasing the availability of live donation.
“Transplant programs worldwide should be encouraged to expand their live donor programs to manage patients with HCC,” suggest authors of the new study, published in September in JAMA Surgery.
The findings are important in light of the fact that among patients with HCC, liver transplants are restricted to those patients who have the highest chances of survival, owing to long donor organ waiting lists, say the authors. Use of transplants from living donors could increase the availability of organs for patients on the deceased donor waiting list.
“One could even argue that a living donor gives two organs back to the organ pool,” the authors comment.
“Efforts to expand the donor pool through living donor liver transplant for patients with HCC will ultimately increase the number of available deceased donor liver transplants to help all patients in need of liver transplant,” David A. Gerber, MD, of the University of North Carolina at Chapel Hill, and colleagues write in an accompanying commentary.
“It is very important that donors aren’t recruited or solicited, but with the growth of transplant programs, more potential donors will become aware of this opportunity and will step forward seeking to help someone else,” Dr. Gerber commented.
Live liver donor = lower death risk
The new study was conducted by first author Quirino Laid, MD, PhD, of the Department of General Surgery and Organ Transplantation, Sapienza University, Rome, and colleagues. They explain that the need to better understand the potential benefits of living donor organs is pressing. Liver cancer rates continue to rise, and the demand for organs outpaces the supply. Although various smaller studies have shown survival benefits of live donor liver transplant for people with HCC, debate continues. Previous evidence has suggested higher cancer recurrence rates and unfavorable outcomes.
The multicenter study is thought to be the largest to date on this issue. The investigators evaluated data from patients who were on liver donation waiting lists for a first transplant between January 2000 and December 2017. The study included two cohorts of patients on waiting lists: an international cohort, consisting of 3,052 patients at 12 collaborative transplant centers in Europe, Asia, and the United States; and a Canadian cohort, consisting of 906 patients.
The majority of patients were men (80.2%). The median age at the time of first referral was 58 years.
About a third of patients (33.1%) in the international cohort and slightly fewer than a third (27%) in the Canadian cohort received live donor liver transplants; the reminder received liver transplants from deceased donors.
The median follow-up period was 3.3 years. Receiving a live donor liver transplant was independently associated with a 49% reduction in the overall risk for death (hazard ratio, 0.51) in the international cohort and a 43% reduction in the Canadian cohort (HR, 0.57; both P < .001).
After adjustment for potential confounders, living donor liver transplantation remained independently associated with a reduced the risk for overall death. There was a reduction of 33% in the international cohort (P = .001) and a reduction of 48% in the Canadian cohort (P < .001).
“Divergent experiences all converged to a similar 40% to 50% reduction in intention-to-treat death risk,” the authors write.
Importantly, there were no increases in post-transplant cancer recurrence rates in the live donor groups in either cohort. Rates ranged from 13% to 16% over 5 years and from 17% to 22% after 10 years in both groups.
The median amount of time on the waiting list was significantly shorter for patients in the live donor group than for those in the deceased donor group (1 month vs. 6 months in the international cohort [P < .001]; 5 months vs. 6 months in the Canadian cohort [P = .006]).
Notably, in the deceased donor groups, there were 295 dropouts, compared with no dropouts among the live donor patients in the international cohort (P < .001). In the Canadian cohort, the corresponding rates were 32.2% and 13.9% (P < .001).
Diverse transplant centers, larger cohorts set study apart
Although these latest results are consistent with those of recent studies conducted in France, Hong Kong, and elsewhere, in the current study, the cohorts were larger, say the authors.
“Compared with previous studies, all of which were based on relatively small case series, the present study examined the data of almost 4,000 patients who were on a waiting list for a transplant; therefore, this study may be the largest cohort study on this topic,” they point out.
In addition to improved timing of a transplant, other factors, such as patient selection, help explain the better survival, editorialist Dr. Gerber commented.
“Survival improvement [with live donor liver transplants] is a combination of [surgeon] experience in this transplant procedure and an appropriate selection bias, meaning taking patients who aren’t too sick while waiting on the transplant but who would benefit from the operation,” he said.
Gaining that experience may be particularly challenging in the United States, owing to regulatory barriers to expanding the programs, but efforts to overcome that are moving ahead, Dr. Gerber added.
“This issue of where an individual gains the experience or expertise is being discussed as transplantation has grown worldwide,” he notes.
As programs expand, the availability of live liver donors should improve, he suggested.
In a related story, this news organization recently reported on the controversial issue of liver transplant as an option for the treatment of liver metastases resulting from colorectal cancer.
Study coauthor Gonzalo Sapisochin, MD, has received grants from Bayer and Roche outside the submitted work as well as personal fees from Integra, Novartis, and AstraZeneca. No other relevant financial relationships were reported.
A version of this article first appeared on Medscape.com.