User login
In MS, iron-ringed lesions may add to imaging toolkit
STOCKHOLM – , according to research presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
Using a conventional 3 Tesla magnetic resonance imaging (MRI) scanner, Margareta Clarke, PhD, and colleagues were able to identify iron rings (also called iron rims) and the central vein sign, and saw that both lesion characteristics were more common in MS patients than in those without MS.
“Routine two-dimensional 3 Tesla MRI with susceptibility weighting can be used to successfully visualize central veins and iron rims,” said Dr. Clarke, speaking at an imaging-focused young investigators’ session at the meeting. “Also, the central vein sign findings from previous 3T studies are confirmed.”
Dr. Clarke, a research fellow at the Vall d’Hebron Research Institute in Barcelona, explained that iron is stored within oligodendrocytes and myelin within the brain. In up to 56% of MS lesions, a rim of iron is visible with susceptibility weighted MRI imaging, she said, adding that the iron rings around the lesions “are likely caused by iron-laden activated microglia and macrophages that accumulate on the edges of lesions.”
It had been known that when lesions are surrounded by iron rings, they are more likely to enlarge and become increasingly hypointense on T1 weighted MRI. In addition, patients with more disability are more likely to have iron-rimmed brain lesions, said Dr. Clarke, and iron rings are associated with chronic disease activity. “Iron rings are a proposed marker of continuing inflammation and tissue loss,” she added.
The cross-sectional, single-center study enrolled patients with clinically isolated syndrome (CIS), MS, and conditions which can mimic MS on MRI. Dr. Clarke and her coinvestigators looked at the frequency of lesions with the central vein sign, and with iron rings, in all patients.
An additional aim of the study was to compare how experienced and inexperienced raters fared in their identification of both central veins and iron rings in 25 scans randomly chosen from within the study population. Inter-rater reliability between experienced and inexperienced raters was assessed as good, with little difference between experience levels in detecting iron rings and central veins, said Dr. Clarke.
Criteria used for central vein determination were those established by the North American Imaging in MS initiative, said Dr. Clarke: The vein needs to be seen entering and/or exiting the lesion, and the vein must course through the lesion’s center. If lesions are confluent, each of the larger lesion’s “fingers” must be assessed individually.
Iron rings appear as a hypointense area rimming the lesion’s edge; for the study, an iron ring was considered present if it could be seen fully or partially encircling a lesion, and if the ring was visible on at least two slices.
The study enrolled 103 patients with relapsing-remitting MS, 49 with progressive MS, 112 with CIS, and 35 non-MS patients; about 60% of this latter group had either autoimmune or vascular disease.
The fewest white matter lesions – a median of 4 per patient - were seen in the CIS group, while the progressive MS and non-MS group each had a median of 7 lesions, and the relapsing-remitting MS group had a median of 10 lesions.
In all, 2,617 lesions were analyzed, and 1,352 were assessed as having the central vein sign. Patients with MS or CIS had central vein sign in more than 50% of their lesions, while the non-MS patients had fewer than 20% central vein–positive lesions. In CIS and MS patients, central vein–positive lesions occurred more frequently in the periventricular and subcortical regions, compared with other brain regions (P less than .001).
Iron rings were detected in 392 lesions; none of the non-MS patients had iron ring–positive lesions. In terms of the brain regions where iron rings were most likely to be seen, said Dr. Clarke, “Over half of all iron ring-positive lesions were periventricular.” This finding was statistically significant as well (P less than .001). At least one lesion with an iron ring was seen in 59% of relapsing-remitting MS patients, 39% of progressive MS patients, and 48% of CIS patients.
In terms of patient characteristics, men were 40% more likely to have iron ring–positive lesions, and patients with relapsing-remitting MS were 50% more likely than were patients with CIS to have iron rings. Iron rings became 3% less likely for each additional year of age, as well (P less than .01 for all comparisons).
“Our results show that iron ring numbers peak in relapsing-remitting MS and decrease with longer disease duration,” Dr. Clarke and colleagues reported.
Dr. Clarke acknowledged several limitations of the study, including its single-center and retrospective nature, as well as the relatively low numbers of non-MS patients and patients with progressive MS. She and her colleagues are planning larger studies using 5-year follow-up data, she said.
Dr. Clarke is an ECTRIMS-MAGNIMS fellow and reported a speaker honorarium from Novartis.
SOURCE: Clarke M et al. ECTRIMS 2019. Abstract 108.
STOCKHOLM – , according to research presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
Using a conventional 3 Tesla magnetic resonance imaging (MRI) scanner, Margareta Clarke, PhD, and colleagues were able to identify iron rings (also called iron rims) and the central vein sign, and saw that both lesion characteristics were more common in MS patients than in those without MS.
“Routine two-dimensional 3 Tesla MRI with susceptibility weighting can be used to successfully visualize central veins and iron rims,” said Dr. Clarke, speaking at an imaging-focused young investigators’ session at the meeting. “Also, the central vein sign findings from previous 3T studies are confirmed.”
Dr. Clarke, a research fellow at the Vall d’Hebron Research Institute in Barcelona, explained that iron is stored within oligodendrocytes and myelin within the brain. In up to 56% of MS lesions, a rim of iron is visible with susceptibility weighted MRI imaging, she said, adding that the iron rings around the lesions “are likely caused by iron-laden activated microglia and macrophages that accumulate on the edges of lesions.”
It had been known that when lesions are surrounded by iron rings, they are more likely to enlarge and become increasingly hypointense on T1 weighted MRI. In addition, patients with more disability are more likely to have iron-rimmed brain lesions, said Dr. Clarke, and iron rings are associated with chronic disease activity. “Iron rings are a proposed marker of continuing inflammation and tissue loss,” she added.
The cross-sectional, single-center study enrolled patients with clinically isolated syndrome (CIS), MS, and conditions which can mimic MS on MRI. Dr. Clarke and her coinvestigators looked at the frequency of lesions with the central vein sign, and with iron rings, in all patients.
An additional aim of the study was to compare how experienced and inexperienced raters fared in their identification of both central veins and iron rings in 25 scans randomly chosen from within the study population. Inter-rater reliability between experienced and inexperienced raters was assessed as good, with little difference between experience levels in detecting iron rings and central veins, said Dr. Clarke.
Criteria used for central vein determination were those established by the North American Imaging in MS initiative, said Dr. Clarke: The vein needs to be seen entering and/or exiting the lesion, and the vein must course through the lesion’s center. If lesions are confluent, each of the larger lesion’s “fingers” must be assessed individually.
Iron rings appear as a hypointense area rimming the lesion’s edge; for the study, an iron ring was considered present if it could be seen fully or partially encircling a lesion, and if the ring was visible on at least two slices.
The study enrolled 103 patients with relapsing-remitting MS, 49 with progressive MS, 112 with CIS, and 35 non-MS patients; about 60% of this latter group had either autoimmune or vascular disease.
The fewest white matter lesions – a median of 4 per patient - were seen in the CIS group, while the progressive MS and non-MS group each had a median of 7 lesions, and the relapsing-remitting MS group had a median of 10 lesions.
In all, 2,617 lesions were analyzed, and 1,352 were assessed as having the central vein sign. Patients with MS or CIS had central vein sign in more than 50% of their lesions, while the non-MS patients had fewer than 20% central vein–positive lesions. In CIS and MS patients, central vein–positive lesions occurred more frequently in the periventricular and subcortical regions, compared with other brain regions (P less than .001).
Iron rings were detected in 392 lesions; none of the non-MS patients had iron ring–positive lesions. In terms of the brain regions where iron rings were most likely to be seen, said Dr. Clarke, “Over half of all iron ring-positive lesions were periventricular.” This finding was statistically significant as well (P less than .001). At least one lesion with an iron ring was seen in 59% of relapsing-remitting MS patients, 39% of progressive MS patients, and 48% of CIS patients.
In terms of patient characteristics, men were 40% more likely to have iron ring–positive lesions, and patients with relapsing-remitting MS were 50% more likely than were patients with CIS to have iron rings. Iron rings became 3% less likely for each additional year of age, as well (P less than .01 for all comparisons).
“Our results show that iron ring numbers peak in relapsing-remitting MS and decrease with longer disease duration,” Dr. Clarke and colleagues reported.
Dr. Clarke acknowledged several limitations of the study, including its single-center and retrospective nature, as well as the relatively low numbers of non-MS patients and patients with progressive MS. She and her colleagues are planning larger studies using 5-year follow-up data, she said.
Dr. Clarke is an ECTRIMS-MAGNIMS fellow and reported a speaker honorarium from Novartis.
SOURCE: Clarke M et al. ECTRIMS 2019. Abstract 108.
STOCKHOLM – , according to research presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
Using a conventional 3 Tesla magnetic resonance imaging (MRI) scanner, Margareta Clarke, PhD, and colleagues were able to identify iron rings (also called iron rims) and the central vein sign, and saw that both lesion characteristics were more common in MS patients than in those without MS.
“Routine two-dimensional 3 Tesla MRI with susceptibility weighting can be used to successfully visualize central veins and iron rims,” said Dr. Clarke, speaking at an imaging-focused young investigators’ session at the meeting. “Also, the central vein sign findings from previous 3T studies are confirmed.”
Dr. Clarke, a research fellow at the Vall d’Hebron Research Institute in Barcelona, explained that iron is stored within oligodendrocytes and myelin within the brain. In up to 56% of MS lesions, a rim of iron is visible with susceptibility weighted MRI imaging, she said, adding that the iron rings around the lesions “are likely caused by iron-laden activated microglia and macrophages that accumulate on the edges of lesions.”
It had been known that when lesions are surrounded by iron rings, they are more likely to enlarge and become increasingly hypointense on T1 weighted MRI. In addition, patients with more disability are more likely to have iron-rimmed brain lesions, said Dr. Clarke, and iron rings are associated with chronic disease activity. “Iron rings are a proposed marker of continuing inflammation and tissue loss,” she added.
The cross-sectional, single-center study enrolled patients with clinically isolated syndrome (CIS), MS, and conditions which can mimic MS on MRI. Dr. Clarke and her coinvestigators looked at the frequency of lesions with the central vein sign, and with iron rings, in all patients.
An additional aim of the study was to compare how experienced and inexperienced raters fared in their identification of both central veins and iron rings in 25 scans randomly chosen from within the study population. Inter-rater reliability between experienced and inexperienced raters was assessed as good, with little difference between experience levels in detecting iron rings and central veins, said Dr. Clarke.
Criteria used for central vein determination were those established by the North American Imaging in MS initiative, said Dr. Clarke: The vein needs to be seen entering and/or exiting the lesion, and the vein must course through the lesion’s center. If lesions are confluent, each of the larger lesion’s “fingers” must be assessed individually.
Iron rings appear as a hypointense area rimming the lesion’s edge; for the study, an iron ring was considered present if it could be seen fully or partially encircling a lesion, and if the ring was visible on at least two slices.
The study enrolled 103 patients with relapsing-remitting MS, 49 with progressive MS, 112 with CIS, and 35 non-MS patients; about 60% of this latter group had either autoimmune or vascular disease.
The fewest white matter lesions – a median of 4 per patient - were seen in the CIS group, while the progressive MS and non-MS group each had a median of 7 lesions, and the relapsing-remitting MS group had a median of 10 lesions.
In all, 2,617 lesions were analyzed, and 1,352 were assessed as having the central vein sign. Patients with MS or CIS had central vein sign in more than 50% of their lesions, while the non-MS patients had fewer than 20% central vein–positive lesions. In CIS and MS patients, central vein–positive lesions occurred more frequently in the periventricular and subcortical regions, compared with other brain regions (P less than .001).
Iron rings were detected in 392 lesions; none of the non-MS patients had iron ring–positive lesions. In terms of the brain regions where iron rings were most likely to be seen, said Dr. Clarke, “Over half of all iron ring-positive lesions were periventricular.” This finding was statistically significant as well (P less than .001). At least one lesion with an iron ring was seen in 59% of relapsing-remitting MS patients, 39% of progressive MS patients, and 48% of CIS patients.
In terms of patient characteristics, men were 40% more likely to have iron ring–positive lesions, and patients with relapsing-remitting MS were 50% more likely than were patients with CIS to have iron rings. Iron rings became 3% less likely for each additional year of age, as well (P less than .01 for all comparisons).
“Our results show that iron ring numbers peak in relapsing-remitting MS and decrease with longer disease duration,” Dr. Clarke and colleagues reported.
Dr. Clarke acknowledged several limitations of the study, including its single-center and retrospective nature, as well as the relatively low numbers of non-MS patients and patients with progressive MS. She and her colleagues are planning larger studies using 5-year follow-up data, she said.
Dr. Clarke is an ECTRIMS-MAGNIMS fellow and reported a speaker honorarium from Novartis.
SOURCE: Clarke M et al. ECTRIMS 2019. Abstract 108.
REPORTING FROM ECTRIMS 2019
Supercooling extends donor liver viability by 27 hours
Standard cooling to 4°C provides just 12 hours of organ preservation, but laboratory testing showed that supercooling to –4°C added 27 hours of viability, reported lead author Reinier J. de Vries, MD, of Harvard Medical School and Massachusetts General Hospital in Boston, and colleagues.
“The absence of technology to preserve organs for more than a few hours is one of the fundamental causes of the donor organ–shortage crisis,” the investigators wrote in Nature Biotechnology.
Supercooling organs to high-subzero temperatures has been shown to prolong organ life while avoiding ice-mediated injury, but techniques that are successful for rat livers have been difficult to translate to human livers because of their larger size, which increases the risk of ice formation, the investigators explained.
Three strategies were employed to overcome this problem: minimization of air-liquid interfaces, development of a new supercooling-preservation solution, and hypothermic machine perfusion to more evenly distribute preservation solution throughout the liver tissue. For recovery of organs after supercooling, the investigators used subnormothermic machine perfusion, which has been used effectively in rat transplants.
In order to measure the impact of this process on organ viability, the investigators first measured adenylate energy content, both before supercooling and after recovery.
“Adenylate energy content, and, particularly, the organ’s ability to recover it during (re)perfusion, is considered the most representative metric for liver viability,” they wrote.
The difference between pre- and postsupercooling energy charge was less than 20%; in comparison, failed liver transplants in large animals and clinical trials have typically involved an energy-charge loss of 40% or more.
To further test organ viability, the investigators measured pre- and postsupercooling levels of bile production, oxygen uptake, and vascular resistance. All of these parameters have been shown to predict transplant success in rats, and bile production has additional precedent from human studies.
On average, bile production, portal resistance, and arterial resistance were not significantly affected by supercooling. Although portal vein resistance was 20% higher after supercooling, this compared favorably with increases of 100%-150% that have been measured in nonviable livers. Similarly, oxygen uptake increased by a mean of 17%, but this was three times lower than changes that have been observed in livers with impaired viability, at 51%.
Additional measures of hepatocellular injury, including AST and ALT, were also supportive of viability after supercooling. Histopathology confirmed these findings by showing preserved tissue architecture.
“In summary, we find that the human livers tested displayed no substantial difference in viability before and after extended subzero supercooling preservation,” the investigators wrote.
To simulate transplantation, the investigators reperfused the organs with blood at a normal temperature, including platelets, complement, and white blood cells, which are drivers of ischemia reperfusion injury. During this process, energy charge remained stable, which indicates preserved mitochondrial function. While energy charge held steady, lactate metabolism increased with bile and urea production, suggesting increased liver function. Bile pH and HCO3– levels fell within range for viability. Although bile glucose exceeded proposed criteria, the investigators pointed out that levels still fell within parameters for research-quality livers. Lactate levels also rose within the first hour of reperfusion, but the investigators suggested that this finding should be interpreted with appropriate context.
“It should be considered that the livers in this study were initially rejected for transplantation,” they wrote, “and the confidence intervals of the lactate concentration at the end of reperfusion largely overlap with time-matched values reported by others during [normothermic machine perfusion] of rejected human livers.”
Hepatocellular injury and histology also were evaluated during and after simulated transplantation, respectively, with favorable results. Although sites of preexisting hepatic injury were aggravated by the process, and rates of apoptosis increased, the investigators considered these changes were clinically insignificant.
Looking to the future, the investigators suggested that further refinement of the process could facilitate even-lower storage temperatures while better preserving liver viability.
“The use of human livers makes this study clinically relevant and promotes the translation of subzero organ preservation to the clinic,” the investigators concluded. “However, long-term survival experiments of transplanted supercooled livers in swine or an alternative large animal model will be needed before clinical translation.”
The study was funded by the National Institutes of Health and the Department of Defense. Dr. de Vries and four other coauthors have provisional patent applications related to the study, and one coauthor disclosed a financial relationship with Organ Solutions.
SOURCE: de Vries RJ et al. Nature Biotechnol. 2019 Sep 9. doi: 10.1038/s41587-019-0223-y.
Standard cooling to 4°C provides just 12 hours of organ preservation, but laboratory testing showed that supercooling to –4°C added 27 hours of viability, reported lead author Reinier J. de Vries, MD, of Harvard Medical School and Massachusetts General Hospital in Boston, and colleagues.
“The absence of technology to preserve organs for more than a few hours is one of the fundamental causes of the donor organ–shortage crisis,” the investigators wrote in Nature Biotechnology.
Supercooling organs to high-subzero temperatures has been shown to prolong organ life while avoiding ice-mediated injury, but techniques that are successful for rat livers have been difficult to translate to human livers because of their larger size, which increases the risk of ice formation, the investigators explained.
Three strategies were employed to overcome this problem: minimization of air-liquid interfaces, development of a new supercooling-preservation solution, and hypothermic machine perfusion to more evenly distribute preservation solution throughout the liver tissue. For recovery of organs after supercooling, the investigators used subnormothermic machine perfusion, which has been used effectively in rat transplants.
In order to measure the impact of this process on organ viability, the investigators first measured adenylate energy content, both before supercooling and after recovery.
“Adenylate energy content, and, particularly, the organ’s ability to recover it during (re)perfusion, is considered the most representative metric for liver viability,” they wrote.
The difference between pre- and postsupercooling energy charge was less than 20%; in comparison, failed liver transplants in large animals and clinical trials have typically involved an energy-charge loss of 40% or more.
To further test organ viability, the investigators measured pre- and postsupercooling levels of bile production, oxygen uptake, and vascular resistance. All of these parameters have been shown to predict transplant success in rats, and bile production has additional precedent from human studies.
On average, bile production, portal resistance, and arterial resistance were not significantly affected by supercooling. Although portal vein resistance was 20% higher after supercooling, this compared favorably with increases of 100%-150% that have been measured in nonviable livers. Similarly, oxygen uptake increased by a mean of 17%, but this was three times lower than changes that have been observed in livers with impaired viability, at 51%.
Additional measures of hepatocellular injury, including AST and ALT, were also supportive of viability after supercooling. Histopathology confirmed these findings by showing preserved tissue architecture.
“In summary, we find that the human livers tested displayed no substantial difference in viability before and after extended subzero supercooling preservation,” the investigators wrote.
To simulate transplantation, the investigators reperfused the organs with blood at a normal temperature, including platelets, complement, and white blood cells, which are drivers of ischemia reperfusion injury. During this process, energy charge remained stable, which indicates preserved mitochondrial function. While energy charge held steady, lactate metabolism increased with bile and urea production, suggesting increased liver function. Bile pH and HCO3– levels fell within range for viability. Although bile glucose exceeded proposed criteria, the investigators pointed out that levels still fell within parameters for research-quality livers. Lactate levels also rose within the first hour of reperfusion, but the investigators suggested that this finding should be interpreted with appropriate context.
“It should be considered that the livers in this study were initially rejected for transplantation,” they wrote, “and the confidence intervals of the lactate concentration at the end of reperfusion largely overlap with time-matched values reported by others during [normothermic machine perfusion] of rejected human livers.”
Hepatocellular injury and histology also were evaluated during and after simulated transplantation, respectively, with favorable results. Although sites of preexisting hepatic injury were aggravated by the process, and rates of apoptosis increased, the investigators considered these changes were clinically insignificant.
Looking to the future, the investigators suggested that further refinement of the process could facilitate even-lower storage temperatures while better preserving liver viability.
“The use of human livers makes this study clinically relevant and promotes the translation of subzero organ preservation to the clinic,” the investigators concluded. “However, long-term survival experiments of transplanted supercooled livers in swine or an alternative large animal model will be needed before clinical translation.”
The study was funded by the National Institutes of Health and the Department of Defense. Dr. de Vries and four other coauthors have provisional patent applications related to the study, and one coauthor disclosed a financial relationship with Organ Solutions.
SOURCE: de Vries RJ et al. Nature Biotechnol. 2019 Sep 9. doi: 10.1038/s41587-019-0223-y.
Standard cooling to 4°C provides just 12 hours of organ preservation, but laboratory testing showed that supercooling to –4°C added 27 hours of viability, reported lead author Reinier J. de Vries, MD, of Harvard Medical School and Massachusetts General Hospital in Boston, and colleagues.
“The absence of technology to preserve organs for more than a few hours is one of the fundamental causes of the donor organ–shortage crisis,” the investigators wrote in Nature Biotechnology.
Supercooling organs to high-subzero temperatures has been shown to prolong organ life while avoiding ice-mediated injury, but techniques that are successful for rat livers have been difficult to translate to human livers because of their larger size, which increases the risk of ice formation, the investigators explained.
Three strategies were employed to overcome this problem: minimization of air-liquid interfaces, development of a new supercooling-preservation solution, and hypothermic machine perfusion to more evenly distribute preservation solution throughout the liver tissue. For recovery of organs after supercooling, the investigators used subnormothermic machine perfusion, which has been used effectively in rat transplants.
In order to measure the impact of this process on organ viability, the investigators first measured adenylate energy content, both before supercooling and after recovery.
“Adenylate energy content, and, particularly, the organ’s ability to recover it during (re)perfusion, is considered the most representative metric for liver viability,” they wrote.
The difference between pre- and postsupercooling energy charge was less than 20%; in comparison, failed liver transplants in large animals and clinical trials have typically involved an energy-charge loss of 40% or more.
To further test organ viability, the investigators measured pre- and postsupercooling levels of bile production, oxygen uptake, and vascular resistance. All of these parameters have been shown to predict transplant success in rats, and bile production has additional precedent from human studies.
On average, bile production, portal resistance, and arterial resistance were not significantly affected by supercooling. Although portal vein resistance was 20% higher after supercooling, this compared favorably with increases of 100%-150% that have been measured in nonviable livers. Similarly, oxygen uptake increased by a mean of 17%, but this was three times lower than changes that have been observed in livers with impaired viability, at 51%.
Additional measures of hepatocellular injury, including AST and ALT, were also supportive of viability after supercooling. Histopathology confirmed these findings by showing preserved tissue architecture.
“In summary, we find that the human livers tested displayed no substantial difference in viability before and after extended subzero supercooling preservation,” the investigators wrote.
To simulate transplantation, the investigators reperfused the organs with blood at a normal temperature, including platelets, complement, and white blood cells, which are drivers of ischemia reperfusion injury. During this process, energy charge remained stable, which indicates preserved mitochondrial function. While energy charge held steady, lactate metabolism increased with bile and urea production, suggesting increased liver function. Bile pH and HCO3– levels fell within range for viability. Although bile glucose exceeded proposed criteria, the investigators pointed out that levels still fell within parameters for research-quality livers. Lactate levels also rose within the first hour of reperfusion, but the investigators suggested that this finding should be interpreted with appropriate context.
“It should be considered that the livers in this study were initially rejected for transplantation,” they wrote, “and the confidence intervals of the lactate concentration at the end of reperfusion largely overlap with time-matched values reported by others during [normothermic machine perfusion] of rejected human livers.”
Hepatocellular injury and histology also were evaluated during and after simulated transplantation, respectively, with favorable results. Although sites of preexisting hepatic injury were aggravated by the process, and rates of apoptosis increased, the investigators considered these changes were clinically insignificant.
Looking to the future, the investigators suggested that further refinement of the process could facilitate even-lower storage temperatures while better preserving liver viability.
“The use of human livers makes this study clinically relevant and promotes the translation of subzero organ preservation to the clinic,” the investigators concluded. “However, long-term survival experiments of transplanted supercooled livers in swine or an alternative large animal model will be needed before clinical translation.”
The study was funded by the National Institutes of Health and the Department of Defense. Dr. de Vries and four other coauthors have provisional patent applications related to the study, and one coauthor disclosed a financial relationship with Organ Solutions.
SOURCE: de Vries RJ et al. Nature Biotechnol. 2019 Sep 9. doi: 10.1038/s41587-019-0223-y.
FROM NATURE BIOTECHNOLOGY
Educating teens, young adults about dangers of vaping
Physicians have been alarmed about the vaping craze for quite some time. This alarm has grown louder in the wake of news that electronic cigarettes have been associated with a mysterious lung disease.
Public health officials have reported that there have been 530 cases of vaping-related respiratory disease,1 and as of press time at least seven deaths had been attributed to vaping*. On Sept. 6, 2019, the Food and Drug Administration, Centers for Disease Control and Prevention, and other health officials issued an investigation notice on vaping and e-cigarettes,2 cautioning teenagers, young adults, and pregnant women to avoid e-cigarettes completely and cautioning all users to never buy e-cigarettes off the street or from social sources.
A few days later, on Sept. 9, the FDA’s Center for Tobacco Products issued a warning letter to JUUL Labs, makers of a popular e-cigarette, for illegal marketing of modified-risk tobacco products.3 Then on Sept. 10, health officials in Kansas reported that a sixth person has died of a lung illness related to vaping.4
Researchers have found that 80% of those diagnosed with the vaping illness used products that contained THC, the psychoactive ingredient in marijuana, 61% had used nicotine products, and 7% used cannabidiol (CBD) products. Vitamin E acetate is another substance identified in press reports as tied to the severe lung disease.
Most of the patients affected are adolescents and young adults, with the average age of 19 years.5 This comes as vaping among high school students rose 78% between 2017 and 2018.6 According the U.S. surgeon general, one in five teens vapes. Other data show that teen use of e-cigarettes comes with most users having never smoked a traditional cigarette.7 Teens and young adults frequently borrow buy* e-cigarette “pods” from gas stations but borrow and purchase from friends or peers. In addition, young people are known to alter the pods to insert other liquids, such as CBD and other marijuana products.
Teens and young adults are at higher risk for vaping complications. Their respiratory and immune systems are still developing. In addition to concerns about the recent surge of respiratory illnesses, nicotine is known to also suppress the immune system, which makes people who use it more susceptible to viral and bacterial infections – and also making it harder for them to recover.
In addition nicotine hyperactivates the reward centers of the brain, which can trigger addictive behaviors. Because the brains of young adults are not yet fully developed until at or after age 26, nicotine use before this can “prime the pump” of a still-developing brain, thereby increasing the likelihood for addiction to harder drugs. Nicotine has been shown to disrupt sleep patterns, which are critical for mental and physical health. Lastly, research shows that smoking increases the risks of various psychiatric disorders, such as depression and anxiety. My teen and young adult patients have endlessly debated with me the idea that smoking – either nicotine or marijuana – eases their anxiety or helps them get to sleep. I tell them that, in the long run, the data show that smoking makes those problems worse.8-11
Nationally, we are seeing an explosion of multistate legislation pushing marijuana as a health food. E-cigarettes have followed as the “healthy” alternative to traditional tobacco.
Finally, our world is now filled with smartphones, sexting, and social media overuse. An entire peer group exists that knows life only with constant electronic stimulation. It is not without irony that our national nicotine obsessions have morphed from paper cigarettes to electronic versions. This raises questions: Are teens and young adults using e-cigarettes because of boredom? Are we witnessing a generational ADHD borne from restlessness that stems from lives with fewer meaningful face-to-face human interactions?
In addition to educating our teens and young adults about the physical risks tied to vaping, we need to teach them to build meaning into their lives that exists outside of this digital age.
Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
References
1. CDC. Outbreak of lung injury associated with e-cigarette use, or vaping. 2019 Sep 19.
2. CDC. Outbreak of lung illness associated with using e-cigarette products. Investigation notice. 2019 Sep 6.
3. FDA. Warning letter, JUUL Labs. 2019 Sep 9.
4. Sixth person dies of vaping-related illness. The Hill. 2019 Sep 10.
5. Layden JE. Pulmonary illness related to cigarette use in Illinois and Wisconsin – preliminary report. N Engl J Med. 2019 Sep 6. doi: 10.1056/NEJMoa1911614.
6. Cullen KA et al. CDC. MMWR. 2018 Nov 16;67(45):1276-7.
7. National Academies of Sciences, Engineering, and Medicine. Public health consequences of e-cigarettes. 2018.
8. Patton GC et al. Am J Public Health. 1996 Feb;86(2):225-30.
9. Leventhal AM et al. J Psychiatr Res. 2016 Feb;73:71-8.
10. Levine A et al. J Am Acad Child Adolesc Psychiatry. 2017 Mar;56(3):214-2.
11. Leadbeater BJ et al. Addiction. 2019 Feb;114(2):278-93.
* This column was updated 9/24/2019.
Physicians have been alarmed about the vaping craze for quite some time. This alarm has grown louder in the wake of news that electronic cigarettes have been associated with a mysterious lung disease.
Public health officials have reported that there have been 530 cases of vaping-related respiratory disease,1 and as of press time at least seven deaths had been attributed to vaping*. On Sept. 6, 2019, the Food and Drug Administration, Centers for Disease Control and Prevention, and other health officials issued an investigation notice on vaping and e-cigarettes,2 cautioning teenagers, young adults, and pregnant women to avoid e-cigarettes completely and cautioning all users to never buy e-cigarettes off the street or from social sources.
A few days later, on Sept. 9, the FDA’s Center for Tobacco Products issued a warning letter to JUUL Labs, makers of a popular e-cigarette, for illegal marketing of modified-risk tobacco products.3 Then on Sept. 10, health officials in Kansas reported that a sixth person has died of a lung illness related to vaping.4
Researchers have found that 80% of those diagnosed with the vaping illness used products that contained THC, the psychoactive ingredient in marijuana, 61% had used nicotine products, and 7% used cannabidiol (CBD) products. Vitamin E acetate is another substance identified in press reports as tied to the severe lung disease.
Most of the patients affected are adolescents and young adults, with the average age of 19 years.5 This comes as vaping among high school students rose 78% between 2017 and 2018.6 According the U.S. surgeon general, one in five teens vapes. Other data show that teen use of e-cigarettes comes with most users having never smoked a traditional cigarette.7 Teens and young adults frequently borrow buy* e-cigarette “pods” from gas stations but borrow and purchase from friends or peers. In addition, young people are known to alter the pods to insert other liquids, such as CBD and other marijuana products.
Teens and young adults are at higher risk for vaping complications. Their respiratory and immune systems are still developing. In addition to concerns about the recent surge of respiratory illnesses, nicotine is known to also suppress the immune system, which makes people who use it more susceptible to viral and bacterial infections – and also making it harder for them to recover.
In addition nicotine hyperactivates the reward centers of the brain, which can trigger addictive behaviors. Because the brains of young adults are not yet fully developed until at or after age 26, nicotine use before this can “prime the pump” of a still-developing brain, thereby increasing the likelihood for addiction to harder drugs. Nicotine has been shown to disrupt sleep patterns, which are critical for mental and physical health. Lastly, research shows that smoking increases the risks of various psychiatric disorders, such as depression and anxiety. My teen and young adult patients have endlessly debated with me the idea that smoking – either nicotine or marijuana – eases their anxiety or helps them get to sleep. I tell them that, in the long run, the data show that smoking makes those problems worse.8-11
Nationally, we are seeing an explosion of multistate legislation pushing marijuana as a health food. E-cigarettes have followed as the “healthy” alternative to traditional tobacco.
Finally, our world is now filled with smartphones, sexting, and social media overuse. An entire peer group exists that knows life only with constant electronic stimulation. It is not without irony that our national nicotine obsessions have morphed from paper cigarettes to electronic versions. This raises questions: Are teens and young adults using e-cigarettes because of boredom? Are we witnessing a generational ADHD borne from restlessness that stems from lives with fewer meaningful face-to-face human interactions?
In addition to educating our teens and young adults about the physical risks tied to vaping, we need to teach them to build meaning into their lives that exists outside of this digital age.
Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
References
1. CDC. Outbreak of lung injury associated with e-cigarette use, or vaping. 2019 Sep 19.
2. CDC. Outbreak of lung illness associated with using e-cigarette products. Investigation notice. 2019 Sep 6.
3. FDA. Warning letter, JUUL Labs. 2019 Sep 9.
4. Sixth person dies of vaping-related illness. The Hill. 2019 Sep 10.
5. Layden JE. Pulmonary illness related to cigarette use in Illinois and Wisconsin – preliminary report. N Engl J Med. 2019 Sep 6. doi: 10.1056/NEJMoa1911614.
6. Cullen KA et al. CDC. MMWR. 2018 Nov 16;67(45):1276-7.
7. National Academies of Sciences, Engineering, and Medicine. Public health consequences of e-cigarettes. 2018.
8. Patton GC et al. Am J Public Health. 1996 Feb;86(2):225-30.
9. Leventhal AM et al. J Psychiatr Res. 2016 Feb;73:71-8.
10. Levine A et al. J Am Acad Child Adolesc Psychiatry. 2017 Mar;56(3):214-2.
11. Leadbeater BJ et al. Addiction. 2019 Feb;114(2):278-93.
* This column was updated 9/24/2019.
Physicians have been alarmed about the vaping craze for quite some time. This alarm has grown louder in the wake of news that electronic cigarettes have been associated with a mysterious lung disease.
Public health officials have reported that there have been 530 cases of vaping-related respiratory disease,1 and as of press time at least seven deaths had been attributed to vaping*. On Sept. 6, 2019, the Food and Drug Administration, Centers for Disease Control and Prevention, and other health officials issued an investigation notice on vaping and e-cigarettes,2 cautioning teenagers, young adults, and pregnant women to avoid e-cigarettes completely and cautioning all users to never buy e-cigarettes off the street or from social sources.
A few days later, on Sept. 9, the FDA’s Center for Tobacco Products issued a warning letter to JUUL Labs, makers of a popular e-cigarette, for illegal marketing of modified-risk tobacco products.3 Then on Sept. 10, health officials in Kansas reported that a sixth person has died of a lung illness related to vaping.4
Researchers have found that 80% of those diagnosed with the vaping illness used products that contained THC, the psychoactive ingredient in marijuana, 61% had used nicotine products, and 7% used cannabidiol (CBD) products. Vitamin E acetate is another substance identified in press reports as tied to the severe lung disease.
Most of the patients affected are adolescents and young adults, with the average age of 19 years.5 This comes as vaping among high school students rose 78% between 2017 and 2018.6 According the U.S. surgeon general, one in five teens vapes. Other data show that teen use of e-cigarettes comes with most users having never smoked a traditional cigarette.7 Teens and young adults frequently borrow buy* e-cigarette “pods” from gas stations but borrow and purchase from friends or peers. In addition, young people are known to alter the pods to insert other liquids, such as CBD and other marijuana products.
Teens and young adults are at higher risk for vaping complications. Their respiratory and immune systems are still developing. In addition to concerns about the recent surge of respiratory illnesses, nicotine is known to also suppress the immune system, which makes people who use it more susceptible to viral and bacterial infections – and also making it harder for them to recover.
In addition nicotine hyperactivates the reward centers of the brain, which can trigger addictive behaviors. Because the brains of young adults are not yet fully developed until at or after age 26, nicotine use before this can “prime the pump” of a still-developing brain, thereby increasing the likelihood for addiction to harder drugs. Nicotine has been shown to disrupt sleep patterns, which are critical for mental and physical health. Lastly, research shows that smoking increases the risks of various psychiatric disorders, such as depression and anxiety. My teen and young adult patients have endlessly debated with me the idea that smoking – either nicotine or marijuana – eases their anxiety or helps them get to sleep. I tell them that, in the long run, the data show that smoking makes those problems worse.8-11
Nationally, we are seeing an explosion of multistate legislation pushing marijuana as a health food. E-cigarettes have followed as the “healthy” alternative to traditional tobacco.
Finally, our world is now filled with smartphones, sexting, and social media overuse. An entire peer group exists that knows life only with constant electronic stimulation. It is not without irony that our national nicotine obsessions have morphed from paper cigarettes to electronic versions. This raises questions: Are teens and young adults using e-cigarettes because of boredom? Are we witnessing a generational ADHD borne from restlessness that stems from lives with fewer meaningful face-to-face human interactions?
In addition to educating our teens and young adults about the physical risks tied to vaping, we need to teach them to build meaning into their lives that exists outside of this digital age.
Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
References
1. CDC. Outbreak of lung injury associated with e-cigarette use, or vaping. 2019 Sep 19.
2. CDC. Outbreak of lung illness associated with using e-cigarette products. Investigation notice. 2019 Sep 6.
3. FDA. Warning letter, JUUL Labs. 2019 Sep 9.
4. Sixth person dies of vaping-related illness. The Hill. 2019 Sep 10.
5. Layden JE. Pulmonary illness related to cigarette use in Illinois and Wisconsin – preliminary report. N Engl J Med. 2019 Sep 6. doi: 10.1056/NEJMoa1911614.
6. Cullen KA et al. CDC. MMWR. 2018 Nov 16;67(45):1276-7.
7. National Academies of Sciences, Engineering, and Medicine. Public health consequences of e-cigarettes. 2018.
8. Patton GC et al. Am J Public Health. 1996 Feb;86(2):225-30.
9. Leventhal AM et al. J Psychiatr Res. 2016 Feb;73:71-8.
10. Levine A et al. J Am Acad Child Adolesc Psychiatry. 2017 Mar;56(3):214-2.
11. Leadbeater BJ et al. Addiction. 2019 Feb;114(2):278-93.
* This column was updated 9/24/2019.
Prior antibiotic use lowers checkpoint inhibitor response and survival
Prior antibiotic use may be associated with a reduced treatment response to checkpoint inhibitors, and worse outcomes, in patients with cancer, according to investigators.
In a prospective cohort study, researchers followed 196 patients with cancer who were treated with immune checkpoint inhibitors in routine clinical practice.
A total of 22 patients had been treated with a 7-day or less course of broad-spectrum beta-lactam–based antibiotics in the 30 days prior to starting immune checkpoint inhibitor therapy, and 68 patients were concurrently taking broad-spectrum beta-lactam–based antibiotics with their checkpoint inhibitor therapy.
The analysis revealed that prior antibiotic therapy was associated with nearly a 100% greater likelihood of poor response to checkpoint inhibitor therapy (P less than .001) and significantly worse overall survival (2 vs. 26 months). Patients who had been on prior antibiotic therapy were also more likely to stop checkpoint inhibitor therapy because their disease had progressed, and were more likely to die of progressive disease while on checkpoint inhibitors.
However, concurrent antibiotic use did not appear to affect either treatment response to checkpoint inhibitors or overall survival.
The most common indication for both prior and concurrent antibiotic use was respiratory tract infections. Researchers examined whether cancer type might play a role in contributing to the association; for example, chronic airway disease in lung cancer might mean higher likelihood of antibiotic use but also lower treatment response and survival.
They found that the association between prior antibiotic therapy and overall survival was consistent across the 119 patients with non–small cell lung cancer, the 38 patients with melanoma, and the 39 patients with other tumor types.
The association was also independent of the class of antibiotic used, the patient’s performance status, and their corticosteroid use.
“Broad-spectrum ATB [antibiotic] use can cause prolonged disruption of the gut ecosystem and impair the effectiveness of the cytotoxic T-cell response against cancer, strengthening the biologic plausibility underlying the adverse effect of ATB therapy on immunotherapy outcomes,” wrote Dr. David J. Pinato, from Imperial College London, and coauthors in JAMA Oncology.
Addressing the question of whether comorbidities might be the mediating factor, the authors pointed out that the use of antibiotics during checkpoint inhibitor therapy – which was a potential indicator of patients’ status worsening during treatment – was not associated with reduced response to treatment or lower overall survival.
“Although provision of cATB [concurrent antibiotic] therapy appears to be safe in the context of immunotherapy, clinicians should carefully weigh the pros and cons of prescribing broad-spectrum ATBs prior to ICI [immune checkpoint inhibitor] treatment,” they wrote.
The study was supported by the Imperial College National Institute for Health Research Biomedical Research Centre, the Imperial College Tissue Bank, the Imperial Cancer Research U.K. Centre, the National Institute for Health Research, and the Wellcome Trust Strategic Fund. Two authors reported receiving grant funding and personal fees from the pharmaceutical sector unrelated to the study.
SOURCE: Pinato D et al. JAMA Oncol. 2019 Sep 12. doi: 10.1001/jamaoncol.2019.2785.
Prior antibiotic use may be associated with a reduced treatment response to checkpoint inhibitors, and worse outcomes, in patients with cancer, according to investigators.
In a prospective cohort study, researchers followed 196 patients with cancer who were treated with immune checkpoint inhibitors in routine clinical practice.
A total of 22 patients had been treated with a 7-day or less course of broad-spectrum beta-lactam–based antibiotics in the 30 days prior to starting immune checkpoint inhibitor therapy, and 68 patients were concurrently taking broad-spectrum beta-lactam–based antibiotics with their checkpoint inhibitor therapy.
The analysis revealed that prior antibiotic therapy was associated with nearly a 100% greater likelihood of poor response to checkpoint inhibitor therapy (P less than .001) and significantly worse overall survival (2 vs. 26 months). Patients who had been on prior antibiotic therapy were also more likely to stop checkpoint inhibitor therapy because their disease had progressed, and were more likely to die of progressive disease while on checkpoint inhibitors.
However, concurrent antibiotic use did not appear to affect either treatment response to checkpoint inhibitors or overall survival.
The most common indication for both prior and concurrent antibiotic use was respiratory tract infections. Researchers examined whether cancer type might play a role in contributing to the association; for example, chronic airway disease in lung cancer might mean higher likelihood of antibiotic use but also lower treatment response and survival.
They found that the association between prior antibiotic therapy and overall survival was consistent across the 119 patients with non–small cell lung cancer, the 38 patients with melanoma, and the 39 patients with other tumor types.
The association was also independent of the class of antibiotic used, the patient’s performance status, and their corticosteroid use.
“Broad-spectrum ATB [antibiotic] use can cause prolonged disruption of the gut ecosystem and impair the effectiveness of the cytotoxic T-cell response against cancer, strengthening the biologic plausibility underlying the adverse effect of ATB therapy on immunotherapy outcomes,” wrote Dr. David J. Pinato, from Imperial College London, and coauthors in JAMA Oncology.
Addressing the question of whether comorbidities might be the mediating factor, the authors pointed out that the use of antibiotics during checkpoint inhibitor therapy – which was a potential indicator of patients’ status worsening during treatment – was not associated with reduced response to treatment or lower overall survival.
“Although provision of cATB [concurrent antibiotic] therapy appears to be safe in the context of immunotherapy, clinicians should carefully weigh the pros and cons of prescribing broad-spectrum ATBs prior to ICI [immune checkpoint inhibitor] treatment,” they wrote.
The study was supported by the Imperial College National Institute for Health Research Biomedical Research Centre, the Imperial College Tissue Bank, the Imperial Cancer Research U.K. Centre, the National Institute for Health Research, and the Wellcome Trust Strategic Fund. Two authors reported receiving grant funding and personal fees from the pharmaceutical sector unrelated to the study.
SOURCE: Pinato D et al. JAMA Oncol. 2019 Sep 12. doi: 10.1001/jamaoncol.2019.2785.
Prior antibiotic use may be associated with a reduced treatment response to checkpoint inhibitors, and worse outcomes, in patients with cancer, according to investigators.
In a prospective cohort study, researchers followed 196 patients with cancer who were treated with immune checkpoint inhibitors in routine clinical practice.
A total of 22 patients had been treated with a 7-day or less course of broad-spectrum beta-lactam–based antibiotics in the 30 days prior to starting immune checkpoint inhibitor therapy, and 68 patients were concurrently taking broad-spectrum beta-lactam–based antibiotics with their checkpoint inhibitor therapy.
The analysis revealed that prior antibiotic therapy was associated with nearly a 100% greater likelihood of poor response to checkpoint inhibitor therapy (P less than .001) and significantly worse overall survival (2 vs. 26 months). Patients who had been on prior antibiotic therapy were also more likely to stop checkpoint inhibitor therapy because their disease had progressed, and were more likely to die of progressive disease while on checkpoint inhibitors.
However, concurrent antibiotic use did not appear to affect either treatment response to checkpoint inhibitors or overall survival.
The most common indication for both prior and concurrent antibiotic use was respiratory tract infections. Researchers examined whether cancer type might play a role in contributing to the association; for example, chronic airway disease in lung cancer might mean higher likelihood of antibiotic use but also lower treatment response and survival.
They found that the association between prior antibiotic therapy and overall survival was consistent across the 119 patients with non–small cell lung cancer, the 38 patients with melanoma, and the 39 patients with other tumor types.
The association was also independent of the class of antibiotic used, the patient’s performance status, and their corticosteroid use.
“Broad-spectrum ATB [antibiotic] use can cause prolonged disruption of the gut ecosystem and impair the effectiveness of the cytotoxic T-cell response against cancer, strengthening the biologic plausibility underlying the adverse effect of ATB therapy on immunotherapy outcomes,” wrote Dr. David J. Pinato, from Imperial College London, and coauthors in JAMA Oncology.
Addressing the question of whether comorbidities might be the mediating factor, the authors pointed out that the use of antibiotics during checkpoint inhibitor therapy – which was a potential indicator of patients’ status worsening during treatment – was not associated with reduced response to treatment or lower overall survival.
“Although provision of cATB [concurrent antibiotic] therapy appears to be safe in the context of immunotherapy, clinicians should carefully weigh the pros and cons of prescribing broad-spectrum ATBs prior to ICI [immune checkpoint inhibitor] treatment,” they wrote.
The study was supported by the Imperial College National Institute for Health Research Biomedical Research Centre, the Imperial College Tissue Bank, the Imperial Cancer Research U.K. Centre, the National Institute for Health Research, and the Wellcome Trust Strategic Fund. Two authors reported receiving grant funding and personal fees from the pharmaceutical sector unrelated to the study.
SOURCE: Pinato D et al. JAMA Oncol. 2019 Sep 12. doi: 10.1001/jamaoncol.2019.2785.
FROM JAMA ONCOLOGY
Key clinical point: People who take antibiotics prior to checkpoint inhibitor therapy have lower treatment response and overall survival.
Major finding: Prior antibiotic use is associated with a nearly a 100% greater likelihood of poor response to checkpoint inhibitor therapy.
Study details: A prospective cohort study involving 196 patients receiving checkpoint inhibitor therapy for cancer.
Disclosures: The study was supported by the Imperial College National Institute for Health Research Biomedical Research Centre, the Imperial College Tissue Bank, the Imperial Cancer Research U.K. Centre, the National Institute for Health Research, and the Wellcome Trust Strategic Fund. Two authors reported receiving grant funding and personal fees from the pharmaceutical sector unrelated to the study.
Source: Pinato D et al. JAMA Oncol. 2019 Sep 12. doi: 10.1001/jamaoncol.2019.2785.
Ponesimod reduces annualized relapse rate, compared with teriflunomide
STOCKHOLM – according to research presented at ECTRIMS 2019. Ponesimod also reduces fatigue and the number of active lesions, compared with teriflunomide.
Ponesimod selectively modulates the sphingosine-1-phosphate receptor 1 (S1P1). The drug is administered orally and reduces circulating lymphocyte counts by inducing a rapid, dose-dependent, and reversible sequestration of lymphocytes in lymphoid organs. This effect decreases the number of immune cells available for inflammatory attacks in the CNS, said Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel (Switzerland). The drug has no active metabolites, and its effects on the immune system are reversible.
Dr. Kappos and colleagues conducted the OPTIMUM phase 3 study to assess the efficacy and safety of oral ponesimod, compared with those of teriflunomide. They enrolled patients between ages 18 and 55 years with an established diagnosis of MS according to the 2010 McDonald criteria with a relapsing course from onset into the multicenter, randomized, double-blind, superiority study. Eligible patients had an Expanded Disability Status Scale (EDSS) score of 0 to 5.5 inclusive and recent clinical or MRI disease activity. Dr. Kappos and colleagues randomized participants in equal groups to receive ponesimod (20 mg/day) or teriflunomide (14 mg/day) and the respective placebo for 108 weeks. To mitigate the potential effects on heart rate that are associated with S1P1 modulators, patients were titrated gradually from 2 mg/day to the target dose over 14 days.
The trial’s primary endpoint was the annualized relapse rate over 108 weeks. Secondary endpoints were the effect on fatigue-related symptoms, as assessed with Fatigue Symptom and Impact Questionnaire-Relapsing MS (FSIQ-RMS); active lesions on MRI to week 108; and time to 12- and 24-week confirmed disability accumulation to end of study. The investigators also assessed the drugs’ safety and tolerability.
Dr. Kappos and colleagues randomized 1,133 patients at 162 sites in 28 countries. They stratified randomization according to whether participants had received prior disease-modifying treatment in the previous 2 years (39.4% had, and 60.6% had not) and EDSS score at baseline (83.4% had a score of 3.5 or lower, and 16.6% had a score above 3.5). The population’s mean age was 36.7 years, and 65% of participants were female. Most patients were recruited in Europe, and 51% came from E.U. countries. Patients’ mean baseline EDSS score was 2.6, and mean disease duration was 7.6 years. The mean prestudy 12-month relapse rate was 1.3, and 483 (42.7%) patients had one or more gadolinium-enhancing T1 lesions on baseline MRI. The two treatment groups were well balanced. The rate of treatment discontinuation was 16.6% for ponesimod and 16.4% on teriflunomide.
At the end of the study, the annualized relapse rate was 0.202 in the ponesimod group and 0.290 in the teriflunomide group. Compared with teriflunomide, ponesimod significantly reduced the annualized relapse rate by 30.5%. Fatigue remained stable in the ponesimod group, but worsened in the teriflunomide group: The mean difference in FSIQ-RMS score between the arms at week 108 was 3.57, and this result was statistically significant. In addition, ponesimod significantly reduced the number of active lesions by 56%, compared with teriflunomide. The risk for 12- and 24- week confirmed disability were lower with ponesimod, compared with teriflunomide, but the difference was not statistically significant.
The rates of treatment-emergent adverse events were approximately 89% for the ponesimod arm and 88% for teriflunomide. The rates of serious adverse events were about 9% for ponesimod and about 8% for teriflunomide. Respiratory events and laboratory values prompted slightly more study discontinuations in the ponesimod group than in the teriflunomide group.
This research represents the first controlled study to show superior efficacy of oral ponesimod, compared with an approved oral compound, said Dr. Kappos. “The overall profile suggests that [ponesimod] may be a valuable addition to our armamentarium in treating patients with relapsing forms of MS,” he concluded.
The study was supported by Actelion Pharmaceuticals. University Hospital Basel, where Dr. Kappos works, received steering committee, advisory board, and consultancy fees from Actelion and other companies.
SOURCE: Kappos L et al. ECTRIMS 2019, Abstract 93.
STOCKHOLM – according to research presented at ECTRIMS 2019. Ponesimod also reduces fatigue and the number of active lesions, compared with teriflunomide.
Ponesimod selectively modulates the sphingosine-1-phosphate receptor 1 (S1P1). The drug is administered orally and reduces circulating lymphocyte counts by inducing a rapid, dose-dependent, and reversible sequestration of lymphocytes in lymphoid organs. This effect decreases the number of immune cells available for inflammatory attacks in the CNS, said Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel (Switzerland). The drug has no active metabolites, and its effects on the immune system are reversible.
Dr. Kappos and colleagues conducted the OPTIMUM phase 3 study to assess the efficacy and safety of oral ponesimod, compared with those of teriflunomide. They enrolled patients between ages 18 and 55 years with an established diagnosis of MS according to the 2010 McDonald criteria with a relapsing course from onset into the multicenter, randomized, double-blind, superiority study. Eligible patients had an Expanded Disability Status Scale (EDSS) score of 0 to 5.5 inclusive and recent clinical or MRI disease activity. Dr. Kappos and colleagues randomized participants in equal groups to receive ponesimod (20 mg/day) or teriflunomide (14 mg/day) and the respective placebo for 108 weeks. To mitigate the potential effects on heart rate that are associated with S1P1 modulators, patients were titrated gradually from 2 mg/day to the target dose over 14 days.
The trial’s primary endpoint was the annualized relapse rate over 108 weeks. Secondary endpoints were the effect on fatigue-related symptoms, as assessed with Fatigue Symptom and Impact Questionnaire-Relapsing MS (FSIQ-RMS); active lesions on MRI to week 108; and time to 12- and 24-week confirmed disability accumulation to end of study. The investigators also assessed the drugs’ safety and tolerability.
Dr. Kappos and colleagues randomized 1,133 patients at 162 sites in 28 countries. They stratified randomization according to whether participants had received prior disease-modifying treatment in the previous 2 years (39.4% had, and 60.6% had not) and EDSS score at baseline (83.4% had a score of 3.5 or lower, and 16.6% had a score above 3.5). The population’s mean age was 36.7 years, and 65% of participants were female. Most patients were recruited in Europe, and 51% came from E.U. countries. Patients’ mean baseline EDSS score was 2.6, and mean disease duration was 7.6 years. The mean prestudy 12-month relapse rate was 1.3, and 483 (42.7%) patients had one or more gadolinium-enhancing T1 lesions on baseline MRI. The two treatment groups were well balanced. The rate of treatment discontinuation was 16.6% for ponesimod and 16.4% on teriflunomide.
At the end of the study, the annualized relapse rate was 0.202 in the ponesimod group and 0.290 in the teriflunomide group. Compared with teriflunomide, ponesimod significantly reduced the annualized relapse rate by 30.5%. Fatigue remained stable in the ponesimod group, but worsened in the teriflunomide group: The mean difference in FSIQ-RMS score between the arms at week 108 was 3.57, and this result was statistically significant. In addition, ponesimod significantly reduced the number of active lesions by 56%, compared with teriflunomide. The risk for 12- and 24- week confirmed disability were lower with ponesimod, compared with teriflunomide, but the difference was not statistically significant.
The rates of treatment-emergent adverse events were approximately 89% for the ponesimod arm and 88% for teriflunomide. The rates of serious adverse events were about 9% for ponesimod and about 8% for teriflunomide. Respiratory events and laboratory values prompted slightly more study discontinuations in the ponesimod group than in the teriflunomide group.
This research represents the first controlled study to show superior efficacy of oral ponesimod, compared with an approved oral compound, said Dr. Kappos. “The overall profile suggests that [ponesimod] may be a valuable addition to our armamentarium in treating patients with relapsing forms of MS,” he concluded.
The study was supported by Actelion Pharmaceuticals. University Hospital Basel, where Dr. Kappos works, received steering committee, advisory board, and consultancy fees from Actelion and other companies.
SOURCE: Kappos L et al. ECTRIMS 2019, Abstract 93.
STOCKHOLM – according to research presented at ECTRIMS 2019. Ponesimod also reduces fatigue and the number of active lesions, compared with teriflunomide.
Ponesimod selectively modulates the sphingosine-1-phosphate receptor 1 (S1P1). The drug is administered orally and reduces circulating lymphocyte counts by inducing a rapid, dose-dependent, and reversible sequestration of lymphocytes in lymphoid organs. This effect decreases the number of immune cells available for inflammatory attacks in the CNS, said Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel (Switzerland). The drug has no active metabolites, and its effects on the immune system are reversible.
Dr. Kappos and colleagues conducted the OPTIMUM phase 3 study to assess the efficacy and safety of oral ponesimod, compared with those of teriflunomide. They enrolled patients between ages 18 and 55 years with an established diagnosis of MS according to the 2010 McDonald criteria with a relapsing course from onset into the multicenter, randomized, double-blind, superiority study. Eligible patients had an Expanded Disability Status Scale (EDSS) score of 0 to 5.5 inclusive and recent clinical or MRI disease activity. Dr. Kappos and colleagues randomized participants in equal groups to receive ponesimod (20 mg/day) or teriflunomide (14 mg/day) and the respective placebo for 108 weeks. To mitigate the potential effects on heart rate that are associated with S1P1 modulators, patients were titrated gradually from 2 mg/day to the target dose over 14 days.
The trial’s primary endpoint was the annualized relapse rate over 108 weeks. Secondary endpoints were the effect on fatigue-related symptoms, as assessed with Fatigue Symptom and Impact Questionnaire-Relapsing MS (FSIQ-RMS); active lesions on MRI to week 108; and time to 12- and 24-week confirmed disability accumulation to end of study. The investigators also assessed the drugs’ safety and tolerability.
Dr. Kappos and colleagues randomized 1,133 patients at 162 sites in 28 countries. They stratified randomization according to whether participants had received prior disease-modifying treatment in the previous 2 years (39.4% had, and 60.6% had not) and EDSS score at baseline (83.4% had a score of 3.5 or lower, and 16.6% had a score above 3.5). The population’s mean age was 36.7 years, and 65% of participants were female. Most patients were recruited in Europe, and 51% came from E.U. countries. Patients’ mean baseline EDSS score was 2.6, and mean disease duration was 7.6 years. The mean prestudy 12-month relapse rate was 1.3, and 483 (42.7%) patients had one or more gadolinium-enhancing T1 lesions on baseline MRI. The two treatment groups were well balanced. The rate of treatment discontinuation was 16.6% for ponesimod and 16.4% on teriflunomide.
At the end of the study, the annualized relapse rate was 0.202 in the ponesimod group and 0.290 in the teriflunomide group. Compared with teriflunomide, ponesimod significantly reduced the annualized relapse rate by 30.5%. Fatigue remained stable in the ponesimod group, but worsened in the teriflunomide group: The mean difference in FSIQ-RMS score between the arms at week 108 was 3.57, and this result was statistically significant. In addition, ponesimod significantly reduced the number of active lesions by 56%, compared with teriflunomide. The risk for 12- and 24- week confirmed disability were lower with ponesimod, compared with teriflunomide, but the difference was not statistically significant.
The rates of treatment-emergent adverse events were approximately 89% for the ponesimod arm and 88% for teriflunomide. The rates of serious adverse events were about 9% for ponesimod and about 8% for teriflunomide. Respiratory events and laboratory values prompted slightly more study discontinuations in the ponesimod group than in the teriflunomide group.
This research represents the first controlled study to show superior efficacy of oral ponesimod, compared with an approved oral compound, said Dr. Kappos. “The overall profile suggests that [ponesimod] may be a valuable addition to our armamentarium in treating patients with relapsing forms of MS,” he concluded.
The study was supported by Actelion Pharmaceuticals. University Hospital Basel, where Dr. Kappos works, received steering committee, advisory board, and consultancy fees from Actelion and other companies.
SOURCE: Kappos L et al. ECTRIMS 2019, Abstract 93.
REPORTING FROM ECTRIMS 2019
Key clinical point: Ponesimod reduces the number of confirmed MS relapses, compared with teriflunomide.
Major finding: Annualized relapse rate was 30.5% lower with ponesimod, compared with teriflunomide.
Study details: A randomized, double-blind, superiority study of 1,133 patients with relapsing-remitting MS.
Disclosures: Actelion Pharmaceuticals sponsored the study.
Source: Kappos L et al. ECTRIMS 2019, Abstract 93.
Continuous treatment reduces risk of confirmed disability progression in MS
STOCKHOLM – (CDP), according to an investigation presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
Using several confirmation points for Expanded Disability Status Scale (EDSS) progression (e.g., 12 months and 24 months), researchers detected a clear gradient of treatment effect. Identification of the most reliable outcome definitions will require further investigations, they said.
“The ultimate goal of MS treatment is the prevention of long-term disability accumulation,” said Giuseppe Lucisano, a biostatistician at the Center for Outcomes Research and Clinical Epidemiology in Pescara, Italy. “Continuous DMT exposure can impact long-term disability accumulation in MS, but it has not been definitively demonstrated yet.”
Registries and clinical databases provide the opportunity to collect longitudinal data for treated and untreated patients as a means of investigating questions such as this one, the researchers said. The Danish, Italian, and Swedish national MS registries, MSBase, and the Observatoire of MS (OFSEP) merged their data in the Big Multiple Sclerosis Data (BMSD) Network, which includes approximately 150,000 patients and more than 470,000 EDSS evaluations. The result is a large dataset suitable for long-term longitudinal studies.
Mr. Lucisano and colleagues sought to examine the long-term effect of DMTs on CDP and irreversible disability milestones (i.e., EDSS scores of 4 and 6) in relapsing-remitting MS. The researchers used marginal structural proportional models, a novel technique that enables them to correct modeling for confounders that vary with time in longitudinal observational studies. Such confounders include treatment switches, on-treatment relapses, and treatment gaps.
The investigators selected patients with 10 or more years’ follow-up and one or more EDSS score evaluations per year from the BMSD pooled cohort. Using marginal structural proportional models, the investigators evaluated cumulative hazards of 3-, 12- and 24-month CDP (i.e., CDP3, CDP12, CDP24) events in 6-month periods. They created stabilized inverse probability of treatment weights (IPTWs) at each 6-month period using survival models according to treatment status (i.e., treated versus untreated). Treatment status was assigned for each patient according to the percentage of time that he or she spent receiving DMT in each 6-month period. A patient who received treatment for 70% or more of the period studied was considered treated; patients who did not meet this threshold were considered untreated. The weights were calculated on the basis of sex, age, occurrence of relapse, EDSS score, and registry source. Finally, the researchers used Cox regression models estimating the effect of DMTs on the risk of reaching CDP3, CDP12, and CDP24, adjusted by the IPTWs, to compare cohorts that remained treated or untreated throughout follow-up.
The investigators identified a cohort of 15,602 patients with relapsing-remitting MS, and this group had 312,040 EDSS score evaluations. Approximately 28% of patients were male. Median age at disease onset was 28.3 years, and median disease duration was 18.7 years. Median follow-up duration was 13.8 years.
During follow-up, 43.3% of patients had CDP3, 27.7% had CDP12, and 14.4% had CDP24 events. In addition, 23.6% of patients reached an EDSS score of 4, and 11.2% reached an EDSS score of 6.
Cox models adjusted by IPTW demonstrated increasing positive evidence of the effect of cumulative treatment exposure, compared with cumulative untreated epochs, according to the length of confirmation time used for defining the CDP. The investigators did not observe an effect of treatment on the probability of reaching CDP3 (hazard ratio [HR], 1.02), but treatment had a protective effect on CDP12 (HR, 0.90) and CDP24 (HR, 0.65) endpoints. During treated epochs, the HR of EDSS 4 was 0.89, and the HR of EDSS 6 was 0.86. Sensitivity analyses largely confirmed the results of the main analysis.
Two of the researchers are employees of Biogen International, which supported the research. Several investigators received compensation or funding from various pharmaceutical companies.
STOCKHOLM – (CDP), according to an investigation presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
Using several confirmation points for Expanded Disability Status Scale (EDSS) progression (e.g., 12 months and 24 months), researchers detected a clear gradient of treatment effect. Identification of the most reliable outcome definitions will require further investigations, they said.
“The ultimate goal of MS treatment is the prevention of long-term disability accumulation,” said Giuseppe Lucisano, a biostatistician at the Center for Outcomes Research and Clinical Epidemiology in Pescara, Italy. “Continuous DMT exposure can impact long-term disability accumulation in MS, but it has not been definitively demonstrated yet.”
Registries and clinical databases provide the opportunity to collect longitudinal data for treated and untreated patients as a means of investigating questions such as this one, the researchers said. The Danish, Italian, and Swedish national MS registries, MSBase, and the Observatoire of MS (OFSEP) merged their data in the Big Multiple Sclerosis Data (BMSD) Network, which includes approximately 150,000 patients and more than 470,000 EDSS evaluations. The result is a large dataset suitable for long-term longitudinal studies.
Mr. Lucisano and colleagues sought to examine the long-term effect of DMTs on CDP and irreversible disability milestones (i.e., EDSS scores of 4 and 6) in relapsing-remitting MS. The researchers used marginal structural proportional models, a novel technique that enables them to correct modeling for confounders that vary with time in longitudinal observational studies. Such confounders include treatment switches, on-treatment relapses, and treatment gaps.
The investigators selected patients with 10 or more years’ follow-up and one or more EDSS score evaluations per year from the BMSD pooled cohort. Using marginal structural proportional models, the investigators evaluated cumulative hazards of 3-, 12- and 24-month CDP (i.e., CDP3, CDP12, CDP24) events in 6-month periods. They created stabilized inverse probability of treatment weights (IPTWs) at each 6-month period using survival models according to treatment status (i.e., treated versus untreated). Treatment status was assigned for each patient according to the percentage of time that he or she spent receiving DMT in each 6-month period. A patient who received treatment for 70% or more of the period studied was considered treated; patients who did not meet this threshold were considered untreated. The weights were calculated on the basis of sex, age, occurrence of relapse, EDSS score, and registry source. Finally, the researchers used Cox regression models estimating the effect of DMTs on the risk of reaching CDP3, CDP12, and CDP24, adjusted by the IPTWs, to compare cohorts that remained treated or untreated throughout follow-up.
The investigators identified a cohort of 15,602 patients with relapsing-remitting MS, and this group had 312,040 EDSS score evaluations. Approximately 28% of patients were male. Median age at disease onset was 28.3 years, and median disease duration was 18.7 years. Median follow-up duration was 13.8 years.
During follow-up, 43.3% of patients had CDP3, 27.7% had CDP12, and 14.4% had CDP24 events. In addition, 23.6% of patients reached an EDSS score of 4, and 11.2% reached an EDSS score of 6.
Cox models adjusted by IPTW demonstrated increasing positive evidence of the effect of cumulative treatment exposure, compared with cumulative untreated epochs, according to the length of confirmation time used for defining the CDP. The investigators did not observe an effect of treatment on the probability of reaching CDP3 (hazard ratio [HR], 1.02), but treatment had a protective effect on CDP12 (HR, 0.90) and CDP24 (HR, 0.65) endpoints. During treated epochs, the HR of EDSS 4 was 0.89, and the HR of EDSS 6 was 0.86. Sensitivity analyses largely confirmed the results of the main analysis.
Two of the researchers are employees of Biogen International, which supported the research. Several investigators received compensation or funding from various pharmaceutical companies.
STOCKHOLM – (CDP), according to an investigation presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
Using several confirmation points for Expanded Disability Status Scale (EDSS) progression (e.g., 12 months and 24 months), researchers detected a clear gradient of treatment effect. Identification of the most reliable outcome definitions will require further investigations, they said.
“The ultimate goal of MS treatment is the prevention of long-term disability accumulation,” said Giuseppe Lucisano, a biostatistician at the Center for Outcomes Research and Clinical Epidemiology in Pescara, Italy. “Continuous DMT exposure can impact long-term disability accumulation in MS, but it has not been definitively demonstrated yet.”
Registries and clinical databases provide the opportunity to collect longitudinal data for treated and untreated patients as a means of investigating questions such as this one, the researchers said. The Danish, Italian, and Swedish national MS registries, MSBase, and the Observatoire of MS (OFSEP) merged their data in the Big Multiple Sclerosis Data (BMSD) Network, which includes approximately 150,000 patients and more than 470,000 EDSS evaluations. The result is a large dataset suitable for long-term longitudinal studies.
Mr. Lucisano and colleagues sought to examine the long-term effect of DMTs on CDP and irreversible disability milestones (i.e., EDSS scores of 4 and 6) in relapsing-remitting MS. The researchers used marginal structural proportional models, a novel technique that enables them to correct modeling for confounders that vary with time in longitudinal observational studies. Such confounders include treatment switches, on-treatment relapses, and treatment gaps.
The investigators selected patients with 10 or more years’ follow-up and one or more EDSS score evaluations per year from the BMSD pooled cohort. Using marginal structural proportional models, the investigators evaluated cumulative hazards of 3-, 12- and 24-month CDP (i.e., CDP3, CDP12, CDP24) events in 6-month periods. They created stabilized inverse probability of treatment weights (IPTWs) at each 6-month period using survival models according to treatment status (i.e., treated versus untreated). Treatment status was assigned for each patient according to the percentage of time that he or she spent receiving DMT in each 6-month period. A patient who received treatment for 70% or more of the period studied was considered treated; patients who did not meet this threshold were considered untreated. The weights were calculated on the basis of sex, age, occurrence of relapse, EDSS score, and registry source. Finally, the researchers used Cox regression models estimating the effect of DMTs on the risk of reaching CDP3, CDP12, and CDP24, adjusted by the IPTWs, to compare cohorts that remained treated or untreated throughout follow-up.
The investigators identified a cohort of 15,602 patients with relapsing-remitting MS, and this group had 312,040 EDSS score evaluations. Approximately 28% of patients were male. Median age at disease onset was 28.3 years, and median disease duration was 18.7 years. Median follow-up duration was 13.8 years.
During follow-up, 43.3% of patients had CDP3, 27.7% had CDP12, and 14.4% had CDP24 events. In addition, 23.6% of patients reached an EDSS score of 4, and 11.2% reached an EDSS score of 6.
Cox models adjusted by IPTW demonstrated increasing positive evidence of the effect of cumulative treatment exposure, compared with cumulative untreated epochs, according to the length of confirmation time used for defining the CDP. The investigators did not observe an effect of treatment on the probability of reaching CDP3 (hazard ratio [HR], 1.02), but treatment had a protective effect on CDP12 (HR, 0.90) and CDP24 (HR, 0.65) endpoints. During treated epochs, the HR of EDSS 4 was 0.89, and the HR of EDSS 6 was 0.86. Sensitivity analyses largely confirmed the results of the main analysis.
Two of the researchers are employees of Biogen International, which supported the research. Several investigators received compensation or funding from various pharmaceutical companies.
REPORTING FROM ECTRIMS 2019
Which Interventions Can Treat Cognitive Fatigue?
Key clinical point: Only one intervention – transcranial direct current stimulation (tDCS) – has been found to counteract cognitive fatigability in a trial with objective outcome measures.
Major finding: Compared with sham stimulation, anodal tDCS increased P300 amplitude and reduced fatigue-related decrements in reaction time in a preliminary study.
Study details: A systematic review of intervention studies that objectively measured cognitive fatigability in adults with neurologic disorders.
Disclosures: The authors had no disclosures.
Citation: Lindsay-Brown A et al. CMSC 2019, Abstract NNN10.
Key clinical point: Only one intervention – transcranial direct current stimulation (tDCS) – has been found to counteract cognitive fatigability in a trial with objective outcome measures.
Major finding: Compared with sham stimulation, anodal tDCS increased P300 amplitude and reduced fatigue-related decrements in reaction time in a preliminary study.
Study details: A systematic review of intervention studies that objectively measured cognitive fatigability in adults with neurologic disorders.
Disclosures: The authors had no disclosures.
Citation: Lindsay-Brown A et al. CMSC 2019, Abstract NNN10.
Key clinical point: Only one intervention – transcranial direct current stimulation (tDCS) – has been found to counteract cognitive fatigability in a trial with objective outcome measures.
Major finding: Compared with sham stimulation, anodal tDCS increased P300 amplitude and reduced fatigue-related decrements in reaction time in a preliminary study.
Study details: A systematic review of intervention studies that objectively measured cognitive fatigability in adults with neurologic disorders.
Disclosures: The authors had no disclosures.
Citation: Lindsay-Brown A et al. CMSC 2019, Abstract NNN10.
Out-of-Pocket Costs for MS Drugs Rose Significantly
Key clinical point: Prices of self-administered disease-modifying therapies for multiple sclerosis increased significantly from 2006 to 2016.
Major finding: Patients’ out-of-pocket costs increased by a factor of 7.2 during this period.
Study details: A cohort study of Medicare claims data from 2006 to 2016.
Disclosures: The Myers Family Foundation and the National Heart, Lung, and Blood Institute funded this research. Several authors are employees of health insurance companies such as the UPMC Health Plan Insurance Services Division and Humana. One author received personal fees from Pfizer that were unrelated to this study.
Citation: San-Juan-Rodriguez A et al. JAMA Neurol. 2019 Aug 26. doi: 10.1001/jamaneurol.2019.2711; Hartung DM and Bourdette D. JAMA Neurol. 2019 Aug 26. doi: 10.1001/jamaneurol.2019.2445.
Key clinical point: Prices of self-administered disease-modifying therapies for multiple sclerosis increased significantly from 2006 to 2016.
Major finding: Patients’ out-of-pocket costs increased by a factor of 7.2 during this period.
Study details: A cohort study of Medicare claims data from 2006 to 2016.
Disclosures: The Myers Family Foundation and the National Heart, Lung, and Blood Institute funded this research. Several authors are employees of health insurance companies such as the UPMC Health Plan Insurance Services Division and Humana. One author received personal fees from Pfizer that were unrelated to this study.
Citation: San-Juan-Rodriguez A et al. JAMA Neurol. 2019 Aug 26. doi: 10.1001/jamaneurol.2019.2711; Hartung DM and Bourdette D. JAMA Neurol. 2019 Aug 26. doi: 10.1001/jamaneurol.2019.2445.
Key clinical point: Prices of self-administered disease-modifying therapies for multiple sclerosis increased significantly from 2006 to 2016.
Major finding: Patients’ out-of-pocket costs increased by a factor of 7.2 during this period.
Study details: A cohort study of Medicare claims data from 2006 to 2016.
Disclosures: The Myers Family Foundation and the National Heart, Lung, and Blood Institute funded this research. Several authors are employees of health insurance companies such as the UPMC Health Plan Insurance Services Division and Humana. One author received personal fees from Pfizer that were unrelated to this study.
Citation: San-Juan-Rodriguez A et al. JAMA Neurol. 2019 Aug 26. doi: 10.1001/jamaneurol.2019.2711; Hartung DM and Bourdette D. JAMA Neurol. 2019 Aug 26. doi: 10.1001/jamaneurol.2019.2445.
Neurologists need not discourage breastfeeding in women with MS
STOCKHOLM – Most neurologists are overly conservative when it comes to advising women with multiple sclerosis (MS) about breastfeeding, discouraging this broadly beneficial practice in favor of early resumption of treatment post pregnancy, Kerstin Hellwig, MD, said at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
“We should change our behavior, and I predict we will change it so that more women are breastfeeding while under MS medication within the next couple years.
She was a coauthor of a groundbreaking 2012 meta-analysis that concluded that breastfeeding by MS patients is not harmful (J Neurol. 2012 Oct;259[10]:2246-8), a finding since confirmed in multiple additional studies.
“Women with MS who want to breastfeed should be supported in doing so,” Dr. Hellwig said.
In this regard, many neurologists are out of step with their colleagues in rheumatology and gastroenterology, who commonly endorse breastfeeding by their patients while on monoclonal antibodies for other autoimmune diseases, according to Dr. Hellwig.
It is important to recognize that most women of reproductive age with MS have milder forms of the disease, she said. They can safely breastfeed without being on any MS medications at all for the duration.
For women who want to breastfeed and have more-active disease where early treatment resumption is warranted, the key is to select a breastfeeding-compatible medication. The main determinant of whether a drug will enter the mother’s breast milk is the size of the drug molecule, with large molecules being unlikely to make their way into breast milk in anything approaching clinically meaningful amounts. The injectable first-line disease-modifying drugs are good options: For example, interferon-beta is a very large molecule which has been detected in breast milk at 0.0006% of the relative infant dose. That’s reassuring, Dr. Hellwig said, since anything less than a relative infant dose of 10% is generally considered to be safe for a baby. And while glatiramer acetate, another injectable, has not been tested, it is metabolized so rapidly that it is unlikely to be detectable in breast milk, according to Dr. Hellwig.
Monoclonal antibodies are also compatible with breastfeeding. Rituximab has been detected in breast milk at 1/240th of the maternal serum level, and natalizumab at less than 1/200th. These are large molecules with a low likelihood of infant absorption, since they are probably destroyed in the child’s gastrointestinal tract. Ocrelizumab has not been studied in breast milk, but it is an IgG1 monoclonal antibody, as is rituximab, and so should likewise pose “exceedingly low risk,” Dr. Hellwig said.
At last year’s ECTRIMS conference, she presented reassuring 1-year follow-up data on a cohort of infants breastfed by mothers with MS while on interferon-beta. “We do not see any growth disturbances, any severe infections, hospitalizations, excess antibiotic use, or postponed reaching of developmental milestones in babies being breastfed under the injectables,” she said.
Dr. Hellwig has served on scientific advisory board for Bayer, Biogen, Genzyme Sanofi, Teva, Roche, Novartis, and Merck. She has received speaker honoraria and research support from Bayer, Biogen, Merck, Novartis, SanofiGenzyme, and Teva, and has received support for congress participation from Bayer, Biogen, Genzyme, Teva, Roche, and Merck.
STOCKHOLM – Most neurologists are overly conservative when it comes to advising women with multiple sclerosis (MS) about breastfeeding, discouraging this broadly beneficial practice in favor of early resumption of treatment post pregnancy, Kerstin Hellwig, MD, said at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
“We should change our behavior, and I predict we will change it so that more women are breastfeeding while under MS medication within the next couple years.
She was a coauthor of a groundbreaking 2012 meta-analysis that concluded that breastfeeding by MS patients is not harmful (J Neurol. 2012 Oct;259[10]:2246-8), a finding since confirmed in multiple additional studies.
“Women with MS who want to breastfeed should be supported in doing so,” Dr. Hellwig said.
In this regard, many neurologists are out of step with their colleagues in rheumatology and gastroenterology, who commonly endorse breastfeeding by their patients while on monoclonal antibodies for other autoimmune diseases, according to Dr. Hellwig.
It is important to recognize that most women of reproductive age with MS have milder forms of the disease, she said. They can safely breastfeed without being on any MS medications at all for the duration.
For women who want to breastfeed and have more-active disease where early treatment resumption is warranted, the key is to select a breastfeeding-compatible medication. The main determinant of whether a drug will enter the mother’s breast milk is the size of the drug molecule, with large molecules being unlikely to make their way into breast milk in anything approaching clinically meaningful amounts. The injectable first-line disease-modifying drugs are good options: For example, interferon-beta is a very large molecule which has been detected in breast milk at 0.0006% of the relative infant dose. That’s reassuring, Dr. Hellwig said, since anything less than a relative infant dose of 10% is generally considered to be safe for a baby. And while glatiramer acetate, another injectable, has not been tested, it is metabolized so rapidly that it is unlikely to be detectable in breast milk, according to Dr. Hellwig.
Monoclonal antibodies are also compatible with breastfeeding. Rituximab has been detected in breast milk at 1/240th of the maternal serum level, and natalizumab at less than 1/200th. These are large molecules with a low likelihood of infant absorption, since they are probably destroyed in the child’s gastrointestinal tract. Ocrelizumab has not been studied in breast milk, but it is an IgG1 monoclonal antibody, as is rituximab, and so should likewise pose “exceedingly low risk,” Dr. Hellwig said.
At last year’s ECTRIMS conference, she presented reassuring 1-year follow-up data on a cohort of infants breastfed by mothers with MS while on interferon-beta. “We do not see any growth disturbances, any severe infections, hospitalizations, excess antibiotic use, or postponed reaching of developmental milestones in babies being breastfed under the injectables,” she said.
Dr. Hellwig has served on scientific advisory board for Bayer, Biogen, Genzyme Sanofi, Teva, Roche, Novartis, and Merck. She has received speaker honoraria and research support from Bayer, Biogen, Merck, Novartis, SanofiGenzyme, and Teva, and has received support for congress participation from Bayer, Biogen, Genzyme, Teva, Roche, and Merck.
STOCKHOLM – Most neurologists are overly conservative when it comes to advising women with multiple sclerosis (MS) about breastfeeding, discouraging this broadly beneficial practice in favor of early resumption of treatment post pregnancy, Kerstin Hellwig, MD, said at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.
“We should change our behavior, and I predict we will change it so that more women are breastfeeding while under MS medication within the next couple years.
She was a coauthor of a groundbreaking 2012 meta-analysis that concluded that breastfeeding by MS patients is not harmful (J Neurol. 2012 Oct;259[10]:2246-8), a finding since confirmed in multiple additional studies.
“Women with MS who want to breastfeed should be supported in doing so,” Dr. Hellwig said.
In this regard, many neurologists are out of step with their colleagues in rheumatology and gastroenterology, who commonly endorse breastfeeding by their patients while on monoclonal antibodies for other autoimmune diseases, according to Dr. Hellwig.
It is important to recognize that most women of reproductive age with MS have milder forms of the disease, she said. They can safely breastfeed without being on any MS medications at all for the duration.
For women who want to breastfeed and have more-active disease where early treatment resumption is warranted, the key is to select a breastfeeding-compatible medication. The main determinant of whether a drug will enter the mother’s breast milk is the size of the drug molecule, with large molecules being unlikely to make their way into breast milk in anything approaching clinically meaningful amounts. The injectable first-line disease-modifying drugs are good options: For example, interferon-beta is a very large molecule which has been detected in breast milk at 0.0006% of the relative infant dose. That’s reassuring, Dr. Hellwig said, since anything less than a relative infant dose of 10% is generally considered to be safe for a baby. And while glatiramer acetate, another injectable, has not been tested, it is metabolized so rapidly that it is unlikely to be detectable in breast milk, according to Dr. Hellwig.
Monoclonal antibodies are also compatible with breastfeeding. Rituximab has been detected in breast milk at 1/240th of the maternal serum level, and natalizumab at less than 1/200th. These are large molecules with a low likelihood of infant absorption, since they are probably destroyed in the child’s gastrointestinal tract. Ocrelizumab has not been studied in breast milk, but it is an IgG1 monoclonal antibody, as is rituximab, and so should likewise pose “exceedingly low risk,” Dr. Hellwig said.
At last year’s ECTRIMS conference, she presented reassuring 1-year follow-up data on a cohort of infants breastfed by mothers with MS while on interferon-beta. “We do not see any growth disturbances, any severe infections, hospitalizations, excess antibiotic use, or postponed reaching of developmental milestones in babies being breastfed under the injectables,” she said.
Dr. Hellwig has served on scientific advisory board for Bayer, Biogen, Genzyme Sanofi, Teva, Roche, Novartis, and Merck. She has received speaker honoraria and research support from Bayer, Biogen, Merck, Novartis, SanofiGenzyme, and Teva, and has received support for congress participation from Bayer, Biogen, Genzyme, Teva, Roche, and Merck.
EXPERT ANALYSIS FROM ECTRIMS 2019
Drug doses for heart failure could possibly be halved for women
Men and women react differently to common drugs used to treat heart failure with reduced ejection fraction (HFrEF), according to findings from a new European study, and women may be able to safely cut their doses in half and get the same level of relief as that provided by larger doses.
“This study ... brings into question what the true optimal medical therapy is for women versus men,” the study authors, led by Bernadet T. Santema, MD, of the University Medical Center Groningen (the Netherlands), wrote in an article published in the Lancet.
Dr. Santema and colleagues noted that current guidelines for the use of ACE inhibitors or angiotensin-receptor blockers (ARBs) and beta-blockers for men and women with heart failure do not differentiate between the genders, despite findings showing that, “with the same dose, the maximum plasma concentrations of ACE inhibitors, ARBs, and beta-blockers were up to 2.5 times higher in women than in men.”
In addition, the researchers wrote, women are much more likely than men to suffer side effects from medications, and the effects tend to be more severe.
HFrEF accounts for an estimated 50% of the 5.7 million patients with heart failure in the United States (Nat Rev Dis Primers. 2017 Aug 24. doi: 10.1038/nrdp.2017.58; Card Fail Rev. 2017;3[1]:7-11.)
For the new study, researchers launched an ad hoc analysis of the findings of a prospective study of HFrEF patients in 11 European countries (1,308 men and 402 women) who took drugs in the three classes. Patients were receiving suboptimal medication doses at the start of the study, and physicians were encouraged to increase their medication. The median follow-up for the primary endpoint was 21 months.
“In men, the lowest hazards of death or hospitalization for heart failure occurred at 100% of the recommended dose of ACE inhibitors or ARBs and beta-blockers, but women showed about 30% lower risk at only 50% of the recommended doses, with no further decrease in risk at higher dose levels,” the researchers wrote. “These sex differences were still present after adjusting for clinical covariates, including age and body surface area.”
The researchers analyzed an Asian registry (3,539 men, 961 women) as a comparison and found the identical numbers.
“Our study provides evidence supporting the hypothesis that women with HFrEF might have the best outcomes with lower doses of ACE inhibitors or ARBs and beta-blockers than do men, and lower doses than recommended in international guidelines for heart failure,” they wrote. However, they added that it was not likely that sex-specific studies analyzing doses would be performed.
In an accompanying editorial, Heather P. Whitley, PharmD, and Warren D. Smith, PharmD, noted that clinical research has often failed to take gender differences into account. They wrote that the study – the first of its kind – was well executed and raises important questions, but the analysis did not take into account the prevalence of adverse effects or the serum concentrations of the various medications. Although those limitations weaken the findings, the study still offers evidence that gender-based, drug-dose guidelines deserve consideration, wrote Dr. Whitley, of Auburn (Ala.) University, and Dr. Smith, of Baptist Health System, Montgomery, Ala (Lancet. 2019 Aug 22. doi: 10.1016/S0140-6736[19]31812-4).
The study was funded by the European Commission. Several study authors reported various disclosures. Dr. Whitley and Dr. Smith reported no conflicts of interest.
SOURCE: Santema BT et al. Lancet. 2019 Aug 22. doi: 10.1016/S0140-6736(19)31792-1.
Men and women react differently to common drugs used to treat heart failure with reduced ejection fraction (HFrEF), according to findings from a new European study, and women may be able to safely cut their doses in half and get the same level of relief as that provided by larger doses.
“This study ... brings into question what the true optimal medical therapy is for women versus men,” the study authors, led by Bernadet T. Santema, MD, of the University Medical Center Groningen (the Netherlands), wrote in an article published in the Lancet.
Dr. Santema and colleagues noted that current guidelines for the use of ACE inhibitors or angiotensin-receptor blockers (ARBs) and beta-blockers for men and women with heart failure do not differentiate between the genders, despite findings showing that, “with the same dose, the maximum plasma concentrations of ACE inhibitors, ARBs, and beta-blockers were up to 2.5 times higher in women than in men.”
In addition, the researchers wrote, women are much more likely than men to suffer side effects from medications, and the effects tend to be more severe.
HFrEF accounts for an estimated 50% of the 5.7 million patients with heart failure in the United States (Nat Rev Dis Primers. 2017 Aug 24. doi: 10.1038/nrdp.2017.58; Card Fail Rev. 2017;3[1]:7-11.)
For the new study, researchers launched an ad hoc analysis of the findings of a prospective study of HFrEF patients in 11 European countries (1,308 men and 402 women) who took drugs in the three classes. Patients were receiving suboptimal medication doses at the start of the study, and physicians were encouraged to increase their medication. The median follow-up for the primary endpoint was 21 months.
“In men, the lowest hazards of death or hospitalization for heart failure occurred at 100% of the recommended dose of ACE inhibitors or ARBs and beta-blockers, but women showed about 30% lower risk at only 50% of the recommended doses, with no further decrease in risk at higher dose levels,” the researchers wrote. “These sex differences were still present after adjusting for clinical covariates, including age and body surface area.”
The researchers analyzed an Asian registry (3,539 men, 961 women) as a comparison and found the identical numbers.
“Our study provides evidence supporting the hypothesis that women with HFrEF might have the best outcomes with lower doses of ACE inhibitors or ARBs and beta-blockers than do men, and lower doses than recommended in international guidelines for heart failure,” they wrote. However, they added that it was not likely that sex-specific studies analyzing doses would be performed.
In an accompanying editorial, Heather P. Whitley, PharmD, and Warren D. Smith, PharmD, noted that clinical research has often failed to take gender differences into account. They wrote that the study – the first of its kind – was well executed and raises important questions, but the analysis did not take into account the prevalence of adverse effects or the serum concentrations of the various medications. Although those limitations weaken the findings, the study still offers evidence that gender-based, drug-dose guidelines deserve consideration, wrote Dr. Whitley, of Auburn (Ala.) University, and Dr. Smith, of Baptist Health System, Montgomery, Ala (Lancet. 2019 Aug 22. doi: 10.1016/S0140-6736[19]31812-4).
The study was funded by the European Commission. Several study authors reported various disclosures. Dr. Whitley and Dr. Smith reported no conflicts of interest.
SOURCE: Santema BT et al. Lancet. 2019 Aug 22. doi: 10.1016/S0140-6736(19)31792-1.
Men and women react differently to common drugs used to treat heart failure with reduced ejection fraction (HFrEF), according to findings from a new European study, and women may be able to safely cut their doses in half and get the same level of relief as that provided by larger doses.
“This study ... brings into question what the true optimal medical therapy is for women versus men,” the study authors, led by Bernadet T. Santema, MD, of the University Medical Center Groningen (the Netherlands), wrote in an article published in the Lancet.
Dr. Santema and colleagues noted that current guidelines for the use of ACE inhibitors or angiotensin-receptor blockers (ARBs) and beta-blockers for men and women with heart failure do not differentiate between the genders, despite findings showing that, “with the same dose, the maximum plasma concentrations of ACE inhibitors, ARBs, and beta-blockers were up to 2.5 times higher in women than in men.”
In addition, the researchers wrote, women are much more likely than men to suffer side effects from medications, and the effects tend to be more severe.
HFrEF accounts for an estimated 50% of the 5.7 million patients with heart failure in the United States (Nat Rev Dis Primers. 2017 Aug 24. doi: 10.1038/nrdp.2017.58; Card Fail Rev. 2017;3[1]:7-11.)
For the new study, researchers launched an ad hoc analysis of the findings of a prospective study of HFrEF patients in 11 European countries (1,308 men and 402 women) who took drugs in the three classes. Patients were receiving suboptimal medication doses at the start of the study, and physicians were encouraged to increase their medication. The median follow-up for the primary endpoint was 21 months.
“In men, the lowest hazards of death or hospitalization for heart failure occurred at 100% of the recommended dose of ACE inhibitors or ARBs and beta-blockers, but women showed about 30% lower risk at only 50% of the recommended doses, with no further decrease in risk at higher dose levels,” the researchers wrote. “These sex differences were still present after adjusting for clinical covariates, including age and body surface area.”
The researchers analyzed an Asian registry (3,539 men, 961 women) as a comparison and found the identical numbers.
“Our study provides evidence supporting the hypothesis that women with HFrEF might have the best outcomes with lower doses of ACE inhibitors or ARBs and beta-blockers than do men, and lower doses than recommended in international guidelines for heart failure,” they wrote. However, they added that it was not likely that sex-specific studies analyzing doses would be performed.
In an accompanying editorial, Heather P. Whitley, PharmD, and Warren D. Smith, PharmD, noted that clinical research has often failed to take gender differences into account. They wrote that the study – the first of its kind – was well executed and raises important questions, but the analysis did not take into account the prevalence of adverse effects or the serum concentrations of the various medications. Although those limitations weaken the findings, the study still offers evidence that gender-based, drug-dose guidelines deserve consideration, wrote Dr. Whitley, of Auburn (Ala.) University, and Dr. Smith, of Baptist Health System, Montgomery, Ala (Lancet. 2019 Aug 22. doi: 10.1016/S0140-6736[19]31812-4).
The study was funded by the European Commission. Several study authors reported various disclosures. Dr. Whitley and Dr. Smith reported no conflicts of interest.
SOURCE: Santema BT et al. Lancet. 2019 Aug 22. doi: 10.1016/S0140-6736(19)31792-1.
FROM THE LANCET