User login
CAR T-cell therapy neurotoxicity linked to NfL elevations
“This is the first study to show NfL levels are elevated even before CAR T treatment is given,” first author Omar H. Butt, MD, PhD, of the Siteman Cancer Center at Barnes-Jewish Hospital and Washington University in St. Louis, said in an interview.
“While unlikely to be the sole driver of [the neurotoxicity], neural injury reflected by NfL may aid in identifying a high-risk subset of patients undergoing cellular therapy,” the authors concluded in the study, published in JAMA Oncology.
CAR T-cell therapy has gained favor for virtually revolutionizing the treatment of some leukemias and lymphomas, however, as many as 40%-60% of patients develop the neurotoxicity side effect, called immune effector cell–associated neurotoxicity syndrome (ICANS), which, though usually low grade, in more severe cases can cause substantial morbidity and even mortality.
Hence, “the early identification of patients at risk for ICANS is critical for preemptive management,” the authors noted.
NfL, an established marker of neuroaxonal injury in neurodegenerative diseases including multiple sclerosis and Alzheimer’s disease, has been shown in previous studies to be elevated following the development of ICANS and up to 5 days prior to its peak symptoms.
To further evaluate NfL elevations in relation to ICANS, Dr. Butt and colleagues identified 30 patients undergoing CD19 CART-cell therapy, including 77% for diffuse large B-cell lymphoma, at two U.S. centers: Washington University in St. Louis and Case Western Reserve University, Cleveland.
The patients had a median age of 64 and were 40% female.
Among them, four developed low-grade ICANS grade 1-2, and 7 developed ICANS grade 3 or higher.
Of those developing any-grade ICANS, baseline elevations of NfL prior to the CAR T-cell treatment, were significantly higher, compared with those who did not develop ICANs (mean 87.6 pg/mL vs. 29.4 pg/mL, P < .001), with no significant differences between the low-grade (1 and 2) and higher-grade (3 or higher) ICANS groups.
A receiver operating characteristic analysis showed baseline NfL levels significantly predicted the development of ICANS with high accuracy (area under the ROC curve, 0.96), as well as sensitivity (AUROC, 0.91) and specificity (AUROC, 0.95).
Notably, baseline NfL levels were associated with ICANS severity, but did not correlate with other factors including demographic, oncologic history, nononcologic neurologic history, or history of exposure to neurotoxic therapies.
However, Dr. Butt added, “it is important to note that our study was insufficiently powered to examine those relationships in earnest. Therefore, [a correlation between NfL and those factors] remains possible,” he said.
The elevated NfL levels observed prior to the development of ICANS remained high across the study’s seven time points, up to day 30 post infusion.
Interest in NfL levels on the rise
NfL assessment is currently only clinically validated in amyotrophic lateral sclerosis, where it is used to assess neuroaxonal health and integrity. However, testing is available as interest and evidence of NfL’s potential role in other settings grows.
Meanwhile, Dr. Butt and associates are themselves developing an assay to predict the development of ICANS, which will likely include NfL, if the role is validated in further studies.
“Future studies will explore validating NfL for ICANS and additional indications,” he said.
ICANS symptoms can range from headaches and confusion to seizures or strokes in more severe cases.
The current gold standard for treatment includes early intervention with high-dose steroids and careful monitoring, but there is reluctance to use such therapies because of concerns about their blunting the anticancer effects of the CAR T cells.
Importantly, if validated, elevations in NfL could signal the need for more precautionary measures with CAR T-cell therapy, Dr. Butt noted.
“Our data suggests patients with high NfL levels at baseline would benefit most from perhaps closer monitoring with frequent checks and possible early intervention at the first sign of symptoms, a period of time when it may be hard to distinguish ICANS from other causes of confusion, such as delirium,” he explained.
Limitations: Validation, preventive measures needed
Commenting on the study, Sattva S. Neelapu, MD, a professor and deputy chair of the department of lymphoma and myeloma at the University of Texas MD Anderson Cancer Center, Houston, agreed that the findings have potentially important implications.
“I think this is a very intriguing and novel finding that needs to be investigated further prospectively in a larger cohort and across different CAR T products in patients with lymphoma, leukemia, and myeloma,” Dr. Neelapu said in an interview.
The NfL elevations observed even before CAR T-cell therapy among those who went on to develop ICANS are notable, he added.
“This is the surprising finding in the study,” Dr. Neelapu said. “It raises the question whether neurologic injury is caused by prior therapies that these patients received or whether it is an age-related phenomenon, as we do see higher incidence and severity of ICANS in older patients or some other mechanisms.”
A key caveat, however, is that even if a risk is identified, options to prevent ICANS are currently limited, Dr. Neelapu noted.
“I think it is too early to implement this into clinical practice,” he said. In addition to needing further validation, “assessing NfL levels would be useful when there is an effective prophylactic or therapeutic strategy – both of which also need to be investigated.”
Dr. Butt and colleagues are developing a clinical assay for ICANS and reported a provisional patent pending on the use of plasma NfL as a predictive biomarker for ICANS. The study received support from the Washington University in St. Louis, the Paula and Rodger O. Riney Fund, the Daniel J. Brennan MD Fund, the Fred Simmons and Olga Mohan Fund; the National Cancer Institute, the National Multiple Sclerosis Society, and the National Institute of Neurological Disorders and Stroke. Dr. Neelapu reported conflicts of interest with numerous pharmaceutical companies.
“This is the first study to show NfL levels are elevated even before CAR T treatment is given,” first author Omar H. Butt, MD, PhD, of the Siteman Cancer Center at Barnes-Jewish Hospital and Washington University in St. Louis, said in an interview.
“While unlikely to be the sole driver of [the neurotoxicity], neural injury reflected by NfL may aid in identifying a high-risk subset of patients undergoing cellular therapy,” the authors concluded in the study, published in JAMA Oncology.
CAR T-cell therapy has gained favor for virtually revolutionizing the treatment of some leukemias and lymphomas, however, as many as 40%-60% of patients develop the neurotoxicity side effect, called immune effector cell–associated neurotoxicity syndrome (ICANS), which, though usually low grade, in more severe cases can cause substantial morbidity and even mortality.
Hence, “the early identification of patients at risk for ICANS is critical for preemptive management,” the authors noted.
NfL, an established marker of neuroaxonal injury in neurodegenerative diseases including multiple sclerosis and Alzheimer’s disease, has been shown in previous studies to be elevated following the development of ICANS and up to 5 days prior to its peak symptoms.
To further evaluate NfL elevations in relation to ICANS, Dr. Butt and colleagues identified 30 patients undergoing CD19 CART-cell therapy, including 77% for diffuse large B-cell lymphoma, at two U.S. centers: Washington University in St. Louis and Case Western Reserve University, Cleveland.
The patients had a median age of 64 and were 40% female.
Among them, four developed low-grade ICANS grade 1-2, and 7 developed ICANS grade 3 or higher.
Of those developing any-grade ICANS, baseline elevations of NfL prior to the CAR T-cell treatment, were significantly higher, compared with those who did not develop ICANs (mean 87.6 pg/mL vs. 29.4 pg/mL, P < .001), with no significant differences between the low-grade (1 and 2) and higher-grade (3 or higher) ICANS groups.
A receiver operating characteristic analysis showed baseline NfL levels significantly predicted the development of ICANS with high accuracy (area under the ROC curve, 0.96), as well as sensitivity (AUROC, 0.91) and specificity (AUROC, 0.95).
Notably, baseline NfL levels were associated with ICANS severity, but did not correlate with other factors including demographic, oncologic history, nononcologic neurologic history, or history of exposure to neurotoxic therapies.
However, Dr. Butt added, “it is important to note that our study was insufficiently powered to examine those relationships in earnest. Therefore, [a correlation between NfL and those factors] remains possible,” he said.
The elevated NfL levels observed prior to the development of ICANS remained high across the study’s seven time points, up to day 30 post infusion.
Interest in NfL levels on the rise
NfL assessment is currently only clinically validated in amyotrophic lateral sclerosis, where it is used to assess neuroaxonal health and integrity. However, testing is available as interest and evidence of NfL’s potential role in other settings grows.
Meanwhile, Dr. Butt and associates are themselves developing an assay to predict the development of ICANS, which will likely include NfL, if the role is validated in further studies.
“Future studies will explore validating NfL for ICANS and additional indications,” he said.
ICANS symptoms can range from headaches and confusion to seizures or strokes in more severe cases.
The current gold standard for treatment includes early intervention with high-dose steroids and careful monitoring, but there is reluctance to use such therapies because of concerns about their blunting the anticancer effects of the CAR T cells.
Importantly, if validated, elevations in NfL could signal the need for more precautionary measures with CAR T-cell therapy, Dr. Butt noted.
“Our data suggests patients with high NfL levels at baseline would benefit most from perhaps closer monitoring with frequent checks and possible early intervention at the first sign of symptoms, a period of time when it may be hard to distinguish ICANS from other causes of confusion, such as delirium,” he explained.
Limitations: Validation, preventive measures needed
Commenting on the study, Sattva S. Neelapu, MD, a professor and deputy chair of the department of lymphoma and myeloma at the University of Texas MD Anderson Cancer Center, Houston, agreed that the findings have potentially important implications.
“I think this is a very intriguing and novel finding that needs to be investigated further prospectively in a larger cohort and across different CAR T products in patients with lymphoma, leukemia, and myeloma,” Dr. Neelapu said in an interview.
The NfL elevations observed even before CAR T-cell therapy among those who went on to develop ICANS are notable, he added.
“This is the surprising finding in the study,” Dr. Neelapu said. “It raises the question whether neurologic injury is caused by prior therapies that these patients received or whether it is an age-related phenomenon, as we do see higher incidence and severity of ICANS in older patients or some other mechanisms.”
A key caveat, however, is that even if a risk is identified, options to prevent ICANS are currently limited, Dr. Neelapu noted.
“I think it is too early to implement this into clinical practice,” he said. In addition to needing further validation, “assessing NfL levels would be useful when there is an effective prophylactic or therapeutic strategy – both of which also need to be investigated.”
Dr. Butt and colleagues are developing a clinical assay for ICANS and reported a provisional patent pending on the use of plasma NfL as a predictive biomarker for ICANS. The study received support from the Washington University in St. Louis, the Paula and Rodger O. Riney Fund, the Daniel J. Brennan MD Fund, the Fred Simmons and Olga Mohan Fund; the National Cancer Institute, the National Multiple Sclerosis Society, and the National Institute of Neurological Disorders and Stroke. Dr. Neelapu reported conflicts of interest with numerous pharmaceutical companies.
“This is the first study to show NfL levels are elevated even before CAR T treatment is given,” first author Omar H. Butt, MD, PhD, of the Siteman Cancer Center at Barnes-Jewish Hospital and Washington University in St. Louis, said in an interview.
“While unlikely to be the sole driver of [the neurotoxicity], neural injury reflected by NfL may aid in identifying a high-risk subset of patients undergoing cellular therapy,” the authors concluded in the study, published in JAMA Oncology.
CAR T-cell therapy has gained favor for virtually revolutionizing the treatment of some leukemias and lymphomas, however, as many as 40%-60% of patients develop the neurotoxicity side effect, called immune effector cell–associated neurotoxicity syndrome (ICANS), which, though usually low grade, in more severe cases can cause substantial morbidity and even mortality.
Hence, “the early identification of patients at risk for ICANS is critical for preemptive management,” the authors noted.
NfL, an established marker of neuroaxonal injury in neurodegenerative diseases including multiple sclerosis and Alzheimer’s disease, has been shown in previous studies to be elevated following the development of ICANS and up to 5 days prior to its peak symptoms.
To further evaluate NfL elevations in relation to ICANS, Dr. Butt and colleagues identified 30 patients undergoing CD19 CART-cell therapy, including 77% for diffuse large B-cell lymphoma, at two U.S. centers: Washington University in St. Louis and Case Western Reserve University, Cleveland.
The patients had a median age of 64 and were 40% female.
Among them, four developed low-grade ICANS grade 1-2, and 7 developed ICANS grade 3 or higher.
Of those developing any-grade ICANS, baseline elevations of NfL prior to the CAR T-cell treatment, were significantly higher, compared with those who did not develop ICANs (mean 87.6 pg/mL vs. 29.4 pg/mL, P < .001), with no significant differences between the low-grade (1 and 2) and higher-grade (3 or higher) ICANS groups.
A receiver operating characteristic analysis showed baseline NfL levels significantly predicted the development of ICANS with high accuracy (area under the ROC curve, 0.96), as well as sensitivity (AUROC, 0.91) and specificity (AUROC, 0.95).
Notably, baseline NfL levels were associated with ICANS severity, but did not correlate with other factors including demographic, oncologic history, nononcologic neurologic history, or history of exposure to neurotoxic therapies.
However, Dr. Butt added, “it is important to note that our study was insufficiently powered to examine those relationships in earnest. Therefore, [a correlation between NfL and those factors] remains possible,” he said.
The elevated NfL levels observed prior to the development of ICANS remained high across the study’s seven time points, up to day 30 post infusion.
Interest in NfL levels on the rise
NfL assessment is currently only clinically validated in amyotrophic lateral sclerosis, where it is used to assess neuroaxonal health and integrity. However, testing is available as interest and evidence of NfL’s potential role in other settings grows.
Meanwhile, Dr. Butt and associates are themselves developing an assay to predict the development of ICANS, which will likely include NfL, if the role is validated in further studies.
“Future studies will explore validating NfL for ICANS and additional indications,” he said.
ICANS symptoms can range from headaches and confusion to seizures or strokes in more severe cases.
The current gold standard for treatment includes early intervention with high-dose steroids and careful monitoring, but there is reluctance to use such therapies because of concerns about their blunting the anticancer effects of the CAR T cells.
Importantly, if validated, elevations in NfL could signal the need for more precautionary measures with CAR T-cell therapy, Dr. Butt noted.
“Our data suggests patients with high NfL levels at baseline would benefit most from perhaps closer monitoring with frequent checks and possible early intervention at the first sign of symptoms, a period of time when it may be hard to distinguish ICANS from other causes of confusion, such as delirium,” he explained.
Limitations: Validation, preventive measures needed
Commenting on the study, Sattva S. Neelapu, MD, a professor and deputy chair of the department of lymphoma and myeloma at the University of Texas MD Anderson Cancer Center, Houston, agreed that the findings have potentially important implications.
“I think this is a very intriguing and novel finding that needs to be investigated further prospectively in a larger cohort and across different CAR T products in patients with lymphoma, leukemia, and myeloma,” Dr. Neelapu said in an interview.
The NfL elevations observed even before CAR T-cell therapy among those who went on to develop ICANS are notable, he added.
“This is the surprising finding in the study,” Dr. Neelapu said. “It raises the question whether neurologic injury is caused by prior therapies that these patients received or whether it is an age-related phenomenon, as we do see higher incidence and severity of ICANS in older patients or some other mechanisms.”
A key caveat, however, is that even if a risk is identified, options to prevent ICANS are currently limited, Dr. Neelapu noted.
“I think it is too early to implement this into clinical practice,” he said. In addition to needing further validation, “assessing NfL levels would be useful when there is an effective prophylactic or therapeutic strategy – both of which also need to be investigated.”
Dr. Butt and colleagues are developing a clinical assay for ICANS and reported a provisional patent pending on the use of plasma NfL as a predictive biomarker for ICANS. The study received support from the Washington University in St. Louis, the Paula and Rodger O. Riney Fund, the Daniel J. Brennan MD Fund, the Fred Simmons and Olga Mohan Fund; the National Cancer Institute, the National Multiple Sclerosis Society, and the National Institute of Neurological Disorders and Stroke. Dr. Neelapu reported conflicts of interest with numerous pharmaceutical companies.
FROM JAMA ONCOLOGY
Novel approach brings hospice-bound MM patient into remission
In a case that researchers hope might pave the way for similar responses, a hospice-bound relapsed/refractory multiple myeloma (RRMM) patient who relapsed after chimeric antigen receptor (CAR) T-cell therapy was brought back into remission with the help of next-generation genomic sequencing, targeted molecular analysis and a novel combination of MAP kinase (MAPK)–inhibiting drugs.
“We have shown that comprehensive molecular profiling of advanced myeloma patients may provide critical information to guide treatment beyond standard of care,” senior author Alessandro Lagana, PhD, of the Tisch Cancer Institute, Icahn School of Medicine at Mount Sinai, New York, said in an interview.
“This represents proof of concept that, while not curative, targeted molecules may serve as potential bridging therapies to clinical trial enrollment,” the authors further report in the case study, published recently in the Journal of Hematology & Oncology.
The use of B-cell maturation antigen (BCMA) CAR T-cell therapy approaches has transformed the treatment of multiple myeloma and leukemias, resulting in high response rates. However, most patients ultimately relapse, and no clear treatment options beyond CAR T therapy are established.
Such was the case for a 61-year old patient described in the study, who had relapsed 6 months after undergoing anti-BCMA CAR T-cell therapy and progressed after being salvaged for a short period with autologous stem cell transplantation. The patient had developed skin extramedullary disease, manifested as subcutaneous nodules.
“The subcutaneous skin lesions in lower extremities made him [ineligible] for another clinical trial and left him with no options,” Dr. Lagana said.
Using next-generation whole-exome sequencing, Dr. Lagana and colleagues had observed that a previously identified BRAF V600E–dominant subclone had persisted, despite the CAR T-cell treatment, in the patient’s bone marrow and cutaneous plasmacytoma.
The finding was not uncommon. More than half of RRMM patients (about 53%) show emerging clones with mutations within the MAPK signaling pathway, and in about 7% of patients, those include BRAF V600E, which can be targeted, the authors noted.
Further assessment of the patient’s CD138-positive MM cells using western blot signaling pathway analysis looking at DNA and RNA markers did indeed show an increase in MAPK signaling as a consequence of the mutation. This suggested a potential benefit of triple MAPK inhibition, compared with standard strategies.
Based on that information and on insights the researchers had gained from previous research, they implemented the novel, orally administered triple-combination treatment strategy, consisting of monomeric inhibition of BRAF dabrafenib (100 mg, twice daily), as well as dimeric inhibition with the multi–kinase inhibitor regorafenib (40 mg, once daily) and a MEK inhibitor (trametinib, 1.5 mg, for 21/28 days daily).
Of note, previous efforts using only monomeric inhibition of BRAF have not shown much success, but early data has shown some potential, with the inclusion of dimeric inhibition.
“Monomeric inhibition of BRAF has been attempted in patients with V600E, but the efficacy has been limited, likely due to feedback activation of the MAPK pathway via induction of BRAF dimer formation,” Dr. Lagana explained.
Meanwhile, “previous in vitro data from our colleagues at Mount Sinai has shown that inhibition of both monomeric and dimeric forms of BRAF in combination with MEK inhibition can overcome the negative feedback and lead to more efficacious and tolerable treatment,” he said.
With the treatment, the patient achieved a very good partial response for 110 days, with prompt reduction of the subcutaneous skin lesions and an 80% reduction in lambda free light chain (27.5 mg/L).
The triple-drug combination was well tolerated with minimal side effects, primarily involving grade 1 fatigue, and the patient was able to carry out activities of daily living and return to work.
“The triple inhibition allowed us to use less of each drug, which resulted in a well-tolerated regimen without any significant side effects,” Dr. Lagana said.
While the patient relapsed about 3 months later, there was, importantly, no recurrence of the subcutaneous nodules.
“We believe that the triple MAPK inhibition completely eradicated the disease clones driving the extramedullary disease,” Dr. Lagana said.
The therapy meanwhile enabled the patient to bridge to a new clinical trial, where he went into complete remission, and still was as of Sept. 29.
“To our knowledge, this was the first reported successful case of this treatment in an RRMM patient,” Dr. Lagana explained.
Case suggests ‘hope’ for relapsing patients
Importantly, currently many patients in the same position may wind up going to hospice, until such targeted medicine gains momentum, coauthor Samir Parekh, MD, a professor of hematology-oncology at the Hess Center for Science and Medicine, Icahn School of Medicine at Mount Sinai, said in an interview.
“As precision medicine is in its infancy in myeloma, these patients are not routinely sequenced for drug options that may be identified by next-generation sequencing,” said Dr. Parekh.
But for clinicians, the message of this case should be that “there is hope for patients relapsing after CAR T,” he added.
“Precision medicine approaches may be applicable even for this relapsed patient population,” he added. “MAP kinase mutations are common and drugs targeting them may be useful in myeloma.”
Noting that “the infrastructure to test and guide application of these therapies needs to be developed for myeloma, Dr. Parekh predicted that, “in the future, more effective MAPK inhibitors and other mutation or RNA-seq guided therapies will be applicable and hopefully provide more durable remissions.”
Approach may help address unmet need
Until then, however, treatment for patients who relapse after CAR-T and BCMA-targeted therapies has emerged as a significant unmet need. Therefore, this case highlights an important potential strategy, said Hans Lee, MD, an associate professor in the department of lymphoma/myeloma, division of cancer medicine, University of Texas MD Anderson Cancer Center, Houston, commenting on the study.
“This case report provides impetus for oncologists to strongly consider performing next-generation sequencing on myeloma tumor samples to look for potential actionable mutations, such as those in the MAPK pathway – which are common in myeloma,” he said. “With limited treatment options in the post–CAR T and post-BCMA setting, identifying such actional mutations may at least provide a bridge to other effective therapies available through clinical trials such as this patient’s case.”
Dr. Lee noted that key caveats include the fact that most physicians currently don’t have access to the type of next-generation sequencing and drug sensitivity testing used in the study.
Nevertheless, considering the limited options in the post–CAR T and post-BCMA setting, “the successful use of triple MAPK pathway inhibition through monomeric and dimeric inhibition of BRAF and MEK inhibition warrants further study in multiple myeloma in a clinical trial,” he said.
Dr. Lagana and associates are doing just that.
“We are about to launch the clinical trial, where we will match advanced RRMM patients with potential targeted treatments using different DNA and RNA markers,” Dr. Lagana said.
Dr. Lagana and Dr. Parekh had no disclosures to report. Three study coauthors reported receiving research grants or consulting fees from numerous pharmaceutical companies.
In a case that researchers hope might pave the way for similar responses, a hospice-bound relapsed/refractory multiple myeloma (RRMM) patient who relapsed after chimeric antigen receptor (CAR) T-cell therapy was brought back into remission with the help of next-generation genomic sequencing, targeted molecular analysis and a novel combination of MAP kinase (MAPK)–inhibiting drugs.
“We have shown that comprehensive molecular profiling of advanced myeloma patients may provide critical information to guide treatment beyond standard of care,” senior author Alessandro Lagana, PhD, of the Tisch Cancer Institute, Icahn School of Medicine at Mount Sinai, New York, said in an interview.
“This represents proof of concept that, while not curative, targeted molecules may serve as potential bridging therapies to clinical trial enrollment,” the authors further report in the case study, published recently in the Journal of Hematology & Oncology.
The use of B-cell maturation antigen (BCMA) CAR T-cell therapy approaches has transformed the treatment of multiple myeloma and leukemias, resulting in high response rates. However, most patients ultimately relapse, and no clear treatment options beyond CAR T therapy are established.
Such was the case for a 61-year old patient described in the study, who had relapsed 6 months after undergoing anti-BCMA CAR T-cell therapy and progressed after being salvaged for a short period with autologous stem cell transplantation. The patient had developed skin extramedullary disease, manifested as subcutaneous nodules.
“The subcutaneous skin lesions in lower extremities made him [ineligible] for another clinical trial and left him with no options,” Dr. Lagana said.
Using next-generation whole-exome sequencing, Dr. Lagana and colleagues had observed that a previously identified BRAF V600E–dominant subclone had persisted, despite the CAR T-cell treatment, in the patient’s bone marrow and cutaneous plasmacytoma.
The finding was not uncommon. More than half of RRMM patients (about 53%) show emerging clones with mutations within the MAPK signaling pathway, and in about 7% of patients, those include BRAF V600E, which can be targeted, the authors noted.
Further assessment of the patient’s CD138-positive MM cells using western blot signaling pathway analysis looking at DNA and RNA markers did indeed show an increase in MAPK signaling as a consequence of the mutation. This suggested a potential benefit of triple MAPK inhibition, compared with standard strategies.
Based on that information and on insights the researchers had gained from previous research, they implemented the novel, orally administered triple-combination treatment strategy, consisting of monomeric inhibition of BRAF dabrafenib (100 mg, twice daily), as well as dimeric inhibition with the multi–kinase inhibitor regorafenib (40 mg, once daily) and a MEK inhibitor (trametinib, 1.5 mg, for 21/28 days daily).
Of note, previous efforts using only monomeric inhibition of BRAF have not shown much success, but early data has shown some potential, with the inclusion of dimeric inhibition.
“Monomeric inhibition of BRAF has been attempted in patients with V600E, but the efficacy has been limited, likely due to feedback activation of the MAPK pathway via induction of BRAF dimer formation,” Dr. Lagana explained.
Meanwhile, “previous in vitro data from our colleagues at Mount Sinai has shown that inhibition of both monomeric and dimeric forms of BRAF in combination with MEK inhibition can overcome the negative feedback and lead to more efficacious and tolerable treatment,” he said.
With the treatment, the patient achieved a very good partial response for 110 days, with prompt reduction of the subcutaneous skin lesions and an 80% reduction in lambda free light chain (27.5 mg/L).
The triple-drug combination was well tolerated with minimal side effects, primarily involving grade 1 fatigue, and the patient was able to carry out activities of daily living and return to work.
“The triple inhibition allowed us to use less of each drug, which resulted in a well-tolerated regimen without any significant side effects,” Dr. Lagana said.
While the patient relapsed about 3 months later, there was, importantly, no recurrence of the subcutaneous nodules.
“We believe that the triple MAPK inhibition completely eradicated the disease clones driving the extramedullary disease,” Dr. Lagana said.
The therapy meanwhile enabled the patient to bridge to a new clinical trial, where he went into complete remission, and still was as of Sept. 29.
“To our knowledge, this was the first reported successful case of this treatment in an RRMM patient,” Dr. Lagana explained.
Case suggests ‘hope’ for relapsing patients
Importantly, currently many patients in the same position may wind up going to hospice, until such targeted medicine gains momentum, coauthor Samir Parekh, MD, a professor of hematology-oncology at the Hess Center for Science and Medicine, Icahn School of Medicine at Mount Sinai, said in an interview.
“As precision medicine is in its infancy in myeloma, these patients are not routinely sequenced for drug options that may be identified by next-generation sequencing,” said Dr. Parekh.
But for clinicians, the message of this case should be that “there is hope for patients relapsing after CAR T,” he added.
“Precision medicine approaches may be applicable even for this relapsed patient population,” he added. “MAP kinase mutations are common and drugs targeting them may be useful in myeloma.”
Noting that “the infrastructure to test and guide application of these therapies needs to be developed for myeloma, Dr. Parekh predicted that, “in the future, more effective MAPK inhibitors and other mutation or RNA-seq guided therapies will be applicable and hopefully provide more durable remissions.”
Approach may help address unmet need
Until then, however, treatment for patients who relapse after CAR-T and BCMA-targeted therapies has emerged as a significant unmet need. Therefore, this case highlights an important potential strategy, said Hans Lee, MD, an associate professor in the department of lymphoma/myeloma, division of cancer medicine, University of Texas MD Anderson Cancer Center, Houston, commenting on the study.
“This case report provides impetus for oncologists to strongly consider performing next-generation sequencing on myeloma tumor samples to look for potential actionable mutations, such as those in the MAPK pathway – which are common in myeloma,” he said. “With limited treatment options in the post–CAR T and post-BCMA setting, identifying such actional mutations may at least provide a bridge to other effective therapies available through clinical trials such as this patient’s case.”
Dr. Lee noted that key caveats include the fact that most physicians currently don’t have access to the type of next-generation sequencing and drug sensitivity testing used in the study.
Nevertheless, considering the limited options in the post–CAR T and post-BCMA setting, “the successful use of triple MAPK pathway inhibition through monomeric and dimeric inhibition of BRAF and MEK inhibition warrants further study in multiple myeloma in a clinical trial,” he said.
Dr. Lagana and associates are doing just that.
“We are about to launch the clinical trial, where we will match advanced RRMM patients with potential targeted treatments using different DNA and RNA markers,” Dr. Lagana said.
Dr. Lagana and Dr. Parekh had no disclosures to report. Three study coauthors reported receiving research grants or consulting fees from numerous pharmaceutical companies.
In a case that researchers hope might pave the way for similar responses, a hospice-bound relapsed/refractory multiple myeloma (RRMM) patient who relapsed after chimeric antigen receptor (CAR) T-cell therapy was brought back into remission with the help of next-generation genomic sequencing, targeted molecular analysis and a novel combination of MAP kinase (MAPK)–inhibiting drugs.
“We have shown that comprehensive molecular profiling of advanced myeloma patients may provide critical information to guide treatment beyond standard of care,” senior author Alessandro Lagana, PhD, of the Tisch Cancer Institute, Icahn School of Medicine at Mount Sinai, New York, said in an interview.
“This represents proof of concept that, while not curative, targeted molecules may serve as potential bridging therapies to clinical trial enrollment,” the authors further report in the case study, published recently in the Journal of Hematology & Oncology.
The use of B-cell maturation antigen (BCMA) CAR T-cell therapy approaches has transformed the treatment of multiple myeloma and leukemias, resulting in high response rates. However, most patients ultimately relapse, and no clear treatment options beyond CAR T therapy are established.
Such was the case for a 61-year old patient described in the study, who had relapsed 6 months after undergoing anti-BCMA CAR T-cell therapy and progressed after being salvaged for a short period with autologous stem cell transplantation. The patient had developed skin extramedullary disease, manifested as subcutaneous nodules.
“The subcutaneous skin lesions in lower extremities made him [ineligible] for another clinical trial and left him with no options,” Dr. Lagana said.
Using next-generation whole-exome sequencing, Dr. Lagana and colleagues had observed that a previously identified BRAF V600E–dominant subclone had persisted, despite the CAR T-cell treatment, in the patient’s bone marrow and cutaneous plasmacytoma.
The finding was not uncommon. More than half of RRMM patients (about 53%) show emerging clones with mutations within the MAPK signaling pathway, and in about 7% of patients, those include BRAF V600E, which can be targeted, the authors noted.
Further assessment of the patient’s CD138-positive MM cells using western blot signaling pathway analysis looking at DNA and RNA markers did indeed show an increase in MAPK signaling as a consequence of the mutation. This suggested a potential benefit of triple MAPK inhibition, compared with standard strategies.
Based on that information and on insights the researchers had gained from previous research, they implemented the novel, orally administered triple-combination treatment strategy, consisting of monomeric inhibition of BRAF dabrafenib (100 mg, twice daily), as well as dimeric inhibition with the multi–kinase inhibitor regorafenib (40 mg, once daily) and a MEK inhibitor (trametinib, 1.5 mg, for 21/28 days daily).
Of note, previous efforts using only monomeric inhibition of BRAF have not shown much success, but early data has shown some potential, with the inclusion of dimeric inhibition.
“Monomeric inhibition of BRAF has been attempted in patients with V600E, but the efficacy has been limited, likely due to feedback activation of the MAPK pathway via induction of BRAF dimer formation,” Dr. Lagana explained.
Meanwhile, “previous in vitro data from our colleagues at Mount Sinai has shown that inhibition of both monomeric and dimeric forms of BRAF in combination with MEK inhibition can overcome the negative feedback and lead to more efficacious and tolerable treatment,” he said.
With the treatment, the patient achieved a very good partial response for 110 days, with prompt reduction of the subcutaneous skin lesions and an 80% reduction in lambda free light chain (27.5 mg/L).
The triple-drug combination was well tolerated with minimal side effects, primarily involving grade 1 fatigue, and the patient was able to carry out activities of daily living and return to work.
“The triple inhibition allowed us to use less of each drug, which resulted in a well-tolerated regimen without any significant side effects,” Dr. Lagana said.
While the patient relapsed about 3 months later, there was, importantly, no recurrence of the subcutaneous nodules.
“We believe that the triple MAPK inhibition completely eradicated the disease clones driving the extramedullary disease,” Dr. Lagana said.
The therapy meanwhile enabled the patient to bridge to a new clinical trial, where he went into complete remission, and still was as of Sept. 29.
“To our knowledge, this was the first reported successful case of this treatment in an RRMM patient,” Dr. Lagana explained.
Case suggests ‘hope’ for relapsing patients
Importantly, currently many patients in the same position may wind up going to hospice, until such targeted medicine gains momentum, coauthor Samir Parekh, MD, a professor of hematology-oncology at the Hess Center for Science and Medicine, Icahn School of Medicine at Mount Sinai, said in an interview.
“As precision medicine is in its infancy in myeloma, these patients are not routinely sequenced for drug options that may be identified by next-generation sequencing,” said Dr. Parekh.
But for clinicians, the message of this case should be that “there is hope for patients relapsing after CAR T,” he added.
“Precision medicine approaches may be applicable even for this relapsed patient population,” he added. “MAP kinase mutations are common and drugs targeting them may be useful in myeloma.”
Noting that “the infrastructure to test and guide application of these therapies needs to be developed for myeloma, Dr. Parekh predicted that, “in the future, more effective MAPK inhibitors and other mutation or RNA-seq guided therapies will be applicable and hopefully provide more durable remissions.”
Approach may help address unmet need
Until then, however, treatment for patients who relapse after CAR-T and BCMA-targeted therapies has emerged as a significant unmet need. Therefore, this case highlights an important potential strategy, said Hans Lee, MD, an associate professor in the department of lymphoma/myeloma, division of cancer medicine, University of Texas MD Anderson Cancer Center, Houston, commenting on the study.
“This case report provides impetus for oncologists to strongly consider performing next-generation sequencing on myeloma tumor samples to look for potential actionable mutations, such as those in the MAPK pathway – which are common in myeloma,” he said. “With limited treatment options in the post–CAR T and post-BCMA setting, identifying such actional mutations may at least provide a bridge to other effective therapies available through clinical trials such as this patient’s case.”
Dr. Lee noted that key caveats include the fact that most physicians currently don’t have access to the type of next-generation sequencing and drug sensitivity testing used in the study.
Nevertheless, considering the limited options in the post–CAR T and post-BCMA setting, “the successful use of triple MAPK pathway inhibition through monomeric and dimeric inhibition of BRAF and MEK inhibition warrants further study in multiple myeloma in a clinical trial,” he said.
Dr. Lagana and associates are doing just that.
“We are about to launch the clinical trial, where we will match advanced RRMM patients with potential targeted treatments using different DNA and RNA markers,” Dr. Lagana said.
Dr. Lagana and Dr. Parekh had no disclosures to report. Three study coauthors reported receiving research grants or consulting fees from numerous pharmaceutical companies.
FROM THE JOURNAL OF HEMATOLOGY & ONCOLOGY
Vitamin D supplementation shows no COVID-19 prevention
Two large studies out of the United Kingdom and Norway show vitamin D supplementation has no benefit – as low dose, high dose, or in the form of cod liver oil supplementation – in preventing COVID-19 or acute respiratory tract infections, regardless of whether individuals are deficient or not.
The studies, published in the BMJ, underscore that “vaccination is still the most effective way to protect people from COVID-19, and vitamin D and cod liver oil supplementation should not be offered to healthy people with normal vitamin D levels,” writes Peter Bergman, MD, of the Karolinska Institute, Stockholm, in an editorial published alongside the studies.
Suboptimal levels of vitamin D are known to be associated with an increased risk of acute respiratory infections, and some observational studies have linked low 25-hydroxyvitamin D (25[OH]D) with more severe COVID-19; however, data on a possible protective effect of vitamin D supplementation in preventing infection have been inconsistent.
U.K. study compares doses
To further investigate the relationship with infections, including COVID-19, in a large cohort, the authors of the first of the two BMJ studies, a phase 3 open-label trial, enrolled 6,200 people in the United Kingdom aged 16 and older between December 2020 and June 2021 who were not taking vitamin D supplements at baseline.
Half of participants were offered a finger-prick blood test, and of the 2,674 who accepted, 86.3% were found to have low concentrations of 25(OH)D (< 75 nmol/L). These participants were provided with vitamin D supplementation at a lower (800 IU/day; n = 1328) or higher dose (3,200 IU/day; n = 1,346) for 6 months. The other half of the group received no tests or supplements.
The results showed minimal differences between groups in terms of rates of developing at least one acute respiratory infection, which occurred in 5% of those in the lower-dose group, 5.7% in the higher-dose group, and 4.6% of participants not offered supplementation.
Similarly, there were no significant differences in the development of real-time PCR-confirmed COVID-19, with rates of 3.6% in the lower-dose group, 3.0% in the higher-dose group, and 2.6% in the group not offered supplementation.
The study is “the first phase 3 randomized controlled trial to evaluate the effectiveness of a test-and-treat approach for correction of suboptimal vitamin D status to prevent acute respiratory tract infections,” report the authors, led by Adrian R. Martineau, MD, PhD, of Barts and The London School of Medicine and Dentistry, Queen Mary University of London.
While uptake and supplementation in the study were favorable, “no statistically significant effect of either dose was seen on the primary outcome of swab test, doctor-confirmed acute respiratory tract infection, or on the major secondary outcome of swab test-confirmed COVID-19,” they conclude.
Traditional use of cod liver oil of benefit?
In the second study, researchers in Norway, led by Arne Soraas, MD, PhD, of the department of microbiology, Oslo University Hospital, evaluated whether that country’s long-held tradition of consuming cod liver oil during the winter to prevent vitamin D deficiency could affect the development of COVID-19 or outcomes.
For the Cod Liver Oil for COVID-19 Prevention Study (CLOC), a large cohort of 34,601 adults with a mean age of 44.9 years who were not taking daily vitamin D supplements were randomized to receive 5 mL/day of cod liver oil, representing a surrogate dose of 400 IU/day of vitamin D (n = 17,278), or placebo (n = 17,323) for up to 6 months.
In contrast with the first study, the vast majority of patients in the CLOC study (86%) had adequate vitamin D levels, defined as greater than 50 nmol/L, at baseline.
Again, however, the results showed no association between increased vitamin D supplementation with cod liver oil and PCR-confirmed COVID-19 or acute respiratory infections, with approximately 1.3% in each group testing positive for COVID-19 over a median of 164 days.
Supplementation with cod liver oil was also not associated with a reduced risk of any of the coprimary endpoints, including other acute respiratory infections.
“Daily supplementation with cod liver oil, a low-dose vitamin D, eicosapentaenoic acid, and docosahexaenoic acid supplement, for 6 months during the SARS-CoV-2pandemic among Norwegian adults did not reduce the incidence of SARS-CoV-2 infection, serious COVID-19, or other acute respiratory infections,” the authors report.
Key study limitations
In his editorial, Dr. Bergman underscores the limitations of two studies – also acknowledged by the authors – including the key confounding role of vaccines that emerged during the studies.
“The null findings of the studies should be interpreted in the context of a highly effective vaccine rolled out during both studies,” Dr. Bergman writes.
In the U.K. study, for instance, whereas only 1.2% of participants were vaccinated at baseline, the rate soared to 89.1% having received at least one dose by study end, potentially masking any effect of vitamin D, he says.
Additionally, for the Norway study, Dr. Bergman notes that cod liver oil also contains a substantial amount of vitamin A, which can be a potent immunomodulator.
“Excessive intake of vitamin A can cause adverse effects and may also interfere with vitamin D-mediated effects on the immune system,” he writes.
With two recent large meta-analyses showing benefits of vitamin D supplementation to be specifically among people who are vitamin D deficient, “a pragmatic approach for the clinician could be to focus on risk groups” for supplementation, Dr. Bergman writes.
“[These include] those who could be tested before supplementation, including people with dark skin, or skin that is rarely exposed to the sun, pregnant women, and elderly people with chronic diseases.”
The U.K. trial was supported by Barts Charity, Pharma Nord, the Fischer Family Foundation, DSM Nutritional Products, the Exilarch’s Foundation, the Karl R. Pfleger Foundation, the AIM Foundation, Synergy Biologics, Cytoplan, the Clinical Research Network of the U.K. National Institute for Health and Care Research, the HDR UK BREATHE Hub, the U.K. Research and Innovation Industrial Strategy Challenge Fund, Thornton & Ross, Warburtons, Hyphens Pharma, and philanthropist Matthew Isaacs.
The CLOC trial was funded by Orkla Health, the manufacturer of the cod liver oil used in the trial. Dr. Bergman has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Two large studies out of the United Kingdom and Norway show vitamin D supplementation has no benefit – as low dose, high dose, or in the form of cod liver oil supplementation – in preventing COVID-19 or acute respiratory tract infections, regardless of whether individuals are deficient or not.
The studies, published in the BMJ, underscore that “vaccination is still the most effective way to protect people from COVID-19, and vitamin D and cod liver oil supplementation should not be offered to healthy people with normal vitamin D levels,” writes Peter Bergman, MD, of the Karolinska Institute, Stockholm, in an editorial published alongside the studies.
Suboptimal levels of vitamin D are known to be associated with an increased risk of acute respiratory infections, and some observational studies have linked low 25-hydroxyvitamin D (25[OH]D) with more severe COVID-19; however, data on a possible protective effect of vitamin D supplementation in preventing infection have been inconsistent.
U.K. study compares doses
To further investigate the relationship with infections, including COVID-19, in a large cohort, the authors of the first of the two BMJ studies, a phase 3 open-label trial, enrolled 6,200 people in the United Kingdom aged 16 and older between December 2020 and June 2021 who were not taking vitamin D supplements at baseline.
Half of participants were offered a finger-prick blood test, and of the 2,674 who accepted, 86.3% were found to have low concentrations of 25(OH)D (< 75 nmol/L). These participants were provided with vitamin D supplementation at a lower (800 IU/day; n = 1328) or higher dose (3,200 IU/day; n = 1,346) for 6 months. The other half of the group received no tests or supplements.
The results showed minimal differences between groups in terms of rates of developing at least one acute respiratory infection, which occurred in 5% of those in the lower-dose group, 5.7% in the higher-dose group, and 4.6% of participants not offered supplementation.
Similarly, there were no significant differences in the development of real-time PCR-confirmed COVID-19, with rates of 3.6% in the lower-dose group, 3.0% in the higher-dose group, and 2.6% in the group not offered supplementation.
The study is “the first phase 3 randomized controlled trial to evaluate the effectiveness of a test-and-treat approach for correction of suboptimal vitamin D status to prevent acute respiratory tract infections,” report the authors, led by Adrian R. Martineau, MD, PhD, of Barts and The London School of Medicine and Dentistry, Queen Mary University of London.
While uptake and supplementation in the study were favorable, “no statistically significant effect of either dose was seen on the primary outcome of swab test, doctor-confirmed acute respiratory tract infection, or on the major secondary outcome of swab test-confirmed COVID-19,” they conclude.
Traditional use of cod liver oil of benefit?
In the second study, researchers in Norway, led by Arne Soraas, MD, PhD, of the department of microbiology, Oslo University Hospital, evaluated whether that country’s long-held tradition of consuming cod liver oil during the winter to prevent vitamin D deficiency could affect the development of COVID-19 or outcomes.
For the Cod Liver Oil for COVID-19 Prevention Study (CLOC), a large cohort of 34,601 adults with a mean age of 44.9 years who were not taking daily vitamin D supplements were randomized to receive 5 mL/day of cod liver oil, representing a surrogate dose of 400 IU/day of vitamin D (n = 17,278), or placebo (n = 17,323) for up to 6 months.
In contrast with the first study, the vast majority of patients in the CLOC study (86%) had adequate vitamin D levels, defined as greater than 50 nmol/L, at baseline.
Again, however, the results showed no association between increased vitamin D supplementation with cod liver oil and PCR-confirmed COVID-19 or acute respiratory infections, with approximately 1.3% in each group testing positive for COVID-19 over a median of 164 days.
Supplementation with cod liver oil was also not associated with a reduced risk of any of the coprimary endpoints, including other acute respiratory infections.
“Daily supplementation with cod liver oil, a low-dose vitamin D, eicosapentaenoic acid, and docosahexaenoic acid supplement, for 6 months during the SARS-CoV-2pandemic among Norwegian adults did not reduce the incidence of SARS-CoV-2 infection, serious COVID-19, or other acute respiratory infections,” the authors report.
Key study limitations
In his editorial, Dr. Bergman underscores the limitations of two studies – also acknowledged by the authors – including the key confounding role of vaccines that emerged during the studies.
“The null findings of the studies should be interpreted in the context of a highly effective vaccine rolled out during both studies,” Dr. Bergman writes.
In the U.K. study, for instance, whereas only 1.2% of participants were vaccinated at baseline, the rate soared to 89.1% having received at least one dose by study end, potentially masking any effect of vitamin D, he says.
Additionally, for the Norway study, Dr. Bergman notes that cod liver oil also contains a substantial amount of vitamin A, which can be a potent immunomodulator.
“Excessive intake of vitamin A can cause adverse effects and may also interfere with vitamin D-mediated effects on the immune system,” he writes.
With two recent large meta-analyses showing benefits of vitamin D supplementation to be specifically among people who are vitamin D deficient, “a pragmatic approach for the clinician could be to focus on risk groups” for supplementation, Dr. Bergman writes.
“[These include] those who could be tested before supplementation, including people with dark skin, or skin that is rarely exposed to the sun, pregnant women, and elderly people with chronic diseases.”
The U.K. trial was supported by Barts Charity, Pharma Nord, the Fischer Family Foundation, DSM Nutritional Products, the Exilarch’s Foundation, the Karl R. Pfleger Foundation, the AIM Foundation, Synergy Biologics, Cytoplan, the Clinical Research Network of the U.K. National Institute for Health and Care Research, the HDR UK BREATHE Hub, the U.K. Research and Innovation Industrial Strategy Challenge Fund, Thornton & Ross, Warburtons, Hyphens Pharma, and philanthropist Matthew Isaacs.
The CLOC trial was funded by Orkla Health, the manufacturer of the cod liver oil used in the trial. Dr. Bergman has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Two large studies out of the United Kingdom and Norway show vitamin D supplementation has no benefit – as low dose, high dose, or in the form of cod liver oil supplementation – in preventing COVID-19 or acute respiratory tract infections, regardless of whether individuals are deficient or not.
The studies, published in the BMJ, underscore that “vaccination is still the most effective way to protect people from COVID-19, and vitamin D and cod liver oil supplementation should not be offered to healthy people with normal vitamin D levels,” writes Peter Bergman, MD, of the Karolinska Institute, Stockholm, in an editorial published alongside the studies.
Suboptimal levels of vitamin D are known to be associated with an increased risk of acute respiratory infections, and some observational studies have linked low 25-hydroxyvitamin D (25[OH]D) with more severe COVID-19; however, data on a possible protective effect of vitamin D supplementation in preventing infection have been inconsistent.
U.K. study compares doses
To further investigate the relationship with infections, including COVID-19, in a large cohort, the authors of the first of the two BMJ studies, a phase 3 open-label trial, enrolled 6,200 people in the United Kingdom aged 16 and older between December 2020 and June 2021 who were not taking vitamin D supplements at baseline.
Half of participants were offered a finger-prick blood test, and of the 2,674 who accepted, 86.3% were found to have low concentrations of 25(OH)D (< 75 nmol/L). These participants were provided with vitamin D supplementation at a lower (800 IU/day; n = 1328) or higher dose (3,200 IU/day; n = 1,346) for 6 months. The other half of the group received no tests or supplements.
The results showed minimal differences between groups in terms of rates of developing at least one acute respiratory infection, which occurred in 5% of those in the lower-dose group, 5.7% in the higher-dose group, and 4.6% of participants not offered supplementation.
Similarly, there were no significant differences in the development of real-time PCR-confirmed COVID-19, with rates of 3.6% in the lower-dose group, 3.0% in the higher-dose group, and 2.6% in the group not offered supplementation.
The study is “the first phase 3 randomized controlled trial to evaluate the effectiveness of a test-and-treat approach for correction of suboptimal vitamin D status to prevent acute respiratory tract infections,” report the authors, led by Adrian R. Martineau, MD, PhD, of Barts and The London School of Medicine and Dentistry, Queen Mary University of London.
While uptake and supplementation in the study were favorable, “no statistically significant effect of either dose was seen on the primary outcome of swab test, doctor-confirmed acute respiratory tract infection, or on the major secondary outcome of swab test-confirmed COVID-19,” they conclude.
Traditional use of cod liver oil of benefit?
In the second study, researchers in Norway, led by Arne Soraas, MD, PhD, of the department of microbiology, Oslo University Hospital, evaluated whether that country’s long-held tradition of consuming cod liver oil during the winter to prevent vitamin D deficiency could affect the development of COVID-19 or outcomes.
For the Cod Liver Oil for COVID-19 Prevention Study (CLOC), a large cohort of 34,601 adults with a mean age of 44.9 years who were not taking daily vitamin D supplements were randomized to receive 5 mL/day of cod liver oil, representing a surrogate dose of 400 IU/day of vitamin D (n = 17,278), or placebo (n = 17,323) for up to 6 months.
In contrast with the first study, the vast majority of patients in the CLOC study (86%) had adequate vitamin D levels, defined as greater than 50 nmol/L, at baseline.
Again, however, the results showed no association between increased vitamin D supplementation with cod liver oil and PCR-confirmed COVID-19 or acute respiratory infections, with approximately 1.3% in each group testing positive for COVID-19 over a median of 164 days.
Supplementation with cod liver oil was also not associated with a reduced risk of any of the coprimary endpoints, including other acute respiratory infections.
“Daily supplementation with cod liver oil, a low-dose vitamin D, eicosapentaenoic acid, and docosahexaenoic acid supplement, for 6 months during the SARS-CoV-2pandemic among Norwegian adults did not reduce the incidence of SARS-CoV-2 infection, serious COVID-19, or other acute respiratory infections,” the authors report.
Key study limitations
In his editorial, Dr. Bergman underscores the limitations of two studies – also acknowledged by the authors – including the key confounding role of vaccines that emerged during the studies.
“The null findings of the studies should be interpreted in the context of a highly effective vaccine rolled out during both studies,” Dr. Bergman writes.
In the U.K. study, for instance, whereas only 1.2% of participants were vaccinated at baseline, the rate soared to 89.1% having received at least one dose by study end, potentially masking any effect of vitamin D, he says.
Additionally, for the Norway study, Dr. Bergman notes that cod liver oil also contains a substantial amount of vitamin A, which can be a potent immunomodulator.
“Excessive intake of vitamin A can cause adverse effects and may also interfere with vitamin D-mediated effects on the immune system,” he writes.
With two recent large meta-analyses showing benefits of vitamin D supplementation to be specifically among people who are vitamin D deficient, “a pragmatic approach for the clinician could be to focus on risk groups” for supplementation, Dr. Bergman writes.
“[These include] those who could be tested before supplementation, including people with dark skin, or skin that is rarely exposed to the sun, pregnant women, and elderly people with chronic diseases.”
The U.K. trial was supported by Barts Charity, Pharma Nord, the Fischer Family Foundation, DSM Nutritional Products, the Exilarch’s Foundation, the Karl R. Pfleger Foundation, the AIM Foundation, Synergy Biologics, Cytoplan, the Clinical Research Network of the U.K. National Institute for Health and Care Research, the HDR UK BREATHE Hub, the U.K. Research and Innovation Industrial Strategy Challenge Fund, Thornton & Ross, Warburtons, Hyphens Pharma, and philanthropist Matthew Isaacs.
The CLOC trial was funded by Orkla Health, the manufacturer of the cod liver oil used in the trial. Dr. Bergman has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM BMJ
Influenza vaccine may offer much more than flu prevention
in new findings that suggest the vaccine itself, and not just avoidance of the virus, may be beneficial.
“We postulate that influenza vaccination may have a protective effect against stroke that may be partly independent of influenza prevention,” study investigator Francisco J. de Abajo, MD, PhD, MPH, of the University of Alcalá, Madrid, said in an interview.
“Although the study is observational and this finding can also be explained by unmeasured confounding factors, we feel that a direct biological effect of vaccine cannot be ruled out and this finding opens new avenues for investigation.”
The study was published online in Neurology.
‘Not a spurious association’
While there is a well-established link between seasonal influenza and increased ischemic stroke risk, the role of flu vaccination in stroke prevention is unclear.
In the nested case-control study, researchers evaluated data from primary care practices in Spain between 2001 and 2015. They identified 14,322 patients with first-time ischemic stroke. Of these, 9,542 had noncardioembolic stroke and 4,780 had cardioembolic stroke.
Each case was matched with five controls from the population of age- and sex-matched controls without stroke (n = 71,610).
Those in the stroke group had a slightly higher rate of flu vaccination than controls, at 41.4% versus 40.5% (odds ratio, 1.05).
Adjusted analysis revealed those who received flu vaccination were less likely to experience ischemic stroke within 15-30 days of vaccination (OR, 0.79) and, to a lesser degree, over up to 150 days (OR, 0.92).
The reduced risk associated with the flu vaccine was observed with both types of ischemic stroke and appeared to offer stroke protection outside of flu season.
The reduced risk was also found in subgroup comparisons in men, women, those aged over and under 65 years, and those with intermediate and high vascular risk.
Importantly, a separate analysis of pneumococcal vaccination did not show a similar reduction in stroke risk (adjusted OR, 1.08).
“The lack of protection found with the pneumococcal vaccine actually reinforces the hypothesis that the protection of influenza vaccine is not a spurious association, as both vaccines might share the same biases and confounding factors,” Dr. de Abajo said.
Anti-inflammatory effect?
Influenza infection is known to induce a systemic inflammatory response that “can precipitate atheroma plaque rupture mediated by elevated concentrations of reactive proteins and cytokines,” the investigators noted, and so, avoiding infection could prevent those effects.
The results are consistent with other studies that have shown similar findings, including recent data from the INTERSTROKE trial. However, the reduced risk observed in the current study even in years without a flu epidemic expands on previous findings.
“This finding suggests that other mechanisms different from the prevention of influenza infection – e.g., a direct biological effect – could account for the risk reduction found,” the investigators wrote.
In terms of the nature of that effect, Dr. de Abajo noted that, “at this stage, we can only speculate.
“Having said that, there are some pieces of evidence that suggest influenza vaccination may release anti-inflammatory mediators that can stabilize the atheroma plaque. This is an interesting hypothesis that should be addressed in the near future,” he added.
‘More than just flu prevention’
In an accompanying editorial, Dixon Yang, MD, and Mitchell S.V. Elkind, MD, agree that the findings point to intriguing potential unexpected benefits of the vaccine.
“This case-control study ... importantly suggests the influenza vaccine is more than just about preventing the flu,” they wrote.
Dr. Elkind said in an interview that the mechanism could indeed involve an anti-inflammatory effect.
“There is some evidence that antibiotics also have anti-inflammatory properties that might reduce risk of stroke or the brain damage from a stroke,” he noted. “So, it is plausible that some of the effect of the vaccine on reducing risk of stroke may be through a reduction in inflammation.”
Dr. Elkind noted that the magnitude of the reduction observed with the vaccine, though not substantial, is important. “The magnitude of effect for any one individual may be modest, but it is in the ballpark of the effect of other commonly used approaches to stroke prevention, such as taking an aspirin a day, which reduces risk of stroke by about 20%. But because influenza is so common, the impact of even a small effect for an individual can have a large impact at the population level. So, the results are of public health significance.”
The study received support from the Biomedical Research Foundation of the Prince of Asturias University Hospital and the Institute of Health Carlos III in Madrid. Dr. Elkind has reported receiving ancillary funding but no personal compensation from Roche for a federally funded trial of stroke prevention.
A version of this article first appeared on Medscape.com.
in new findings that suggest the vaccine itself, and not just avoidance of the virus, may be beneficial.
“We postulate that influenza vaccination may have a protective effect against stroke that may be partly independent of influenza prevention,” study investigator Francisco J. de Abajo, MD, PhD, MPH, of the University of Alcalá, Madrid, said in an interview.
“Although the study is observational and this finding can also be explained by unmeasured confounding factors, we feel that a direct biological effect of vaccine cannot be ruled out and this finding opens new avenues for investigation.”
The study was published online in Neurology.
‘Not a spurious association’
While there is a well-established link between seasonal influenza and increased ischemic stroke risk, the role of flu vaccination in stroke prevention is unclear.
In the nested case-control study, researchers evaluated data from primary care practices in Spain between 2001 and 2015. They identified 14,322 patients with first-time ischemic stroke. Of these, 9,542 had noncardioembolic stroke and 4,780 had cardioembolic stroke.
Each case was matched with five controls from the population of age- and sex-matched controls without stroke (n = 71,610).
Those in the stroke group had a slightly higher rate of flu vaccination than controls, at 41.4% versus 40.5% (odds ratio, 1.05).
Adjusted analysis revealed those who received flu vaccination were less likely to experience ischemic stroke within 15-30 days of vaccination (OR, 0.79) and, to a lesser degree, over up to 150 days (OR, 0.92).
The reduced risk associated with the flu vaccine was observed with both types of ischemic stroke and appeared to offer stroke protection outside of flu season.
The reduced risk was also found in subgroup comparisons in men, women, those aged over and under 65 years, and those with intermediate and high vascular risk.
Importantly, a separate analysis of pneumococcal vaccination did not show a similar reduction in stroke risk (adjusted OR, 1.08).
“The lack of protection found with the pneumococcal vaccine actually reinforces the hypothesis that the protection of influenza vaccine is not a spurious association, as both vaccines might share the same biases and confounding factors,” Dr. de Abajo said.
Anti-inflammatory effect?
Influenza infection is known to induce a systemic inflammatory response that “can precipitate atheroma plaque rupture mediated by elevated concentrations of reactive proteins and cytokines,” the investigators noted, and so, avoiding infection could prevent those effects.
The results are consistent with other studies that have shown similar findings, including recent data from the INTERSTROKE trial. However, the reduced risk observed in the current study even in years without a flu epidemic expands on previous findings.
“This finding suggests that other mechanisms different from the prevention of influenza infection – e.g., a direct biological effect – could account for the risk reduction found,” the investigators wrote.
In terms of the nature of that effect, Dr. de Abajo noted that, “at this stage, we can only speculate.
“Having said that, there are some pieces of evidence that suggest influenza vaccination may release anti-inflammatory mediators that can stabilize the atheroma plaque. This is an interesting hypothesis that should be addressed in the near future,” he added.
‘More than just flu prevention’
In an accompanying editorial, Dixon Yang, MD, and Mitchell S.V. Elkind, MD, agree that the findings point to intriguing potential unexpected benefits of the vaccine.
“This case-control study ... importantly suggests the influenza vaccine is more than just about preventing the flu,” they wrote.
Dr. Elkind said in an interview that the mechanism could indeed involve an anti-inflammatory effect.
“There is some evidence that antibiotics also have anti-inflammatory properties that might reduce risk of stroke or the brain damage from a stroke,” he noted. “So, it is plausible that some of the effect of the vaccine on reducing risk of stroke may be through a reduction in inflammation.”
Dr. Elkind noted that the magnitude of the reduction observed with the vaccine, though not substantial, is important. “The magnitude of effect for any one individual may be modest, but it is in the ballpark of the effect of other commonly used approaches to stroke prevention, such as taking an aspirin a day, which reduces risk of stroke by about 20%. But because influenza is so common, the impact of even a small effect for an individual can have a large impact at the population level. So, the results are of public health significance.”
The study received support from the Biomedical Research Foundation of the Prince of Asturias University Hospital and the Institute of Health Carlos III in Madrid. Dr. Elkind has reported receiving ancillary funding but no personal compensation from Roche for a federally funded trial of stroke prevention.
A version of this article first appeared on Medscape.com.
in new findings that suggest the vaccine itself, and not just avoidance of the virus, may be beneficial.
“We postulate that influenza vaccination may have a protective effect against stroke that may be partly independent of influenza prevention,” study investigator Francisco J. de Abajo, MD, PhD, MPH, of the University of Alcalá, Madrid, said in an interview.
“Although the study is observational and this finding can also be explained by unmeasured confounding factors, we feel that a direct biological effect of vaccine cannot be ruled out and this finding opens new avenues for investigation.”
The study was published online in Neurology.
‘Not a spurious association’
While there is a well-established link between seasonal influenza and increased ischemic stroke risk, the role of flu vaccination in stroke prevention is unclear.
In the nested case-control study, researchers evaluated data from primary care practices in Spain between 2001 and 2015. They identified 14,322 patients with first-time ischemic stroke. Of these, 9,542 had noncardioembolic stroke and 4,780 had cardioembolic stroke.
Each case was matched with five controls from the population of age- and sex-matched controls without stroke (n = 71,610).
Those in the stroke group had a slightly higher rate of flu vaccination than controls, at 41.4% versus 40.5% (odds ratio, 1.05).
Adjusted analysis revealed those who received flu vaccination were less likely to experience ischemic stroke within 15-30 days of vaccination (OR, 0.79) and, to a lesser degree, over up to 150 days (OR, 0.92).
The reduced risk associated with the flu vaccine was observed with both types of ischemic stroke and appeared to offer stroke protection outside of flu season.
The reduced risk was also found in subgroup comparisons in men, women, those aged over and under 65 years, and those with intermediate and high vascular risk.
Importantly, a separate analysis of pneumococcal vaccination did not show a similar reduction in stroke risk (adjusted OR, 1.08).
“The lack of protection found with the pneumococcal vaccine actually reinforces the hypothesis that the protection of influenza vaccine is not a spurious association, as both vaccines might share the same biases and confounding factors,” Dr. de Abajo said.
Anti-inflammatory effect?
Influenza infection is known to induce a systemic inflammatory response that “can precipitate atheroma plaque rupture mediated by elevated concentrations of reactive proteins and cytokines,” the investigators noted, and so, avoiding infection could prevent those effects.
The results are consistent with other studies that have shown similar findings, including recent data from the INTERSTROKE trial. However, the reduced risk observed in the current study even in years without a flu epidemic expands on previous findings.
“This finding suggests that other mechanisms different from the prevention of influenza infection – e.g., a direct biological effect – could account for the risk reduction found,” the investigators wrote.
In terms of the nature of that effect, Dr. de Abajo noted that, “at this stage, we can only speculate.
“Having said that, there are some pieces of evidence that suggest influenza vaccination may release anti-inflammatory mediators that can stabilize the atheroma plaque. This is an interesting hypothesis that should be addressed in the near future,” he added.
‘More than just flu prevention’
In an accompanying editorial, Dixon Yang, MD, and Mitchell S.V. Elkind, MD, agree that the findings point to intriguing potential unexpected benefits of the vaccine.
“This case-control study ... importantly suggests the influenza vaccine is more than just about preventing the flu,” they wrote.
Dr. Elkind said in an interview that the mechanism could indeed involve an anti-inflammatory effect.
“There is some evidence that antibiotics also have anti-inflammatory properties that might reduce risk of stroke or the brain damage from a stroke,” he noted. “So, it is plausible that some of the effect of the vaccine on reducing risk of stroke may be through a reduction in inflammation.”
Dr. Elkind noted that the magnitude of the reduction observed with the vaccine, though not substantial, is important. “The magnitude of effect for any one individual may be modest, but it is in the ballpark of the effect of other commonly used approaches to stroke prevention, such as taking an aspirin a day, which reduces risk of stroke by about 20%. But because influenza is so common, the impact of even a small effect for an individual can have a large impact at the population level. So, the results are of public health significance.”
The study received support from the Biomedical Research Foundation of the Prince of Asturias University Hospital and the Institute of Health Carlos III in Madrid. Dr. Elkind has reported receiving ancillary funding but no personal compensation from Roche for a federally funded trial of stroke prevention.
A version of this article first appeared on Medscape.com.
FROM NEUROLOGY
Pivotal trials in blood cancers don’t mirror patient populations
, a new study concludes.
“Our analysis shows that, over the past 10 years, participation in pivotal clinical trials investigating therapies for leukemias and MM is unrepresentative of the U.S. population,” say the authors, led by Jorge E. Cortes, MD, of the Georgia Cancer Center at Augusta University, Ga. “Trials should represent the population with the disease,” they comment.
The study was published in the Journal of Clinical Oncology.
“This study confirms that the U.S. cancer population for select hematologic malignancies was inadequately racially and ethnically represented in studies leading to drug approval,” comment the authors of an accompanying editorial.
“The results from this study should lead to questions about the generalizability of drug safety and efficacy in populations we serve as medical hematologists and oncologists,” say Mikkael A. Sekeres, MD, along with Namrata S. Chandhok, MD, both of the division of hematology, Sylvester Comprehensive Cancer Center, University of Miami.
They pose the question, for instance, as physicians practicing in South Florida, where most of their patients are Hispanic, “can we apply the results of these pivotal studies – and drug labels – to them, without any sense of whether they metabolize the drug the same way as those included in the study or have the same biologic targets?”
Analysis of pivotal trials
For their study, Dr. Cortes and colleagues analyzed 61 pivotal trials for leukemia and MM leading to approval of the drugs from the U.S. Food and Drug Administration between 2011 and 2021.
They found that only two-thirds (67.2%) of these trials reported data pertaining to race, while about half (48.8%) reported on ethnicity.
The trials that did report data on race involved a total of 13,731 patients. The vast majority (81.6%) were White, and Black patients represented only 3.8%. Asian/Pacific Islanders made up 9.1%, and American Indians or Alaskan Natives made up just 0.12% of participants, with 1.5% categorized as other.
Among the trials reporting on ethnicity, 4.7% of patients were Hispanic, with 11.5% being Hispanic in acute lymphoblastic leukemia (ALL) trials and 7.6% Hispanic in chronic myeloid leukemia (CML) trials.
Slightly more than half (54.8%) of all trial participants were male, and patients’ average ages ranged from 41.7 to 67.3 years across all malignancies.
Of the minority groups, Asian/Pacific Islanders and Black people had the highest representation in trials involving CML, at 12.7% and 5.3%, respectively.
Their lowest representation was in chronic lymphocytic leukemia (CLL), at 3% and 1.1%, respectively.
Among the trials reporting ethnicity, Hispanic people were the highest representation, with percentages ranging from 3.8% of MM trials to 11.5% in ALL trials.
Inconsistent with patient populations
Next, the researchers compared the proportions of race/ethnic groups that were found among the participants of these pivotal trials with the proportions that would be expected in patient populations for each of these blood cancers (according to the U.S. Surveillance, Epidemiology, and End Results [SEER] database).
For example, White people made up 80.3% of participants in clinical trials of MM, whereas they represent 68.7% of patients with MM, a difference that was statistically significant (P < .0001).
The finding was similar for CML, with White people accounting for 90.5% of participants in clinical trials versus 82.5% of the patient population (P < .0001).
For AML, the difference was smaller, with respective percentages of 79.6 versus 77.3% (P = .0389).
For Black people, Asian/Pacific Islanders and Hispanic people, across all five cancer types that were analyzed, the proportion of participants in clinical trials was significantly lower than the proportion in the patient population.
The analysis also showed that females were overrepresented in clinical trials for two blood cancers. For MM, trial participation was 44.7%, while disease incidence was 41.7% (P < .0001), and for CML the proportions were 44.7% versus 39.5% (P = .0009). However, females were underrepresented in a third blood cancer: in AML, the proportions were 44.7% versus 60.5% (P < .0001).
Geographic location of trials often inaccessible
The study also highlighted an obstacle to minorities participating in clinical trials: geography.
For this analysis, the researchers looked at mortality rates for the various blood cancers.
For AML, they found mortality rates were high across the whole of the United States, but centers conducting AML clinical trials were primarily in the Northeast, with no centers in the Midwest.
Key regions with high rates of AML mortality, low access to trials, and high minority representation were notably clustered in areas including east of the Carolinas, South Georgia, Alabama, and Mississippi, the authors noted.
“In many instances, trials were absent in areas with high mortality,” they report. “This makes access to clinical trials difficult, if not impossible, to patients who do not have the financial means for travel.”
Further action needed
Racial and ethnic disparities in clinical trials have been widely reported in numerous previous studies, the authors note.
Various initiatives have been launched in recent years to tackle the problem, including the National Institutes of Health Revitalization Act, FDA race and ethnicity guidance, and the International Conference for Harmonization guidance.
For oncology, the American Society of Clinical Oncology has also taken steps with the release of the new Equity, Diversity, and Inclusion Action Plan in 2021 to improve representation of minorities in research.
Dr. Cortes and colleagues suggest another step that is needed is standardized reporting of demographics of clinical trial participants.
“More importantly, efforts to increase representation of minorities and disadvantaged populations in clinical trials should be prioritized,” they say.
Dr. Cortes reports a consulting role and receiving research funding from many pharmaceutical companies. No other coauthors have financial disclosures. Dr. Chandhok reports honoraria from Healio, Clinical Care Options, and a consulting role with Servier. Dr. Sekeres reports a consulting role with Celgene, Millennium, Pfizer, Novartis, Syros Pharmaceuticals, Kurome Therapeutics, and institutional research funding from Takeda, Pfizer, Bristol Myers Squibb, Actuate Therapeutics, Sellas Life Sciences, and Bio-Path Holdings.
A version of this article first appeared on Medscape.com.
, a new study concludes.
“Our analysis shows that, over the past 10 years, participation in pivotal clinical trials investigating therapies for leukemias and MM is unrepresentative of the U.S. population,” say the authors, led by Jorge E. Cortes, MD, of the Georgia Cancer Center at Augusta University, Ga. “Trials should represent the population with the disease,” they comment.
The study was published in the Journal of Clinical Oncology.
“This study confirms that the U.S. cancer population for select hematologic malignancies was inadequately racially and ethnically represented in studies leading to drug approval,” comment the authors of an accompanying editorial.
“The results from this study should lead to questions about the generalizability of drug safety and efficacy in populations we serve as medical hematologists and oncologists,” say Mikkael A. Sekeres, MD, along with Namrata S. Chandhok, MD, both of the division of hematology, Sylvester Comprehensive Cancer Center, University of Miami.
They pose the question, for instance, as physicians practicing in South Florida, where most of their patients are Hispanic, “can we apply the results of these pivotal studies – and drug labels – to them, without any sense of whether they metabolize the drug the same way as those included in the study or have the same biologic targets?”
Analysis of pivotal trials
For their study, Dr. Cortes and colleagues analyzed 61 pivotal trials for leukemia and MM leading to approval of the drugs from the U.S. Food and Drug Administration between 2011 and 2021.
They found that only two-thirds (67.2%) of these trials reported data pertaining to race, while about half (48.8%) reported on ethnicity.
The trials that did report data on race involved a total of 13,731 patients. The vast majority (81.6%) were White, and Black patients represented only 3.8%. Asian/Pacific Islanders made up 9.1%, and American Indians or Alaskan Natives made up just 0.12% of participants, with 1.5% categorized as other.
Among the trials reporting on ethnicity, 4.7% of patients were Hispanic, with 11.5% being Hispanic in acute lymphoblastic leukemia (ALL) trials and 7.6% Hispanic in chronic myeloid leukemia (CML) trials.
Slightly more than half (54.8%) of all trial participants were male, and patients’ average ages ranged from 41.7 to 67.3 years across all malignancies.
Of the minority groups, Asian/Pacific Islanders and Black people had the highest representation in trials involving CML, at 12.7% and 5.3%, respectively.
Their lowest representation was in chronic lymphocytic leukemia (CLL), at 3% and 1.1%, respectively.
Among the trials reporting ethnicity, Hispanic people were the highest representation, with percentages ranging from 3.8% of MM trials to 11.5% in ALL trials.
Inconsistent with patient populations
Next, the researchers compared the proportions of race/ethnic groups that were found among the participants of these pivotal trials with the proportions that would be expected in patient populations for each of these blood cancers (according to the U.S. Surveillance, Epidemiology, and End Results [SEER] database).
For example, White people made up 80.3% of participants in clinical trials of MM, whereas they represent 68.7% of patients with MM, a difference that was statistically significant (P < .0001).
The finding was similar for CML, with White people accounting for 90.5% of participants in clinical trials versus 82.5% of the patient population (P < .0001).
For AML, the difference was smaller, with respective percentages of 79.6 versus 77.3% (P = .0389).
For Black people, Asian/Pacific Islanders and Hispanic people, across all five cancer types that were analyzed, the proportion of participants in clinical trials was significantly lower than the proportion in the patient population.
The analysis also showed that females were overrepresented in clinical trials for two blood cancers. For MM, trial participation was 44.7%, while disease incidence was 41.7% (P < .0001), and for CML the proportions were 44.7% versus 39.5% (P = .0009). However, females were underrepresented in a third blood cancer: in AML, the proportions were 44.7% versus 60.5% (P < .0001).
Geographic location of trials often inaccessible
The study also highlighted an obstacle to minorities participating in clinical trials: geography.
For this analysis, the researchers looked at mortality rates for the various blood cancers.
For AML, they found mortality rates were high across the whole of the United States, but centers conducting AML clinical trials were primarily in the Northeast, with no centers in the Midwest.
Key regions with high rates of AML mortality, low access to trials, and high minority representation were notably clustered in areas including east of the Carolinas, South Georgia, Alabama, and Mississippi, the authors noted.
“In many instances, trials were absent in areas with high mortality,” they report. “This makes access to clinical trials difficult, if not impossible, to patients who do not have the financial means for travel.”
Further action needed
Racial and ethnic disparities in clinical trials have been widely reported in numerous previous studies, the authors note.
Various initiatives have been launched in recent years to tackle the problem, including the National Institutes of Health Revitalization Act, FDA race and ethnicity guidance, and the International Conference for Harmonization guidance.
For oncology, the American Society of Clinical Oncology has also taken steps with the release of the new Equity, Diversity, and Inclusion Action Plan in 2021 to improve representation of minorities in research.
Dr. Cortes and colleagues suggest another step that is needed is standardized reporting of demographics of clinical trial participants.
“More importantly, efforts to increase representation of minorities and disadvantaged populations in clinical trials should be prioritized,” they say.
Dr. Cortes reports a consulting role and receiving research funding from many pharmaceutical companies. No other coauthors have financial disclosures. Dr. Chandhok reports honoraria from Healio, Clinical Care Options, and a consulting role with Servier. Dr. Sekeres reports a consulting role with Celgene, Millennium, Pfizer, Novartis, Syros Pharmaceuticals, Kurome Therapeutics, and institutional research funding from Takeda, Pfizer, Bristol Myers Squibb, Actuate Therapeutics, Sellas Life Sciences, and Bio-Path Holdings.
A version of this article first appeared on Medscape.com.
, a new study concludes.
“Our analysis shows that, over the past 10 years, participation in pivotal clinical trials investigating therapies for leukemias and MM is unrepresentative of the U.S. population,” say the authors, led by Jorge E. Cortes, MD, of the Georgia Cancer Center at Augusta University, Ga. “Trials should represent the population with the disease,” they comment.
The study was published in the Journal of Clinical Oncology.
“This study confirms that the U.S. cancer population for select hematologic malignancies was inadequately racially and ethnically represented in studies leading to drug approval,” comment the authors of an accompanying editorial.
“The results from this study should lead to questions about the generalizability of drug safety and efficacy in populations we serve as medical hematologists and oncologists,” say Mikkael A. Sekeres, MD, along with Namrata S. Chandhok, MD, both of the division of hematology, Sylvester Comprehensive Cancer Center, University of Miami.
They pose the question, for instance, as physicians practicing in South Florida, where most of their patients are Hispanic, “can we apply the results of these pivotal studies – and drug labels – to them, without any sense of whether they metabolize the drug the same way as those included in the study or have the same biologic targets?”
Analysis of pivotal trials
For their study, Dr. Cortes and colleagues analyzed 61 pivotal trials for leukemia and MM leading to approval of the drugs from the U.S. Food and Drug Administration between 2011 and 2021.
They found that only two-thirds (67.2%) of these trials reported data pertaining to race, while about half (48.8%) reported on ethnicity.
The trials that did report data on race involved a total of 13,731 patients. The vast majority (81.6%) were White, and Black patients represented only 3.8%. Asian/Pacific Islanders made up 9.1%, and American Indians or Alaskan Natives made up just 0.12% of participants, with 1.5% categorized as other.
Among the trials reporting on ethnicity, 4.7% of patients were Hispanic, with 11.5% being Hispanic in acute lymphoblastic leukemia (ALL) trials and 7.6% Hispanic in chronic myeloid leukemia (CML) trials.
Slightly more than half (54.8%) of all trial participants were male, and patients’ average ages ranged from 41.7 to 67.3 years across all malignancies.
Of the minority groups, Asian/Pacific Islanders and Black people had the highest representation in trials involving CML, at 12.7% and 5.3%, respectively.
Their lowest representation was in chronic lymphocytic leukemia (CLL), at 3% and 1.1%, respectively.
Among the trials reporting ethnicity, Hispanic people were the highest representation, with percentages ranging from 3.8% of MM trials to 11.5% in ALL trials.
Inconsistent with patient populations
Next, the researchers compared the proportions of race/ethnic groups that were found among the participants of these pivotal trials with the proportions that would be expected in patient populations for each of these blood cancers (according to the U.S. Surveillance, Epidemiology, and End Results [SEER] database).
For example, White people made up 80.3% of participants in clinical trials of MM, whereas they represent 68.7% of patients with MM, a difference that was statistically significant (P < .0001).
The finding was similar for CML, with White people accounting for 90.5% of participants in clinical trials versus 82.5% of the patient population (P < .0001).
For AML, the difference was smaller, with respective percentages of 79.6 versus 77.3% (P = .0389).
For Black people, Asian/Pacific Islanders and Hispanic people, across all five cancer types that were analyzed, the proportion of participants in clinical trials was significantly lower than the proportion in the patient population.
The analysis also showed that females were overrepresented in clinical trials for two blood cancers. For MM, trial participation was 44.7%, while disease incidence was 41.7% (P < .0001), and for CML the proportions were 44.7% versus 39.5% (P = .0009). However, females were underrepresented in a third blood cancer: in AML, the proportions were 44.7% versus 60.5% (P < .0001).
Geographic location of trials often inaccessible
The study also highlighted an obstacle to minorities participating in clinical trials: geography.
For this analysis, the researchers looked at mortality rates for the various blood cancers.
For AML, they found mortality rates were high across the whole of the United States, but centers conducting AML clinical trials were primarily in the Northeast, with no centers in the Midwest.
Key regions with high rates of AML mortality, low access to trials, and high minority representation were notably clustered in areas including east of the Carolinas, South Georgia, Alabama, and Mississippi, the authors noted.
“In many instances, trials were absent in areas with high mortality,” they report. “This makes access to clinical trials difficult, if not impossible, to patients who do not have the financial means for travel.”
Further action needed
Racial and ethnic disparities in clinical trials have been widely reported in numerous previous studies, the authors note.
Various initiatives have been launched in recent years to tackle the problem, including the National Institutes of Health Revitalization Act, FDA race and ethnicity guidance, and the International Conference for Harmonization guidance.
For oncology, the American Society of Clinical Oncology has also taken steps with the release of the new Equity, Diversity, and Inclusion Action Plan in 2021 to improve representation of minorities in research.
Dr. Cortes and colleagues suggest another step that is needed is standardized reporting of demographics of clinical trial participants.
“More importantly, efforts to increase representation of minorities and disadvantaged populations in clinical trials should be prioritized,” they say.
Dr. Cortes reports a consulting role and receiving research funding from many pharmaceutical companies. No other coauthors have financial disclosures. Dr. Chandhok reports honoraria from Healio, Clinical Care Options, and a consulting role with Servier. Dr. Sekeres reports a consulting role with Celgene, Millennium, Pfizer, Novartis, Syros Pharmaceuticals, Kurome Therapeutics, and institutional research funding from Takeda, Pfizer, Bristol Myers Squibb, Actuate Therapeutics, Sellas Life Sciences, and Bio-Path Holdings.
A version of this article first appeared on Medscape.com.
FROM THE JOURNAL OF CLINICAL ONCOLOGY
Vitamin D deficiency clearly linked to inflammation
Vitamin D deficiency has a causative role in the systemic inflammation that commonly accompanies it, with inflammation declining, reflected by reductions in elevated C-reactive protein (CRP), as vitamin D levels increase to normal levels, new research shows.
However, there is no reverse effect between the two: Changes in CRP levels did not appear to affect vitamin D levels.
first author Elina Hypponen, PhD, a professor in nutritional and genetic epidemiology and director of the Australian Centre for Precision Health, Adelaide, said in an interview.
“Given that the serum CRP level is a widely used biomarker for chronic inflammation, these results suggest that improving vitamin D status may reduce chronic inflammation, but only for people with vitamin D deficiency,” Dr. Hypponen and coauthors reported in their study, published in the International Journal of Epidemiology.
Vitamin D associated with CRP in ‘L-shaped’ manner
Nutritional factors are known to influence systemic inflammation in a variety of ways. However, there has been debate over the association between vitamin D – specifically, serum 25-hydroxyvitamin D (25[OH]D), an indicator of vitamin D status – and CRP, with some reports of observational associations between the two disputed in more robust randomized trials.
To further evaluate the relationship, the authors performed a bidirectional Mendelian randomization analysis, using a cohort of 294,970 unrelated participants of White/British ancestry in the UK Biobank, the largest cohort to date with measured serum 25(OH)D concentrations, they noted.
Overall, the average 25(OH)D concentration was 50.0 nmol/L (range, 10-340 nmol/L), with 11.7% (n = 34,403) of participants having concentrations of less than 25 nmol/L, considered deficient.
The analysis showed that genetically predicted serum 25(OH)D was associated with serum CRP in an L-shaped manner, with CRP levels, and hence inflammation, sharply decreasing in relation to increasing 25(OH)D concentration to normal levels.
However, the relationship was only significant among participants with 25(OH)D levels in the deficiency range (< 25 nmol/L), with the association leveling off at about 50 nmol/L of 25(OH)D, which is generally considered a normal level.
The association was supported in further stratified Mendelian randomization analyses, which confirmed an inverse association between serum 25(OH)D in the deficiency range and CRP, but not with higher concentrations of serum vitamin D.
Conversely, neither linear nor nonlinear Mendelian randomization analyses showed a causal effect of serum CRP level on 25(OH)D concentrations.
The findings suggest that “improving vitamin D status in the deficiency range could reduce systemic low-grade inflammation and potentially mitigate the risk or severity of chronic illnesses with an inflammatory component,” the authors noted.
Dr. Hypponen added that the greatest reductions in CRP are observed with correction of the most severe vitamin D deficiency.
“The strongest benefits of improving concentrations will be seen for people with severe deficiency,” Dr. Hypponen said in an interview.
“In our study, much of the benefit was achieved when people reached the National Academy of Sciences endorsed cutoff of 50 nmol/L [for vitamin D sufficiency].”
Prohormone effects?
The anti-inflammatory effects observed with serum vitamin D could be related to its role as a prohormone that can impact vitamin D receptor–expressing immune cells, such as monocytes, B cells, T cells, and antigen-presenting cells, the authors noted.
“Indeed, cell experiments have shown that active vitamin D can inhibit the production of proinflammatory cytokines, including [tumor necrosis factor]–alpha, interleukin-1b, IL-6, IL-8, and IL-12, and promote the production of IL-10, an anti-inflammatory cytokine,” they explained.
In that regard, adequate vitamin D concentrations could be important in preventing inflammation-related complications from obesity and reduce the risk or severity of chronic illnesses with an inflammatory component, such as cardiovascular diseases, diabetes, autoimmune diseases, neurodegenerative conditions, and others, the authors noted.
Previous studies unable to assess effect of deficiency
While the current findings contradict other studies that have used Mendelian randomization and showed no causal effect of 25(OH)D on CRP, those previous studies only used a standard linear Mendelian randomization method that could not rule out the possibility of a ‘threshold effect’ restricted to vitamin D deficiency, the authors noted.
“Indeed, it is logical to expect that improving vitamin D status would be relevant only in the presence of vitamin D deficiency, whereas any further additions may be redundant and, in the ... extreme of supplementation, might become toxic,” they wrote.
However, the nonlinear Mendelian randomization approach used in the current study allows for better detection of the association, and the authors point out that the method has also been recently used in research showing an adverse effect of vitamin D deficiency on cardiovascular disease risk and mortality, which would not be visible using the standard linear Mendelian randomization approach.
Meanwhile, the current findings add to broader research showing benefits of increases in vitamin D to be mainly limited to those who are deficient, with limited benefit of supplementation for those who are not, Dr. Hypponen emphasized.
“We have repeatedly seen evidence for health benefits for increasing vitamin D concentrations in individuals with very low levels, while for others, there appears to be little to no benefit,” Dr. Hypponen said in a press statement.
“These findings highlight the importance of avoiding clinical vitamin D deficiency and provide further evidence for the wide-ranging effects of hormonal vitamin D,” she added.
The study was financially supported by the National Health and Medical Research Council, Australia. The authors reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Vitamin D deficiency has a causative role in the systemic inflammation that commonly accompanies it, with inflammation declining, reflected by reductions in elevated C-reactive protein (CRP), as vitamin D levels increase to normal levels, new research shows.
However, there is no reverse effect between the two: Changes in CRP levels did not appear to affect vitamin D levels.
first author Elina Hypponen, PhD, a professor in nutritional and genetic epidemiology and director of the Australian Centre for Precision Health, Adelaide, said in an interview.
“Given that the serum CRP level is a widely used biomarker for chronic inflammation, these results suggest that improving vitamin D status may reduce chronic inflammation, but only for people with vitamin D deficiency,” Dr. Hypponen and coauthors reported in their study, published in the International Journal of Epidemiology.
Vitamin D associated with CRP in ‘L-shaped’ manner
Nutritional factors are known to influence systemic inflammation in a variety of ways. However, there has been debate over the association between vitamin D – specifically, serum 25-hydroxyvitamin D (25[OH]D), an indicator of vitamin D status – and CRP, with some reports of observational associations between the two disputed in more robust randomized trials.
To further evaluate the relationship, the authors performed a bidirectional Mendelian randomization analysis, using a cohort of 294,970 unrelated participants of White/British ancestry in the UK Biobank, the largest cohort to date with measured serum 25(OH)D concentrations, they noted.
Overall, the average 25(OH)D concentration was 50.0 nmol/L (range, 10-340 nmol/L), with 11.7% (n = 34,403) of participants having concentrations of less than 25 nmol/L, considered deficient.
The analysis showed that genetically predicted serum 25(OH)D was associated with serum CRP in an L-shaped manner, with CRP levels, and hence inflammation, sharply decreasing in relation to increasing 25(OH)D concentration to normal levels.
However, the relationship was only significant among participants with 25(OH)D levels in the deficiency range (< 25 nmol/L), with the association leveling off at about 50 nmol/L of 25(OH)D, which is generally considered a normal level.
The association was supported in further stratified Mendelian randomization analyses, which confirmed an inverse association between serum 25(OH)D in the deficiency range and CRP, but not with higher concentrations of serum vitamin D.
Conversely, neither linear nor nonlinear Mendelian randomization analyses showed a causal effect of serum CRP level on 25(OH)D concentrations.
The findings suggest that “improving vitamin D status in the deficiency range could reduce systemic low-grade inflammation and potentially mitigate the risk or severity of chronic illnesses with an inflammatory component,” the authors noted.
Dr. Hypponen added that the greatest reductions in CRP are observed with correction of the most severe vitamin D deficiency.
“The strongest benefits of improving concentrations will be seen for people with severe deficiency,” Dr. Hypponen said in an interview.
“In our study, much of the benefit was achieved when people reached the National Academy of Sciences endorsed cutoff of 50 nmol/L [for vitamin D sufficiency].”
Prohormone effects?
The anti-inflammatory effects observed with serum vitamin D could be related to its role as a prohormone that can impact vitamin D receptor–expressing immune cells, such as monocytes, B cells, T cells, and antigen-presenting cells, the authors noted.
“Indeed, cell experiments have shown that active vitamin D can inhibit the production of proinflammatory cytokines, including [tumor necrosis factor]–alpha, interleukin-1b, IL-6, IL-8, and IL-12, and promote the production of IL-10, an anti-inflammatory cytokine,” they explained.
In that regard, adequate vitamin D concentrations could be important in preventing inflammation-related complications from obesity and reduce the risk or severity of chronic illnesses with an inflammatory component, such as cardiovascular diseases, diabetes, autoimmune diseases, neurodegenerative conditions, and others, the authors noted.
Previous studies unable to assess effect of deficiency
While the current findings contradict other studies that have used Mendelian randomization and showed no causal effect of 25(OH)D on CRP, those previous studies only used a standard linear Mendelian randomization method that could not rule out the possibility of a ‘threshold effect’ restricted to vitamin D deficiency, the authors noted.
“Indeed, it is logical to expect that improving vitamin D status would be relevant only in the presence of vitamin D deficiency, whereas any further additions may be redundant and, in the ... extreme of supplementation, might become toxic,” they wrote.
However, the nonlinear Mendelian randomization approach used in the current study allows for better detection of the association, and the authors point out that the method has also been recently used in research showing an adverse effect of vitamin D deficiency on cardiovascular disease risk and mortality, which would not be visible using the standard linear Mendelian randomization approach.
Meanwhile, the current findings add to broader research showing benefits of increases in vitamin D to be mainly limited to those who are deficient, with limited benefit of supplementation for those who are not, Dr. Hypponen emphasized.
“We have repeatedly seen evidence for health benefits for increasing vitamin D concentrations in individuals with very low levels, while for others, there appears to be little to no benefit,” Dr. Hypponen said in a press statement.
“These findings highlight the importance of avoiding clinical vitamin D deficiency and provide further evidence for the wide-ranging effects of hormonal vitamin D,” she added.
The study was financially supported by the National Health and Medical Research Council, Australia. The authors reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Vitamin D deficiency has a causative role in the systemic inflammation that commonly accompanies it, with inflammation declining, reflected by reductions in elevated C-reactive protein (CRP), as vitamin D levels increase to normal levels, new research shows.
However, there is no reverse effect between the two: Changes in CRP levels did not appear to affect vitamin D levels.
first author Elina Hypponen, PhD, a professor in nutritional and genetic epidemiology and director of the Australian Centre for Precision Health, Adelaide, said in an interview.
“Given that the serum CRP level is a widely used biomarker for chronic inflammation, these results suggest that improving vitamin D status may reduce chronic inflammation, but only for people with vitamin D deficiency,” Dr. Hypponen and coauthors reported in their study, published in the International Journal of Epidemiology.
Vitamin D associated with CRP in ‘L-shaped’ manner
Nutritional factors are known to influence systemic inflammation in a variety of ways. However, there has been debate over the association between vitamin D – specifically, serum 25-hydroxyvitamin D (25[OH]D), an indicator of vitamin D status – and CRP, with some reports of observational associations between the two disputed in more robust randomized trials.
To further evaluate the relationship, the authors performed a bidirectional Mendelian randomization analysis, using a cohort of 294,970 unrelated participants of White/British ancestry in the UK Biobank, the largest cohort to date with measured serum 25(OH)D concentrations, they noted.
Overall, the average 25(OH)D concentration was 50.0 nmol/L (range, 10-340 nmol/L), with 11.7% (n = 34,403) of participants having concentrations of less than 25 nmol/L, considered deficient.
The analysis showed that genetically predicted serum 25(OH)D was associated with serum CRP in an L-shaped manner, with CRP levels, and hence inflammation, sharply decreasing in relation to increasing 25(OH)D concentration to normal levels.
However, the relationship was only significant among participants with 25(OH)D levels in the deficiency range (< 25 nmol/L), with the association leveling off at about 50 nmol/L of 25(OH)D, which is generally considered a normal level.
The association was supported in further stratified Mendelian randomization analyses, which confirmed an inverse association between serum 25(OH)D in the deficiency range and CRP, but not with higher concentrations of serum vitamin D.
Conversely, neither linear nor nonlinear Mendelian randomization analyses showed a causal effect of serum CRP level on 25(OH)D concentrations.
The findings suggest that “improving vitamin D status in the deficiency range could reduce systemic low-grade inflammation and potentially mitigate the risk or severity of chronic illnesses with an inflammatory component,” the authors noted.
Dr. Hypponen added that the greatest reductions in CRP are observed with correction of the most severe vitamin D deficiency.
“The strongest benefits of improving concentrations will be seen for people with severe deficiency,” Dr. Hypponen said in an interview.
“In our study, much of the benefit was achieved when people reached the National Academy of Sciences endorsed cutoff of 50 nmol/L [for vitamin D sufficiency].”
Prohormone effects?
The anti-inflammatory effects observed with serum vitamin D could be related to its role as a prohormone that can impact vitamin D receptor–expressing immune cells, such as monocytes, B cells, T cells, and antigen-presenting cells, the authors noted.
“Indeed, cell experiments have shown that active vitamin D can inhibit the production of proinflammatory cytokines, including [tumor necrosis factor]–alpha, interleukin-1b, IL-6, IL-8, and IL-12, and promote the production of IL-10, an anti-inflammatory cytokine,” they explained.
In that regard, adequate vitamin D concentrations could be important in preventing inflammation-related complications from obesity and reduce the risk or severity of chronic illnesses with an inflammatory component, such as cardiovascular diseases, diabetes, autoimmune diseases, neurodegenerative conditions, and others, the authors noted.
Previous studies unable to assess effect of deficiency
While the current findings contradict other studies that have used Mendelian randomization and showed no causal effect of 25(OH)D on CRP, those previous studies only used a standard linear Mendelian randomization method that could not rule out the possibility of a ‘threshold effect’ restricted to vitamin D deficiency, the authors noted.
“Indeed, it is logical to expect that improving vitamin D status would be relevant only in the presence of vitamin D deficiency, whereas any further additions may be redundant and, in the ... extreme of supplementation, might become toxic,” they wrote.
However, the nonlinear Mendelian randomization approach used in the current study allows for better detection of the association, and the authors point out that the method has also been recently used in research showing an adverse effect of vitamin D deficiency on cardiovascular disease risk and mortality, which would not be visible using the standard linear Mendelian randomization approach.
Meanwhile, the current findings add to broader research showing benefits of increases in vitamin D to be mainly limited to those who are deficient, with limited benefit of supplementation for those who are not, Dr. Hypponen emphasized.
“We have repeatedly seen evidence for health benefits for increasing vitamin D concentrations in individuals with very low levels, while for others, there appears to be little to no benefit,” Dr. Hypponen said in a press statement.
“These findings highlight the importance of avoiding clinical vitamin D deficiency and provide further evidence for the wide-ranging effects of hormonal vitamin D,” she added.
The study was financially supported by the National Health and Medical Research Council, Australia. The authors reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM THE INTERNATIONAL JOURNAL OF EPIDEMIOLOGY
Where women’s voices still get heard less
“Our study provides the first analysis of gender and early-career faculty disparities in speakers at hematology and medical oncology board review meetings,” the authors reported in research published in Blood Advances.
“We covered six major board reviews over the last 5 years that are either conducted yearly or every other year, [and] the general trend across all meetings showed skewness toward men speakers,” the authors reported.
Recent data from 2021 suggests a closing of the gender gap in oncology, with women making up 44.6% of oncologists in training. However, they still only represented 35.2% of practicing oncologists and are underrepresented in leadership positions in academic oncology, the authors reported.
With speaking roles at academic meetings potentially marking a key step in career advancement and improved opportunities, the authors sought to investigate the balance of gender, as well as early-career faculty among speakers at prominent hematology and/or oncology board review lecture series taking place in the United States between 2017 and 2021.
The five institutions and one society presenting the board review lecture series included Baylor College of Medicine/MD Anderson Cancer Center, both in Houston; Dana-Farber Brigham Cancer Center, Boston; George Washington University, Washington; Memorial Sloan Kettering Cancer Center, New York; Seattle Cancer Care Alliance; and the hematology board review series from the American Society of Hematology.
During the period in question, among 1,224 board review lectures presented, women constituted only 37.7% of the speakers. In lectures presented by American Board of Internal Medicine–certified speakers (n = 1,016, 83%), women were found to have made up fewer than 50% of speakers in five of six courses.
Men were also more likely to be recurrent speakers; across all courses, 13 men but only 2 women conducted 10 or more lectures. And while 35 men gave six or more lectures across all courses, only 12 women did so.
The lecture topics with the lowest rates of women presenters included malignant hematology (24.8%), solid tumors (38.9%), and benign hematology lectures (44.1%).
“We suspected [the imbalance in malignant hematology] since multiple recurrent roles were concentrated in the malignant hematology,” senior author Samer Al Hadidi, MD, of the Myeloma Center, Winthrop P. Rockefeller Cancer Institute, University of Arkansas for Medical Sciences, Little Rock, AK, said in an interview.
He noted that “there are no regulations that such courses need to follow to ensure certain proportions of women and junior faculty are involved.”
Early-career faculty
In terms of early-career representation, more than 50% of lectures were given by faculty who had received their initial certifications more than 15 years earlier. The median time from initial certification was 12.5 years for hematology and 14 years for medical oncology.
The findings that more than half of the board review lectures were presented by faculty with more than 15 years’ experience since initial certification “reflects a lack of appropriate involvement of early-career faculty, who arguably may have more recent experience with board certification,” the authors wrote.
While being underrepresented in such roles is detrimental, there are no regulations that such courses follow to ensure certain proportions of women and junior faculty are involved, Dr. Al Hadidi noted.
Equal representation remains elusive
The study does suggest some notable gains. In a previous study of 181 academic conferences in the United States and Canada between 2007 and 2017, the rate of women speakers was only 15%, compared with 37.7% in the new study.
And an overall trend analysis in the study shows an approximately 10% increase in representation of women in all of the board reviews. However, only the ASH hematology board review achieved more than 50% women in their two courses.
“Overall, the proportion of women speakers is improving over the years, though it remains suboptimal,” Dr. Al Hadidi said.
The authors noted that oncology is clearly not the only specialty with gender disparities. They documented a lack of women speakers at conferences involving otolaryngology head and neck meetings, radiation oncology, emergency medicine, and research conferences.
They pointed to the work of ASH’s Women in Hematology Working Group as an important example of the needed effort to improve the balance of women hematologists.
Ariela Marshall, MD, director of women’s thrombosis and hemostasis at Penn Medicine in Philadelphia and a leader of ASH’s Women in Hematology Working Group, agreed that more efforts are needed to address both gender disparities as well as those of early career speakers. She asserted that the two disparities appear to be connected.
“If you broke down gender representation over time and the faculty/time since initial certification, the findings may mirror the percent of women in hematology-oncology at that given point in time,” Dr. Marshall said in an interview.
“If an institution is truly committed to taking action on gender equity, it needs to look at gender and experience equity of speakers,” she said. “Perhaps it’s the time to say ‘Dr. X has been doing this review course for 15 years. Let’s give someone else a chance.’
“This is not even just from a gender equity perspective but from a career development perspective overall,” she added. “Junior faculty need these speaking engagements a lot more than senior faculty.”
Meanwhile, the higher number of female trainees is a trend that ideally will be sustained as those trainees move into positions of leadership, Dr. Marshall noted.
“We do see that over time, we have achieved gender equity in the percent of women matriculating to medical school. And my hope is that, 20 years down the line, we will see the effects of this reflected in increased equity in leadership positions such as division/department chair, dean, and hospital CEO,” she said. “However, we have a lot of work to do because there are still huge inequities in the culture of medicine (institutional and more broadly), including gender-based discrimination, maternal discrimination, and high attrition rates for women physicians, compared to male physicians.
“It’s not enough to simply say ‘well, we have fixed the problem because our incoming medical student classes are now equitable in gender distribution,’ ”
The authors and Dr. Marshall had no disclosures to report.
“Our study provides the first analysis of gender and early-career faculty disparities in speakers at hematology and medical oncology board review meetings,” the authors reported in research published in Blood Advances.
“We covered six major board reviews over the last 5 years that are either conducted yearly or every other year, [and] the general trend across all meetings showed skewness toward men speakers,” the authors reported.
Recent data from 2021 suggests a closing of the gender gap in oncology, with women making up 44.6% of oncologists in training. However, they still only represented 35.2% of practicing oncologists and are underrepresented in leadership positions in academic oncology, the authors reported.
With speaking roles at academic meetings potentially marking a key step in career advancement and improved opportunities, the authors sought to investigate the balance of gender, as well as early-career faculty among speakers at prominent hematology and/or oncology board review lecture series taking place in the United States between 2017 and 2021.
The five institutions and one society presenting the board review lecture series included Baylor College of Medicine/MD Anderson Cancer Center, both in Houston; Dana-Farber Brigham Cancer Center, Boston; George Washington University, Washington; Memorial Sloan Kettering Cancer Center, New York; Seattle Cancer Care Alliance; and the hematology board review series from the American Society of Hematology.
During the period in question, among 1,224 board review lectures presented, women constituted only 37.7% of the speakers. In lectures presented by American Board of Internal Medicine–certified speakers (n = 1,016, 83%), women were found to have made up fewer than 50% of speakers in five of six courses.
Men were also more likely to be recurrent speakers; across all courses, 13 men but only 2 women conducted 10 or more lectures. And while 35 men gave six or more lectures across all courses, only 12 women did so.
The lecture topics with the lowest rates of women presenters included malignant hematology (24.8%), solid tumors (38.9%), and benign hematology lectures (44.1%).
“We suspected [the imbalance in malignant hematology] since multiple recurrent roles were concentrated in the malignant hematology,” senior author Samer Al Hadidi, MD, of the Myeloma Center, Winthrop P. Rockefeller Cancer Institute, University of Arkansas for Medical Sciences, Little Rock, AK, said in an interview.
He noted that “there are no regulations that such courses need to follow to ensure certain proportions of women and junior faculty are involved.”
Early-career faculty
In terms of early-career representation, more than 50% of lectures were given by faculty who had received their initial certifications more than 15 years earlier. The median time from initial certification was 12.5 years for hematology and 14 years for medical oncology.
The findings that more than half of the board review lectures were presented by faculty with more than 15 years’ experience since initial certification “reflects a lack of appropriate involvement of early-career faculty, who arguably may have more recent experience with board certification,” the authors wrote.
While being underrepresented in such roles is detrimental, there are no regulations that such courses follow to ensure certain proportions of women and junior faculty are involved, Dr. Al Hadidi noted.
Equal representation remains elusive
The study does suggest some notable gains. In a previous study of 181 academic conferences in the United States and Canada between 2007 and 2017, the rate of women speakers was only 15%, compared with 37.7% in the new study.
And an overall trend analysis in the study shows an approximately 10% increase in representation of women in all of the board reviews. However, only the ASH hematology board review achieved more than 50% women in their two courses.
“Overall, the proportion of women speakers is improving over the years, though it remains suboptimal,” Dr. Al Hadidi said.
The authors noted that oncology is clearly not the only specialty with gender disparities. They documented a lack of women speakers at conferences involving otolaryngology head and neck meetings, radiation oncology, emergency medicine, and research conferences.
They pointed to the work of ASH’s Women in Hematology Working Group as an important example of the needed effort to improve the balance of women hematologists.
Ariela Marshall, MD, director of women’s thrombosis and hemostasis at Penn Medicine in Philadelphia and a leader of ASH’s Women in Hematology Working Group, agreed that more efforts are needed to address both gender disparities as well as those of early career speakers. She asserted that the two disparities appear to be connected.
“If you broke down gender representation over time and the faculty/time since initial certification, the findings may mirror the percent of women in hematology-oncology at that given point in time,” Dr. Marshall said in an interview.
“If an institution is truly committed to taking action on gender equity, it needs to look at gender and experience equity of speakers,” she said. “Perhaps it’s the time to say ‘Dr. X has been doing this review course for 15 years. Let’s give someone else a chance.’
“This is not even just from a gender equity perspective but from a career development perspective overall,” she added. “Junior faculty need these speaking engagements a lot more than senior faculty.”
Meanwhile, the higher number of female trainees is a trend that ideally will be sustained as those trainees move into positions of leadership, Dr. Marshall noted.
“We do see that over time, we have achieved gender equity in the percent of women matriculating to medical school. And my hope is that, 20 years down the line, we will see the effects of this reflected in increased equity in leadership positions such as division/department chair, dean, and hospital CEO,” she said. “However, we have a lot of work to do because there are still huge inequities in the culture of medicine (institutional and more broadly), including gender-based discrimination, maternal discrimination, and high attrition rates for women physicians, compared to male physicians.
“It’s not enough to simply say ‘well, we have fixed the problem because our incoming medical student classes are now equitable in gender distribution,’ ”
The authors and Dr. Marshall had no disclosures to report.
“Our study provides the first analysis of gender and early-career faculty disparities in speakers at hematology and medical oncology board review meetings,” the authors reported in research published in Blood Advances.
“We covered six major board reviews over the last 5 years that are either conducted yearly or every other year, [and] the general trend across all meetings showed skewness toward men speakers,” the authors reported.
Recent data from 2021 suggests a closing of the gender gap in oncology, with women making up 44.6% of oncologists in training. However, they still only represented 35.2% of practicing oncologists and are underrepresented in leadership positions in academic oncology, the authors reported.
With speaking roles at academic meetings potentially marking a key step in career advancement and improved opportunities, the authors sought to investigate the balance of gender, as well as early-career faculty among speakers at prominent hematology and/or oncology board review lecture series taking place in the United States between 2017 and 2021.
The five institutions and one society presenting the board review lecture series included Baylor College of Medicine/MD Anderson Cancer Center, both in Houston; Dana-Farber Brigham Cancer Center, Boston; George Washington University, Washington; Memorial Sloan Kettering Cancer Center, New York; Seattle Cancer Care Alliance; and the hematology board review series from the American Society of Hematology.
During the period in question, among 1,224 board review lectures presented, women constituted only 37.7% of the speakers. In lectures presented by American Board of Internal Medicine–certified speakers (n = 1,016, 83%), women were found to have made up fewer than 50% of speakers in five of six courses.
Men were also more likely to be recurrent speakers; across all courses, 13 men but only 2 women conducted 10 or more lectures. And while 35 men gave six or more lectures across all courses, only 12 women did so.
The lecture topics with the lowest rates of women presenters included malignant hematology (24.8%), solid tumors (38.9%), and benign hematology lectures (44.1%).
“We suspected [the imbalance in malignant hematology] since multiple recurrent roles were concentrated in the malignant hematology,” senior author Samer Al Hadidi, MD, of the Myeloma Center, Winthrop P. Rockefeller Cancer Institute, University of Arkansas for Medical Sciences, Little Rock, AK, said in an interview.
He noted that “there are no regulations that such courses need to follow to ensure certain proportions of women and junior faculty are involved.”
Early-career faculty
In terms of early-career representation, more than 50% of lectures were given by faculty who had received their initial certifications more than 15 years earlier. The median time from initial certification was 12.5 years for hematology and 14 years for medical oncology.
The findings that more than half of the board review lectures were presented by faculty with more than 15 years’ experience since initial certification “reflects a lack of appropriate involvement of early-career faculty, who arguably may have more recent experience with board certification,” the authors wrote.
While being underrepresented in such roles is detrimental, there are no regulations that such courses follow to ensure certain proportions of women and junior faculty are involved, Dr. Al Hadidi noted.
Equal representation remains elusive
The study does suggest some notable gains. In a previous study of 181 academic conferences in the United States and Canada between 2007 and 2017, the rate of women speakers was only 15%, compared with 37.7% in the new study.
And an overall trend analysis in the study shows an approximately 10% increase in representation of women in all of the board reviews. However, only the ASH hematology board review achieved more than 50% women in their two courses.
“Overall, the proportion of women speakers is improving over the years, though it remains suboptimal,” Dr. Al Hadidi said.
The authors noted that oncology is clearly not the only specialty with gender disparities. They documented a lack of women speakers at conferences involving otolaryngology head and neck meetings, radiation oncology, emergency medicine, and research conferences.
They pointed to the work of ASH’s Women in Hematology Working Group as an important example of the needed effort to improve the balance of women hematologists.
Ariela Marshall, MD, director of women’s thrombosis and hemostasis at Penn Medicine in Philadelphia and a leader of ASH’s Women in Hematology Working Group, agreed that more efforts are needed to address both gender disparities as well as those of early career speakers. She asserted that the two disparities appear to be connected.
“If you broke down gender representation over time and the faculty/time since initial certification, the findings may mirror the percent of women in hematology-oncology at that given point in time,” Dr. Marshall said in an interview.
“If an institution is truly committed to taking action on gender equity, it needs to look at gender and experience equity of speakers,” she said. “Perhaps it’s the time to say ‘Dr. X has been doing this review course for 15 years. Let’s give someone else a chance.’
“This is not even just from a gender equity perspective but from a career development perspective overall,” she added. “Junior faculty need these speaking engagements a lot more than senior faculty.”
Meanwhile, the higher number of female trainees is a trend that ideally will be sustained as those trainees move into positions of leadership, Dr. Marshall noted.
“We do see that over time, we have achieved gender equity in the percent of women matriculating to medical school. And my hope is that, 20 years down the line, we will see the effects of this reflected in increased equity in leadership positions such as division/department chair, dean, and hospital CEO,” she said. “However, we have a lot of work to do because there are still huge inequities in the culture of medicine (institutional and more broadly), including gender-based discrimination, maternal discrimination, and high attrition rates for women physicians, compared to male physicians.
“It’s not enough to simply say ‘well, we have fixed the problem because our incoming medical student classes are now equitable in gender distribution,’ ”
The authors and Dr. Marshall had no disclosures to report.
FROM BLOOD ADVANCES
Ultrasound-guided nerve blocks improve fracture pain
meta-analysis published in BMC Anesthesiology show.
results from aWith the caveat that the quality of evidence in most trials in the analysis is low owing to a lack of blinding and other factors, “our review suggests that, among patients suffering from a hip fracture, a preoperative ultrasound-guided peripheral nerve block is associated with a significant pain reduction and reduced need for systemic analgesics compared to conventional analgesia,” reported the authors.
“Our results may also indicate a lower risk of delirium, serious adverse events and higher patient satisfaction in patients receiving an ultrasound-guided peripheral nerve block,” they added.
Because hip fractures commonly affect older populations and those who are frail, treatment of the substantial pain that can occur perioperatively is a challenge.
Peripheral nerve blocks have been shown to reduce pain within 30 minutes of the block placement; however, most studies have primarily included blocks that use anatomic landmarks or nerve stimulation for guidance. However, the use of ultrasound guidance with the nerve block should improve efficacy, the authors noted.
“It seems intuitive that using ultrasound-guidance should be more effective than using a blind technique, since it allows a trained physician to deposit the local anesthetic with much more precision,” they wrote.
To evaluate the data from studies that have looked at ultrasound-guided peripheral nerve blocks, Oskar Wilborg Exsteen, of the department of anesthesiology and intensive care, Copenhagen University Hospital and Nordsjællands Hospital, Hillerød, Denmark, and colleagues identified 12 randomized controlled trials, involving a combined total of 976 participants, for the meta-analysis.
The studies included 509 participants who received ultrasound-guided peripheral nerve blocks, specifically the femoral nerve block and fascia iliaca block, and 476 who were randomly assigned to control groups.
Overall, those treated with the nerve blocks showed significantly greater reductions in pain measured closest to 2 hours of block placement, compared with conventional analgesia, with a mean reduction of 2.26 points on the Visual Analogue Scale (VAS) (range, 0-10; P < .001).
Ultrasound-guided peripheral nerve block use was associated with lower preoperative usage of analgesic intravenous morphine equivalents in milligram, reported in four of the trials (random effects model mean difference, –5.34; P = .003).
Delirium was also significantly lower with the nerve blocks (risk ratio, 0.6; P = 0.03), as were serious adverse events, compared with standard analgesia (RR, 0.33; P = .006), whereas patient satisfaction was significantly higher with the nerve blocks (mean VAS difference, 25.9 [score 0-100]; P < .001).
Seven of the studies had monitored for serious adverse events or complications related to the nerve blocks, but none reported any complications directly related to the ultrasound-guided peripheral nerve blocks.
Owing to the inability to conduct blinded comparisons, clinical heterogeneity, and other caveats, the quality of evidence was ultimately judged to be “low” or “very low”; however, the observed benefits are nevertheless relevant, the authors concluded.
“Despite the low quality of evidence, ultrasound-guided blocks were associated with benefits compared to conventional systemic analgesia,” they said.
Key caveats include that the morphine reductions observed with the nerve blocks were not substantial, they noted. “The opioid-sparing effect seems small and may be of less clinical importance.” The decreases in opioid consumption, as well as pain reduction in the analysis, are in fact similar to those observed with conventional, peripheral nerve blocks that did not use ultrasound guidance, compared with standard pain management.
No trials were identified that directly compared ultrasound-guided peripheral nerve blocks with nerve block techniques that didn’t use ultrasound.
However, the other noted improvements carry more weight, the authors said.
“The potential for higher patient satisfaction and reduction in serious adverse events and delirium may be of clinical importance,” they wrote.
Ultrasound-guided peripheral nerve blocks not always accessible
Of note, the use of ultrasound-guided peripheral nerve blocks appears to be somewhat low, with one observational trend study of national data in the United States showing that, among patients receiving a peripheral nerve block for hip arthroplasty, only 3.2% of the procedures were performed using ultrasound guidance.
Stephen C. Haskins, MD, a coauthor on that study, said that the low utilization underscores that, in real-world practice, an ultrasound-guided approach isn’t always convenient.
“I think our findings demonstrate a common misconception that exists for those of us that work at academic institutions and/or within the ivory towers of regional anesthesia, which is that everyone is performing cutting edge ultrasound-guided techniques for all procedures,” Dr. Haskins, an associate attending anesthesiologist and chief medical diversity officer with the department of anesthesiology, critical care & pain management at the Hospital for Special Surgery in New York, said in an interview.
However, “there are many limitations to use of ultrasound for these blocks, including limited access to machines, limited access to training, and limited interest and support from our surgical colleagues,” he explained.
“Ultimately, the best nerve block is the one performed in a timely and successful fashion, regardless of technique,” he said. “But we will continue to see a trend towards ultrasound use in the future due to increasing access in the form of portability and affordability.”
Haskins noted that newer ultrasound-guided nerve blocks that were not reviewed in the study, such as the pericapsular nerve group block, regional block, and supra-inguinal fascia iliaca block, which provide additional benefits such as avoiding quadriceps weakness.
Jeff Gadsden, MD, chief of the orthopedics, plastic, and regional anesthesiology division at Duke University Medical Center, Durham, N.C., agreed, noting that much has changed since some of the older studies in the analysis, that date back to 2010.
“A fascia iliaca block done in 2022 looks a lot different than it did in 2012, and we would expect it to be more consistent, reliable and longer-lasting with current techniques and technology,” he said in an interview. “So, if anything, I would expect the findings of this analysis to undersell the benefits of peripheral nerve blocks in this population.”
Although the quality of evidence in the meta-analysis is described as “low,” the downsides of the procedures are few, and “the potential benefits [of ultrasound-guided peripheral nerve blocks] are just too good to ignore,” Dr. Gadsden emphasized.
“If we can avoid or reduce opioids in this population and at the same time reduce the acute pain from the injury, there is no question that the incidence of delirium will go down,” he said. “Delirium is associated with a number of poor outcomes following hip fracture, including increased mortality.
“The bottom line is that the risk/benefit ratio is so far in favor of performing the blocks that even in the face of ‘modest’ levels of evidence, we should all be doing these.”
The authors, Dr. Haskins, and Dr. Gadsden had no disclosures relating to the study to report.
A version of this article first appeared on Medscape.com.
meta-analysis published in BMC Anesthesiology show.
results from aWith the caveat that the quality of evidence in most trials in the analysis is low owing to a lack of blinding and other factors, “our review suggests that, among patients suffering from a hip fracture, a preoperative ultrasound-guided peripheral nerve block is associated with a significant pain reduction and reduced need for systemic analgesics compared to conventional analgesia,” reported the authors.
“Our results may also indicate a lower risk of delirium, serious adverse events and higher patient satisfaction in patients receiving an ultrasound-guided peripheral nerve block,” they added.
Because hip fractures commonly affect older populations and those who are frail, treatment of the substantial pain that can occur perioperatively is a challenge.
Peripheral nerve blocks have been shown to reduce pain within 30 minutes of the block placement; however, most studies have primarily included blocks that use anatomic landmarks or nerve stimulation for guidance. However, the use of ultrasound guidance with the nerve block should improve efficacy, the authors noted.
“It seems intuitive that using ultrasound-guidance should be more effective than using a blind technique, since it allows a trained physician to deposit the local anesthetic with much more precision,” they wrote.
To evaluate the data from studies that have looked at ultrasound-guided peripheral nerve blocks, Oskar Wilborg Exsteen, of the department of anesthesiology and intensive care, Copenhagen University Hospital and Nordsjællands Hospital, Hillerød, Denmark, and colleagues identified 12 randomized controlled trials, involving a combined total of 976 participants, for the meta-analysis.
The studies included 509 participants who received ultrasound-guided peripheral nerve blocks, specifically the femoral nerve block and fascia iliaca block, and 476 who were randomly assigned to control groups.
Overall, those treated with the nerve blocks showed significantly greater reductions in pain measured closest to 2 hours of block placement, compared with conventional analgesia, with a mean reduction of 2.26 points on the Visual Analogue Scale (VAS) (range, 0-10; P < .001).
Ultrasound-guided peripheral nerve block use was associated with lower preoperative usage of analgesic intravenous morphine equivalents in milligram, reported in four of the trials (random effects model mean difference, –5.34; P = .003).
Delirium was also significantly lower with the nerve blocks (risk ratio, 0.6; P = 0.03), as were serious adverse events, compared with standard analgesia (RR, 0.33; P = .006), whereas patient satisfaction was significantly higher with the nerve blocks (mean VAS difference, 25.9 [score 0-100]; P < .001).
Seven of the studies had monitored for serious adverse events or complications related to the nerve blocks, but none reported any complications directly related to the ultrasound-guided peripheral nerve blocks.
Owing to the inability to conduct blinded comparisons, clinical heterogeneity, and other caveats, the quality of evidence was ultimately judged to be “low” or “very low”; however, the observed benefits are nevertheless relevant, the authors concluded.
“Despite the low quality of evidence, ultrasound-guided blocks were associated with benefits compared to conventional systemic analgesia,” they said.
Key caveats include that the morphine reductions observed with the nerve blocks were not substantial, they noted. “The opioid-sparing effect seems small and may be of less clinical importance.” The decreases in opioid consumption, as well as pain reduction in the analysis, are in fact similar to those observed with conventional, peripheral nerve blocks that did not use ultrasound guidance, compared with standard pain management.
No trials were identified that directly compared ultrasound-guided peripheral nerve blocks with nerve block techniques that didn’t use ultrasound.
However, the other noted improvements carry more weight, the authors said.
“The potential for higher patient satisfaction and reduction in serious adverse events and delirium may be of clinical importance,” they wrote.
Ultrasound-guided peripheral nerve blocks not always accessible
Of note, the use of ultrasound-guided peripheral nerve blocks appears to be somewhat low, with one observational trend study of national data in the United States showing that, among patients receiving a peripheral nerve block for hip arthroplasty, only 3.2% of the procedures were performed using ultrasound guidance.
Stephen C. Haskins, MD, a coauthor on that study, said that the low utilization underscores that, in real-world practice, an ultrasound-guided approach isn’t always convenient.
“I think our findings demonstrate a common misconception that exists for those of us that work at academic institutions and/or within the ivory towers of regional anesthesia, which is that everyone is performing cutting edge ultrasound-guided techniques for all procedures,” Dr. Haskins, an associate attending anesthesiologist and chief medical diversity officer with the department of anesthesiology, critical care & pain management at the Hospital for Special Surgery in New York, said in an interview.
However, “there are many limitations to use of ultrasound for these blocks, including limited access to machines, limited access to training, and limited interest and support from our surgical colleagues,” he explained.
“Ultimately, the best nerve block is the one performed in a timely and successful fashion, regardless of technique,” he said. “But we will continue to see a trend towards ultrasound use in the future due to increasing access in the form of portability and affordability.”
Haskins noted that newer ultrasound-guided nerve blocks that were not reviewed in the study, such as the pericapsular nerve group block, regional block, and supra-inguinal fascia iliaca block, which provide additional benefits such as avoiding quadriceps weakness.
Jeff Gadsden, MD, chief of the orthopedics, plastic, and regional anesthesiology division at Duke University Medical Center, Durham, N.C., agreed, noting that much has changed since some of the older studies in the analysis, that date back to 2010.
“A fascia iliaca block done in 2022 looks a lot different than it did in 2012, and we would expect it to be more consistent, reliable and longer-lasting with current techniques and technology,” he said in an interview. “So, if anything, I would expect the findings of this analysis to undersell the benefits of peripheral nerve blocks in this population.”
Although the quality of evidence in the meta-analysis is described as “low,” the downsides of the procedures are few, and “the potential benefits [of ultrasound-guided peripheral nerve blocks] are just too good to ignore,” Dr. Gadsden emphasized.
“If we can avoid or reduce opioids in this population and at the same time reduce the acute pain from the injury, there is no question that the incidence of delirium will go down,” he said. “Delirium is associated with a number of poor outcomes following hip fracture, including increased mortality.
“The bottom line is that the risk/benefit ratio is so far in favor of performing the blocks that even in the face of ‘modest’ levels of evidence, we should all be doing these.”
The authors, Dr. Haskins, and Dr. Gadsden had no disclosures relating to the study to report.
A version of this article first appeared on Medscape.com.
meta-analysis published in BMC Anesthesiology show.
results from aWith the caveat that the quality of evidence in most trials in the analysis is low owing to a lack of blinding and other factors, “our review suggests that, among patients suffering from a hip fracture, a preoperative ultrasound-guided peripheral nerve block is associated with a significant pain reduction and reduced need for systemic analgesics compared to conventional analgesia,” reported the authors.
“Our results may also indicate a lower risk of delirium, serious adverse events and higher patient satisfaction in patients receiving an ultrasound-guided peripheral nerve block,” they added.
Because hip fractures commonly affect older populations and those who are frail, treatment of the substantial pain that can occur perioperatively is a challenge.
Peripheral nerve blocks have been shown to reduce pain within 30 minutes of the block placement; however, most studies have primarily included blocks that use anatomic landmarks or nerve stimulation for guidance. However, the use of ultrasound guidance with the nerve block should improve efficacy, the authors noted.
“It seems intuitive that using ultrasound-guidance should be more effective than using a blind technique, since it allows a trained physician to deposit the local anesthetic with much more precision,” they wrote.
To evaluate the data from studies that have looked at ultrasound-guided peripheral nerve blocks, Oskar Wilborg Exsteen, of the department of anesthesiology and intensive care, Copenhagen University Hospital and Nordsjællands Hospital, Hillerød, Denmark, and colleagues identified 12 randomized controlled trials, involving a combined total of 976 participants, for the meta-analysis.
The studies included 509 participants who received ultrasound-guided peripheral nerve blocks, specifically the femoral nerve block and fascia iliaca block, and 476 who were randomly assigned to control groups.
Overall, those treated with the nerve blocks showed significantly greater reductions in pain measured closest to 2 hours of block placement, compared with conventional analgesia, with a mean reduction of 2.26 points on the Visual Analogue Scale (VAS) (range, 0-10; P < .001).
Ultrasound-guided peripheral nerve block use was associated with lower preoperative usage of analgesic intravenous morphine equivalents in milligram, reported in four of the trials (random effects model mean difference, –5.34; P = .003).
Delirium was also significantly lower with the nerve blocks (risk ratio, 0.6; P = 0.03), as were serious adverse events, compared with standard analgesia (RR, 0.33; P = .006), whereas patient satisfaction was significantly higher with the nerve blocks (mean VAS difference, 25.9 [score 0-100]; P < .001).
Seven of the studies had monitored for serious adverse events or complications related to the nerve blocks, but none reported any complications directly related to the ultrasound-guided peripheral nerve blocks.
Owing to the inability to conduct blinded comparisons, clinical heterogeneity, and other caveats, the quality of evidence was ultimately judged to be “low” or “very low”; however, the observed benefits are nevertheless relevant, the authors concluded.
“Despite the low quality of evidence, ultrasound-guided blocks were associated with benefits compared to conventional systemic analgesia,” they said.
Key caveats include that the morphine reductions observed with the nerve blocks were not substantial, they noted. “The opioid-sparing effect seems small and may be of less clinical importance.” The decreases in opioid consumption, as well as pain reduction in the analysis, are in fact similar to those observed with conventional, peripheral nerve blocks that did not use ultrasound guidance, compared with standard pain management.
No trials were identified that directly compared ultrasound-guided peripheral nerve blocks with nerve block techniques that didn’t use ultrasound.
However, the other noted improvements carry more weight, the authors said.
“The potential for higher patient satisfaction and reduction in serious adverse events and delirium may be of clinical importance,” they wrote.
Ultrasound-guided peripheral nerve blocks not always accessible
Of note, the use of ultrasound-guided peripheral nerve blocks appears to be somewhat low, with one observational trend study of national data in the United States showing that, among patients receiving a peripheral nerve block for hip arthroplasty, only 3.2% of the procedures were performed using ultrasound guidance.
Stephen C. Haskins, MD, a coauthor on that study, said that the low utilization underscores that, in real-world practice, an ultrasound-guided approach isn’t always convenient.
“I think our findings demonstrate a common misconception that exists for those of us that work at academic institutions and/or within the ivory towers of regional anesthesia, which is that everyone is performing cutting edge ultrasound-guided techniques for all procedures,” Dr. Haskins, an associate attending anesthesiologist and chief medical diversity officer with the department of anesthesiology, critical care & pain management at the Hospital for Special Surgery in New York, said in an interview.
However, “there are many limitations to use of ultrasound for these blocks, including limited access to machines, limited access to training, and limited interest and support from our surgical colleagues,” he explained.
“Ultimately, the best nerve block is the one performed in a timely and successful fashion, regardless of technique,” he said. “But we will continue to see a trend towards ultrasound use in the future due to increasing access in the form of portability and affordability.”
Haskins noted that newer ultrasound-guided nerve blocks that were not reviewed in the study, such as the pericapsular nerve group block, regional block, and supra-inguinal fascia iliaca block, which provide additional benefits such as avoiding quadriceps weakness.
Jeff Gadsden, MD, chief of the orthopedics, plastic, and regional anesthesiology division at Duke University Medical Center, Durham, N.C., agreed, noting that much has changed since some of the older studies in the analysis, that date back to 2010.
“A fascia iliaca block done in 2022 looks a lot different than it did in 2012, and we would expect it to be more consistent, reliable and longer-lasting with current techniques and technology,” he said in an interview. “So, if anything, I would expect the findings of this analysis to undersell the benefits of peripheral nerve blocks in this population.”
Although the quality of evidence in the meta-analysis is described as “low,” the downsides of the procedures are few, and “the potential benefits [of ultrasound-guided peripheral nerve blocks] are just too good to ignore,” Dr. Gadsden emphasized.
“If we can avoid or reduce opioids in this population and at the same time reduce the acute pain from the injury, there is no question that the incidence of delirium will go down,” he said. “Delirium is associated with a number of poor outcomes following hip fracture, including increased mortality.
“The bottom line is that the risk/benefit ratio is so far in favor of performing the blocks that even in the face of ‘modest’ levels of evidence, we should all be doing these.”
The authors, Dr. Haskins, and Dr. Gadsden had no disclosures relating to the study to report.
A version of this article first appeared on Medscape.com.
FROM BMC ANESTHESIOLOGY
Strength training overcomes bone effects of vegan diet
People who maintain a vegan diet show significant deficits in bone microarchitecture, compared with omnivores; however, resistance training not only appears to improve those deficits but may have a stronger effect in vegans, suggesting an important strategy in maintaining bone health with a vegan diet.
“We expected better bone structure in both vegans and omnivores who reported resistance training,” first author Robert Wakolbinger-Habel, MD, PhD, of St. Vincent Hospital Vienna and the Medical University of Vienna, said in an interview.
“However, we expected [there would still be] differences in structure between vegans and omnivores [who practiced resistance training], as previous literature reported higher fracture rates in vegans,” he said. “Still, the positive message is that ‘pumping iron’ could counterbalance these differences between vegans and omnivores.”
The research was published online in The Endocrine Society’s Journal of Clinical Endocrinology & Metabolism.
Exercise significantly impacts bone health in vegans
The potential effects of the plant-based vegan diet on bone health have been reported in studies linking the diet to an increased risk of fractures and lower bone mineral density (BMD), with common theories including lower bone- and muscle-building protein in vegan diets.
However, most previous studies have not considered other key factors, such as the effects of exercise, the authors noted.
“While previous studies on bone health in vegans only took BMD, biochemical and nutritional parameters into account, they did not consider the significant effects of physical activity,” they wrote.
“By ignoring these effects, important factors influencing bone health are neglected.”
For the study, 88 participants were enrolled in Vienna, with vegan participants recruited with the help of the Austrian Vegan Society.
Importantly, the study documented participants’ bone microarchitecture, a key measure of bone strength that has also not been previously investigated in vegans, using high-resolution peripheral quantitative CT.
Inclusion criteria included maintaining an omnivore diet of meat and plant-based foods or a vegan diet for at least 5 years, not being underweight or obese (body mass index [BMI], 18.5-30 kg/m2), being age 30-50 years, and being premenopausal.
Of the participants, 43 were vegan and 45 were omnivores, with generally equal ratios of men and women.
Vegan bone deficits disappear with strength training
Overall, compared with omnivores, the vegan group showed significant deficits in 7 of 14 measures of BMI-adjusted trabecular and cortical structure (all P < .05).
Among participants who reported no resistance training, vegans still showed significant decreases in bone microarchitecture, compared with omnivores, including radius trabecular BMD, radius trabecular bone volume fraction, and other tibial and cortical bone microarchitecture measures.
However, among those who did report progressive resistant training (20 vegans and 25 omnivores), defined as using machines, free weights, or bodyweight resistance exercises at least once a week, those differences disappeared and there were no significant differences in BMI-adjusted bone microarchitecture between vegans and omnivores after the 5 years.
Of note, no significant differences in bone microarchitecture were observed between those who performed exclusively aerobic activities and those who reported no sports activities in the vegan or omnivore group.
Based on the findings, “other types of exercise such as aerobics, cycling, etc, would not be sufficient for a similar positive effect on bone [as resistance training],” Dr. Wakolbinger-Habel said.
Although the findings suggest that resistance training seemed to allow vegans to “catch up” with omnivores in terms of bone microarchitecture, Dr. Wakolbinger-Habel cautioned that a study limitation is the relatively low number of participants.
“The absolute numbers suggest that in vegans the differences, and the relative effect, respectively of resistance training might be larger,” he said. “However, the number of participants in the subgroups is small and it is still an observational study, so we need to be careful in drawing causal conclusions.”
Serum bone markers were within normal ranges across all subgroups. And although there were some correlations between nutrient intake and bone microarchitecture among vegans who did and did not practice resistance training, no conclusions could be drawn from that data, the authors noted.
“Based on our data, the structural [differences between vegans and omnivores] cannot solely be explained by deficits in certain nutrients according to lifestyle,” the authors concluded.
Mechanisms
The mechanisms by which progressive resistance training could result in the benefits include that mechanical loads trigger stimulation of key pathways involved in bone formation, or mechanotransduction, the authors explained.
The unique effects have been observed in other studies, including one study showing that, among young adult runners, the addition of resistance training once a week was associated with significantly greater BMD.
“Veganism is a global trend with strongly increasing numbers of people worldwide adhering to a purely plant-based diet,” first author Christian Muschitz, MD, also of St. Vincent Hospital Vienna and the Medical University of Vienna, said in a press statement.
“Our study showed resistance training offsets diminished bone structure in vegan people when compared to omnivores,” he said.
Dr. Wakolbinger-Habel recommended that, based on the findings, “exercise, including resistance training, should be strongly advocated [for vegans], I would say, at least two times per week.”
The authors reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
People who maintain a vegan diet show significant deficits in bone microarchitecture, compared with omnivores; however, resistance training not only appears to improve those deficits but may have a stronger effect in vegans, suggesting an important strategy in maintaining bone health with a vegan diet.
“We expected better bone structure in both vegans and omnivores who reported resistance training,” first author Robert Wakolbinger-Habel, MD, PhD, of St. Vincent Hospital Vienna and the Medical University of Vienna, said in an interview.
“However, we expected [there would still be] differences in structure between vegans and omnivores [who practiced resistance training], as previous literature reported higher fracture rates in vegans,” he said. “Still, the positive message is that ‘pumping iron’ could counterbalance these differences between vegans and omnivores.”
The research was published online in The Endocrine Society’s Journal of Clinical Endocrinology & Metabolism.
Exercise significantly impacts bone health in vegans
The potential effects of the plant-based vegan diet on bone health have been reported in studies linking the diet to an increased risk of fractures and lower bone mineral density (BMD), with common theories including lower bone- and muscle-building protein in vegan diets.
However, most previous studies have not considered other key factors, such as the effects of exercise, the authors noted.
“While previous studies on bone health in vegans only took BMD, biochemical and nutritional parameters into account, they did not consider the significant effects of physical activity,” they wrote.
“By ignoring these effects, important factors influencing bone health are neglected.”
For the study, 88 participants were enrolled in Vienna, with vegan participants recruited with the help of the Austrian Vegan Society.
Importantly, the study documented participants’ bone microarchitecture, a key measure of bone strength that has also not been previously investigated in vegans, using high-resolution peripheral quantitative CT.
Inclusion criteria included maintaining an omnivore diet of meat and plant-based foods or a vegan diet for at least 5 years, not being underweight or obese (body mass index [BMI], 18.5-30 kg/m2), being age 30-50 years, and being premenopausal.
Of the participants, 43 were vegan and 45 were omnivores, with generally equal ratios of men and women.
Vegan bone deficits disappear with strength training
Overall, compared with omnivores, the vegan group showed significant deficits in 7 of 14 measures of BMI-adjusted trabecular and cortical structure (all P < .05).
Among participants who reported no resistance training, vegans still showed significant decreases in bone microarchitecture, compared with omnivores, including radius trabecular BMD, radius trabecular bone volume fraction, and other tibial and cortical bone microarchitecture measures.
However, among those who did report progressive resistant training (20 vegans and 25 omnivores), defined as using machines, free weights, or bodyweight resistance exercises at least once a week, those differences disappeared and there were no significant differences in BMI-adjusted bone microarchitecture between vegans and omnivores after the 5 years.
Of note, no significant differences in bone microarchitecture were observed between those who performed exclusively aerobic activities and those who reported no sports activities in the vegan or omnivore group.
Based on the findings, “other types of exercise such as aerobics, cycling, etc, would not be sufficient for a similar positive effect on bone [as resistance training],” Dr. Wakolbinger-Habel said.
Although the findings suggest that resistance training seemed to allow vegans to “catch up” with omnivores in terms of bone microarchitecture, Dr. Wakolbinger-Habel cautioned that a study limitation is the relatively low number of participants.
“The absolute numbers suggest that in vegans the differences, and the relative effect, respectively of resistance training might be larger,” he said. “However, the number of participants in the subgroups is small and it is still an observational study, so we need to be careful in drawing causal conclusions.”
Serum bone markers were within normal ranges across all subgroups. And although there were some correlations between nutrient intake and bone microarchitecture among vegans who did and did not practice resistance training, no conclusions could be drawn from that data, the authors noted.
“Based on our data, the structural [differences between vegans and omnivores] cannot solely be explained by deficits in certain nutrients according to lifestyle,” the authors concluded.
Mechanisms
The mechanisms by which progressive resistance training could result in the benefits include that mechanical loads trigger stimulation of key pathways involved in bone formation, or mechanotransduction, the authors explained.
The unique effects have been observed in other studies, including one study showing that, among young adult runners, the addition of resistance training once a week was associated with significantly greater BMD.
“Veganism is a global trend with strongly increasing numbers of people worldwide adhering to a purely plant-based diet,” first author Christian Muschitz, MD, also of St. Vincent Hospital Vienna and the Medical University of Vienna, said in a press statement.
“Our study showed resistance training offsets diminished bone structure in vegan people when compared to omnivores,” he said.
Dr. Wakolbinger-Habel recommended that, based on the findings, “exercise, including resistance training, should be strongly advocated [for vegans], I would say, at least two times per week.”
The authors reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
People who maintain a vegan diet show significant deficits in bone microarchitecture, compared with omnivores; however, resistance training not only appears to improve those deficits but may have a stronger effect in vegans, suggesting an important strategy in maintaining bone health with a vegan diet.
“We expected better bone structure in both vegans and omnivores who reported resistance training,” first author Robert Wakolbinger-Habel, MD, PhD, of St. Vincent Hospital Vienna and the Medical University of Vienna, said in an interview.
“However, we expected [there would still be] differences in structure between vegans and omnivores [who practiced resistance training], as previous literature reported higher fracture rates in vegans,” he said. “Still, the positive message is that ‘pumping iron’ could counterbalance these differences between vegans and omnivores.”
The research was published online in The Endocrine Society’s Journal of Clinical Endocrinology & Metabolism.
Exercise significantly impacts bone health in vegans
The potential effects of the plant-based vegan diet on bone health have been reported in studies linking the diet to an increased risk of fractures and lower bone mineral density (BMD), with common theories including lower bone- and muscle-building protein in vegan diets.
However, most previous studies have not considered other key factors, such as the effects of exercise, the authors noted.
“While previous studies on bone health in vegans only took BMD, biochemical and nutritional parameters into account, they did not consider the significant effects of physical activity,” they wrote.
“By ignoring these effects, important factors influencing bone health are neglected.”
For the study, 88 participants were enrolled in Vienna, with vegan participants recruited with the help of the Austrian Vegan Society.
Importantly, the study documented participants’ bone microarchitecture, a key measure of bone strength that has also not been previously investigated in vegans, using high-resolution peripheral quantitative CT.
Inclusion criteria included maintaining an omnivore diet of meat and plant-based foods or a vegan diet for at least 5 years, not being underweight or obese (body mass index [BMI], 18.5-30 kg/m2), being age 30-50 years, and being premenopausal.
Of the participants, 43 were vegan and 45 were omnivores, with generally equal ratios of men and women.
Vegan bone deficits disappear with strength training
Overall, compared with omnivores, the vegan group showed significant deficits in 7 of 14 measures of BMI-adjusted trabecular and cortical structure (all P < .05).
Among participants who reported no resistance training, vegans still showed significant decreases in bone microarchitecture, compared with omnivores, including radius trabecular BMD, radius trabecular bone volume fraction, and other tibial and cortical bone microarchitecture measures.
However, among those who did report progressive resistant training (20 vegans and 25 omnivores), defined as using machines, free weights, or bodyweight resistance exercises at least once a week, those differences disappeared and there were no significant differences in BMI-adjusted bone microarchitecture between vegans and omnivores after the 5 years.
Of note, no significant differences in bone microarchitecture were observed between those who performed exclusively aerobic activities and those who reported no sports activities in the vegan or omnivore group.
Based on the findings, “other types of exercise such as aerobics, cycling, etc, would not be sufficient for a similar positive effect on bone [as resistance training],” Dr. Wakolbinger-Habel said.
Although the findings suggest that resistance training seemed to allow vegans to “catch up” with omnivores in terms of bone microarchitecture, Dr. Wakolbinger-Habel cautioned that a study limitation is the relatively low number of participants.
“The absolute numbers suggest that in vegans the differences, and the relative effect, respectively of resistance training might be larger,” he said. “However, the number of participants in the subgroups is small and it is still an observational study, so we need to be careful in drawing causal conclusions.”
Serum bone markers were within normal ranges across all subgroups. And although there were some correlations between nutrient intake and bone microarchitecture among vegans who did and did not practice resistance training, no conclusions could be drawn from that data, the authors noted.
“Based on our data, the structural [differences between vegans and omnivores] cannot solely be explained by deficits in certain nutrients according to lifestyle,” the authors concluded.
Mechanisms
The mechanisms by which progressive resistance training could result in the benefits include that mechanical loads trigger stimulation of key pathways involved in bone formation, or mechanotransduction, the authors explained.
The unique effects have been observed in other studies, including one study showing that, among young adult runners, the addition of resistance training once a week was associated with significantly greater BMD.
“Veganism is a global trend with strongly increasing numbers of people worldwide adhering to a purely plant-based diet,” first author Christian Muschitz, MD, also of St. Vincent Hospital Vienna and the Medical University of Vienna, said in a press statement.
“Our study showed resistance training offsets diminished bone structure in vegan people when compared to omnivores,” he said.
Dr. Wakolbinger-Habel recommended that, based on the findings, “exercise, including resistance training, should be strongly advocated [for vegans], I would say, at least two times per week.”
The authors reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM THE JOURNAL OF CLINICAL ENDOCRINOLOGY & METABOLISM
Hyperthyroidism rebound in pregnancy boosts adverse outcomes
Discontinuing antithyroid drugs during early pregnancy is linked to a possible rebound of hyperthyroidism and a high risk of adverse pregnancy outcomes, new research shows.
“Our study provides preliminary evidence that the risk of rebound increases in women with subnormal thyroid-stimulating hormone (TSH) and/or positive thyrotropin receptor antibody (TRAb) who stop antithyroid drugs in early pregnancy,” first author Xin Hou told this news organization.
“When discussing the pros and cons of antithyroid drug withdrawal early in pregnancy [clinicians] should consider the level of TSH and TRAb in early pregnancy,” said Hou, of the department of endocrinology and metabolism, Institute of Endocrinology, The First Affiliated Hospital of China Medical University, Shenyang.
Suvi Turunen, MD, of the University of Oulu (Finland), who has also conducted research on the issue, said the study adds important insights.
“I find this study very interesting,” Dr. Turunen said in an interview. “It is well known that medical treatment of hyperthyroidism outweighs the potential harms of antithyroid treatment.”
The new findings add to the evidence, she added. “I think that withdrawal of antithyroid drugs should be carefully considered, especially with autoantibody-positive patients,” Dr. Turunen said.
Hyperthyroidism a risk in pregnancy – with or without treatment
The potential risks of hyperthyroidism in pregnancy are well established and can range from preeclampsia to premature birth or miscarriage.
However, antithyroid drugs, including methimazole and propylthiouracil, carry their own risks. In crossing the placental barrier, the drugs can increase the risk of birth defects, particularly during 6-10 weeks of gestation, yet their discontinuation is linked to as much as a 50%-60% risk of relapse, the authors explain.
Because of the risks, the American Thyroid Association recommends that “women with a stable euthyroid state on 5-10 mg methimazole per day achieved within a few months, and a falling TRAb level, are likely candidates to withdraw from antithyroid drug therapy in early pregnancy,” the authors noted.
However, as the recommendations for women who are already pregnant are largely based on evidence from nonpregnant patients, Hou and colleagues sought to evaluate withdrawal among women who were pregnant.
For the study, published in Thyroid, they enrolled 63 women who were pregnant and part of an outpatient service of the department of endocrinology and metabolism at The First Affiliated Hospital of China Medical University, between September 2014 and March 2017, who had well-controlled hyperthyroidism in early pregnancy and discontinued the drugs.
The women were an average age of 27 years, and 28 were multigravida. Twenty-two had a history of miscarriage.
A follow-up of the patients until the end of their pregnancy showed that, overall, 20 (31.7%) had a rebound of hyperthyroidism during their pregnancy after withdrawing from the drugs.
Key factors associated with the highest risk of a rebound after discontinuation included having subnormal TSH levels (TSH < 0.35 mIU/L; odds ratio, 5.12; P = .03) or having positive TRAb (TRAb > 1.75 IU/L; OR, 3.79; P = .02) at the time of medication withdrawal, compared with those with either normal TSH levels or negative TRAb.
The combination of both subnormal TSH and positive TRAb at the time of antithyroid medication withdrawal further boosted the risk of hyperthyroidism rebound (83.3%, 5 of 6), compared with those who had both normal TSH and negative TRAb (13%, 3 of 23; OR, 33.33; P = .003).
Adverse pregnancy outcomes increased
Importantly, among the 20 patients who had a rebound, 11 (55%) had adverse pregnancy outcomes, including miscarriage, premature birth, induced labor, gestational hypertension, and gestational diabetes, compared with only 4 (9.3%) of the 43 who had no rebound (OR, 11.92; P = .0002).
Neonatal abnormalities were also higher among those experiencing a rebound (20% vs. 4.7%), however, the authors noted that “larger prospective studies are required to conclude whether antithyroid drug withdrawal affects fetal outcome.”
In the rebound group, the mean duration of antithyroid medication use was 24.7 months versus 35.1 months in the nonrebound group, however, the difference was not statistically significant (P = .07). And 40% of the rebound group had a history of miscarriage versus 32.6% in the non-rebound group, but was also not significantly different (P = .56).
The authors noted that half of those in the rebound group developed hyperthyroidism more than 4 weeks after their withdrawal from antithyroid medications, “which seemed to have circumvented the most sensitive period of teratogenesis between 6 and 10 weeks of pregnancy.”
Hou added that restarting antithyroid medication did not increase the risk of adverse outcomes for offspring.
“A low dose of antithyroid medications may be a good choice for women with subnormal TSH and/or positive TRAb in early pregnancy,” Hou concluded. “Because of the small size of our study, a larger prospective study is needed to overcome the potential selection bias and to verify the conclusions.”
Findings consistent with Finnish study
In her own recent study, which included 2,144 women in Finland who experienced hyperthyroidism during pregnancy, Dr. Turunen and colleagues found that having hyperthyroidism, with or without antithyroid drug treatment, was associated with an increased odds of pregnancy and/or prenatal complications, compared with those without thyroid disease.
“In our study, we observed an increased risk of adverse pregnancy outcomes also in mothers with previous diagnosis and/or treatment of hyperthyroidism, not only with overt hyperthyroidism treated with antithyroid drugs,” she told this news organization.
“I think that especially those patients with positive antibodies [TRAbs] are at risk even if they are euthyroid,” she noted. “Withdrawal of antithyroid drugs in these patients is a risk.”
“Probably continuing antithyroid treatment with low dose is a better option,” she said.
The authors and Dr. Turunen reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Discontinuing antithyroid drugs during early pregnancy is linked to a possible rebound of hyperthyroidism and a high risk of adverse pregnancy outcomes, new research shows.
“Our study provides preliminary evidence that the risk of rebound increases in women with subnormal thyroid-stimulating hormone (TSH) and/or positive thyrotropin receptor antibody (TRAb) who stop antithyroid drugs in early pregnancy,” first author Xin Hou told this news organization.
“When discussing the pros and cons of antithyroid drug withdrawal early in pregnancy [clinicians] should consider the level of TSH and TRAb in early pregnancy,” said Hou, of the department of endocrinology and metabolism, Institute of Endocrinology, The First Affiliated Hospital of China Medical University, Shenyang.
Suvi Turunen, MD, of the University of Oulu (Finland), who has also conducted research on the issue, said the study adds important insights.
“I find this study very interesting,” Dr. Turunen said in an interview. “It is well known that medical treatment of hyperthyroidism outweighs the potential harms of antithyroid treatment.”
The new findings add to the evidence, she added. “I think that withdrawal of antithyroid drugs should be carefully considered, especially with autoantibody-positive patients,” Dr. Turunen said.
Hyperthyroidism a risk in pregnancy – with or without treatment
The potential risks of hyperthyroidism in pregnancy are well established and can range from preeclampsia to premature birth or miscarriage.
However, antithyroid drugs, including methimazole and propylthiouracil, carry their own risks. In crossing the placental barrier, the drugs can increase the risk of birth defects, particularly during 6-10 weeks of gestation, yet their discontinuation is linked to as much as a 50%-60% risk of relapse, the authors explain.
Because of the risks, the American Thyroid Association recommends that “women with a stable euthyroid state on 5-10 mg methimazole per day achieved within a few months, and a falling TRAb level, are likely candidates to withdraw from antithyroid drug therapy in early pregnancy,” the authors noted.
However, as the recommendations for women who are already pregnant are largely based on evidence from nonpregnant patients, Hou and colleagues sought to evaluate withdrawal among women who were pregnant.
For the study, published in Thyroid, they enrolled 63 women who were pregnant and part of an outpatient service of the department of endocrinology and metabolism at The First Affiliated Hospital of China Medical University, between September 2014 and March 2017, who had well-controlled hyperthyroidism in early pregnancy and discontinued the drugs.
The women were an average age of 27 years, and 28 were multigravida. Twenty-two had a history of miscarriage.
A follow-up of the patients until the end of their pregnancy showed that, overall, 20 (31.7%) had a rebound of hyperthyroidism during their pregnancy after withdrawing from the drugs.
Key factors associated with the highest risk of a rebound after discontinuation included having subnormal TSH levels (TSH < 0.35 mIU/L; odds ratio, 5.12; P = .03) or having positive TRAb (TRAb > 1.75 IU/L; OR, 3.79; P = .02) at the time of medication withdrawal, compared with those with either normal TSH levels or negative TRAb.
The combination of both subnormal TSH and positive TRAb at the time of antithyroid medication withdrawal further boosted the risk of hyperthyroidism rebound (83.3%, 5 of 6), compared with those who had both normal TSH and negative TRAb (13%, 3 of 23; OR, 33.33; P = .003).
Adverse pregnancy outcomes increased
Importantly, among the 20 patients who had a rebound, 11 (55%) had adverse pregnancy outcomes, including miscarriage, premature birth, induced labor, gestational hypertension, and gestational diabetes, compared with only 4 (9.3%) of the 43 who had no rebound (OR, 11.92; P = .0002).
Neonatal abnormalities were also higher among those experiencing a rebound (20% vs. 4.7%), however, the authors noted that “larger prospective studies are required to conclude whether antithyroid drug withdrawal affects fetal outcome.”
In the rebound group, the mean duration of antithyroid medication use was 24.7 months versus 35.1 months in the nonrebound group, however, the difference was not statistically significant (P = .07). And 40% of the rebound group had a history of miscarriage versus 32.6% in the non-rebound group, but was also not significantly different (P = .56).
The authors noted that half of those in the rebound group developed hyperthyroidism more than 4 weeks after their withdrawal from antithyroid medications, “which seemed to have circumvented the most sensitive period of teratogenesis between 6 and 10 weeks of pregnancy.”
Hou added that restarting antithyroid medication did not increase the risk of adverse outcomes for offspring.
“A low dose of antithyroid medications may be a good choice for women with subnormal TSH and/or positive TRAb in early pregnancy,” Hou concluded. “Because of the small size of our study, a larger prospective study is needed to overcome the potential selection bias and to verify the conclusions.”
Findings consistent with Finnish study
In her own recent study, which included 2,144 women in Finland who experienced hyperthyroidism during pregnancy, Dr. Turunen and colleagues found that having hyperthyroidism, with or without antithyroid drug treatment, was associated with an increased odds of pregnancy and/or prenatal complications, compared with those without thyroid disease.
“In our study, we observed an increased risk of adverse pregnancy outcomes also in mothers with previous diagnosis and/or treatment of hyperthyroidism, not only with overt hyperthyroidism treated with antithyroid drugs,” she told this news organization.
“I think that especially those patients with positive antibodies [TRAbs] are at risk even if they are euthyroid,” she noted. “Withdrawal of antithyroid drugs in these patients is a risk.”
“Probably continuing antithyroid treatment with low dose is a better option,” she said.
The authors and Dr. Turunen reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Discontinuing antithyroid drugs during early pregnancy is linked to a possible rebound of hyperthyroidism and a high risk of adverse pregnancy outcomes, new research shows.
“Our study provides preliminary evidence that the risk of rebound increases in women with subnormal thyroid-stimulating hormone (TSH) and/or positive thyrotropin receptor antibody (TRAb) who stop antithyroid drugs in early pregnancy,” first author Xin Hou told this news organization.
“When discussing the pros and cons of antithyroid drug withdrawal early in pregnancy [clinicians] should consider the level of TSH and TRAb in early pregnancy,” said Hou, of the department of endocrinology and metabolism, Institute of Endocrinology, The First Affiliated Hospital of China Medical University, Shenyang.
Suvi Turunen, MD, of the University of Oulu (Finland), who has also conducted research on the issue, said the study adds important insights.
“I find this study very interesting,” Dr. Turunen said in an interview. “It is well known that medical treatment of hyperthyroidism outweighs the potential harms of antithyroid treatment.”
The new findings add to the evidence, she added. “I think that withdrawal of antithyroid drugs should be carefully considered, especially with autoantibody-positive patients,” Dr. Turunen said.
Hyperthyroidism a risk in pregnancy – with or without treatment
The potential risks of hyperthyroidism in pregnancy are well established and can range from preeclampsia to premature birth or miscarriage.
However, antithyroid drugs, including methimazole and propylthiouracil, carry their own risks. In crossing the placental barrier, the drugs can increase the risk of birth defects, particularly during 6-10 weeks of gestation, yet their discontinuation is linked to as much as a 50%-60% risk of relapse, the authors explain.
Because of the risks, the American Thyroid Association recommends that “women with a stable euthyroid state on 5-10 mg methimazole per day achieved within a few months, and a falling TRAb level, are likely candidates to withdraw from antithyroid drug therapy in early pregnancy,” the authors noted.
However, as the recommendations for women who are already pregnant are largely based on evidence from nonpregnant patients, Hou and colleagues sought to evaluate withdrawal among women who were pregnant.
For the study, published in Thyroid, they enrolled 63 women who were pregnant and part of an outpatient service of the department of endocrinology and metabolism at The First Affiliated Hospital of China Medical University, between September 2014 and March 2017, who had well-controlled hyperthyroidism in early pregnancy and discontinued the drugs.
The women were an average age of 27 years, and 28 were multigravida. Twenty-two had a history of miscarriage.
A follow-up of the patients until the end of their pregnancy showed that, overall, 20 (31.7%) had a rebound of hyperthyroidism during their pregnancy after withdrawing from the drugs.
Key factors associated with the highest risk of a rebound after discontinuation included having subnormal TSH levels (TSH < 0.35 mIU/L; odds ratio, 5.12; P = .03) or having positive TRAb (TRAb > 1.75 IU/L; OR, 3.79; P = .02) at the time of medication withdrawal, compared with those with either normal TSH levels or negative TRAb.
The combination of both subnormal TSH and positive TRAb at the time of antithyroid medication withdrawal further boosted the risk of hyperthyroidism rebound (83.3%, 5 of 6), compared with those who had both normal TSH and negative TRAb (13%, 3 of 23; OR, 33.33; P = .003).
Adverse pregnancy outcomes increased
Importantly, among the 20 patients who had a rebound, 11 (55%) had adverse pregnancy outcomes, including miscarriage, premature birth, induced labor, gestational hypertension, and gestational diabetes, compared with only 4 (9.3%) of the 43 who had no rebound (OR, 11.92; P = .0002).
Neonatal abnormalities were also higher among those experiencing a rebound (20% vs. 4.7%), however, the authors noted that “larger prospective studies are required to conclude whether antithyroid drug withdrawal affects fetal outcome.”
In the rebound group, the mean duration of antithyroid medication use was 24.7 months versus 35.1 months in the nonrebound group, however, the difference was not statistically significant (P = .07). And 40% of the rebound group had a history of miscarriage versus 32.6% in the non-rebound group, but was also not significantly different (P = .56).
The authors noted that half of those in the rebound group developed hyperthyroidism more than 4 weeks after their withdrawal from antithyroid medications, “which seemed to have circumvented the most sensitive period of teratogenesis between 6 and 10 weeks of pregnancy.”
Hou added that restarting antithyroid medication did not increase the risk of adverse outcomes for offspring.
“A low dose of antithyroid medications may be a good choice for women with subnormal TSH and/or positive TRAb in early pregnancy,” Hou concluded. “Because of the small size of our study, a larger prospective study is needed to overcome the potential selection bias and to verify the conclusions.”
Findings consistent with Finnish study
In her own recent study, which included 2,144 women in Finland who experienced hyperthyroidism during pregnancy, Dr. Turunen and colleagues found that having hyperthyroidism, with or without antithyroid drug treatment, was associated with an increased odds of pregnancy and/or prenatal complications, compared with those without thyroid disease.
“In our study, we observed an increased risk of adverse pregnancy outcomes also in mothers with previous diagnosis and/or treatment of hyperthyroidism, not only with overt hyperthyroidism treated with antithyroid drugs,” she told this news organization.
“I think that especially those patients with positive antibodies [TRAbs] are at risk even if they are euthyroid,” she noted. “Withdrawal of antithyroid drugs in these patients is a risk.”
“Probably continuing antithyroid treatment with low dose is a better option,” she said.
The authors and Dr. Turunen reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM THYROID