User login
Phase 3 study of new levodopa/carbidopa delivery system meets all efficacy endpoints
BOSTON –
presented at the 2023 annual meeting of the American Academy of Neurology.When compared with optimized oral immediate-release medication, the delivery system, called ND0612 (NeuroDerm, Rehovot, Israel), improved ON time without troublesome dyskinesias while improving symptoms according to ratings from both patients and clinicians, according to Alberto J. Espay, MD, professor of neurology and director of the Gardner Family Center for Parkinson’s Disease and Movement Disorders, University of Cincinnati.
The new delivery system addresses the challenge of reducing the variability in levodopa plasma concentrations, a major factor in motor fluctuations and diminishing benefit from orally administered drug, according to Dr. Espay. He said that continuous infusion strategies have long been sought as a method to preserve levodopa efficacy.
BouNDless findings
There were two phases to this multinational trial, called BouNDless. In the first, an open-label run-in phase, 381 patients with Parkinson’s disease were dose titrated for optimization of oral immediate-release levodopa and carbidopa. They were then optimized for the same drugs delivered with ND0612. The study was conducted over 12 weeks; 122 patients left the study after this phase due to adverse events, lack of efficacy, or withdrawal of consent.
In the second phase, the 259 remaining patients were randomized to the continuous infusion arm or to immediate release oral therapy. In this double-blind, double-dummy phase, those randomized to the ND0612 infusion also received oral placebos. Those randomized to oral therapy received a placebo infusion. Efficacy and safety were assessed at the end of 12 weeks.
At the end of phase 1, the ON time increased by about 3 hours when levodopa-carbidopa dosing was optimized on either delivery method. Dr. Espay attributed the improvement to the value of optimized dosing even in patients with relatively advanced disease.
However, for the purposes of the double-blind comparison, this improvement in ON time provided a new baseline for comparison of the two delivery methods. This is important for interpreting the primary result, which was a 1.72-hour difference in ON time at the end of the study. The difference was created when ON time was maintained with ND0612 continuous drug delivery but eroded in the group randomized to oral immediate-release treatment.
Several secondary endpoints supported the greater efficacy of continuous subcutaneous delivery. These included lower OFF time (0.50 vs. 1.90 hours), less accumulation of disability on the United Parkinson’s Disease Rating Scale part II-M-EDL (-0.30 vs. +2.75 points), and greater improvement on the Patient Global Impression of Change (+0.31 vs. +0.70 points), and the Clinical Global Impression of change (+0.31 vs. +0.77 points). The differences were highly statistically significant (all P < .0001).
The patients participating in the double-blind phase of the study were similar with a mean age of 63.5 years in both groups and time since Parkinson’s disease diagnosis (> 9 years). The median ON time without troublesome dyskinesias was about 12 hours at baseline in both groups and the median OFF time was about 3.5 hours.
The higher rate of treatment-related adverse events in the ND0612 group (67.2% vs. 52.7%) was largely explained by the greater rate of infusion site reactions (57.0% vs. 42.7%). The rates of severe reactions in the two groups were the same (0.8%), but both mild (43.8% vs. 36.6%) and moderate (12.5% vs. 5.3%) reactions occurred more commonly in the group receiving active therapy.
“Infusion reactions are the Achilles heel of all subcutaneous therapies,” acknowledged Dr. Espay, who expects other infusion systems in development to share this risk. He suggested that the clinical impact can be attenuated to some degree by rotating infusion sites.
BeyoND extension study
Data from an open-label extension (OLE) of the phase 2b BeyoND trial were also presented at the AAN meeting and generated generally similar results. Largely a safety study, there was no active control in the initial BeyoND or the BeyoND OLE. In BeyoND, the improvement in ON time from baseline was even greater than that seen in BouNDless, but, again, the optimization of dosing in the BouNDless run-in established a greater baseline of disease control.
In the OLE of BeyoND, presented by Aaron Ellenbogen, DO, a neurologist in Farmington, Mich., one of the notable findings was the retention of patients. After 2 years of follow-up, 82% completed at least 2 years of follow-up and 66.7% have now remained on treatment for at least 3 years. Dr. Ellenbogen maintains that this retention rate provides compelling evidence of a favorable benefit-to-risk ratio.
Fulfilling an unmet need
The favorable efficacy data from this trial represent “a big advance,” according to Ihtsham Ul Haq, MD, chief, movement disorders division, University of Miami, who was reached for comment. He noted that continuous infusion delivery has been anticipated for some time, and he expects these types of systems to fulfill an unmet need.
“This will be a useful option in a carefully selected group of patients,” said Dr. Haq, who considers the types of improvement in ON time to be highly clinically meaningful.
However, he cautioned that the nodules created by injection site reactions might limit the utility of this treatment option in at least some patients. Wearing the external device might also be a limiting factor for some patients.
In complex Parkinson’s disease, a stage that can be reached fairly rapidly in some patients but might take 15 years or more in others, all of the options involve a careful benefit-to-risk calculation, according to Dr. Haq. Deep brain stimulation is among the most effective options, but continuous infusion might appeal to some patients for delaying this procedure or as an alternative.
“We need multiple options for these types of patients, and it appears that continuous infusion will be one of them,” Dr. Haq said.
Dr. Espay has financial relationships with Acadia, Acorda, Amneal, AskBio, Bexion, Kyowa Kirin, Neuroderm, Neurocrine, and Sunovion. Dr. Ellenbogen has financial relationships with Allergan, Acorda, Supernus, and Teva. Dr. Haq reports no potential conflicts of interest.
BOSTON –
presented at the 2023 annual meeting of the American Academy of Neurology.When compared with optimized oral immediate-release medication, the delivery system, called ND0612 (NeuroDerm, Rehovot, Israel), improved ON time without troublesome dyskinesias while improving symptoms according to ratings from both patients and clinicians, according to Alberto J. Espay, MD, professor of neurology and director of the Gardner Family Center for Parkinson’s Disease and Movement Disorders, University of Cincinnati.
The new delivery system addresses the challenge of reducing the variability in levodopa plasma concentrations, a major factor in motor fluctuations and diminishing benefit from orally administered drug, according to Dr. Espay. He said that continuous infusion strategies have long been sought as a method to preserve levodopa efficacy.
BouNDless findings
There were two phases to this multinational trial, called BouNDless. In the first, an open-label run-in phase, 381 patients with Parkinson’s disease were dose titrated for optimization of oral immediate-release levodopa and carbidopa. They were then optimized for the same drugs delivered with ND0612. The study was conducted over 12 weeks; 122 patients left the study after this phase due to adverse events, lack of efficacy, or withdrawal of consent.
In the second phase, the 259 remaining patients were randomized to the continuous infusion arm or to immediate release oral therapy. In this double-blind, double-dummy phase, those randomized to the ND0612 infusion also received oral placebos. Those randomized to oral therapy received a placebo infusion. Efficacy and safety were assessed at the end of 12 weeks.
At the end of phase 1, the ON time increased by about 3 hours when levodopa-carbidopa dosing was optimized on either delivery method. Dr. Espay attributed the improvement to the value of optimized dosing even in patients with relatively advanced disease.
However, for the purposes of the double-blind comparison, this improvement in ON time provided a new baseline for comparison of the two delivery methods. This is important for interpreting the primary result, which was a 1.72-hour difference in ON time at the end of the study. The difference was created when ON time was maintained with ND0612 continuous drug delivery but eroded in the group randomized to oral immediate-release treatment.
Several secondary endpoints supported the greater efficacy of continuous subcutaneous delivery. These included lower OFF time (0.50 vs. 1.90 hours), less accumulation of disability on the United Parkinson’s Disease Rating Scale part II-M-EDL (-0.30 vs. +2.75 points), and greater improvement on the Patient Global Impression of Change (+0.31 vs. +0.70 points), and the Clinical Global Impression of change (+0.31 vs. +0.77 points). The differences were highly statistically significant (all P < .0001).
The patients participating in the double-blind phase of the study were similar with a mean age of 63.5 years in both groups and time since Parkinson’s disease diagnosis (> 9 years). The median ON time without troublesome dyskinesias was about 12 hours at baseline in both groups and the median OFF time was about 3.5 hours.
The higher rate of treatment-related adverse events in the ND0612 group (67.2% vs. 52.7%) was largely explained by the greater rate of infusion site reactions (57.0% vs. 42.7%). The rates of severe reactions in the two groups were the same (0.8%), but both mild (43.8% vs. 36.6%) and moderate (12.5% vs. 5.3%) reactions occurred more commonly in the group receiving active therapy.
“Infusion reactions are the Achilles heel of all subcutaneous therapies,” acknowledged Dr. Espay, who expects other infusion systems in development to share this risk. He suggested that the clinical impact can be attenuated to some degree by rotating infusion sites.
BeyoND extension study
Data from an open-label extension (OLE) of the phase 2b BeyoND trial were also presented at the AAN meeting and generated generally similar results. Largely a safety study, there was no active control in the initial BeyoND or the BeyoND OLE. In BeyoND, the improvement in ON time from baseline was even greater than that seen in BouNDless, but, again, the optimization of dosing in the BouNDless run-in established a greater baseline of disease control.
In the OLE of BeyoND, presented by Aaron Ellenbogen, DO, a neurologist in Farmington, Mich., one of the notable findings was the retention of patients. After 2 years of follow-up, 82% completed at least 2 years of follow-up and 66.7% have now remained on treatment for at least 3 years. Dr. Ellenbogen maintains that this retention rate provides compelling evidence of a favorable benefit-to-risk ratio.
Fulfilling an unmet need
The favorable efficacy data from this trial represent “a big advance,” according to Ihtsham Ul Haq, MD, chief, movement disorders division, University of Miami, who was reached for comment. He noted that continuous infusion delivery has been anticipated for some time, and he expects these types of systems to fulfill an unmet need.
“This will be a useful option in a carefully selected group of patients,” said Dr. Haq, who considers the types of improvement in ON time to be highly clinically meaningful.
However, he cautioned that the nodules created by injection site reactions might limit the utility of this treatment option in at least some patients. Wearing the external device might also be a limiting factor for some patients.
In complex Parkinson’s disease, a stage that can be reached fairly rapidly in some patients but might take 15 years or more in others, all of the options involve a careful benefit-to-risk calculation, according to Dr. Haq. Deep brain stimulation is among the most effective options, but continuous infusion might appeal to some patients for delaying this procedure or as an alternative.
“We need multiple options for these types of patients, and it appears that continuous infusion will be one of them,” Dr. Haq said.
Dr. Espay has financial relationships with Acadia, Acorda, Amneal, AskBio, Bexion, Kyowa Kirin, Neuroderm, Neurocrine, and Sunovion. Dr. Ellenbogen has financial relationships with Allergan, Acorda, Supernus, and Teva. Dr. Haq reports no potential conflicts of interest.
BOSTON –
presented at the 2023 annual meeting of the American Academy of Neurology.When compared with optimized oral immediate-release medication, the delivery system, called ND0612 (NeuroDerm, Rehovot, Israel), improved ON time without troublesome dyskinesias while improving symptoms according to ratings from both patients and clinicians, according to Alberto J. Espay, MD, professor of neurology and director of the Gardner Family Center for Parkinson’s Disease and Movement Disorders, University of Cincinnati.
The new delivery system addresses the challenge of reducing the variability in levodopa plasma concentrations, a major factor in motor fluctuations and diminishing benefit from orally administered drug, according to Dr. Espay. He said that continuous infusion strategies have long been sought as a method to preserve levodopa efficacy.
BouNDless findings
There were two phases to this multinational trial, called BouNDless. In the first, an open-label run-in phase, 381 patients with Parkinson’s disease were dose titrated for optimization of oral immediate-release levodopa and carbidopa. They were then optimized for the same drugs delivered with ND0612. The study was conducted over 12 weeks; 122 patients left the study after this phase due to adverse events, lack of efficacy, or withdrawal of consent.
In the second phase, the 259 remaining patients were randomized to the continuous infusion arm or to immediate release oral therapy. In this double-blind, double-dummy phase, those randomized to the ND0612 infusion also received oral placebos. Those randomized to oral therapy received a placebo infusion. Efficacy and safety were assessed at the end of 12 weeks.
At the end of phase 1, the ON time increased by about 3 hours when levodopa-carbidopa dosing was optimized on either delivery method. Dr. Espay attributed the improvement to the value of optimized dosing even in patients with relatively advanced disease.
However, for the purposes of the double-blind comparison, this improvement in ON time provided a new baseline for comparison of the two delivery methods. This is important for interpreting the primary result, which was a 1.72-hour difference in ON time at the end of the study. The difference was created when ON time was maintained with ND0612 continuous drug delivery but eroded in the group randomized to oral immediate-release treatment.
Several secondary endpoints supported the greater efficacy of continuous subcutaneous delivery. These included lower OFF time (0.50 vs. 1.90 hours), less accumulation of disability on the United Parkinson’s Disease Rating Scale part II-M-EDL (-0.30 vs. +2.75 points), and greater improvement on the Patient Global Impression of Change (+0.31 vs. +0.70 points), and the Clinical Global Impression of change (+0.31 vs. +0.77 points). The differences were highly statistically significant (all P < .0001).
The patients participating in the double-blind phase of the study were similar with a mean age of 63.5 years in both groups and time since Parkinson’s disease diagnosis (> 9 years). The median ON time without troublesome dyskinesias was about 12 hours at baseline in both groups and the median OFF time was about 3.5 hours.
The higher rate of treatment-related adverse events in the ND0612 group (67.2% vs. 52.7%) was largely explained by the greater rate of infusion site reactions (57.0% vs. 42.7%). The rates of severe reactions in the two groups were the same (0.8%), but both mild (43.8% vs. 36.6%) and moderate (12.5% vs. 5.3%) reactions occurred more commonly in the group receiving active therapy.
“Infusion reactions are the Achilles heel of all subcutaneous therapies,” acknowledged Dr. Espay, who expects other infusion systems in development to share this risk. He suggested that the clinical impact can be attenuated to some degree by rotating infusion sites.
BeyoND extension study
Data from an open-label extension (OLE) of the phase 2b BeyoND trial were also presented at the AAN meeting and generated generally similar results. Largely a safety study, there was no active control in the initial BeyoND or the BeyoND OLE. In BeyoND, the improvement in ON time from baseline was even greater than that seen in BouNDless, but, again, the optimization of dosing in the BouNDless run-in established a greater baseline of disease control.
In the OLE of BeyoND, presented by Aaron Ellenbogen, DO, a neurologist in Farmington, Mich., one of the notable findings was the retention of patients. After 2 years of follow-up, 82% completed at least 2 years of follow-up and 66.7% have now remained on treatment for at least 3 years. Dr. Ellenbogen maintains that this retention rate provides compelling evidence of a favorable benefit-to-risk ratio.
Fulfilling an unmet need
The favorable efficacy data from this trial represent “a big advance,” according to Ihtsham Ul Haq, MD, chief, movement disorders division, University of Miami, who was reached for comment. He noted that continuous infusion delivery has been anticipated for some time, and he expects these types of systems to fulfill an unmet need.
“This will be a useful option in a carefully selected group of patients,” said Dr. Haq, who considers the types of improvement in ON time to be highly clinically meaningful.
However, he cautioned that the nodules created by injection site reactions might limit the utility of this treatment option in at least some patients. Wearing the external device might also be a limiting factor for some patients.
In complex Parkinson’s disease, a stage that can be reached fairly rapidly in some patients but might take 15 years or more in others, all of the options involve a careful benefit-to-risk calculation, according to Dr. Haq. Deep brain stimulation is among the most effective options, but continuous infusion might appeal to some patients for delaying this procedure or as an alternative.
“We need multiple options for these types of patients, and it appears that continuous infusion will be one of them,” Dr. Haq said.
Dr. Espay has financial relationships with Acadia, Acorda, Amneal, AskBio, Bexion, Kyowa Kirin, Neuroderm, Neurocrine, and Sunovion. Dr. Ellenbogen has financial relationships with Allergan, Acorda, Supernus, and Teva. Dr. Haq reports no potential conflicts of interest.
FROM AAN 2023
Cycle timing may reduce hormonal dosage for contraception
Progesterone and estrogen are often used for contraception by preventing ovulation, but the adverse effects associated with large doses of these hormones remain a concern, wrote Brenda Lyn A. Gavina, a PhD candidate at the University of the Philippines Diliman, Quezon City, and colleagues.
In a study published in PLoS Computational Biology, the researchers examined how the timing of hormone administration during a cycle might impact the amount of hormones needed for contraception. Previous research shown that combining hormones can reduce the dosage needed, but the impact of timing on further dose reduction has not been well studied, they said.
The researchers applied optimal control theory in a mathematical model to show the contraceptive effect of estrogen and/or progesterone at different times in the menstrual cycle. The model was based on a normal menstrual cycle with pituitary and ovarian phases. The model assumed that synthesis of luteinizing hormone and follicle-stimulating hormone occurs in the pituitary, that LH and FSH are held in reserve before release into the bloodstream, and that the follicular/luteal mass goes through nine ovarian stages of development. The model also included the activity of ovarian hormones estradiol (E2), progesterone (P4), and inhibin (Inh), in a normal cycle. In the model, LH, FSH, and E2 peaked in the late follicular phase, and P4 and Inh peaked in the luteal phase.
The pituitary model predicted the synthesis, release, and clearance of LH and FSH, and the response of the pituitary to E2, P4, and Inh. The ovarian model predicted the response of E2, P4, and Inh as functions of LH and FSH.
The researchers simulated a constant dose of exogenous progesterone monotherapy and combined exogenous estrogen/progesterone. They determined that a P4 peak of 4.99 ng/mL was taken as the optimum constant dosage for progesterone monotherapy, and for combination estrogen/progesterone.
The researchers then assessed the impact of time on dosage. They found that estrogen administration starting on the first day of a normal cycle preventing FHS from reaching maximum value, and that the low level of FHS in the follicular phase and additional P4 inhibition slowed follicular growth, and use of combination estrogen/progesterone caused similar inhibition at a later follicular stage.
“The combination therapy suggests that time-varying doses of estrogen and progesterone given simultaneously from the start to the end of the 28-day period, only requires a surge in estrogen dose around the 12th day of the cycle (a delayed administration compared to the estrogen monotherapy),” they noted.
With attention to timing, the maximum progesterone levels throughout a menstrual cycle were 4.43 ng/mL, 4.66 ng/mL, and 4.31 ng/mL for estrogen monotherapy, progesterone monotherapy, and combination therapy, respectively. Total doses of the optimal exogenous hormone were 77.76 pg/mL and 48.84 ng/mL for estrogen and progesterone monotherapy, respectively, and 35.58 pg/mL and 21.67 ng/mL for estrogen and progesterone in combination.
The findings were limited by the use of a standard model that does not account for variations in cycle length, the researchers noted. However, the results reflect other studies of hormonal activity, and the model can be used in future studies of the effect of hormones on cycle length, they said.
Overall, the researchers determined that timing dosage with estrogen monotherapy based on their model could provide effective contraception with about 92% of the minimum total constant dosage, while progesterone monotherapy would be effective with approximately 43% of the total constant dose.
Although more work is needed, the current study results may guide clinicians in experimenting with the optimal treatment regimen for anovulation, the researchers said.
“The results presented here give insights on construction of timed devices that give contraception at certain parts of the menstrual cycle,” they concluded.
Model aims to improve women’s control of contraception
“Aside from wanting to contribute to controlling population growth, we aim to empower women more by giving them more control on when to conceive and start motherhood,” and be in control of contraception in a safer way, said lead author Ms. Gavina, in an interview. In addition, studies are showing the noncontraceptive benefits of suppressing ovulation for managing premenstrual syndromes such as breast tenderness and irritability, and for managing diseases such as endometriosis, she said. “Anovulation also lowers the risk of ACL injuries in female athletes,” she added.
Ms. Gavina said that she was surprised primarily by the optimal control result for estrogen monotherapy. “It was surprising that, theoretically, our mathematical model, with the simplifying assumptions, showed that as low as 10% of the total dose in constant administration could achieve contraception as long as the administration of this dosage is perfectly timed, and the timing was also shown in our optimization result,” she said.
“Our model does not capture all factors in contraception, since the reproductive function in women is a very complex multiscale dynamical system highly dependent on both endogenous and exogenous hormones,” Ms. Gavina told this news organization. However, “with the emergence of more data, it can be refined to address other contraception issues. Further, although the results of this study are not directly translatable to the clinical setting, we hope that these results may aid clinicians in identifying the minimum dose and treatment schedule for contraception,” she said.
Future research directions include examining within and between women’s variabilities and adding a pharmacokinetics model to account for the effects of specific drugs, she said. “We also hope to expand or modify the current model to investigate reproductive health concerns in women, such as [polycystic ovary syndrome] and ovarian cysts,” she added.
Ms. Gavina disclosed support from the University of the Philippines Office of International Linkages, a Continuous Operational and Outcomes-based Partnership for Excellence in Research and Academic Training Enhancement grant, and a Commission on Higher Education Faculty Development Program-II scholarship.
Progesterone and estrogen are often used for contraception by preventing ovulation, but the adverse effects associated with large doses of these hormones remain a concern, wrote Brenda Lyn A. Gavina, a PhD candidate at the University of the Philippines Diliman, Quezon City, and colleagues.
In a study published in PLoS Computational Biology, the researchers examined how the timing of hormone administration during a cycle might impact the amount of hormones needed for contraception. Previous research shown that combining hormones can reduce the dosage needed, but the impact of timing on further dose reduction has not been well studied, they said.
The researchers applied optimal control theory in a mathematical model to show the contraceptive effect of estrogen and/or progesterone at different times in the menstrual cycle. The model was based on a normal menstrual cycle with pituitary and ovarian phases. The model assumed that synthesis of luteinizing hormone and follicle-stimulating hormone occurs in the pituitary, that LH and FSH are held in reserve before release into the bloodstream, and that the follicular/luteal mass goes through nine ovarian stages of development. The model also included the activity of ovarian hormones estradiol (E2), progesterone (P4), and inhibin (Inh), in a normal cycle. In the model, LH, FSH, and E2 peaked in the late follicular phase, and P4 and Inh peaked in the luteal phase.
The pituitary model predicted the synthesis, release, and clearance of LH and FSH, and the response of the pituitary to E2, P4, and Inh. The ovarian model predicted the response of E2, P4, and Inh as functions of LH and FSH.
The researchers simulated a constant dose of exogenous progesterone monotherapy and combined exogenous estrogen/progesterone. They determined that a P4 peak of 4.99 ng/mL was taken as the optimum constant dosage for progesterone monotherapy, and for combination estrogen/progesterone.
The researchers then assessed the impact of time on dosage. They found that estrogen administration starting on the first day of a normal cycle preventing FHS from reaching maximum value, and that the low level of FHS in the follicular phase and additional P4 inhibition slowed follicular growth, and use of combination estrogen/progesterone caused similar inhibition at a later follicular stage.
“The combination therapy suggests that time-varying doses of estrogen and progesterone given simultaneously from the start to the end of the 28-day period, only requires a surge in estrogen dose around the 12th day of the cycle (a delayed administration compared to the estrogen monotherapy),” they noted.
With attention to timing, the maximum progesterone levels throughout a menstrual cycle were 4.43 ng/mL, 4.66 ng/mL, and 4.31 ng/mL for estrogen monotherapy, progesterone monotherapy, and combination therapy, respectively. Total doses of the optimal exogenous hormone were 77.76 pg/mL and 48.84 ng/mL for estrogen and progesterone monotherapy, respectively, and 35.58 pg/mL and 21.67 ng/mL for estrogen and progesterone in combination.
The findings were limited by the use of a standard model that does not account for variations in cycle length, the researchers noted. However, the results reflect other studies of hormonal activity, and the model can be used in future studies of the effect of hormones on cycle length, they said.
Overall, the researchers determined that timing dosage with estrogen monotherapy based on their model could provide effective contraception with about 92% of the minimum total constant dosage, while progesterone monotherapy would be effective with approximately 43% of the total constant dose.
Although more work is needed, the current study results may guide clinicians in experimenting with the optimal treatment regimen for anovulation, the researchers said.
“The results presented here give insights on construction of timed devices that give contraception at certain parts of the menstrual cycle,” they concluded.
Model aims to improve women’s control of contraception
“Aside from wanting to contribute to controlling population growth, we aim to empower women more by giving them more control on when to conceive and start motherhood,” and be in control of contraception in a safer way, said lead author Ms. Gavina, in an interview. In addition, studies are showing the noncontraceptive benefits of suppressing ovulation for managing premenstrual syndromes such as breast tenderness and irritability, and for managing diseases such as endometriosis, she said. “Anovulation also lowers the risk of ACL injuries in female athletes,” she added.
Ms. Gavina said that she was surprised primarily by the optimal control result for estrogen monotherapy. “It was surprising that, theoretically, our mathematical model, with the simplifying assumptions, showed that as low as 10% of the total dose in constant administration could achieve contraception as long as the administration of this dosage is perfectly timed, and the timing was also shown in our optimization result,” she said.
“Our model does not capture all factors in contraception, since the reproductive function in women is a very complex multiscale dynamical system highly dependent on both endogenous and exogenous hormones,” Ms. Gavina told this news organization. However, “with the emergence of more data, it can be refined to address other contraception issues. Further, although the results of this study are not directly translatable to the clinical setting, we hope that these results may aid clinicians in identifying the minimum dose and treatment schedule for contraception,” she said.
Future research directions include examining within and between women’s variabilities and adding a pharmacokinetics model to account for the effects of specific drugs, she said. “We also hope to expand or modify the current model to investigate reproductive health concerns in women, such as [polycystic ovary syndrome] and ovarian cysts,” she added.
Ms. Gavina disclosed support from the University of the Philippines Office of International Linkages, a Continuous Operational and Outcomes-based Partnership for Excellence in Research and Academic Training Enhancement grant, and a Commission on Higher Education Faculty Development Program-II scholarship.
Progesterone and estrogen are often used for contraception by preventing ovulation, but the adverse effects associated with large doses of these hormones remain a concern, wrote Brenda Lyn A. Gavina, a PhD candidate at the University of the Philippines Diliman, Quezon City, and colleagues.
In a study published in PLoS Computational Biology, the researchers examined how the timing of hormone administration during a cycle might impact the amount of hormones needed for contraception. Previous research shown that combining hormones can reduce the dosage needed, but the impact of timing on further dose reduction has not been well studied, they said.
The researchers applied optimal control theory in a mathematical model to show the contraceptive effect of estrogen and/or progesterone at different times in the menstrual cycle. The model was based on a normal menstrual cycle with pituitary and ovarian phases. The model assumed that synthesis of luteinizing hormone and follicle-stimulating hormone occurs in the pituitary, that LH and FSH are held in reserve before release into the bloodstream, and that the follicular/luteal mass goes through nine ovarian stages of development. The model also included the activity of ovarian hormones estradiol (E2), progesterone (P4), and inhibin (Inh), in a normal cycle. In the model, LH, FSH, and E2 peaked in the late follicular phase, and P4 and Inh peaked in the luteal phase.
The pituitary model predicted the synthesis, release, and clearance of LH and FSH, and the response of the pituitary to E2, P4, and Inh. The ovarian model predicted the response of E2, P4, and Inh as functions of LH and FSH.
The researchers simulated a constant dose of exogenous progesterone monotherapy and combined exogenous estrogen/progesterone. They determined that a P4 peak of 4.99 ng/mL was taken as the optimum constant dosage for progesterone monotherapy, and for combination estrogen/progesterone.
The researchers then assessed the impact of time on dosage. They found that estrogen administration starting on the first day of a normal cycle preventing FHS from reaching maximum value, and that the low level of FHS in the follicular phase and additional P4 inhibition slowed follicular growth, and use of combination estrogen/progesterone caused similar inhibition at a later follicular stage.
“The combination therapy suggests that time-varying doses of estrogen and progesterone given simultaneously from the start to the end of the 28-day period, only requires a surge in estrogen dose around the 12th day of the cycle (a delayed administration compared to the estrogen monotherapy),” they noted.
With attention to timing, the maximum progesterone levels throughout a menstrual cycle were 4.43 ng/mL, 4.66 ng/mL, and 4.31 ng/mL for estrogen monotherapy, progesterone monotherapy, and combination therapy, respectively. Total doses of the optimal exogenous hormone were 77.76 pg/mL and 48.84 ng/mL for estrogen and progesterone monotherapy, respectively, and 35.58 pg/mL and 21.67 ng/mL for estrogen and progesterone in combination.
The findings were limited by the use of a standard model that does not account for variations in cycle length, the researchers noted. However, the results reflect other studies of hormonal activity, and the model can be used in future studies of the effect of hormones on cycle length, they said.
Overall, the researchers determined that timing dosage with estrogen monotherapy based on their model could provide effective contraception with about 92% of the minimum total constant dosage, while progesterone monotherapy would be effective with approximately 43% of the total constant dose.
Although more work is needed, the current study results may guide clinicians in experimenting with the optimal treatment regimen for anovulation, the researchers said.
“The results presented here give insights on construction of timed devices that give contraception at certain parts of the menstrual cycle,” they concluded.
Model aims to improve women’s control of contraception
“Aside from wanting to contribute to controlling population growth, we aim to empower women more by giving them more control on when to conceive and start motherhood,” and be in control of contraception in a safer way, said lead author Ms. Gavina, in an interview. In addition, studies are showing the noncontraceptive benefits of suppressing ovulation for managing premenstrual syndromes such as breast tenderness and irritability, and for managing diseases such as endometriosis, she said. “Anovulation also lowers the risk of ACL injuries in female athletes,” she added.
Ms. Gavina said that she was surprised primarily by the optimal control result for estrogen monotherapy. “It was surprising that, theoretically, our mathematical model, with the simplifying assumptions, showed that as low as 10% of the total dose in constant administration could achieve contraception as long as the administration of this dosage is perfectly timed, and the timing was also shown in our optimization result,” she said.
“Our model does not capture all factors in contraception, since the reproductive function in women is a very complex multiscale dynamical system highly dependent on both endogenous and exogenous hormones,” Ms. Gavina told this news organization. However, “with the emergence of more data, it can be refined to address other contraception issues. Further, although the results of this study are not directly translatable to the clinical setting, we hope that these results may aid clinicians in identifying the minimum dose and treatment schedule for contraception,” she said.
Future research directions include examining within and between women’s variabilities and adding a pharmacokinetics model to account for the effects of specific drugs, she said. “We also hope to expand or modify the current model to investigate reproductive health concerns in women, such as [polycystic ovary syndrome] and ovarian cysts,” she added.
Ms. Gavina disclosed support from the University of the Philippines Office of International Linkages, a Continuous Operational and Outcomes-based Partnership for Excellence in Research and Academic Training Enhancement grant, and a Commission on Higher Education Faculty Development Program-II scholarship.
FROM PLOS COMPUTATIONAL BIOLOGY
Child’s health improves by applying new obesity guidelines
At age 15 years, Maya was referred by her primary care provider to our pediatric obesity center. She weighed 151 kg and had a body mass index (BMI) over 48 kg/m2. One year earlier, she had been diagnosed with hypertension and prediabetes.
A review of her growth charts showed that she had been in the 95th percentile at age 8 years. Her weight had steadily risen, with an exponential increase of 55 lb between 2020 and 2022, during the COVID-19 pandemic. Her primary care provider monitored her from age 8 to 12 years, providing nutrition and physical activity counseling.
In February, the American Academy of Pediatrics released new clinical practice guidelines for managing childhood obesity. A better understanding of the pathophysiology has challenged the old-worn concept of lack of will power and personal responsibility as the cause of obesity, which has been the basis for weight-related bias and stigma. The updated guidelines have also been influenced by lifestyle intervention studies, the US Food and Drug Administration approval of new anti-obesity medications, and the 2013 designation of obesity as a disease by the American Medical Association.
We used these updated guidelines in our approach to treating Maya.
Starting with the assessment
In the new AAP guidelines, assessing the genetic, environmental, and social-determinant risks for obesity form the basis for evaluation and intervention. Following this approach, we conducted a complete medical evaluation of Maya, including a review of her symptoms and her family history along with a physical examination to assess for comorbidities and other cause of obesity (for example, genetic, hypothyroidism).
We also collected information regarding her diet and behaviors (for example, drinking sweet beverages, fruit and vegetable intake, parent feeding style, portion sizes, emotional eating, hyperphagia), physical activity behaviors (for example, physical education, organized sports), screen time, social drivers of health (for example, food insecurity, neighborhood, school environment), family and household factors (for example, family composition, support, number of caregivers, parenting style) and mental and physical health (autism, attention-deficit/hyperactivity disorder, history of being bullied, developmental and physical disabilities). Because Maya had a BMI of 48, she met the criterion for severe obesity, which is having a BMI at least 120% of the 95th percentile.
The guidelines use BMI as a criterion for screening for obesity because it is inexpensive and easy to obtain in the clinic setting. The Centers for Disease Control and Prevention growth chart uses BMI as well. Recently, there has been controversy about solely using BMI to define obesity, which is a point that the guidelines address by emphasizing evaluation of the whole child along with BMI to make a diagnosis of obesity.
The child’s age and the severity of their obesity drive the evaluation for comorbidities and treatment. In children aged 10 years or older, pediatricians and other primary care providers should evaluate for lipid abnormalities, abnormal glucose metabolism, and abnormal liver function in children and adolescents with obesity (BMI ≥ 95th percentile).
Maya presented with snoring, early-morning headaches, daytime sleepiness, and abdominal pain. A sleep study revealed an apnea-hypopnea index of 15, indicating obstructive sleep apnea, and she was placed on a continuous positive airway pressure machine.
Her laboratory studies showed elevated triglycerides of 169 mg/dL and abnormal ALT (123 IU/L). Potential causes of elevated liver function test results (such as abnormal ceruloplasmin levels or infectious or autoimmune hepatitis) were excluded, and a liver ultrasound with elastography indicated steatohepatitis. Maya was referred to gastroenterology for nonalcoholic fatty liver disease.
Maya experienced depressive symptoms, including difficulty with peer relationships and declining academic performance. Her Patient Health Questionnaire–9 score was 21, with a moderate impact on her daily functioning. Prior attempts at counseling had been sporadic and not helpful. She was diagnosed with intermittent moderate clinical depression, started on a selective serotonin reuptake inhibitor, and resumed counseling with a new therapist.
Considering treatment options
Based on shared decision-making, our team began a more intensive lifestyle behavior treatment as recommended in the updated guidelines. Maya chose to decrease sugar-sweetened beverages as her initial nutrition goal, a change that can lead to a reduction of liver function test results and triglycerides, even in the absence of weight loss.
As emphasized in the guidelines, we stressed the importance of managing obesity and comorbidities concurrently to the family. In addition to lifestyle behavior intervention, once her mental health stabilized, Maya and her mother opted for bariatric surgery. Sleeve gastrectomy was elected because she met the criteria.
If the child already has obesity, the guidelines discourage watchful waiting (that is, the expectation that the child will grow into their weight) as Maya’s primary care provider had done when she was younger. The staged treatment approach where progressively more intensive interventions are adopted (a hallmark of the 2007 guidelines) is no longer recommended. Rather, the primary care provider should offer treatment options guided by age, severity of obesity, and comorbidities.
Maya completed a bariatric preoperative program, extensive mental health evaluation, and tolerated the sleeve gastrectomy well with no complications. At her 6-month postoperative visit, she had lost 99 lb (45 kg) since the surgery, with an 18% decline in BMI. She is taking daily multivitamins as well as calcium and vitamin D. She continues to incorporate healthy eating into her life, with a focus on adequate protein intake and is exercising three to four times per week in the apartment complex gym. She reports better physical and mental health, her school performance has improved, and she still receives regular counseling.
Maya’s story outlines the benefits of early and intensive intervention as recommended by the new AAP guidelines. The shift from some of the earlier recommendations is partly driven by the persistence of childhood obesity into adulthood, especially for older children with serious psychosocial and physical comorbidities. Hopefully by implementing the new guidelines, the physician can provide empathetic, bias-free, and effective care that recognizes the needs and environment of the whole child.
Dr. Salhah is a pediatric endocrinology fellow at Nationwide Children’s Hospital, Columbus, Ohio. Dr. Eneli is director of the Center for Healthy Weight and Nutrition at Nationwide Children’s Hospital. Dr. Salhah reported no conflicts of interest. Dr. Eneli reported receiving research grants and income from the National Institutes of Health, the AAP, and the National Academy of Medicine.
A version of this article first appeared on Medscape.com.
At age 15 years, Maya was referred by her primary care provider to our pediatric obesity center. She weighed 151 kg and had a body mass index (BMI) over 48 kg/m2. One year earlier, she had been diagnosed with hypertension and prediabetes.
A review of her growth charts showed that she had been in the 95th percentile at age 8 years. Her weight had steadily risen, with an exponential increase of 55 lb between 2020 and 2022, during the COVID-19 pandemic. Her primary care provider monitored her from age 8 to 12 years, providing nutrition and physical activity counseling.
In February, the American Academy of Pediatrics released new clinical practice guidelines for managing childhood obesity. A better understanding of the pathophysiology has challenged the old-worn concept of lack of will power and personal responsibility as the cause of obesity, which has been the basis for weight-related bias and stigma. The updated guidelines have also been influenced by lifestyle intervention studies, the US Food and Drug Administration approval of new anti-obesity medications, and the 2013 designation of obesity as a disease by the American Medical Association.
We used these updated guidelines in our approach to treating Maya.
Starting with the assessment
In the new AAP guidelines, assessing the genetic, environmental, and social-determinant risks for obesity form the basis for evaluation and intervention. Following this approach, we conducted a complete medical evaluation of Maya, including a review of her symptoms and her family history along with a physical examination to assess for comorbidities and other cause of obesity (for example, genetic, hypothyroidism).
We also collected information regarding her diet and behaviors (for example, drinking sweet beverages, fruit and vegetable intake, parent feeding style, portion sizes, emotional eating, hyperphagia), physical activity behaviors (for example, physical education, organized sports), screen time, social drivers of health (for example, food insecurity, neighborhood, school environment), family and household factors (for example, family composition, support, number of caregivers, parenting style) and mental and physical health (autism, attention-deficit/hyperactivity disorder, history of being bullied, developmental and physical disabilities). Because Maya had a BMI of 48, she met the criterion for severe obesity, which is having a BMI at least 120% of the 95th percentile.
The guidelines use BMI as a criterion for screening for obesity because it is inexpensive and easy to obtain in the clinic setting. The Centers for Disease Control and Prevention growth chart uses BMI as well. Recently, there has been controversy about solely using BMI to define obesity, which is a point that the guidelines address by emphasizing evaluation of the whole child along with BMI to make a diagnosis of obesity.
The child’s age and the severity of their obesity drive the evaluation for comorbidities and treatment. In children aged 10 years or older, pediatricians and other primary care providers should evaluate for lipid abnormalities, abnormal glucose metabolism, and abnormal liver function in children and adolescents with obesity (BMI ≥ 95th percentile).
Maya presented with snoring, early-morning headaches, daytime sleepiness, and abdominal pain. A sleep study revealed an apnea-hypopnea index of 15, indicating obstructive sleep apnea, and she was placed on a continuous positive airway pressure machine.
Her laboratory studies showed elevated triglycerides of 169 mg/dL and abnormal ALT (123 IU/L). Potential causes of elevated liver function test results (such as abnormal ceruloplasmin levels or infectious or autoimmune hepatitis) were excluded, and a liver ultrasound with elastography indicated steatohepatitis. Maya was referred to gastroenterology for nonalcoholic fatty liver disease.
Maya experienced depressive symptoms, including difficulty with peer relationships and declining academic performance. Her Patient Health Questionnaire–9 score was 21, with a moderate impact on her daily functioning. Prior attempts at counseling had been sporadic and not helpful. She was diagnosed with intermittent moderate clinical depression, started on a selective serotonin reuptake inhibitor, and resumed counseling with a new therapist.
Considering treatment options
Based on shared decision-making, our team began a more intensive lifestyle behavior treatment as recommended in the updated guidelines. Maya chose to decrease sugar-sweetened beverages as her initial nutrition goal, a change that can lead to a reduction of liver function test results and triglycerides, even in the absence of weight loss.
As emphasized in the guidelines, we stressed the importance of managing obesity and comorbidities concurrently to the family. In addition to lifestyle behavior intervention, once her mental health stabilized, Maya and her mother opted for bariatric surgery. Sleeve gastrectomy was elected because she met the criteria.
If the child already has obesity, the guidelines discourage watchful waiting (that is, the expectation that the child will grow into their weight) as Maya’s primary care provider had done when she was younger. The staged treatment approach where progressively more intensive interventions are adopted (a hallmark of the 2007 guidelines) is no longer recommended. Rather, the primary care provider should offer treatment options guided by age, severity of obesity, and comorbidities.
Maya completed a bariatric preoperative program, extensive mental health evaluation, and tolerated the sleeve gastrectomy well with no complications. At her 6-month postoperative visit, she had lost 99 lb (45 kg) since the surgery, with an 18% decline in BMI. She is taking daily multivitamins as well as calcium and vitamin D. She continues to incorporate healthy eating into her life, with a focus on adequate protein intake and is exercising three to four times per week in the apartment complex gym. She reports better physical and mental health, her school performance has improved, and she still receives regular counseling.
Maya’s story outlines the benefits of early and intensive intervention as recommended by the new AAP guidelines. The shift from some of the earlier recommendations is partly driven by the persistence of childhood obesity into adulthood, especially for older children with serious psychosocial and physical comorbidities. Hopefully by implementing the new guidelines, the physician can provide empathetic, bias-free, and effective care that recognizes the needs and environment of the whole child.
Dr. Salhah is a pediatric endocrinology fellow at Nationwide Children’s Hospital, Columbus, Ohio. Dr. Eneli is director of the Center for Healthy Weight and Nutrition at Nationwide Children’s Hospital. Dr. Salhah reported no conflicts of interest. Dr. Eneli reported receiving research grants and income from the National Institutes of Health, the AAP, and the National Academy of Medicine.
A version of this article first appeared on Medscape.com.
At age 15 years, Maya was referred by her primary care provider to our pediatric obesity center. She weighed 151 kg and had a body mass index (BMI) over 48 kg/m2. One year earlier, she had been diagnosed with hypertension and prediabetes.
A review of her growth charts showed that she had been in the 95th percentile at age 8 years. Her weight had steadily risen, with an exponential increase of 55 lb between 2020 and 2022, during the COVID-19 pandemic. Her primary care provider monitored her from age 8 to 12 years, providing nutrition and physical activity counseling.
In February, the American Academy of Pediatrics released new clinical practice guidelines for managing childhood obesity. A better understanding of the pathophysiology has challenged the old-worn concept of lack of will power and personal responsibility as the cause of obesity, which has been the basis for weight-related bias and stigma. The updated guidelines have also been influenced by lifestyle intervention studies, the US Food and Drug Administration approval of new anti-obesity medications, and the 2013 designation of obesity as a disease by the American Medical Association.
We used these updated guidelines in our approach to treating Maya.
Starting with the assessment
In the new AAP guidelines, assessing the genetic, environmental, and social-determinant risks for obesity form the basis for evaluation and intervention. Following this approach, we conducted a complete medical evaluation of Maya, including a review of her symptoms and her family history along with a physical examination to assess for comorbidities and other cause of obesity (for example, genetic, hypothyroidism).
We also collected information regarding her diet and behaviors (for example, drinking sweet beverages, fruit and vegetable intake, parent feeding style, portion sizes, emotional eating, hyperphagia), physical activity behaviors (for example, physical education, organized sports), screen time, social drivers of health (for example, food insecurity, neighborhood, school environment), family and household factors (for example, family composition, support, number of caregivers, parenting style) and mental and physical health (autism, attention-deficit/hyperactivity disorder, history of being bullied, developmental and physical disabilities). Because Maya had a BMI of 48, she met the criterion for severe obesity, which is having a BMI at least 120% of the 95th percentile.
The guidelines use BMI as a criterion for screening for obesity because it is inexpensive and easy to obtain in the clinic setting. The Centers for Disease Control and Prevention growth chart uses BMI as well. Recently, there has been controversy about solely using BMI to define obesity, which is a point that the guidelines address by emphasizing evaluation of the whole child along with BMI to make a diagnosis of obesity.
The child’s age and the severity of their obesity drive the evaluation for comorbidities and treatment. In children aged 10 years or older, pediatricians and other primary care providers should evaluate for lipid abnormalities, abnormal glucose metabolism, and abnormal liver function in children and adolescents with obesity (BMI ≥ 95th percentile).
Maya presented with snoring, early-morning headaches, daytime sleepiness, and abdominal pain. A sleep study revealed an apnea-hypopnea index of 15, indicating obstructive sleep apnea, and she was placed on a continuous positive airway pressure machine.
Her laboratory studies showed elevated triglycerides of 169 mg/dL and abnormal ALT (123 IU/L). Potential causes of elevated liver function test results (such as abnormal ceruloplasmin levels or infectious or autoimmune hepatitis) were excluded, and a liver ultrasound with elastography indicated steatohepatitis. Maya was referred to gastroenterology for nonalcoholic fatty liver disease.
Maya experienced depressive symptoms, including difficulty with peer relationships and declining academic performance. Her Patient Health Questionnaire–9 score was 21, with a moderate impact on her daily functioning. Prior attempts at counseling had been sporadic and not helpful. She was diagnosed with intermittent moderate clinical depression, started on a selective serotonin reuptake inhibitor, and resumed counseling with a new therapist.
Considering treatment options
Based on shared decision-making, our team began a more intensive lifestyle behavior treatment as recommended in the updated guidelines. Maya chose to decrease sugar-sweetened beverages as her initial nutrition goal, a change that can lead to a reduction of liver function test results and triglycerides, even in the absence of weight loss.
As emphasized in the guidelines, we stressed the importance of managing obesity and comorbidities concurrently to the family. In addition to lifestyle behavior intervention, once her mental health stabilized, Maya and her mother opted for bariatric surgery. Sleeve gastrectomy was elected because she met the criteria.
If the child already has obesity, the guidelines discourage watchful waiting (that is, the expectation that the child will grow into their weight) as Maya’s primary care provider had done when she was younger. The staged treatment approach where progressively more intensive interventions are adopted (a hallmark of the 2007 guidelines) is no longer recommended. Rather, the primary care provider should offer treatment options guided by age, severity of obesity, and comorbidities.
Maya completed a bariatric preoperative program, extensive mental health evaluation, and tolerated the sleeve gastrectomy well with no complications. At her 6-month postoperative visit, she had lost 99 lb (45 kg) since the surgery, with an 18% decline in BMI. She is taking daily multivitamins as well as calcium and vitamin D. She continues to incorporate healthy eating into her life, with a focus on adequate protein intake and is exercising three to four times per week in the apartment complex gym. She reports better physical and mental health, her school performance has improved, and she still receives regular counseling.
Maya’s story outlines the benefits of early and intensive intervention as recommended by the new AAP guidelines. The shift from some of the earlier recommendations is partly driven by the persistence of childhood obesity into adulthood, especially for older children with serious psychosocial and physical comorbidities. Hopefully by implementing the new guidelines, the physician can provide empathetic, bias-free, and effective care that recognizes the needs and environment of the whole child.
Dr. Salhah is a pediatric endocrinology fellow at Nationwide Children’s Hospital, Columbus, Ohio. Dr. Eneli is director of the Center for Healthy Weight and Nutrition at Nationwide Children’s Hospital. Dr. Salhah reported no conflicts of interest. Dr. Eneli reported receiving research grants and income from the National Institutes of Health, the AAP, and the National Academy of Medicine.
A version of this article first appeared on Medscape.com.
Predicting BPD vs. bipolar treatment response: New imaging data
A new study identifies specific brain regions involved in treatment response in bipolar disorder (BD) and borderline personality disorder (BPD), potentially paving the way for more targeted treatment.
In a meta-analysis of 34 studies that used neuroimaging to investigate changes in brain activation following psychotherapy and pharmacotherapy for BD and BPD, investigators found most brain regions showing abnormal activation in both conditions improved after treatment. In particular, changes in brain activity after psychotherapy were found primarily in the frontal areas, whereas pharmacotherapy largely altered the limbic areas.
“ ,” senior investigator Xiaoming Li, PhD, professor, department of medical psychology, Anhui Medical University, Hefei, China, told this news organization.
“It may also contribute to the identification of more accurate neuroimaging biomarkers for treatment of the two disorders and to the finding of more effective therapy,” Dr. Li said.
The study was published online in the Journal of Clinical Psychiatry.
Blurred boundary
Dr. Li called BDs and BPDs “difficult to diagnose and differentiate,” noting that the comorbidity rate is “very high.” Underestimating the boundary between BD and BPD “increases the risk of improper or harmful drug exposure,” since mood stabilizing drugs are “considered to be the key therapeutic intervention for BD, while psychotherapy is the key treatment for BPD.”
The “blurred boundary between BD and BPD is one of the reasons it is important to study the relationship between these two diseases,” the authors said.
Previous studies comparing the relationship between BD and BPD “did not explore the similarities and differences in brain mechanisms between these two disorders after treatment,” they pointed out.
Patients with BD have a different disease course and response to therapy, compared to patient with BPD patients. “Misdiagnosis may result in the patients receiving ineffective treatment, so it is particularly important to explore the neural mechanisms of the treatment of these two diseases,” Dr. Li said.
To investigate, the researchers used activation likelihood estimation (ALE) – a technique that examines coordinates of neuroimaging data gleaned from published studies – after searching several databases from inception until June 2021.
This approach was used to “evaluate the similarities and differences in the activation of different brain regions in patients with BD and BPD after treatment with psychotherapy and drug therapy.”
Studies were required to focus on patients with a clinical diagnosis of BD or BPD; neuroimaging studies using functional MRI; coordinates of the peak activations in the stereotactic space of the Montreal Neurologic Institute or Talairach; treatment (pharmacologic or psychological) for patients with BD or BPD; and results of changes in brain activation after treatment, relative to a before-treatment condition.
Of 1,592 records, 34 studies (n = 912 subjects) met inclusion criteria and were selected and used in extracting the activation coordinates. The researchers extracted a total of 186 activity increase points and 90 activity decrease points. After combining these calculations, they found 12 increased activation clusters and 2 decreased activation clusters.
Of the studies, 23 focused on BD and 11 on BPD; 14 used psychotherapy, 18 used drug therapy, and 2 used a combination of both approaches.
Normalizing activation levels
Both treatments were associated with convergent activity increases and decreases in several brain regions: the anterior cingulate cortex, medial frontal gyrus, inferior frontal gyrus, cingulate gyrus, parahippocampal gyrus, and the posterior cingulate cortex.
The researchers then examined studies based on treatment method – psychotherapy or pharmacotherapy and the effect on the two disorders.
“After psychotherapy, the frontal lobe and temporal lobe were the primary brain regions in which activation changed, indicating a top-down effect of this therapy type, while after drug therapy, the limbic area was the region in which activation changed, indicating a ‘bottom-up’ effect,” said Dr. Li.
Dr. Li cited previous research pointing to functional and structural abnormalities in both disorders – especially in the default mode network (DMN) and frontolimbic network.
In particular, alterations in the amygdala and the parahippocampal gyrus are reported more frequently in BPD than in BD, whereas dysfunctional frontolimbic brain regions seem to underlie the emotional dysfunction in BPD. Several studies have also associated the impulsivity of BD with dysfunctions in the interplay of cortical-limbic circuits.
Dr. Li said the study findings suggest “that treatment may change these brain activation levels by acting on the abnormal brain circuit, such as the DMN and the frontolimbic network so as to ‘normalize’ its activity and improve symptoms.”
Specifically, brain regions with abnormally increased activation “showed decreased activation after treatment, and brain regions with abnormally decreased activation showed increased activation after treatment.”
Discrete, overlapping mechanisms
Commenting on the study, Roger S. McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the Mood Disorders Psychopharmacology Unit, said the study “provides additional support for the underlying neurobiological signature of bipolar disorder and a commonly encountered co-occurring condition – borderline personality disorder – having both discrete yet overlapping mechanisms.”
He found it interesting that “medications have a different principal target than psychosocial interventions, which has both academic and clinical implications.
“The academic implication is that we have reasons to believe that we will be in a position to parse the neurobiology of bipolar disorder or borderline personality disorder when we take an approach that isolates specific domains of psychopathology, which is what they [the authors] appear to be doing,” said Dr. McIntyre, who wasn’t associated with this research.
In addition, “from the clinical perspective, this provides a rationale for why we should be integrating pharmacotherapy with psychotherapy in people who have comorbid conditions like borderline personality disorder, which affects 20% of people living with bipolar disorder and 60% to 70% have borderline traits,” he added.
The research was supported by the Anhui Natural Science Foundation and Grants for Scientific Research from Anhui Medical University. Dr. Li and coauthors declared no relevant financial relationships. Dr. McIntyre has received research grant support from CIHR/GACD/National Natural Science Foundation of China and the Milken Institute; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, Atai Life Sciences. Dr. McIntyre is a CEO of Braxia Scientific Corp.
A version of this article first appeared on Medscape.com.
A new study identifies specific brain regions involved in treatment response in bipolar disorder (BD) and borderline personality disorder (BPD), potentially paving the way for more targeted treatment.
In a meta-analysis of 34 studies that used neuroimaging to investigate changes in brain activation following psychotherapy and pharmacotherapy for BD and BPD, investigators found most brain regions showing abnormal activation in both conditions improved after treatment. In particular, changes in brain activity after psychotherapy were found primarily in the frontal areas, whereas pharmacotherapy largely altered the limbic areas.
“ ,” senior investigator Xiaoming Li, PhD, professor, department of medical psychology, Anhui Medical University, Hefei, China, told this news organization.
“It may also contribute to the identification of more accurate neuroimaging biomarkers for treatment of the two disorders and to the finding of more effective therapy,” Dr. Li said.
The study was published online in the Journal of Clinical Psychiatry.
Blurred boundary
Dr. Li called BDs and BPDs “difficult to diagnose and differentiate,” noting that the comorbidity rate is “very high.” Underestimating the boundary between BD and BPD “increases the risk of improper or harmful drug exposure,” since mood stabilizing drugs are “considered to be the key therapeutic intervention for BD, while psychotherapy is the key treatment for BPD.”
The “blurred boundary between BD and BPD is one of the reasons it is important to study the relationship between these two diseases,” the authors said.
Previous studies comparing the relationship between BD and BPD “did not explore the similarities and differences in brain mechanisms between these two disorders after treatment,” they pointed out.
Patients with BD have a different disease course and response to therapy, compared to patient with BPD patients. “Misdiagnosis may result in the patients receiving ineffective treatment, so it is particularly important to explore the neural mechanisms of the treatment of these two diseases,” Dr. Li said.
To investigate, the researchers used activation likelihood estimation (ALE) – a technique that examines coordinates of neuroimaging data gleaned from published studies – after searching several databases from inception until June 2021.
This approach was used to “evaluate the similarities and differences in the activation of different brain regions in patients with BD and BPD after treatment with psychotherapy and drug therapy.”
Studies were required to focus on patients with a clinical diagnosis of BD or BPD; neuroimaging studies using functional MRI; coordinates of the peak activations in the stereotactic space of the Montreal Neurologic Institute or Talairach; treatment (pharmacologic or psychological) for patients with BD or BPD; and results of changes in brain activation after treatment, relative to a before-treatment condition.
Of 1,592 records, 34 studies (n = 912 subjects) met inclusion criteria and were selected and used in extracting the activation coordinates. The researchers extracted a total of 186 activity increase points and 90 activity decrease points. After combining these calculations, they found 12 increased activation clusters and 2 decreased activation clusters.
Of the studies, 23 focused on BD and 11 on BPD; 14 used psychotherapy, 18 used drug therapy, and 2 used a combination of both approaches.
Normalizing activation levels
Both treatments were associated with convergent activity increases and decreases in several brain regions: the anterior cingulate cortex, medial frontal gyrus, inferior frontal gyrus, cingulate gyrus, parahippocampal gyrus, and the posterior cingulate cortex.
The researchers then examined studies based on treatment method – psychotherapy or pharmacotherapy and the effect on the two disorders.
“After psychotherapy, the frontal lobe and temporal lobe were the primary brain regions in which activation changed, indicating a top-down effect of this therapy type, while after drug therapy, the limbic area was the region in which activation changed, indicating a ‘bottom-up’ effect,” said Dr. Li.
Dr. Li cited previous research pointing to functional and structural abnormalities in both disorders – especially in the default mode network (DMN) and frontolimbic network.
In particular, alterations in the amygdala and the parahippocampal gyrus are reported more frequently in BPD than in BD, whereas dysfunctional frontolimbic brain regions seem to underlie the emotional dysfunction in BPD. Several studies have also associated the impulsivity of BD with dysfunctions in the interplay of cortical-limbic circuits.
Dr. Li said the study findings suggest “that treatment may change these brain activation levels by acting on the abnormal brain circuit, such as the DMN and the frontolimbic network so as to ‘normalize’ its activity and improve symptoms.”
Specifically, brain regions with abnormally increased activation “showed decreased activation after treatment, and brain regions with abnormally decreased activation showed increased activation after treatment.”
Discrete, overlapping mechanisms
Commenting on the study, Roger S. McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the Mood Disorders Psychopharmacology Unit, said the study “provides additional support for the underlying neurobiological signature of bipolar disorder and a commonly encountered co-occurring condition – borderline personality disorder – having both discrete yet overlapping mechanisms.”
He found it interesting that “medications have a different principal target than psychosocial interventions, which has both academic and clinical implications.
“The academic implication is that we have reasons to believe that we will be in a position to parse the neurobiology of bipolar disorder or borderline personality disorder when we take an approach that isolates specific domains of psychopathology, which is what they [the authors] appear to be doing,” said Dr. McIntyre, who wasn’t associated with this research.
In addition, “from the clinical perspective, this provides a rationale for why we should be integrating pharmacotherapy with psychotherapy in people who have comorbid conditions like borderline personality disorder, which affects 20% of people living with bipolar disorder and 60% to 70% have borderline traits,” he added.
The research was supported by the Anhui Natural Science Foundation and Grants for Scientific Research from Anhui Medical University. Dr. Li and coauthors declared no relevant financial relationships. Dr. McIntyre has received research grant support from CIHR/GACD/National Natural Science Foundation of China and the Milken Institute; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, Atai Life Sciences. Dr. McIntyre is a CEO of Braxia Scientific Corp.
A version of this article first appeared on Medscape.com.
A new study identifies specific brain regions involved in treatment response in bipolar disorder (BD) and borderline personality disorder (BPD), potentially paving the way for more targeted treatment.
In a meta-analysis of 34 studies that used neuroimaging to investigate changes in brain activation following psychotherapy and pharmacotherapy for BD and BPD, investigators found most brain regions showing abnormal activation in both conditions improved after treatment. In particular, changes in brain activity after psychotherapy were found primarily in the frontal areas, whereas pharmacotherapy largely altered the limbic areas.
“ ,” senior investigator Xiaoming Li, PhD, professor, department of medical psychology, Anhui Medical University, Hefei, China, told this news organization.
“It may also contribute to the identification of more accurate neuroimaging biomarkers for treatment of the two disorders and to the finding of more effective therapy,” Dr. Li said.
The study was published online in the Journal of Clinical Psychiatry.
Blurred boundary
Dr. Li called BDs and BPDs “difficult to diagnose and differentiate,” noting that the comorbidity rate is “very high.” Underestimating the boundary between BD and BPD “increases the risk of improper or harmful drug exposure,” since mood stabilizing drugs are “considered to be the key therapeutic intervention for BD, while psychotherapy is the key treatment for BPD.”
The “blurred boundary between BD and BPD is one of the reasons it is important to study the relationship between these two diseases,” the authors said.
Previous studies comparing the relationship between BD and BPD “did not explore the similarities and differences in brain mechanisms between these two disorders after treatment,” they pointed out.
Patients with BD have a different disease course and response to therapy, compared to patient with BPD patients. “Misdiagnosis may result in the patients receiving ineffective treatment, so it is particularly important to explore the neural mechanisms of the treatment of these two diseases,” Dr. Li said.
To investigate, the researchers used activation likelihood estimation (ALE) – a technique that examines coordinates of neuroimaging data gleaned from published studies – after searching several databases from inception until June 2021.
This approach was used to “evaluate the similarities and differences in the activation of different brain regions in patients with BD and BPD after treatment with psychotherapy and drug therapy.”
Studies were required to focus on patients with a clinical diagnosis of BD or BPD; neuroimaging studies using functional MRI; coordinates of the peak activations in the stereotactic space of the Montreal Neurologic Institute or Talairach; treatment (pharmacologic or psychological) for patients with BD or BPD; and results of changes in brain activation after treatment, relative to a before-treatment condition.
Of 1,592 records, 34 studies (n = 912 subjects) met inclusion criteria and were selected and used in extracting the activation coordinates. The researchers extracted a total of 186 activity increase points and 90 activity decrease points. After combining these calculations, they found 12 increased activation clusters and 2 decreased activation clusters.
Of the studies, 23 focused on BD and 11 on BPD; 14 used psychotherapy, 18 used drug therapy, and 2 used a combination of both approaches.
Normalizing activation levels
Both treatments were associated with convergent activity increases and decreases in several brain regions: the anterior cingulate cortex, medial frontal gyrus, inferior frontal gyrus, cingulate gyrus, parahippocampal gyrus, and the posterior cingulate cortex.
The researchers then examined studies based on treatment method – psychotherapy or pharmacotherapy and the effect on the two disorders.
“After psychotherapy, the frontal lobe and temporal lobe were the primary brain regions in which activation changed, indicating a top-down effect of this therapy type, while after drug therapy, the limbic area was the region in which activation changed, indicating a ‘bottom-up’ effect,” said Dr. Li.
Dr. Li cited previous research pointing to functional and structural abnormalities in both disorders – especially in the default mode network (DMN) and frontolimbic network.
In particular, alterations in the amygdala and the parahippocampal gyrus are reported more frequently in BPD than in BD, whereas dysfunctional frontolimbic brain regions seem to underlie the emotional dysfunction in BPD. Several studies have also associated the impulsivity of BD with dysfunctions in the interplay of cortical-limbic circuits.
Dr. Li said the study findings suggest “that treatment may change these brain activation levels by acting on the abnormal brain circuit, such as the DMN and the frontolimbic network so as to ‘normalize’ its activity and improve symptoms.”
Specifically, brain regions with abnormally increased activation “showed decreased activation after treatment, and brain regions with abnormally decreased activation showed increased activation after treatment.”
Discrete, overlapping mechanisms
Commenting on the study, Roger S. McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the Mood Disorders Psychopharmacology Unit, said the study “provides additional support for the underlying neurobiological signature of bipolar disorder and a commonly encountered co-occurring condition – borderline personality disorder – having both discrete yet overlapping mechanisms.”
He found it interesting that “medications have a different principal target than psychosocial interventions, which has both academic and clinical implications.
“The academic implication is that we have reasons to believe that we will be in a position to parse the neurobiology of bipolar disorder or borderline personality disorder when we take an approach that isolates specific domains of psychopathology, which is what they [the authors] appear to be doing,” said Dr. McIntyre, who wasn’t associated with this research.
In addition, “from the clinical perspective, this provides a rationale for why we should be integrating pharmacotherapy with psychotherapy in people who have comorbid conditions like borderline personality disorder, which affects 20% of people living with bipolar disorder and 60% to 70% have borderline traits,” he added.
The research was supported by the Anhui Natural Science Foundation and Grants for Scientific Research from Anhui Medical University. Dr. Li and coauthors declared no relevant financial relationships. Dr. McIntyre has received research grant support from CIHR/GACD/National Natural Science Foundation of China and the Milken Institute; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, Atai Life Sciences. Dr. McIntyre is a CEO of Braxia Scientific Corp.
A version of this article first appeared on Medscape.com.
FROM THE JOURNAL OF CLINICAL PSYCHIATRY
What new cardiovascular disease risk factors have emerged?
Cardiovascular disease (CVD) is the main cause of premature death and disability in the general population, and according to the World Health Organization, the incidence of CVD is increasing throughout the world. Conventional risk factors that contribute to the occurrence and worsening of CVD have been identified and widely studied. They include high cholesterol levels, high blood pressure, diabetes, obesity, smoking, and lack of physical activity. Despite the introduction of measures to prevent and treat these risk factors with lipid-lowering drugs, antihypertensives, antiplatelet drugs, and anticoagulants, the mortality rate related to CVD remains high.
Despite the effectiveness of many currently available treatment options, there are still significant gaps in risk assessment and treatment of CVD.
They are detailed in an editorial published in The American Journal of Medicine that describes their role and their impact on our cardiovascular health.
Systemic inflammation
The new coronary risk factors include the following diseases characterized by systemic inflammation:
- Gout – Among patients who have experienced a recent flare of gout, the probability of experiencing an acute cardiovascular event such as a myocardial infarction or stroke is increased.
- Rheumatoid arthritis and systemic lupus erythematous – Patients with one or both of these conditions are at higher odds of experiencing concomitant premature and extremely premature coronary artery disease.
- Inflammatory bowel disease (Crohn’s disease or ulcerative colitis) – Patients with this disease have increased odds of developing coronary artery disease.
- Psoriasis – Patients with psoriasis are up to 50% more likely to develop CVD.
Maternal and childhood factors
The following maternal and childhood factors are associated with an increased risk of developing coronary artery disease: gestational diabetes; preeclampsia; delivering a child of low birth weight; preterm delivery; and premature or surgical menopause. The factor or factors that increase the risk of coronary artery disease associated with each of these conditions are not known but may be the result of increased cytokine and oxidative stress.
An unusual and yet unexplained association has been observed between migraine headaches with aura in women and incident CVD.
Also of interest is the association of early life trauma and the risk of adverse cardiovascular outcomes in young and middle-aged individuals who have a history of myocardial infarction.
Transgender patients who present for gender-affirming care are also at increased cardiovascular risk. Among these patients, the increase in coronary artery disease risk may be related to high rates of anxiety and depression.
Environmental factors
Low socioeconomic status has emerged as a risk factor. Increased psychosocial stressors, limited educational and economic opportunities, and lack of peer influence favoring healthier lifestyle choices may be causative elements leading to enhanced coronary artery disease among individuals with low socioeconomic living conditions.
Air pollution was estimated to have caused 9 million deaths worldwide in 2019, with 62% due to CVD and 31.7% to coronary artery disease. Severely polluted environmental aerosols contain several toxic metals, such as lead, mercury, arsenic, and cadmium. Transient exposure to various air pollutants may trigger the onset of an acute coronary syndrome.
Lifestyle factors
Long working hours by patients who have experienced a first myocardial infarction increase the risk for a recurrent event, possibly because of prolonged exposure to work stressors.
Skipping breakfast has been linked to increased cardiovascular and all-cause mortality.
Long-term consumption of drinks containing sugar and artificial sweeteners has also been associated with increased cardiovascular mortality.
Recognizing the presence of one or more of these new risk factors could help prompt and improve behaviors for reducing more conventional CV risk factors to a minimum.
This article was translated from Univadis Italy, which is part of the Medscape Professional Network.
A version of this article first appeared on Medscape.com.
Cardiovascular disease (CVD) is the main cause of premature death and disability in the general population, and according to the World Health Organization, the incidence of CVD is increasing throughout the world. Conventional risk factors that contribute to the occurrence and worsening of CVD have been identified and widely studied. They include high cholesterol levels, high blood pressure, diabetes, obesity, smoking, and lack of physical activity. Despite the introduction of measures to prevent and treat these risk factors with lipid-lowering drugs, antihypertensives, antiplatelet drugs, and anticoagulants, the mortality rate related to CVD remains high.
Despite the effectiveness of many currently available treatment options, there are still significant gaps in risk assessment and treatment of CVD.
They are detailed in an editorial published in The American Journal of Medicine that describes their role and their impact on our cardiovascular health.
Systemic inflammation
The new coronary risk factors include the following diseases characterized by systemic inflammation:
- Gout – Among patients who have experienced a recent flare of gout, the probability of experiencing an acute cardiovascular event such as a myocardial infarction or stroke is increased.
- Rheumatoid arthritis and systemic lupus erythematous – Patients with one or both of these conditions are at higher odds of experiencing concomitant premature and extremely premature coronary artery disease.
- Inflammatory bowel disease (Crohn’s disease or ulcerative colitis) – Patients with this disease have increased odds of developing coronary artery disease.
- Psoriasis – Patients with psoriasis are up to 50% more likely to develop CVD.
Maternal and childhood factors
The following maternal and childhood factors are associated with an increased risk of developing coronary artery disease: gestational diabetes; preeclampsia; delivering a child of low birth weight; preterm delivery; and premature or surgical menopause. The factor or factors that increase the risk of coronary artery disease associated with each of these conditions are not known but may be the result of increased cytokine and oxidative stress.
An unusual and yet unexplained association has been observed between migraine headaches with aura in women and incident CVD.
Also of interest is the association of early life trauma and the risk of adverse cardiovascular outcomes in young and middle-aged individuals who have a history of myocardial infarction.
Transgender patients who present for gender-affirming care are also at increased cardiovascular risk. Among these patients, the increase in coronary artery disease risk may be related to high rates of anxiety and depression.
Environmental factors
Low socioeconomic status has emerged as a risk factor. Increased psychosocial stressors, limited educational and economic opportunities, and lack of peer influence favoring healthier lifestyle choices may be causative elements leading to enhanced coronary artery disease among individuals with low socioeconomic living conditions.
Air pollution was estimated to have caused 9 million deaths worldwide in 2019, with 62% due to CVD and 31.7% to coronary artery disease. Severely polluted environmental aerosols contain several toxic metals, such as lead, mercury, arsenic, and cadmium. Transient exposure to various air pollutants may trigger the onset of an acute coronary syndrome.
Lifestyle factors
Long working hours by patients who have experienced a first myocardial infarction increase the risk for a recurrent event, possibly because of prolonged exposure to work stressors.
Skipping breakfast has been linked to increased cardiovascular and all-cause mortality.
Long-term consumption of drinks containing sugar and artificial sweeteners has also been associated with increased cardiovascular mortality.
Recognizing the presence of one or more of these new risk factors could help prompt and improve behaviors for reducing more conventional CV risk factors to a minimum.
This article was translated from Univadis Italy, which is part of the Medscape Professional Network.
A version of this article first appeared on Medscape.com.
Cardiovascular disease (CVD) is the main cause of premature death and disability in the general population, and according to the World Health Organization, the incidence of CVD is increasing throughout the world. Conventional risk factors that contribute to the occurrence and worsening of CVD have been identified and widely studied. They include high cholesterol levels, high blood pressure, diabetes, obesity, smoking, and lack of physical activity. Despite the introduction of measures to prevent and treat these risk factors with lipid-lowering drugs, antihypertensives, antiplatelet drugs, and anticoagulants, the mortality rate related to CVD remains high.
Despite the effectiveness of many currently available treatment options, there are still significant gaps in risk assessment and treatment of CVD.
They are detailed in an editorial published in The American Journal of Medicine that describes their role and their impact on our cardiovascular health.
Systemic inflammation
The new coronary risk factors include the following diseases characterized by systemic inflammation:
- Gout – Among patients who have experienced a recent flare of gout, the probability of experiencing an acute cardiovascular event such as a myocardial infarction or stroke is increased.
- Rheumatoid arthritis and systemic lupus erythematous – Patients with one or both of these conditions are at higher odds of experiencing concomitant premature and extremely premature coronary artery disease.
- Inflammatory bowel disease (Crohn’s disease or ulcerative colitis) – Patients with this disease have increased odds of developing coronary artery disease.
- Psoriasis – Patients with psoriasis are up to 50% more likely to develop CVD.
Maternal and childhood factors
The following maternal and childhood factors are associated with an increased risk of developing coronary artery disease: gestational diabetes; preeclampsia; delivering a child of low birth weight; preterm delivery; and premature or surgical menopause. The factor or factors that increase the risk of coronary artery disease associated with each of these conditions are not known but may be the result of increased cytokine and oxidative stress.
An unusual and yet unexplained association has been observed between migraine headaches with aura in women and incident CVD.
Also of interest is the association of early life trauma and the risk of adverse cardiovascular outcomes in young and middle-aged individuals who have a history of myocardial infarction.
Transgender patients who present for gender-affirming care are also at increased cardiovascular risk. Among these patients, the increase in coronary artery disease risk may be related to high rates of anxiety and depression.
Environmental factors
Low socioeconomic status has emerged as a risk factor. Increased psychosocial stressors, limited educational and economic opportunities, and lack of peer influence favoring healthier lifestyle choices may be causative elements leading to enhanced coronary artery disease among individuals with low socioeconomic living conditions.
Air pollution was estimated to have caused 9 million deaths worldwide in 2019, with 62% due to CVD and 31.7% to coronary artery disease. Severely polluted environmental aerosols contain several toxic metals, such as lead, mercury, arsenic, and cadmium. Transient exposure to various air pollutants may trigger the onset of an acute coronary syndrome.
Lifestyle factors
Long working hours by patients who have experienced a first myocardial infarction increase the risk for a recurrent event, possibly because of prolonged exposure to work stressors.
Skipping breakfast has been linked to increased cardiovascular and all-cause mortality.
Long-term consumption of drinks containing sugar and artificial sweeteners has also been associated with increased cardiovascular mortality.
Recognizing the presence of one or more of these new risk factors could help prompt and improve behaviors for reducing more conventional CV risk factors to a minimum.
This article was translated from Univadis Italy, which is part of the Medscape Professional Network.
A version of this article first appeared on Medscape.com.
FROM THE AMERICAN JOURNAL OF MEDICINE
Registry data ‘reassure’ on biologics’ heart attack risk in rheumatoid arthritis
MANCHESTER, ENGLAND – Rheumatoid arthritis patients are no more likely to have a heart attack if they are treated with an interleukin-6 inhibitor (IL-6i) than if they are treated with a tumor necrosis factor inhibitor (TNFi), according to data presented at the British Society for Rheumatology annual meeting.
Results of a large analysis from the long-running British Society for Rheumatology Biologics Register–Rheumatoid Arthritis (BSRBR-RA) found no statistical difference in the rate of myocardial infarction (MI), considering treatment in almost 21,000 patients. The overall propensity-score adjusted hazard ratio for MI risk comparing TNFi and IL-6i was 0.77, but the 95% confidence interval crossed the line for statistical significance.
“This result reassures patients and clinical teams about the long-term treatment effects on myocardial infarction in a real-world setting,” said Tian Zixing, a PhD student at the University of Manchester (England).
“Patients with rheumatoid arthritis have an increased risk of myocardial infarction, compared to the general population,” Ms. Tian explained. However, this risk has been “considerably improved” with biologic treatment of rheumatoid arthritis, notably with the TNFi drugs vs. nonbiologic disease-modifying antirheumatic drugs.
The reasoning behind the current analysis was to see if there was any risk associated with IL-6i, as these drugs have been noted to increase low-density cholesterol levels, which in turn can raise the risk for MI.
The study population consisted of all patients registered in the BSRBR-RA over the past 20 years who had started treatment with one of the many TNFi drugs available in the UK – adalimumab (Humira and biosimilars), etanercept (Enbrel), infliximab (Remicade and biosimilars), certolizumab pegol (Cimzia), and golimumab (Simponi) – or the two available drugs that target the effects of IL-6 – tocilizumab (RoActemra, but Actemra in the U.S.), which targets IL-6 itself, and sarilumab (Kevzara), which targets the IL-6 receptor.
Clinical follow-up forms, death certificates, and patient reports confirmed by the clinical team were used to identify patients who experienced a MI, but only MIs that occurred while on treatment were counted.
More than 30,000 lines of therapy in 20,898 patients were recorded. Ms. Tian noted that most (> 90%) patients had been treated with a TNFi across all lines of therapy.
“It is very important to consider the treatment sequence,” she said. “Most patients start first-line treatment with a TNF inhibitor, with only a few patients starting an IL-6 inhibitor,” she noted. “IL-6 inhibitors are more commonly used in the later stages of disease, when more cardiovascular risk factors have accumulated.”
Thus, to ensure that the MI risk was fairly evaluated, the statistical analyses compared TNFi and IL-6i according to the line of treatment. “That means only patients on their first-line treatment will be compared to each other, and only those on their second-line treatment will be compared to each other, and so on,” Ms. Tian explained.
Baseline characteristics were broadly similar for patients treated with TNFi and IL-6i drugs, except for hyperlipidemia, which was higher in patients treated with an IL-6i. Nevertheless, there was no suggestion of any difference in the MI rates after adjustment for cardiovascular risk factors.
There are a lot of strengths to these data, but of course the possibilities of residual confounding and confounding by indication exist, Ms. Tian said. There were also missing data that had to be imputed.
“There has been quite a bit around interleukin-1 blockers being cardiovascular protective,” observed Kenneth Baker, MBChB, PhD, who chaired the RA oral abstracts session during which Ms. Tian presented the findings.
“IL-6 is quite good at suppressing CRP [C-reactive protein],” added Dr. Baker, a senior clinical research fellow at Newcastle University and honorary consultant rheumatologist at Freeman Hospital, both in Newcastle upon Tyne, England.
“You’ve hypothesized or extrapolated that the differences in the lipid levels may not be relevant,” he said to Ms. Tian, “but do you think there might be an extra element going on here?” Maybe IL-6i drugs such as tocilizumab are better at suppressing inflammation, and that counterbalances the effects on lipids, he suggested.
Ms. Tian and Dr. Baker disclosed no relevant financial relationships. The BSRBR-RA is managed by the University of Manchester on behalf of the British Society for Rheumatology. The registry is supported by funding from multiple pharmaceutical companies, including AbbVie, Amgen, Celltrion Healthcare, Eli Lilly, Galapagos, Pfizer, Samsung Bioepis, and Sanofi, and in the past Hospira, Merck Sharp & Dohme, Roche, Sandoz, SOBI, and UCB.
A version of this article originally appeared on Medscape.com.
MANCHESTER, ENGLAND – Rheumatoid arthritis patients are no more likely to have a heart attack if they are treated with an interleukin-6 inhibitor (IL-6i) than if they are treated with a tumor necrosis factor inhibitor (TNFi), according to data presented at the British Society for Rheumatology annual meeting.
Results of a large analysis from the long-running British Society for Rheumatology Biologics Register–Rheumatoid Arthritis (BSRBR-RA) found no statistical difference in the rate of myocardial infarction (MI), considering treatment in almost 21,000 patients. The overall propensity-score adjusted hazard ratio for MI risk comparing TNFi and IL-6i was 0.77, but the 95% confidence interval crossed the line for statistical significance.
“This result reassures patients and clinical teams about the long-term treatment effects on myocardial infarction in a real-world setting,” said Tian Zixing, a PhD student at the University of Manchester (England).
“Patients with rheumatoid arthritis have an increased risk of myocardial infarction, compared to the general population,” Ms. Tian explained. However, this risk has been “considerably improved” with biologic treatment of rheumatoid arthritis, notably with the TNFi drugs vs. nonbiologic disease-modifying antirheumatic drugs.
The reasoning behind the current analysis was to see if there was any risk associated with IL-6i, as these drugs have been noted to increase low-density cholesterol levels, which in turn can raise the risk for MI.
The study population consisted of all patients registered in the BSRBR-RA over the past 20 years who had started treatment with one of the many TNFi drugs available in the UK – adalimumab (Humira and biosimilars), etanercept (Enbrel), infliximab (Remicade and biosimilars), certolizumab pegol (Cimzia), and golimumab (Simponi) – or the two available drugs that target the effects of IL-6 – tocilizumab (RoActemra, but Actemra in the U.S.), which targets IL-6 itself, and sarilumab (Kevzara), which targets the IL-6 receptor.
Clinical follow-up forms, death certificates, and patient reports confirmed by the clinical team were used to identify patients who experienced a MI, but only MIs that occurred while on treatment were counted.
More than 30,000 lines of therapy in 20,898 patients were recorded. Ms. Tian noted that most (> 90%) patients had been treated with a TNFi across all lines of therapy.
“It is very important to consider the treatment sequence,” she said. “Most patients start first-line treatment with a TNF inhibitor, with only a few patients starting an IL-6 inhibitor,” she noted. “IL-6 inhibitors are more commonly used in the later stages of disease, when more cardiovascular risk factors have accumulated.”
Thus, to ensure that the MI risk was fairly evaluated, the statistical analyses compared TNFi and IL-6i according to the line of treatment. “That means only patients on their first-line treatment will be compared to each other, and only those on their second-line treatment will be compared to each other, and so on,” Ms. Tian explained.
Baseline characteristics were broadly similar for patients treated with TNFi and IL-6i drugs, except for hyperlipidemia, which was higher in patients treated with an IL-6i. Nevertheless, there was no suggestion of any difference in the MI rates after adjustment for cardiovascular risk factors.
There are a lot of strengths to these data, but of course the possibilities of residual confounding and confounding by indication exist, Ms. Tian said. There were also missing data that had to be imputed.
“There has been quite a bit around interleukin-1 blockers being cardiovascular protective,” observed Kenneth Baker, MBChB, PhD, who chaired the RA oral abstracts session during which Ms. Tian presented the findings.
“IL-6 is quite good at suppressing CRP [C-reactive protein],” added Dr. Baker, a senior clinical research fellow at Newcastle University and honorary consultant rheumatologist at Freeman Hospital, both in Newcastle upon Tyne, England.
“You’ve hypothesized or extrapolated that the differences in the lipid levels may not be relevant,” he said to Ms. Tian, “but do you think there might be an extra element going on here?” Maybe IL-6i drugs such as tocilizumab are better at suppressing inflammation, and that counterbalances the effects on lipids, he suggested.
Ms. Tian and Dr. Baker disclosed no relevant financial relationships. The BSRBR-RA is managed by the University of Manchester on behalf of the British Society for Rheumatology. The registry is supported by funding from multiple pharmaceutical companies, including AbbVie, Amgen, Celltrion Healthcare, Eli Lilly, Galapagos, Pfizer, Samsung Bioepis, and Sanofi, and in the past Hospira, Merck Sharp & Dohme, Roche, Sandoz, SOBI, and UCB.
A version of this article originally appeared on Medscape.com.
MANCHESTER, ENGLAND – Rheumatoid arthritis patients are no more likely to have a heart attack if they are treated with an interleukin-6 inhibitor (IL-6i) than if they are treated with a tumor necrosis factor inhibitor (TNFi), according to data presented at the British Society for Rheumatology annual meeting.
Results of a large analysis from the long-running British Society for Rheumatology Biologics Register–Rheumatoid Arthritis (BSRBR-RA) found no statistical difference in the rate of myocardial infarction (MI), considering treatment in almost 21,000 patients. The overall propensity-score adjusted hazard ratio for MI risk comparing TNFi and IL-6i was 0.77, but the 95% confidence interval crossed the line for statistical significance.
“This result reassures patients and clinical teams about the long-term treatment effects on myocardial infarction in a real-world setting,” said Tian Zixing, a PhD student at the University of Manchester (England).
“Patients with rheumatoid arthritis have an increased risk of myocardial infarction, compared to the general population,” Ms. Tian explained. However, this risk has been “considerably improved” with biologic treatment of rheumatoid arthritis, notably with the TNFi drugs vs. nonbiologic disease-modifying antirheumatic drugs.
The reasoning behind the current analysis was to see if there was any risk associated with IL-6i, as these drugs have been noted to increase low-density cholesterol levels, which in turn can raise the risk for MI.
The study population consisted of all patients registered in the BSRBR-RA over the past 20 years who had started treatment with one of the many TNFi drugs available in the UK – adalimumab (Humira and biosimilars), etanercept (Enbrel), infliximab (Remicade and biosimilars), certolizumab pegol (Cimzia), and golimumab (Simponi) – or the two available drugs that target the effects of IL-6 – tocilizumab (RoActemra, but Actemra in the U.S.), which targets IL-6 itself, and sarilumab (Kevzara), which targets the IL-6 receptor.
Clinical follow-up forms, death certificates, and patient reports confirmed by the clinical team were used to identify patients who experienced a MI, but only MIs that occurred while on treatment were counted.
More than 30,000 lines of therapy in 20,898 patients were recorded. Ms. Tian noted that most (> 90%) patients had been treated with a TNFi across all lines of therapy.
“It is very important to consider the treatment sequence,” she said. “Most patients start first-line treatment with a TNF inhibitor, with only a few patients starting an IL-6 inhibitor,” she noted. “IL-6 inhibitors are more commonly used in the later stages of disease, when more cardiovascular risk factors have accumulated.”
Thus, to ensure that the MI risk was fairly evaluated, the statistical analyses compared TNFi and IL-6i according to the line of treatment. “That means only patients on their first-line treatment will be compared to each other, and only those on their second-line treatment will be compared to each other, and so on,” Ms. Tian explained.
Baseline characteristics were broadly similar for patients treated with TNFi and IL-6i drugs, except for hyperlipidemia, which was higher in patients treated with an IL-6i. Nevertheless, there was no suggestion of any difference in the MI rates after adjustment for cardiovascular risk factors.
There are a lot of strengths to these data, but of course the possibilities of residual confounding and confounding by indication exist, Ms. Tian said. There were also missing data that had to be imputed.
“There has been quite a bit around interleukin-1 blockers being cardiovascular protective,” observed Kenneth Baker, MBChB, PhD, who chaired the RA oral abstracts session during which Ms. Tian presented the findings.
“IL-6 is quite good at suppressing CRP [C-reactive protein],” added Dr. Baker, a senior clinical research fellow at Newcastle University and honorary consultant rheumatologist at Freeman Hospital, both in Newcastle upon Tyne, England.
“You’ve hypothesized or extrapolated that the differences in the lipid levels may not be relevant,” he said to Ms. Tian, “but do you think there might be an extra element going on here?” Maybe IL-6i drugs such as tocilizumab are better at suppressing inflammation, and that counterbalances the effects on lipids, he suggested.
Ms. Tian and Dr. Baker disclosed no relevant financial relationships. The BSRBR-RA is managed by the University of Manchester on behalf of the British Society for Rheumatology. The registry is supported by funding from multiple pharmaceutical companies, including AbbVie, Amgen, Celltrion Healthcare, Eli Lilly, Galapagos, Pfizer, Samsung Bioepis, and Sanofi, and in the past Hospira, Merck Sharp & Dohme, Roche, Sandoz, SOBI, and UCB.
A version of this article originally appeared on Medscape.com.
AT BSR 2023
Seasonal variation in thyroid hormone TSH may lead to overprescribing
Seasonal variation in one of the hormones used to monitor thyroid function could in turn lead to false diagnoses of subclinical hypothyroidism and unnecessary prescriptions of levothyroxine, according to Yale clinical chemist Joe M. El-Khoury, PhD.
A Japanese study of more than 7,000 healthy individuals showed that thyrotropin-stimulating hormone (TSH) varies widely throughout the seasons, he said, peaking in the northern hemisphere’s winter months (January to February) with its low in the summer months (June to August). That paper was published last year in the Journal of the Endocrine Society.
But free thyroxine (FT4) levels in the Japanese population remained relatively stable, he wrote in a letter recently published in Clinical Chemistry.
“If you end up with a mildly elevated TSH result and a normal FT4, try getting retested 2-3 months later to make sure this is not a seasonal artifact or transient increase before prescribing/taking levothyroxine unnecessarily,” advised Dr. El-Khoury, director of Yale University’s Clinical Chemistry Laboratory, New Haven, Conn.
“Because the [population-based, laboratory] reference ranges don’t account for seasonal variation, we’re flagging a significant number of people as high TSH when they’re normal, and physicians are prescribing levothyroxine inappropriately to healthy people who don’t need it,” he told this news organization, adding that overtreatment can be harmful, particularly for elderly people.
This seasonal variation in TSH could account for between a third to a half of the 90% of all levothyroxine prescriptions that were found to be unnecessary, according to a U.S. study in 2021, Dr. El-Khoury added.
In a comment, Trisha Cubb, MD, said that Dr. El-Khoury’s letter “raises a good point, that we really need to look at our reference ranges, especially when more and more studies are showing that so many thyroid hormone prescriptions may not be necessary.”
Dr. Cubb, thyroid section director and assistant professor of clinical medicine at Weill Cornell Medical College/Houston Methodist Academic Institute, Texas, also agrees with Dr. El-Khoury’s suggestion to repeat lab results in some instances.
“I think repeating results, especially in our patients with subclinical disease, is important,” she noted.
And she pointed out that seasonal variation isn’t the only relevant variable. “We also know that multiple clinical factors like pregnancy status, coexisting comorbidities, or age can all influence what we as clinicians consider an acceptable TSH range in an individual patient.” And other medications, such as steroids, or supplements like biotin, “can all affect thyroid lab values,” she noted.
“Ensuring that minor abnormalities aren’t transient is important prior to initiating medical therapy. With any medical therapy there are possible side effects, along with time, cost, [and] monitoring, all of which can be associated with thyroid hormone replacement.”
TSH reference ranges should be adapted for subpopulations
Dr. El-Khoury explained that to get an idea of how big the seasonal differences in TSH observed in the Japanese study were, “the upper end of the population they tracked goes from 5.2 [mIU/L] in January to 3.4 [mIU/L] in August. So you have almost a 2-unit change in concentration that can happen in the reference population. But laboratory reference ranges, or ‘normal ranges,’ are usually fixed and don’t change by season.”
The higher the TSH, the more likely a person is to have hypothyroidism. Major recent studies have found no benefit of levothyroxine treatment with TSH levels below 7.0-10.0 mIU/L, he said.
“So, I suggest that the limit should be 7.0 [mIU/L] to be safe, but it could be as high as 10 [mIU/L]. In any case, let’s shift the mindset to clinical outcome–based treatment cutoffs,” he said, noting that this approach is currently used for decisions on cholesterol-lowering therapy or vitamin D supplementation, for example.
Regarding this suggestion of using a TSH cutoff of 7 mIU/L to diagnose subclinical hypothyroidism, Dr. Cubb said: “It really depends on the specific population. In an elderly patient, a higher TSH may be of less clinical concern when compared to a female who is actively trying to get pregnant.
“Overall, I think we do need to better understand what appropriate TSH ranges are in specific subpopulations, and then with time, make this more understandable and available for general medicine as well as subspecialty providers to be able to utilize,” she noted.
Regarding the particular Japanese findings cited by Dr. El-Khoury, Dr. Cubb observed that this was a very specific study population, “so we would need more data showing that this is more generalizable.”
And she noted that there’s also diurnal variation in TSH. “In the [Japanese] paper, patients had their thyroid labs drawn between 8:00 a.m. and 9:00 a.m. in a fasting state. Oftentimes in the U.S., thyroid labs are not drawn at specific times or [during] fasting. I think this is one of many factors that should be considered.”
Acknowledging seasonal variation would be a start
But overall, Dr. Cubb said that both the Japanese study and Dr. El-Khoury’s letter highlight “how season, in and of itself, which is not something we usually think about, can affect thyroid lab results. I believe as more data come out, more generalizable data, that’s how evidence-based guidelines are generated over time.”
According to Dr. El-Khoury, fixing the laboratory reference range issues would likely require a joint effort of professional medical societies, reference laboratories, and assay manufacturers. But with seasonal variation, that might be a difficult task.
“The problem is, in laboratory medicine, we don’t have rules for an analyte that changes by season to do anything different. My goal is to get people to at least acknowledge this is a problem and do something,” he concluded.
Dr. El-Khoury and Dr. Cubb have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Seasonal variation in one of the hormones used to monitor thyroid function could in turn lead to false diagnoses of subclinical hypothyroidism and unnecessary prescriptions of levothyroxine, according to Yale clinical chemist Joe M. El-Khoury, PhD.
A Japanese study of more than 7,000 healthy individuals showed that thyrotropin-stimulating hormone (TSH) varies widely throughout the seasons, he said, peaking in the northern hemisphere’s winter months (January to February) with its low in the summer months (June to August). That paper was published last year in the Journal of the Endocrine Society.
But free thyroxine (FT4) levels in the Japanese population remained relatively stable, he wrote in a letter recently published in Clinical Chemistry.
“If you end up with a mildly elevated TSH result and a normal FT4, try getting retested 2-3 months later to make sure this is not a seasonal artifact or transient increase before prescribing/taking levothyroxine unnecessarily,” advised Dr. El-Khoury, director of Yale University’s Clinical Chemistry Laboratory, New Haven, Conn.
“Because the [population-based, laboratory] reference ranges don’t account for seasonal variation, we’re flagging a significant number of people as high TSH when they’re normal, and physicians are prescribing levothyroxine inappropriately to healthy people who don’t need it,” he told this news organization, adding that overtreatment can be harmful, particularly for elderly people.
This seasonal variation in TSH could account for between a third to a half of the 90% of all levothyroxine prescriptions that were found to be unnecessary, according to a U.S. study in 2021, Dr. El-Khoury added.
In a comment, Trisha Cubb, MD, said that Dr. El-Khoury’s letter “raises a good point, that we really need to look at our reference ranges, especially when more and more studies are showing that so many thyroid hormone prescriptions may not be necessary.”
Dr. Cubb, thyroid section director and assistant professor of clinical medicine at Weill Cornell Medical College/Houston Methodist Academic Institute, Texas, also agrees with Dr. El-Khoury’s suggestion to repeat lab results in some instances.
“I think repeating results, especially in our patients with subclinical disease, is important,” she noted.
And she pointed out that seasonal variation isn’t the only relevant variable. “We also know that multiple clinical factors like pregnancy status, coexisting comorbidities, or age can all influence what we as clinicians consider an acceptable TSH range in an individual patient.” And other medications, such as steroids, or supplements like biotin, “can all affect thyroid lab values,” she noted.
“Ensuring that minor abnormalities aren’t transient is important prior to initiating medical therapy. With any medical therapy there are possible side effects, along with time, cost, [and] monitoring, all of which can be associated with thyroid hormone replacement.”
TSH reference ranges should be adapted for subpopulations
Dr. El-Khoury explained that to get an idea of how big the seasonal differences in TSH observed in the Japanese study were, “the upper end of the population they tracked goes from 5.2 [mIU/L] in January to 3.4 [mIU/L] in August. So you have almost a 2-unit change in concentration that can happen in the reference population. But laboratory reference ranges, or ‘normal ranges,’ are usually fixed and don’t change by season.”
The higher the TSH, the more likely a person is to have hypothyroidism. Major recent studies have found no benefit of levothyroxine treatment with TSH levels below 7.0-10.0 mIU/L, he said.
“So, I suggest that the limit should be 7.0 [mIU/L] to be safe, but it could be as high as 10 [mIU/L]. In any case, let’s shift the mindset to clinical outcome–based treatment cutoffs,” he said, noting that this approach is currently used for decisions on cholesterol-lowering therapy or vitamin D supplementation, for example.
Regarding this suggestion of using a TSH cutoff of 7 mIU/L to diagnose subclinical hypothyroidism, Dr. Cubb said: “It really depends on the specific population. In an elderly patient, a higher TSH may be of less clinical concern when compared to a female who is actively trying to get pregnant.
“Overall, I think we do need to better understand what appropriate TSH ranges are in specific subpopulations, and then with time, make this more understandable and available for general medicine as well as subspecialty providers to be able to utilize,” she noted.
Regarding the particular Japanese findings cited by Dr. El-Khoury, Dr. Cubb observed that this was a very specific study population, “so we would need more data showing that this is more generalizable.”
And she noted that there’s also diurnal variation in TSH. “In the [Japanese] paper, patients had their thyroid labs drawn between 8:00 a.m. and 9:00 a.m. in a fasting state. Oftentimes in the U.S., thyroid labs are not drawn at specific times or [during] fasting. I think this is one of many factors that should be considered.”
Acknowledging seasonal variation would be a start
But overall, Dr. Cubb said that both the Japanese study and Dr. El-Khoury’s letter highlight “how season, in and of itself, which is not something we usually think about, can affect thyroid lab results. I believe as more data come out, more generalizable data, that’s how evidence-based guidelines are generated over time.”
According to Dr. El-Khoury, fixing the laboratory reference range issues would likely require a joint effort of professional medical societies, reference laboratories, and assay manufacturers. But with seasonal variation, that might be a difficult task.
“The problem is, in laboratory medicine, we don’t have rules for an analyte that changes by season to do anything different. My goal is to get people to at least acknowledge this is a problem and do something,” he concluded.
Dr. El-Khoury and Dr. Cubb have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Seasonal variation in one of the hormones used to monitor thyroid function could in turn lead to false diagnoses of subclinical hypothyroidism and unnecessary prescriptions of levothyroxine, according to Yale clinical chemist Joe M. El-Khoury, PhD.
A Japanese study of more than 7,000 healthy individuals showed that thyrotropin-stimulating hormone (TSH) varies widely throughout the seasons, he said, peaking in the northern hemisphere’s winter months (January to February) with its low in the summer months (June to August). That paper was published last year in the Journal of the Endocrine Society.
But free thyroxine (FT4) levels in the Japanese population remained relatively stable, he wrote in a letter recently published in Clinical Chemistry.
“If you end up with a mildly elevated TSH result and a normal FT4, try getting retested 2-3 months later to make sure this is not a seasonal artifact or transient increase before prescribing/taking levothyroxine unnecessarily,” advised Dr. El-Khoury, director of Yale University’s Clinical Chemistry Laboratory, New Haven, Conn.
“Because the [population-based, laboratory] reference ranges don’t account for seasonal variation, we’re flagging a significant number of people as high TSH when they’re normal, and physicians are prescribing levothyroxine inappropriately to healthy people who don’t need it,” he told this news organization, adding that overtreatment can be harmful, particularly for elderly people.
This seasonal variation in TSH could account for between a third to a half of the 90% of all levothyroxine prescriptions that were found to be unnecessary, according to a U.S. study in 2021, Dr. El-Khoury added.
In a comment, Trisha Cubb, MD, said that Dr. El-Khoury’s letter “raises a good point, that we really need to look at our reference ranges, especially when more and more studies are showing that so many thyroid hormone prescriptions may not be necessary.”
Dr. Cubb, thyroid section director and assistant professor of clinical medicine at Weill Cornell Medical College/Houston Methodist Academic Institute, Texas, also agrees with Dr. El-Khoury’s suggestion to repeat lab results in some instances.
“I think repeating results, especially in our patients with subclinical disease, is important,” she noted.
And she pointed out that seasonal variation isn’t the only relevant variable. “We also know that multiple clinical factors like pregnancy status, coexisting comorbidities, or age can all influence what we as clinicians consider an acceptable TSH range in an individual patient.” And other medications, such as steroids, or supplements like biotin, “can all affect thyroid lab values,” she noted.
“Ensuring that minor abnormalities aren’t transient is important prior to initiating medical therapy. With any medical therapy there are possible side effects, along with time, cost, [and] monitoring, all of which can be associated with thyroid hormone replacement.”
TSH reference ranges should be adapted for subpopulations
Dr. El-Khoury explained that to get an idea of how big the seasonal differences in TSH observed in the Japanese study were, “the upper end of the population they tracked goes from 5.2 [mIU/L] in January to 3.4 [mIU/L] in August. So you have almost a 2-unit change in concentration that can happen in the reference population. But laboratory reference ranges, or ‘normal ranges,’ are usually fixed and don’t change by season.”
The higher the TSH, the more likely a person is to have hypothyroidism. Major recent studies have found no benefit of levothyroxine treatment with TSH levels below 7.0-10.0 mIU/L, he said.
“So, I suggest that the limit should be 7.0 [mIU/L] to be safe, but it could be as high as 10 [mIU/L]. In any case, let’s shift the mindset to clinical outcome–based treatment cutoffs,” he said, noting that this approach is currently used for decisions on cholesterol-lowering therapy or vitamin D supplementation, for example.
Regarding this suggestion of using a TSH cutoff of 7 mIU/L to diagnose subclinical hypothyroidism, Dr. Cubb said: “It really depends on the specific population. In an elderly patient, a higher TSH may be of less clinical concern when compared to a female who is actively trying to get pregnant.
“Overall, I think we do need to better understand what appropriate TSH ranges are in specific subpopulations, and then with time, make this more understandable and available for general medicine as well as subspecialty providers to be able to utilize,” she noted.
Regarding the particular Japanese findings cited by Dr. El-Khoury, Dr. Cubb observed that this was a very specific study population, “so we would need more data showing that this is more generalizable.”
And she noted that there’s also diurnal variation in TSH. “In the [Japanese] paper, patients had their thyroid labs drawn between 8:00 a.m. and 9:00 a.m. in a fasting state. Oftentimes in the U.S., thyroid labs are not drawn at specific times or [during] fasting. I think this is one of many factors that should be considered.”
Acknowledging seasonal variation would be a start
But overall, Dr. Cubb said that both the Japanese study and Dr. El-Khoury’s letter highlight “how season, in and of itself, which is not something we usually think about, can affect thyroid lab results. I believe as more data come out, more generalizable data, that’s how evidence-based guidelines are generated over time.”
According to Dr. El-Khoury, fixing the laboratory reference range issues would likely require a joint effort of professional medical societies, reference laboratories, and assay manufacturers. But with seasonal variation, that might be a difficult task.
“The problem is, in laboratory medicine, we don’t have rules for an analyte that changes by season to do anything different. My goal is to get people to at least acknowledge this is a problem and do something,” he concluded.
Dr. El-Khoury and Dr. Cubb have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM CLINICAL CHEMISTRY
FDA okays latest artificial pancreas, the MiniMed 780G
The Food and Drug Administration has approved Medtronic Minimed’s 780G automated insulin delivery system with the Guardian 4 sensor.
The latest so-called artificial pancreas system is approved for people aged 7 years and older who have type 1 diabetes. Medtronic will begin taking preorders for the 780G on May 15, 2023. Users of the current MiniMed 770G will be eligible for no-cost remote software upgrades.
The 780G is currently available in 105 countries. It has been available in Europe since 2020 and in the United Kingdom since 2021. It is the first automated insulin delivery system to automatically administer bolus correction insulin doses every 5 minutes to correct meal-related hyperglycemia.
This so-called meal detection technology doesn’t replace manual premeal boluses but does provide extra insulin if the premeal bolus is skipped or is insufficient.
As with other automated systems, the 780G automatically adjusts basal insulin doses up or down based on glucose levels and trends and shuts off insulin delivery to prevent hypoglycemia. The insulin pump’s infusion set can be worn for 7 days, rather than 3 days as with the older system, and the glucose target level can be set as low as 100 mg/dL.
And in contrast to the older MiniMed 670G system, which tended to frequently boot users out of automated mode, with the 780G, users spent an average of 95% of the time in the automated “SmartGuard” mode.
In the pivotal U.S. trial, overall, patients who used the 780G spent 75% of the time in ideal glucose range (70-180 mg/dL) and 1.8% of the time below that range. Overnight, the figures were 82% and 1.5%, respectively. With the glucose target set at 100 mg/dL and active insulin time set to 2 hours, patients spent 78.8% of time in range without increased hyperglycemia.
In the ADAPT study, with the 780G, there was a 26% increase in time in ideal glucose range and a 1.4% reduction in A1c compared with results for patients who received multiple daily insulin injections with intermittently scanned continuous glucose monitoring, without an increase in hypoglycemia. Overnight, time in range increased 30.2%. The results were sustained at 1 year.
A version of this article first appeared on Medscape.com.
The Food and Drug Administration has approved Medtronic Minimed’s 780G automated insulin delivery system with the Guardian 4 sensor.
The latest so-called artificial pancreas system is approved for people aged 7 years and older who have type 1 diabetes. Medtronic will begin taking preorders for the 780G on May 15, 2023. Users of the current MiniMed 770G will be eligible for no-cost remote software upgrades.
The 780G is currently available in 105 countries. It has been available in Europe since 2020 and in the United Kingdom since 2021. It is the first automated insulin delivery system to automatically administer bolus correction insulin doses every 5 minutes to correct meal-related hyperglycemia.
This so-called meal detection technology doesn’t replace manual premeal boluses but does provide extra insulin if the premeal bolus is skipped or is insufficient.
As with other automated systems, the 780G automatically adjusts basal insulin doses up or down based on glucose levels and trends and shuts off insulin delivery to prevent hypoglycemia. The insulin pump’s infusion set can be worn for 7 days, rather than 3 days as with the older system, and the glucose target level can be set as low as 100 mg/dL.
And in contrast to the older MiniMed 670G system, which tended to frequently boot users out of automated mode, with the 780G, users spent an average of 95% of the time in the automated “SmartGuard” mode.
In the pivotal U.S. trial, overall, patients who used the 780G spent 75% of the time in ideal glucose range (70-180 mg/dL) and 1.8% of the time below that range. Overnight, the figures were 82% and 1.5%, respectively. With the glucose target set at 100 mg/dL and active insulin time set to 2 hours, patients spent 78.8% of time in range without increased hyperglycemia.
In the ADAPT study, with the 780G, there was a 26% increase in time in ideal glucose range and a 1.4% reduction in A1c compared with results for patients who received multiple daily insulin injections with intermittently scanned continuous glucose monitoring, without an increase in hypoglycemia. Overnight, time in range increased 30.2%. The results were sustained at 1 year.
A version of this article first appeared on Medscape.com.
The Food and Drug Administration has approved Medtronic Minimed’s 780G automated insulin delivery system with the Guardian 4 sensor.
The latest so-called artificial pancreas system is approved for people aged 7 years and older who have type 1 diabetes. Medtronic will begin taking preorders for the 780G on May 15, 2023. Users of the current MiniMed 770G will be eligible for no-cost remote software upgrades.
The 780G is currently available in 105 countries. It has been available in Europe since 2020 and in the United Kingdom since 2021. It is the first automated insulin delivery system to automatically administer bolus correction insulin doses every 5 minutes to correct meal-related hyperglycemia.
This so-called meal detection technology doesn’t replace manual premeal boluses but does provide extra insulin if the premeal bolus is skipped or is insufficient.
As with other automated systems, the 780G automatically adjusts basal insulin doses up or down based on glucose levels and trends and shuts off insulin delivery to prevent hypoglycemia. The insulin pump’s infusion set can be worn for 7 days, rather than 3 days as with the older system, and the glucose target level can be set as low as 100 mg/dL.
And in contrast to the older MiniMed 670G system, which tended to frequently boot users out of automated mode, with the 780G, users spent an average of 95% of the time in the automated “SmartGuard” mode.
In the pivotal U.S. trial, overall, patients who used the 780G spent 75% of the time in ideal glucose range (70-180 mg/dL) and 1.8% of the time below that range. Overnight, the figures were 82% and 1.5%, respectively. With the glucose target set at 100 mg/dL and active insulin time set to 2 hours, patients spent 78.8% of time in range without increased hyperglycemia.
In the ADAPT study, with the 780G, there was a 26% increase in time in ideal glucose range and a 1.4% reduction in A1c compared with results for patients who received multiple daily insulin injections with intermittently scanned continuous glucose monitoring, without an increase in hypoglycemia. Overnight, time in range increased 30.2%. The results were sustained at 1 year.
A version of this article first appeared on Medscape.com.
VA Stops Rollout of Cerner EHR To Reset Amid Continued Problems
The painful paused and repaused rollout of the new Cerner electronic health record (EHR) system at the US Department of Veterans Affairs (VA) is now halted as the VA announces a “reset.” The decision applies to all planned deployments. An exception is the Captain James A. Lovell Federal Health Care Center in Chicago, the only fully integrated VA and US Department of Defense (DoD) health care system, which is expected go live in March 2024 as planned. The DoD rollout of its Cerner EHR is further along and expected to be completed in 2024.
The new plan is to redirect resources and “prioritize improvements” at the 5 sites currently using the new EHR: Spokane VA Health Care System, VA Walla Walla Health Care, Roseburg VA Health Care System, VA Southern Oregon Health Care, and VA Central Ohio Health Care System. Additional deployments will not be scheduled, the VA says, until it is confident that the new EHR is highly functioning at the current sites and ready to deliver at future sites, as demonstrated by “clear improvements” in the clinician and veteran experience, sustained high performance and high reliability.
“For the past few years, we’ve tried to fix this plane while flying it—and that hasn’t delivered the results that veterans or our staff deserve,” said Neil Evans, MD, acting program executive director at the Electronic Health Record Modernization Integration Office. “This reset changes that. We are going to take the time necessary to get this right for veterans and VA clinicians alike, and that means focusing our resources solely on improving the EHR at the sites where it is currently in use, and improving its fit for VA more broadly. In doing so, we will enhance the EHR for both current and future users, paving the way for successful future deployments.”
The various EHR rollouts around the country have been bumpy from the beginning, operating by fits and starts as new problems surfaced and were addressed. To be fair, the whole implementation process only started in 2020 (and deployed at the first VA hospital during the COVID-19 pandemic), but in that time, the VA has had to, in its own words, “revise the timeline” again and again. The Boise VA Medical Center, for instance, was originally scheduled to go live June 25, 2022, then a month later—then 2023.
The VA Office of the Inspector General published 3 reports last year that found significant issues, including improperly routed clinical orders. VA Secretary Denis McDonough announced last July that the VA would delay EHR deployments until January 2023 to ensure that the system’s issues had been resolved. “During VA’s subsequent investigation at our current sites,” he said, “several additional technical and system issues were identified—including challenges with performance, such as latency and slowness, problems with patient scheduling, referrals, medication management, and other types of medical orders.”
In February, Ken Glueck, executive vice president of Oracle, wrote a blog post that was both apologia and explanation. Modernization, he said, “doesn’t come with a magic wand and there’s no easy button.”
After the DoD moved to Cerner for a new EHR system, the VA decided to follow suit. The goal was to create a “seamless, longitudinal record”—and that was the beginning of the largest health IT modernization project in history, Glueck said. And, although he didn’t mention it, the beginning of one of the VA’s biggest headaches. The problem, Glueck wrote, was that the new project involved “standardizing procedures and workflows that may have been different across 130 VistA implementations at largely autonomous VA medical centers.”
In June 2022—a significant month in the whole rollout process—Cerner was acquired by Oracle. By Glueck’s lights, that meant the VA “now has essentially 2 vendors for the price of one—one with extensive clinical expertise and one with extensive engineering expertise.”
Oracle, he said, “is hard at work to stabilize and improve performance; make fixes to functionality and design issues; improve training and build a better user experience.” He noted that significant improvements to the system’s capacity and performance have included reducing the most severe outage incidents by 67%.
In a recent statement, House VA Committee Chairman Mike Bost (R-IL) and Technology Modernization Subcommittee Chairman Matt Rosendale (R-MT) said, “We support Secretary McDonough’s decision in the strongest possible terms. The best way to get out of a hole is to stop digging, and we’re encouraged that VA and Oracle Cerner have finally realized that.”
VA and Oracle Cerner are currently working toward an amended contract that will "increase Oracle Cerner’s accountability to deliver a high-functioning, high-reliability, world-class EHR system,” the VA says. As part of the re-set, the VA also will work with Congress on resource requirements. The VA estimates FY 2023 costs will be reduced by $400 million.
The painful paused and repaused rollout of the new Cerner electronic health record (EHR) system at the US Department of Veterans Affairs (VA) is now halted as the VA announces a “reset.” The decision applies to all planned deployments. An exception is the Captain James A. Lovell Federal Health Care Center in Chicago, the only fully integrated VA and US Department of Defense (DoD) health care system, which is expected go live in March 2024 as planned. The DoD rollout of its Cerner EHR is further along and expected to be completed in 2024.
The new plan is to redirect resources and “prioritize improvements” at the 5 sites currently using the new EHR: Spokane VA Health Care System, VA Walla Walla Health Care, Roseburg VA Health Care System, VA Southern Oregon Health Care, and VA Central Ohio Health Care System. Additional deployments will not be scheduled, the VA says, until it is confident that the new EHR is highly functioning at the current sites and ready to deliver at future sites, as demonstrated by “clear improvements” in the clinician and veteran experience, sustained high performance and high reliability.
“For the past few years, we’ve tried to fix this plane while flying it—and that hasn’t delivered the results that veterans or our staff deserve,” said Neil Evans, MD, acting program executive director at the Electronic Health Record Modernization Integration Office. “This reset changes that. We are going to take the time necessary to get this right for veterans and VA clinicians alike, and that means focusing our resources solely on improving the EHR at the sites where it is currently in use, and improving its fit for VA more broadly. In doing so, we will enhance the EHR for both current and future users, paving the way for successful future deployments.”
The various EHR rollouts around the country have been bumpy from the beginning, operating by fits and starts as new problems surfaced and were addressed. To be fair, the whole implementation process only started in 2020 (and deployed at the first VA hospital during the COVID-19 pandemic), but in that time, the VA has had to, in its own words, “revise the timeline” again and again. The Boise VA Medical Center, for instance, was originally scheduled to go live June 25, 2022, then a month later—then 2023.
The VA Office of the Inspector General published 3 reports last year that found significant issues, including improperly routed clinical orders. VA Secretary Denis McDonough announced last July that the VA would delay EHR deployments until January 2023 to ensure that the system’s issues had been resolved. “During VA’s subsequent investigation at our current sites,” he said, “several additional technical and system issues were identified—including challenges with performance, such as latency and slowness, problems with patient scheduling, referrals, medication management, and other types of medical orders.”
In February, Ken Glueck, executive vice president of Oracle, wrote a blog post that was both apologia and explanation. Modernization, he said, “doesn’t come with a magic wand and there’s no easy button.”
After the DoD moved to Cerner for a new EHR system, the VA decided to follow suit. The goal was to create a “seamless, longitudinal record”—and that was the beginning of the largest health IT modernization project in history, Glueck said. And, although he didn’t mention it, the beginning of one of the VA’s biggest headaches. The problem, Glueck wrote, was that the new project involved “standardizing procedures and workflows that may have been different across 130 VistA implementations at largely autonomous VA medical centers.”
In June 2022—a significant month in the whole rollout process—Cerner was acquired by Oracle. By Glueck’s lights, that meant the VA “now has essentially 2 vendors for the price of one—one with extensive clinical expertise and one with extensive engineering expertise.”
Oracle, he said, “is hard at work to stabilize and improve performance; make fixes to functionality and design issues; improve training and build a better user experience.” He noted that significant improvements to the system’s capacity and performance have included reducing the most severe outage incidents by 67%.
In a recent statement, House VA Committee Chairman Mike Bost (R-IL) and Technology Modernization Subcommittee Chairman Matt Rosendale (R-MT) said, “We support Secretary McDonough’s decision in the strongest possible terms. The best way to get out of a hole is to stop digging, and we’re encouraged that VA and Oracle Cerner have finally realized that.”
VA and Oracle Cerner are currently working toward an amended contract that will "increase Oracle Cerner’s accountability to deliver a high-functioning, high-reliability, world-class EHR system,” the VA says. As part of the re-set, the VA also will work with Congress on resource requirements. The VA estimates FY 2023 costs will be reduced by $400 million.
The painful paused and repaused rollout of the new Cerner electronic health record (EHR) system at the US Department of Veterans Affairs (VA) is now halted as the VA announces a “reset.” The decision applies to all planned deployments. An exception is the Captain James A. Lovell Federal Health Care Center in Chicago, the only fully integrated VA and US Department of Defense (DoD) health care system, which is expected go live in March 2024 as planned. The DoD rollout of its Cerner EHR is further along and expected to be completed in 2024.
The new plan is to redirect resources and “prioritize improvements” at the 5 sites currently using the new EHR: Spokane VA Health Care System, VA Walla Walla Health Care, Roseburg VA Health Care System, VA Southern Oregon Health Care, and VA Central Ohio Health Care System. Additional deployments will not be scheduled, the VA says, until it is confident that the new EHR is highly functioning at the current sites and ready to deliver at future sites, as demonstrated by “clear improvements” in the clinician and veteran experience, sustained high performance and high reliability.
“For the past few years, we’ve tried to fix this plane while flying it—and that hasn’t delivered the results that veterans or our staff deserve,” said Neil Evans, MD, acting program executive director at the Electronic Health Record Modernization Integration Office. “This reset changes that. We are going to take the time necessary to get this right for veterans and VA clinicians alike, and that means focusing our resources solely on improving the EHR at the sites where it is currently in use, and improving its fit for VA more broadly. In doing so, we will enhance the EHR for both current and future users, paving the way for successful future deployments.”
The various EHR rollouts around the country have been bumpy from the beginning, operating by fits and starts as new problems surfaced and were addressed. To be fair, the whole implementation process only started in 2020 (and deployed at the first VA hospital during the COVID-19 pandemic), but in that time, the VA has had to, in its own words, “revise the timeline” again and again. The Boise VA Medical Center, for instance, was originally scheduled to go live June 25, 2022, then a month later—then 2023.
The VA Office of the Inspector General published 3 reports last year that found significant issues, including improperly routed clinical orders. VA Secretary Denis McDonough announced last July that the VA would delay EHR deployments until January 2023 to ensure that the system’s issues had been resolved. “During VA’s subsequent investigation at our current sites,” he said, “several additional technical and system issues were identified—including challenges with performance, such as latency and slowness, problems with patient scheduling, referrals, medication management, and other types of medical orders.”
In February, Ken Glueck, executive vice president of Oracle, wrote a blog post that was both apologia and explanation. Modernization, he said, “doesn’t come with a magic wand and there’s no easy button.”
After the DoD moved to Cerner for a new EHR system, the VA decided to follow suit. The goal was to create a “seamless, longitudinal record”—and that was the beginning of the largest health IT modernization project in history, Glueck said. And, although he didn’t mention it, the beginning of one of the VA’s biggest headaches. The problem, Glueck wrote, was that the new project involved “standardizing procedures and workflows that may have been different across 130 VistA implementations at largely autonomous VA medical centers.”
In June 2022—a significant month in the whole rollout process—Cerner was acquired by Oracle. By Glueck’s lights, that meant the VA “now has essentially 2 vendors for the price of one—one with extensive clinical expertise and one with extensive engineering expertise.”
Oracle, he said, “is hard at work to stabilize and improve performance; make fixes to functionality and design issues; improve training and build a better user experience.” He noted that significant improvements to the system’s capacity and performance have included reducing the most severe outage incidents by 67%.
In a recent statement, House VA Committee Chairman Mike Bost (R-IL) and Technology Modernization Subcommittee Chairman Matt Rosendale (R-MT) said, “We support Secretary McDonough’s decision in the strongest possible terms. The best way to get out of a hole is to stop digging, and we’re encouraged that VA and Oracle Cerner have finally realized that.”
VA and Oracle Cerner are currently working toward an amended contract that will "increase Oracle Cerner’s accountability to deliver a high-functioning, high-reliability, world-class EHR system,” the VA says. As part of the re-set, the VA also will work with Congress on resource requirements. The VA estimates FY 2023 costs will be reduced by $400 million.
Guidelines for assessing cancer risk may need updating
The authors of the clinical trial suggest that these guidelines may need to be revised.
Individuals with hereditary breast and ovarian cancer (HBOC) have an 80% lifetime risk of breast cancer and are at greater risk of ovarian cancer, pancreatic cancer, prostate cancer, and melanoma. Those with Lynch syndrome (LS) have an 80% lifetime risk of colorectal cancer, a 60% lifetime risk of endometrial cancer, and heightened risk of upper gastrointestinal, urinary tract, skin, and other tumors, said study coauthor N. Jewel Samadder, MD in a statement.
The National Cancer Control Network has guidelines for determining family risk for colorectal cancer and breast, ovarian, and pancreatic cancer to identify individuals who should be screened for LS and HBOC, but these rely on personal and family health histories.
“These criteria were created at a time when genetic testing was cost prohibitive and thus aimed to identify those at the greatest chance of being a mutation carrier in the absence of population-wide whole-exome sequencing. However, [LS and HBOC] are poorly identified in current practice, and many patients are not aware of their cancer risk,” said Dr. Samadder, professor of medicine and coleader of the precision oncology program at the Mayo Clinic Comprehensive Cancer Center, Phoenix, in the statement.
Whole-exome sequencing covers only protein-coding regions of the genome, which is less than 2% of the total genome but includes more than 85% of known disease-related genetic variants, according to Emily Gay, who presented the trial results (Abstract 5768) on April 18 at the annual meeting of the American Association for Cancer Research.
“In recent years, the cost of whole-exome sequencing has been rapidly decreasing, allowing us to complete this test on saliva samples from thousands, if not tens of thousands of patients covering large populations and large health systems,” said Ms. Gay, a genetic counseling graduate student at the University of Arizona, during her presentation.
She described results from the TAPESTRY clinical trial, with 44,306 participants from Mayo Clinic centers in Arizona, Florida, and Minnesota, who were identified as definitely or likely to be harboring pathogenic mutations and consented to whole-exome sequencing from saliva samples. They used electronic health records to determine whether patients would satisfy the testing criteria from NCCN guidelines.
The researchers identified 1.24% of participants to be carriers of HBOC or LS. Of the HBOC carriers, 62.8% were female, and of the LS carriers, 62.6% were female. The percentages of HBOC and LS carriers who were White were 88.6 and 94.5, respectively. The median age of both groups was 57 years. Of HBOC carriers, 47.3% had personal histories of cancers; for LS carries, the percentage was 44.2.
Of HBOC carriers, 49.1% had been previously unaware of their genetic condition, while an even higher percentage of patients with LS – 59.3% – fell into that category. Thirty-two percent of those with HBOC and 56.2% of those with LS would not have qualified for screening using the relevant NCCN guidelines.
“Most strikingly,” 63.8% of individuals with mutations in the MSH6 gene and 83.7% of those mutations in the PMS2 gene would not have met NCCN criteria, Ms. Gay said.
Having a cancer type not known to be related to a genetic syndrome was a reason for 58.6% of individuals failing to meet NCCN guidelines, while 60.5% did not meet the guidelines because of an insufficient number of relatives known to have a history of cancer, and 63.3% did not because they had no personal history of cancer. Among individuals with a pathogenic mutation who met NCCN criteria, 34% were not aware of their condition.
“This suggests that the NCCN guidelines are underutilized in clinical practice, potentially due to the busy schedule of clinicians or because the complexity of using these criteria,” said Ms. Gay.
The numbers were even more striking among minorities: “There is additional data analysis and research needed in this area, but based on our preliminary findings, we saw that nearly 50% of the individuals who are [part of an underrepresented minority group] did not meet criteria, compared with 32% of the white cohort,” said Ms. Gay.
Asked what new NCCN guidelines should be, Ms. Gay replied: “I think maybe limiting the number of relatives that you have to have with a certain type of cancer, especially as we see families get smaller and smaller, especially in the United States – that family data isn’t necessarily available or as useful. And then also, I think, incorporating in the size of a family into the calculation, so more of maybe a point-based system like we see with other genetic conditions rather than a ‘yes you meet or no, you don’t.’ More of a range to say ‘you fall on the low-risk, medium-risk, or high-risk stage,’” said Ms. Gay.
During the Q&A period, session cochair Andrew Godwin, PhD, who is a professor of molecular oncology and pathology at University of Kansas Medical Center, Kansas City, said he wondered if whole-exome sequencing was capable of picking up cancer risk mutations that standard targeted tests don’t look for.
Dr. Samadder, who was in the audience, answered the question, saying that targeted tests are actually better at picking up some types of mutations like intronic mutations, single-nucleotide polymorphisms, and deletions.
“There are some limitations to whole-exome sequencing. Our estimate here of 1.2% [of participants carrying HBOC or LS mutations] is probably an underestimate. There are additional variants that exome sequencing probably doesn’t pick up easily or as well. That’s why we qualify that exome sequencing is a screening test, not a diagnostic,” he continued.
Ms. Gay and Dr. Samadder have no relevant financial disclosures. Dr. Godwin has financial relationships with Clara Biotech, VITRAC Therapeutics, and Sinochips Diagnostics.
The authors of the clinical trial suggest that these guidelines may need to be revised.
Individuals with hereditary breast and ovarian cancer (HBOC) have an 80% lifetime risk of breast cancer and are at greater risk of ovarian cancer, pancreatic cancer, prostate cancer, and melanoma. Those with Lynch syndrome (LS) have an 80% lifetime risk of colorectal cancer, a 60% lifetime risk of endometrial cancer, and heightened risk of upper gastrointestinal, urinary tract, skin, and other tumors, said study coauthor N. Jewel Samadder, MD in a statement.
The National Cancer Control Network has guidelines for determining family risk for colorectal cancer and breast, ovarian, and pancreatic cancer to identify individuals who should be screened for LS and HBOC, but these rely on personal and family health histories.
“These criteria were created at a time when genetic testing was cost prohibitive and thus aimed to identify those at the greatest chance of being a mutation carrier in the absence of population-wide whole-exome sequencing. However, [LS and HBOC] are poorly identified in current practice, and many patients are not aware of their cancer risk,” said Dr. Samadder, professor of medicine and coleader of the precision oncology program at the Mayo Clinic Comprehensive Cancer Center, Phoenix, in the statement.
Whole-exome sequencing covers only protein-coding regions of the genome, which is less than 2% of the total genome but includes more than 85% of known disease-related genetic variants, according to Emily Gay, who presented the trial results (Abstract 5768) on April 18 at the annual meeting of the American Association for Cancer Research.
“In recent years, the cost of whole-exome sequencing has been rapidly decreasing, allowing us to complete this test on saliva samples from thousands, if not tens of thousands of patients covering large populations and large health systems,” said Ms. Gay, a genetic counseling graduate student at the University of Arizona, during her presentation.
She described results from the TAPESTRY clinical trial, with 44,306 participants from Mayo Clinic centers in Arizona, Florida, and Minnesota, who were identified as definitely or likely to be harboring pathogenic mutations and consented to whole-exome sequencing from saliva samples. They used electronic health records to determine whether patients would satisfy the testing criteria from NCCN guidelines.
The researchers identified 1.24% of participants to be carriers of HBOC or LS. Of the HBOC carriers, 62.8% were female, and of the LS carriers, 62.6% were female. The percentages of HBOC and LS carriers who were White were 88.6 and 94.5, respectively. The median age of both groups was 57 years. Of HBOC carriers, 47.3% had personal histories of cancers; for LS carries, the percentage was 44.2.
Of HBOC carriers, 49.1% had been previously unaware of their genetic condition, while an even higher percentage of patients with LS – 59.3% – fell into that category. Thirty-two percent of those with HBOC and 56.2% of those with LS would not have qualified for screening using the relevant NCCN guidelines.
“Most strikingly,” 63.8% of individuals with mutations in the MSH6 gene and 83.7% of those mutations in the PMS2 gene would not have met NCCN criteria, Ms. Gay said.
Having a cancer type not known to be related to a genetic syndrome was a reason for 58.6% of individuals failing to meet NCCN guidelines, while 60.5% did not meet the guidelines because of an insufficient number of relatives known to have a history of cancer, and 63.3% did not because they had no personal history of cancer. Among individuals with a pathogenic mutation who met NCCN criteria, 34% were not aware of their condition.
“This suggests that the NCCN guidelines are underutilized in clinical practice, potentially due to the busy schedule of clinicians or because the complexity of using these criteria,” said Ms. Gay.
The numbers were even more striking among minorities: “There is additional data analysis and research needed in this area, but based on our preliminary findings, we saw that nearly 50% of the individuals who are [part of an underrepresented minority group] did not meet criteria, compared with 32% of the white cohort,” said Ms. Gay.
Asked what new NCCN guidelines should be, Ms. Gay replied: “I think maybe limiting the number of relatives that you have to have with a certain type of cancer, especially as we see families get smaller and smaller, especially in the United States – that family data isn’t necessarily available or as useful. And then also, I think, incorporating in the size of a family into the calculation, so more of maybe a point-based system like we see with other genetic conditions rather than a ‘yes you meet or no, you don’t.’ More of a range to say ‘you fall on the low-risk, medium-risk, or high-risk stage,’” said Ms. Gay.
During the Q&A period, session cochair Andrew Godwin, PhD, who is a professor of molecular oncology and pathology at University of Kansas Medical Center, Kansas City, said he wondered if whole-exome sequencing was capable of picking up cancer risk mutations that standard targeted tests don’t look for.
Dr. Samadder, who was in the audience, answered the question, saying that targeted tests are actually better at picking up some types of mutations like intronic mutations, single-nucleotide polymorphisms, and deletions.
“There are some limitations to whole-exome sequencing. Our estimate here of 1.2% [of participants carrying HBOC or LS mutations] is probably an underestimate. There are additional variants that exome sequencing probably doesn’t pick up easily or as well. That’s why we qualify that exome sequencing is a screening test, not a diagnostic,” he continued.
Ms. Gay and Dr. Samadder have no relevant financial disclosures. Dr. Godwin has financial relationships with Clara Biotech, VITRAC Therapeutics, and Sinochips Diagnostics.
The authors of the clinical trial suggest that these guidelines may need to be revised.
Individuals with hereditary breast and ovarian cancer (HBOC) have an 80% lifetime risk of breast cancer and are at greater risk of ovarian cancer, pancreatic cancer, prostate cancer, and melanoma. Those with Lynch syndrome (LS) have an 80% lifetime risk of colorectal cancer, a 60% lifetime risk of endometrial cancer, and heightened risk of upper gastrointestinal, urinary tract, skin, and other tumors, said study coauthor N. Jewel Samadder, MD in a statement.
The National Cancer Control Network has guidelines for determining family risk for colorectal cancer and breast, ovarian, and pancreatic cancer to identify individuals who should be screened for LS and HBOC, but these rely on personal and family health histories.
“These criteria were created at a time when genetic testing was cost prohibitive and thus aimed to identify those at the greatest chance of being a mutation carrier in the absence of population-wide whole-exome sequencing. However, [LS and HBOC] are poorly identified in current practice, and many patients are not aware of their cancer risk,” said Dr. Samadder, professor of medicine and coleader of the precision oncology program at the Mayo Clinic Comprehensive Cancer Center, Phoenix, in the statement.
Whole-exome sequencing covers only protein-coding regions of the genome, which is less than 2% of the total genome but includes more than 85% of known disease-related genetic variants, according to Emily Gay, who presented the trial results (Abstract 5768) on April 18 at the annual meeting of the American Association for Cancer Research.
“In recent years, the cost of whole-exome sequencing has been rapidly decreasing, allowing us to complete this test on saliva samples from thousands, if not tens of thousands of patients covering large populations and large health systems,” said Ms. Gay, a genetic counseling graduate student at the University of Arizona, during her presentation.
She described results from the TAPESTRY clinical trial, with 44,306 participants from Mayo Clinic centers in Arizona, Florida, and Minnesota, who were identified as definitely or likely to be harboring pathogenic mutations and consented to whole-exome sequencing from saliva samples. They used electronic health records to determine whether patients would satisfy the testing criteria from NCCN guidelines.
The researchers identified 1.24% of participants to be carriers of HBOC or LS. Of the HBOC carriers, 62.8% were female, and of the LS carriers, 62.6% were female. The percentages of HBOC and LS carriers who were White were 88.6 and 94.5, respectively. The median age of both groups was 57 years. Of HBOC carriers, 47.3% had personal histories of cancers; for LS carries, the percentage was 44.2.
Of HBOC carriers, 49.1% had been previously unaware of their genetic condition, while an even higher percentage of patients with LS – 59.3% – fell into that category. Thirty-two percent of those with HBOC and 56.2% of those with LS would not have qualified for screening using the relevant NCCN guidelines.
“Most strikingly,” 63.8% of individuals with mutations in the MSH6 gene and 83.7% of those mutations in the PMS2 gene would not have met NCCN criteria, Ms. Gay said.
Having a cancer type not known to be related to a genetic syndrome was a reason for 58.6% of individuals failing to meet NCCN guidelines, while 60.5% did not meet the guidelines because of an insufficient number of relatives known to have a history of cancer, and 63.3% did not because they had no personal history of cancer. Among individuals with a pathogenic mutation who met NCCN criteria, 34% were not aware of their condition.
“This suggests that the NCCN guidelines are underutilized in clinical practice, potentially due to the busy schedule of clinicians or because the complexity of using these criteria,” said Ms. Gay.
The numbers were even more striking among minorities: “There is additional data analysis and research needed in this area, but based on our preliminary findings, we saw that nearly 50% of the individuals who are [part of an underrepresented minority group] did not meet criteria, compared with 32% of the white cohort,” said Ms. Gay.
Asked what new NCCN guidelines should be, Ms. Gay replied: “I think maybe limiting the number of relatives that you have to have with a certain type of cancer, especially as we see families get smaller and smaller, especially in the United States – that family data isn’t necessarily available or as useful. And then also, I think, incorporating in the size of a family into the calculation, so more of maybe a point-based system like we see with other genetic conditions rather than a ‘yes you meet or no, you don’t.’ More of a range to say ‘you fall on the low-risk, medium-risk, or high-risk stage,’” said Ms. Gay.
During the Q&A period, session cochair Andrew Godwin, PhD, who is a professor of molecular oncology and pathology at University of Kansas Medical Center, Kansas City, said he wondered if whole-exome sequencing was capable of picking up cancer risk mutations that standard targeted tests don’t look for.
Dr. Samadder, who was in the audience, answered the question, saying that targeted tests are actually better at picking up some types of mutations like intronic mutations, single-nucleotide polymorphisms, and deletions.
“There are some limitations to whole-exome sequencing. Our estimate here of 1.2% [of participants carrying HBOC or LS mutations] is probably an underestimate. There are additional variants that exome sequencing probably doesn’t pick up easily or as well. That’s why we qualify that exome sequencing is a screening test, not a diagnostic,” he continued.
Ms. Gay and Dr. Samadder have no relevant financial disclosures. Dr. Godwin has financial relationships with Clara Biotech, VITRAC Therapeutics, and Sinochips Diagnostics.
FROM AACR 2023