User login
Weight gain linked to cancer survival in men and women
Cancer cachexia is a syndrome of weight loss that frequently occurs during cancer treatment. Consequences can include skeletal muscle loss, fatigue, functional impairment, worse quality of life, and worse survival. On the other hand, weight gain during cancer treatment has been tied to better survival.
“The finding that weight gain occurred in subsets of males and females is a new observation. The fact that weight gain occurs in cancer patients during anticancer treatment could confound results of clinical [trials] evaluating novel anticachexia treatments. Simultaneously studying longitudinal body weights and serum and cellular biomarkers in cancer patients might provide insights into mechanisms involved in cachexia. Increased understanding of mechanisms driving cachexia could lead to new therapeutic strategies,” said study coauthor Philip Bonomi, MD, who is an oncologist at Rush Medical College, Chicago.
“This data, although it appears to be very basic, is critically important, especially as we consider our novel interventions in the treatment of cancer cachexia,” said Eric Roeland, MD, during his presentation of the study at the annual meeting of European Society for Medical Oncology. Dr. Roeland is a medical oncologist at Oregon Health & Science University, Portland.
Dr. Roeland is also the lead author of cancer cachexia guidelines published by the American Society of Clinical Oncology in 2020. The guidelines suggest that dietary counseling can be offered to patients, but warns against routine use of enteral feeding tubes and parenteral nutrition. Although no specific drug can be recommended for cancer cachexia, progesterone analogs and corticosteroids used over the short term (weeks) can be used on a trial base to improve appetite and weight gain. While not approved in the United States, anamorelin was approved in 2020 in Japan for cancer cachexia in NSCLC, gastric cancer, pancreatic cancer, and colorectal cancer.
The new study should raise awareness of the importance of adverse effects of cancer treatments, said Karin Jordan, MD, University Hospital Heidelberg (Germany). She served as a discussant following the presentation. “As a medical oncologist, we focus a bit too much on the benefits of antineoplastic therapy, both on cure and on the survival benefit. But what is also very, very important to do is a balanced oncology treatment to focus on the risks of oncology therapies,” she said.
The study is limited by its retrospective nature and potential for bias. “The hypothesis that weight gain leads to improved survival is not really proven as it likewise may be the other way around,” Dr. Jordan said.
However, in oncology research, a phenomenon called the “obesity paradox” is increasingly catching the interest of investigators. Observational studies have shown that overweight patients with certain cancers (specifically, colorectal, endometrial and lung cancer). actually have improved overall survival as compared with normal-weight patients.
Details from the new study
The researchers pooled data 1,030 patients who participated in three phase 3 clinical trials conducted between 2005 and 2011. The patients all received platinum-based chemotherapy as part of control arms. 304 were female and 726 were male. The median age was 62. 16.7% were Asian, the mean body mass index was 24.6 kg/m2, 88.5% had stage 4 disease, 36.9% had adenocarcinoma, and 86.3% were current or former smokers.
Males and females had similar magnitudes and rate of weight gain over the course of treatment. Any weight gain was associated with improved overall survival in both males (12.7 vs. 8.0 months; hazard ratio, 0.60; P < .001) and females (16.2 vs. 10.1 months; HR, 0.65; P = .0028). Patients who had a weight gain of 2.5% of body weight or more saw an improvement in overall survival in both males (14.0 vs. 8.2 months; HR, 0.57; P < .001) and females (16.7 vs. 11.3 months; HR, 0.61; P = .0041).
Patients with a weight gain of 5% or more was associated with improved survival in males (13.6 vs. 8.9 months; HR, 0.62; P = .0001), but there was no statistically significant association in females (16.7 vs. 12.6 months; HR, 0.69; P = .1107).
Regardless of weight-gain status, males had lower survival rates than females. All of the associations were independent of smoking status.
The study was funded by Pfizer. Dr. Bonomi has received honoraria from Pfizer and Helsinn for participation in scientific advisory boards. Dr. Jordan has consulted for Amgen, Hexal, Riemser, Helsinn, Voluntis, Pfizer, and BD Solution. She has received research funding from Deutsche Krebshilfe. She has received honoraria from MSD, Merck, Amgen, Hexal, Riemser, Helsinn, Voluntis, Pfizer, Pomme-med, PharmaMar, arttemoi, OnkoUpdate, Stemline, and Roche.
Cancer cachexia is a syndrome of weight loss that frequently occurs during cancer treatment. Consequences can include skeletal muscle loss, fatigue, functional impairment, worse quality of life, and worse survival. On the other hand, weight gain during cancer treatment has been tied to better survival.
“The finding that weight gain occurred in subsets of males and females is a new observation. The fact that weight gain occurs in cancer patients during anticancer treatment could confound results of clinical [trials] evaluating novel anticachexia treatments. Simultaneously studying longitudinal body weights and serum and cellular biomarkers in cancer patients might provide insights into mechanisms involved in cachexia. Increased understanding of mechanisms driving cachexia could lead to new therapeutic strategies,” said study coauthor Philip Bonomi, MD, who is an oncologist at Rush Medical College, Chicago.
“This data, although it appears to be very basic, is critically important, especially as we consider our novel interventions in the treatment of cancer cachexia,” said Eric Roeland, MD, during his presentation of the study at the annual meeting of European Society for Medical Oncology. Dr. Roeland is a medical oncologist at Oregon Health & Science University, Portland.
Dr. Roeland is also the lead author of cancer cachexia guidelines published by the American Society of Clinical Oncology in 2020. The guidelines suggest that dietary counseling can be offered to patients, but warns against routine use of enteral feeding tubes and parenteral nutrition. Although no specific drug can be recommended for cancer cachexia, progesterone analogs and corticosteroids used over the short term (weeks) can be used on a trial base to improve appetite and weight gain. While not approved in the United States, anamorelin was approved in 2020 in Japan for cancer cachexia in NSCLC, gastric cancer, pancreatic cancer, and colorectal cancer.
The new study should raise awareness of the importance of adverse effects of cancer treatments, said Karin Jordan, MD, University Hospital Heidelberg (Germany). She served as a discussant following the presentation. “As a medical oncologist, we focus a bit too much on the benefits of antineoplastic therapy, both on cure and on the survival benefit. But what is also very, very important to do is a balanced oncology treatment to focus on the risks of oncology therapies,” she said.
The study is limited by its retrospective nature and potential for bias. “The hypothesis that weight gain leads to improved survival is not really proven as it likewise may be the other way around,” Dr. Jordan said.
However, in oncology research, a phenomenon called the “obesity paradox” is increasingly catching the interest of investigators. Observational studies have shown that overweight patients with certain cancers (specifically, colorectal, endometrial and lung cancer). actually have improved overall survival as compared with normal-weight patients.
Details from the new study
The researchers pooled data 1,030 patients who participated in three phase 3 clinical trials conducted between 2005 and 2011. The patients all received platinum-based chemotherapy as part of control arms. 304 were female and 726 were male. The median age was 62. 16.7% were Asian, the mean body mass index was 24.6 kg/m2, 88.5% had stage 4 disease, 36.9% had adenocarcinoma, and 86.3% were current or former smokers.
Males and females had similar magnitudes and rate of weight gain over the course of treatment. Any weight gain was associated with improved overall survival in both males (12.7 vs. 8.0 months; hazard ratio, 0.60; P < .001) and females (16.2 vs. 10.1 months; HR, 0.65; P = .0028). Patients who had a weight gain of 2.5% of body weight or more saw an improvement in overall survival in both males (14.0 vs. 8.2 months; HR, 0.57; P < .001) and females (16.7 vs. 11.3 months; HR, 0.61; P = .0041).
Patients with a weight gain of 5% or more was associated with improved survival in males (13.6 vs. 8.9 months; HR, 0.62; P = .0001), but there was no statistically significant association in females (16.7 vs. 12.6 months; HR, 0.69; P = .1107).
Regardless of weight-gain status, males had lower survival rates than females. All of the associations were independent of smoking status.
The study was funded by Pfizer. Dr. Bonomi has received honoraria from Pfizer and Helsinn for participation in scientific advisory boards. Dr. Jordan has consulted for Amgen, Hexal, Riemser, Helsinn, Voluntis, Pfizer, and BD Solution. She has received research funding from Deutsche Krebshilfe. She has received honoraria from MSD, Merck, Amgen, Hexal, Riemser, Helsinn, Voluntis, Pfizer, Pomme-med, PharmaMar, arttemoi, OnkoUpdate, Stemline, and Roche.
Cancer cachexia is a syndrome of weight loss that frequently occurs during cancer treatment. Consequences can include skeletal muscle loss, fatigue, functional impairment, worse quality of life, and worse survival. On the other hand, weight gain during cancer treatment has been tied to better survival.
“The finding that weight gain occurred in subsets of males and females is a new observation. The fact that weight gain occurs in cancer patients during anticancer treatment could confound results of clinical [trials] evaluating novel anticachexia treatments. Simultaneously studying longitudinal body weights and serum and cellular biomarkers in cancer patients might provide insights into mechanisms involved in cachexia. Increased understanding of mechanisms driving cachexia could lead to new therapeutic strategies,” said study coauthor Philip Bonomi, MD, who is an oncologist at Rush Medical College, Chicago.
“This data, although it appears to be very basic, is critically important, especially as we consider our novel interventions in the treatment of cancer cachexia,” said Eric Roeland, MD, during his presentation of the study at the annual meeting of European Society for Medical Oncology. Dr. Roeland is a medical oncologist at Oregon Health & Science University, Portland.
Dr. Roeland is also the lead author of cancer cachexia guidelines published by the American Society of Clinical Oncology in 2020. The guidelines suggest that dietary counseling can be offered to patients, but warns against routine use of enteral feeding tubes and parenteral nutrition. Although no specific drug can be recommended for cancer cachexia, progesterone analogs and corticosteroids used over the short term (weeks) can be used on a trial base to improve appetite and weight gain. While not approved in the United States, anamorelin was approved in 2020 in Japan for cancer cachexia in NSCLC, gastric cancer, pancreatic cancer, and colorectal cancer.
The new study should raise awareness of the importance of adverse effects of cancer treatments, said Karin Jordan, MD, University Hospital Heidelberg (Germany). She served as a discussant following the presentation. “As a medical oncologist, we focus a bit too much on the benefits of antineoplastic therapy, both on cure and on the survival benefit. But what is also very, very important to do is a balanced oncology treatment to focus on the risks of oncology therapies,” she said.
The study is limited by its retrospective nature and potential for bias. “The hypothesis that weight gain leads to improved survival is not really proven as it likewise may be the other way around,” Dr. Jordan said.
However, in oncology research, a phenomenon called the “obesity paradox” is increasingly catching the interest of investigators. Observational studies have shown that overweight patients with certain cancers (specifically, colorectal, endometrial and lung cancer). actually have improved overall survival as compared with normal-weight patients.
Details from the new study
The researchers pooled data 1,030 patients who participated in three phase 3 clinical trials conducted between 2005 and 2011. The patients all received platinum-based chemotherapy as part of control arms. 304 were female and 726 were male. The median age was 62. 16.7% were Asian, the mean body mass index was 24.6 kg/m2, 88.5% had stage 4 disease, 36.9% had adenocarcinoma, and 86.3% were current or former smokers.
Males and females had similar magnitudes and rate of weight gain over the course of treatment. Any weight gain was associated with improved overall survival in both males (12.7 vs. 8.0 months; hazard ratio, 0.60; P < .001) and females (16.2 vs. 10.1 months; HR, 0.65; P = .0028). Patients who had a weight gain of 2.5% of body weight or more saw an improvement in overall survival in both males (14.0 vs. 8.2 months; HR, 0.57; P < .001) and females (16.7 vs. 11.3 months; HR, 0.61; P = .0041).
Patients with a weight gain of 5% or more was associated with improved survival in males (13.6 vs. 8.9 months; HR, 0.62; P = .0001), but there was no statistically significant association in females (16.7 vs. 12.6 months; HR, 0.69; P = .1107).
Regardless of weight-gain status, males had lower survival rates than females. All of the associations were independent of smoking status.
The study was funded by Pfizer. Dr. Bonomi has received honoraria from Pfizer and Helsinn for participation in scientific advisory boards. Dr. Jordan has consulted for Amgen, Hexal, Riemser, Helsinn, Voluntis, Pfizer, and BD Solution. She has received research funding from Deutsche Krebshilfe. She has received honoraria from MSD, Merck, Amgen, Hexal, Riemser, Helsinn, Voluntis, Pfizer, Pomme-med, PharmaMar, arttemoi, OnkoUpdate, Stemline, and Roche.
FROM ESMO CONGRESS 2022
Is vitamin B12 protective against Parkinson’s disease?
A high baseline intake of vitamin B12 is linked to lower risk of developing Parkinson’s disease, new research suggests. “The results leave the door open for the possibility that vitamin B12 may have a beneficial effect in protecting against Parkinson’s disease,” said lead author Mario H. Flores, PhD, a research fellow at Harvard T.H. Chan School of Public Health, Boston.
The findings were presented at the International Congress of Parkinson’s Disease and Movement Disorders.
B vitamins and Parkinson’s disease
Previous preclinical studies have suggested that B vitamins protect against Parkinson’s disease by decreasing plasma homocysteine levels and through other neuroprotective effects. However, there have been only two epidemiologic studies of B vitamins in Parkinson’s disease – and their results were inconsistent, Dr. Flores noted.
The new study included 80,965 women from the Nurses’ Health Study and 48,837 men from the Health Professionals Follow-up Study. All completed a food frequency questionnaire at baseline and every 4 years.
Researchers collected information on dietary, supplemental, and total intake of folate, vitamin B6, and vitamin B12 over the course of about 30 years up to 2012. They estimated hazard ratios and 95% confidence intervals for Parkinson’s disease according to quintiles of cumulative average intake.
During follow-up, 495 women and 621 men were diagnosed with Parkinson’s disease.
The investigators adjusted for potential confounders, including age, year, smoking status, physical activity, intake of alcohol or caffeine, hormone use (in women), intake of dairy and flavonoids, and Mediterranean diet score.
Analyses of cumulative average total folate, B6, and B12 intake were not associated with Parkinson’s disease risk. “The results of the primary analysis of cumulative intake were not significant for any of the vitamins we looked at,” said Dr. Flores.
Researchers also conducted secondary analyses, including assessment of how the most recent intake of B vitamins related to Parkinson’s disease risk. This analysis also did not find a significant association.
However, when examining baseline intake of vitamin B12, “we saw some suggestion for a potential inverse association with Parkinson’s disease,” Dr. Flores said.
Among individuals with higher total intake of vitamin B12, there was a lower risk for Parkinson’s disease (pooled hazard ratio for top vs. bottom quintile, 0.74; 95% confidence interval [CI], 0.60-0.89; P for trend, .001). Intake from both diet and supplements contributed to this inverse association, the investigators noted.
Dietary sources of vitamin B12 include poultry, meat, fish, and dairy products; however, the main sources in this study were multivitamins/supplements and enriched foods such as cereals, said Dr. Flores.
Several limitations
In an attempt to overcome risk for reverse causality, the researchers examined B12 intake during four lagged exposure periods: 8-, 12-, 16- and 20-year lags. They found a significant relationship between intake for the 20-year lag time and development of Parkinson’s disease.
Overall, the study results provide support for a possible protective effect of early intake of vitamin B12 on the development of Parkinson’s disease, Dr. Flores noted.
In addition to being involved in the regulation of homocysteine levels, vitamin B12 may help regulate leucine-rich repeat kinase 2 (LRRK2), an enzyme implicated in the pathogenesis of Parkinson’s disease, he said.
However, the study did not examine how B12 deficiency might relate to risk for Parkinson’s disease, which “is something worth looking at in future studies,” said Dr. Flores.
He added that although findings from a single study cannot translate into recommendations on ideal vitamin B12 intake to prevent or delay Parkinson’s disease onset, the median intake in the highest quintile that the study linked to less Parkinson’s disease risk was 18 mcg/d at baseline. The amount in multivitamins can vary from 5 to 25 mcg.
Dr. Flores said a limitation of the study was that it included U.S. health care professionals, “most of whom arguably have very good nutritional status.”
As well, assessment of vitamin B intake was self-reported, so there might have been measurement error – and there may have been an unmeasured confounding factor that could explain the associations.
Dr. Flores also stressed that the effect of B12 on Parkinson’s disease risk “was very modest,” and the results need to be confirmed in other studies “ideally looking at circulating levels of vitamin B12.”
Not ready to recommend
Commenting on the research, Michael S. Okun, MD, medical adviser at the Parkinson’s Foundation and professor and director of the Norman Fixel Institute for Neurological Diseases at the University of Florida, Gainesville, noted that other recent studies have suggested high-dose B12 may be preventive and a possible treatment in Parkinson’s disease.
“Although only a secondary aim of the current study, there was a reported potential benefit” in this new study, too, said Dr. Okun, who was not involved with the research.
However, the evidence is still not strong enough to change prescribing habits, he noted. “We do not recommend high-dose B12 either for those at genetic risk of Parkinson’s or those already with the disease,” Dr. Okun said.
He added that because multiple recent studies have questioned the beneficial effects for multivitamin combinations used to prevent neurologic diseases, “it wasn’t surprising to see results showing a lack of protection against later-onset Parkinson’s disease with [cumulative] folate, B6, and B12 intake” in the current study.
The study was supported by the NIH. Dr. Flores and Dr. Okun have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A high baseline intake of vitamin B12 is linked to lower risk of developing Parkinson’s disease, new research suggests. “The results leave the door open for the possibility that vitamin B12 may have a beneficial effect in protecting against Parkinson’s disease,” said lead author Mario H. Flores, PhD, a research fellow at Harvard T.H. Chan School of Public Health, Boston.
The findings were presented at the International Congress of Parkinson’s Disease and Movement Disorders.
B vitamins and Parkinson’s disease
Previous preclinical studies have suggested that B vitamins protect against Parkinson’s disease by decreasing plasma homocysteine levels and through other neuroprotective effects. However, there have been only two epidemiologic studies of B vitamins in Parkinson’s disease – and their results were inconsistent, Dr. Flores noted.
The new study included 80,965 women from the Nurses’ Health Study and 48,837 men from the Health Professionals Follow-up Study. All completed a food frequency questionnaire at baseline and every 4 years.
Researchers collected information on dietary, supplemental, and total intake of folate, vitamin B6, and vitamin B12 over the course of about 30 years up to 2012. They estimated hazard ratios and 95% confidence intervals for Parkinson’s disease according to quintiles of cumulative average intake.
During follow-up, 495 women and 621 men were diagnosed with Parkinson’s disease.
The investigators adjusted for potential confounders, including age, year, smoking status, physical activity, intake of alcohol or caffeine, hormone use (in women), intake of dairy and flavonoids, and Mediterranean diet score.
Analyses of cumulative average total folate, B6, and B12 intake were not associated with Parkinson’s disease risk. “The results of the primary analysis of cumulative intake were not significant for any of the vitamins we looked at,” said Dr. Flores.
Researchers also conducted secondary analyses, including assessment of how the most recent intake of B vitamins related to Parkinson’s disease risk. This analysis also did not find a significant association.
However, when examining baseline intake of vitamin B12, “we saw some suggestion for a potential inverse association with Parkinson’s disease,” Dr. Flores said.
Among individuals with higher total intake of vitamin B12, there was a lower risk for Parkinson’s disease (pooled hazard ratio for top vs. bottom quintile, 0.74; 95% confidence interval [CI], 0.60-0.89; P for trend, .001). Intake from both diet and supplements contributed to this inverse association, the investigators noted.
Dietary sources of vitamin B12 include poultry, meat, fish, and dairy products; however, the main sources in this study were multivitamins/supplements and enriched foods such as cereals, said Dr. Flores.
Several limitations
In an attempt to overcome risk for reverse causality, the researchers examined B12 intake during four lagged exposure periods: 8-, 12-, 16- and 20-year lags. They found a significant relationship between intake for the 20-year lag time and development of Parkinson’s disease.
Overall, the study results provide support for a possible protective effect of early intake of vitamin B12 on the development of Parkinson’s disease, Dr. Flores noted.
In addition to being involved in the regulation of homocysteine levels, vitamin B12 may help regulate leucine-rich repeat kinase 2 (LRRK2), an enzyme implicated in the pathogenesis of Parkinson’s disease, he said.
However, the study did not examine how B12 deficiency might relate to risk for Parkinson’s disease, which “is something worth looking at in future studies,” said Dr. Flores.
He added that although findings from a single study cannot translate into recommendations on ideal vitamin B12 intake to prevent or delay Parkinson’s disease onset, the median intake in the highest quintile that the study linked to less Parkinson’s disease risk was 18 mcg/d at baseline. The amount in multivitamins can vary from 5 to 25 mcg.
Dr. Flores said a limitation of the study was that it included U.S. health care professionals, “most of whom arguably have very good nutritional status.”
As well, assessment of vitamin B intake was self-reported, so there might have been measurement error – and there may have been an unmeasured confounding factor that could explain the associations.
Dr. Flores also stressed that the effect of B12 on Parkinson’s disease risk “was very modest,” and the results need to be confirmed in other studies “ideally looking at circulating levels of vitamin B12.”
Not ready to recommend
Commenting on the research, Michael S. Okun, MD, medical adviser at the Parkinson’s Foundation and professor and director of the Norman Fixel Institute for Neurological Diseases at the University of Florida, Gainesville, noted that other recent studies have suggested high-dose B12 may be preventive and a possible treatment in Parkinson’s disease.
“Although only a secondary aim of the current study, there was a reported potential benefit” in this new study, too, said Dr. Okun, who was not involved with the research.
However, the evidence is still not strong enough to change prescribing habits, he noted. “We do not recommend high-dose B12 either for those at genetic risk of Parkinson’s or those already with the disease,” Dr. Okun said.
He added that because multiple recent studies have questioned the beneficial effects for multivitamin combinations used to prevent neurologic diseases, “it wasn’t surprising to see results showing a lack of protection against later-onset Parkinson’s disease with [cumulative] folate, B6, and B12 intake” in the current study.
The study was supported by the NIH. Dr. Flores and Dr. Okun have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A high baseline intake of vitamin B12 is linked to lower risk of developing Parkinson’s disease, new research suggests. “The results leave the door open for the possibility that vitamin B12 may have a beneficial effect in protecting against Parkinson’s disease,” said lead author Mario H. Flores, PhD, a research fellow at Harvard T.H. Chan School of Public Health, Boston.
The findings were presented at the International Congress of Parkinson’s Disease and Movement Disorders.
B vitamins and Parkinson’s disease
Previous preclinical studies have suggested that B vitamins protect against Parkinson’s disease by decreasing plasma homocysteine levels and through other neuroprotective effects. However, there have been only two epidemiologic studies of B vitamins in Parkinson’s disease – and their results were inconsistent, Dr. Flores noted.
The new study included 80,965 women from the Nurses’ Health Study and 48,837 men from the Health Professionals Follow-up Study. All completed a food frequency questionnaire at baseline and every 4 years.
Researchers collected information on dietary, supplemental, and total intake of folate, vitamin B6, and vitamin B12 over the course of about 30 years up to 2012. They estimated hazard ratios and 95% confidence intervals for Parkinson’s disease according to quintiles of cumulative average intake.
During follow-up, 495 women and 621 men were diagnosed with Parkinson’s disease.
The investigators adjusted for potential confounders, including age, year, smoking status, physical activity, intake of alcohol or caffeine, hormone use (in women), intake of dairy and flavonoids, and Mediterranean diet score.
Analyses of cumulative average total folate, B6, and B12 intake were not associated with Parkinson’s disease risk. “The results of the primary analysis of cumulative intake were not significant for any of the vitamins we looked at,” said Dr. Flores.
Researchers also conducted secondary analyses, including assessment of how the most recent intake of B vitamins related to Parkinson’s disease risk. This analysis also did not find a significant association.
However, when examining baseline intake of vitamin B12, “we saw some suggestion for a potential inverse association with Parkinson’s disease,” Dr. Flores said.
Among individuals with higher total intake of vitamin B12, there was a lower risk for Parkinson’s disease (pooled hazard ratio for top vs. bottom quintile, 0.74; 95% confidence interval [CI], 0.60-0.89; P for trend, .001). Intake from both diet and supplements contributed to this inverse association, the investigators noted.
Dietary sources of vitamin B12 include poultry, meat, fish, and dairy products; however, the main sources in this study were multivitamins/supplements and enriched foods such as cereals, said Dr. Flores.
Several limitations
In an attempt to overcome risk for reverse causality, the researchers examined B12 intake during four lagged exposure periods: 8-, 12-, 16- and 20-year lags. They found a significant relationship between intake for the 20-year lag time and development of Parkinson’s disease.
Overall, the study results provide support for a possible protective effect of early intake of vitamin B12 on the development of Parkinson’s disease, Dr. Flores noted.
In addition to being involved in the regulation of homocysteine levels, vitamin B12 may help regulate leucine-rich repeat kinase 2 (LRRK2), an enzyme implicated in the pathogenesis of Parkinson’s disease, he said.
However, the study did not examine how B12 deficiency might relate to risk for Parkinson’s disease, which “is something worth looking at in future studies,” said Dr. Flores.
He added that although findings from a single study cannot translate into recommendations on ideal vitamin B12 intake to prevent or delay Parkinson’s disease onset, the median intake in the highest quintile that the study linked to less Parkinson’s disease risk was 18 mcg/d at baseline. The amount in multivitamins can vary from 5 to 25 mcg.
Dr. Flores said a limitation of the study was that it included U.S. health care professionals, “most of whom arguably have very good nutritional status.”
As well, assessment of vitamin B intake was self-reported, so there might have been measurement error – and there may have been an unmeasured confounding factor that could explain the associations.
Dr. Flores also stressed that the effect of B12 on Parkinson’s disease risk “was very modest,” and the results need to be confirmed in other studies “ideally looking at circulating levels of vitamin B12.”
Not ready to recommend
Commenting on the research, Michael S. Okun, MD, medical adviser at the Parkinson’s Foundation and professor and director of the Norman Fixel Institute for Neurological Diseases at the University of Florida, Gainesville, noted that other recent studies have suggested high-dose B12 may be preventive and a possible treatment in Parkinson’s disease.
“Although only a secondary aim of the current study, there was a reported potential benefit” in this new study, too, said Dr. Okun, who was not involved with the research.
However, the evidence is still not strong enough to change prescribing habits, he noted. “We do not recommend high-dose B12 either for those at genetic risk of Parkinson’s or those already with the disease,” Dr. Okun said.
He added that because multiple recent studies have questioned the beneficial effects for multivitamin combinations used to prevent neurologic diseases, “it wasn’t surprising to see results showing a lack of protection against later-onset Parkinson’s disease with [cumulative] folate, B6, and B12 intake” in the current study.
The study was supported by the NIH. Dr. Flores and Dr. Okun have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM MDS 2022
No such thing as an easy fix
Recently an article crossed my screen that drinking 4 cups of tea per day lowered the risk of type 2 diabetes by 17%. As these thing always seem to, it ended with a variant of “further research is needed.”
Encouraging? Sure. Definite? Nope.
I’ve seen plenty of articles suggesting coffee and/or tea have health benefits, though specifically on what varies, from lifespan to lowering the risk of a chronic medical condition (in this case, type 2 diabetes).
There are always numerous variables that aren’t clear. What kind of tea? Decaf or regular? Hot or iced? When you say cup, what do you mean? A lot of people, including me, probably consider anything smaller that a Starbucks grande to be for wimps.
While I can’t think of any off the top of my head, there’s probably a reasonable chance that, if I looked, I could find something that says coffee or tea are bad for you in some way, too.
Not that I’m planning on changing my already caffeinated drinking habits, which is probably the crux of these things for most of us. In a given day I have 1-2 cups of coffee and 3-4 bottles of diet green tea. Maybe 1-2 Diet Cokes in there some days. In winter more hot black tea. I’m probably a poster child for methylyxanthine toxicity.
I have no idea if all that coffee and tea are doing anything besides keeping me awake and alert for my patients. If they are, I certainly hope they’re lowering my risk of something bad.
Articles like this always get attention, and are often picked up by the general media. People love to think something so simple as drinking more tea or coffee would make a big difference in their lives. So it gets forwarded, people never read past the first paragraph or two, and don’t make it to the “further research is needed” line.
If an article ever came out refuting it, it probably wouldn’t get nearly as much press (who wants to read bad news?) and would be quickly forgotten outside of medical circles.
But the reality is that people are really looking for shortcuts. Unless you live under a rock, it’s pretty clear to both medical and lay people that such things as exercise and a healthy diet can help avoid multiple chronic health conditions. This doesn’t mean most of us, myself included, will do such faithfully. It just takes less time and effort to drink more tea than it does to go to the gym, so we want to believe.
That’s just human nature.
Dr. Block has a solo neurology practice in Scottsdale, Ariz.
Recently an article crossed my screen that drinking 4 cups of tea per day lowered the risk of type 2 diabetes by 17%. As these thing always seem to, it ended with a variant of “further research is needed.”
Encouraging? Sure. Definite? Nope.
I’ve seen plenty of articles suggesting coffee and/or tea have health benefits, though specifically on what varies, from lifespan to lowering the risk of a chronic medical condition (in this case, type 2 diabetes).
There are always numerous variables that aren’t clear. What kind of tea? Decaf or regular? Hot or iced? When you say cup, what do you mean? A lot of people, including me, probably consider anything smaller that a Starbucks grande to be for wimps.
While I can’t think of any off the top of my head, there’s probably a reasonable chance that, if I looked, I could find something that says coffee or tea are bad for you in some way, too.
Not that I’m planning on changing my already caffeinated drinking habits, which is probably the crux of these things for most of us. In a given day I have 1-2 cups of coffee and 3-4 bottles of diet green tea. Maybe 1-2 Diet Cokes in there some days. In winter more hot black tea. I’m probably a poster child for methylyxanthine toxicity.
I have no idea if all that coffee and tea are doing anything besides keeping me awake and alert for my patients. If they are, I certainly hope they’re lowering my risk of something bad.
Articles like this always get attention, and are often picked up by the general media. People love to think something so simple as drinking more tea or coffee would make a big difference in their lives. So it gets forwarded, people never read past the first paragraph or two, and don’t make it to the “further research is needed” line.
If an article ever came out refuting it, it probably wouldn’t get nearly as much press (who wants to read bad news?) and would be quickly forgotten outside of medical circles.
But the reality is that people are really looking for shortcuts. Unless you live under a rock, it’s pretty clear to both medical and lay people that such things as exercise and a healthy diet can help avoid multiple chronic health conditions. This doesn’t mean most of us, myself included, will do such faithfully. It just takes less time and effort to drink more tea than it does to go to the gym, so we want to believe.
That’s just human nature.
Dr. Block has a solo neurology practice in Scottsdale, Ariz.
Recently an article crossed my screen that drinking 4 cups of tea per day lowered the risk of type 2 diabetes by 17%. As these thing always seem to, it ended with a variant of “further research is needed.”
Encouraging? Sure. Definite? Nope.
I’ve seen plenty of articles suggesting coffee and/or tea have health benefits, though specifically on what varies, from lifespan to lowering the risk of a chronic medical condition (in this case, type 2 diabetes).
There are always numerous variables that aren’t clear. What kind of tea? Decaf or regular? Hot or iced? When you say cup, what do you mean? A lot of people, including me, probably consider anything smaller that a Starbucks grande to be for wimps.
While I can’t think of any off the top of my head, there’s probably a reasonable chance that, if I looked, I could find something that says coffee or tea are bad for you in some way, too.
Not that I’m planning on changing my already caffeinated drinking habits, which is probably the crux of these things for most of us. In a given day I have 1-2 cups of coffee and 3-4 bottles of diet green tea. Maybe 1-2 Diet Cokes in there some days. In winter more hot black tea. I’m probably a poster child for methylyxanthine toxicity.
I have no idea if all that coffee and tea are doing anything besides keeping me awake and alert for my patients. If they are, I certainly hope they’re lowering my risk of something bad.
Articles like this always get attention, and are often picked up by the general media. People love to think something so simple as drinking more tea or coffee would make a big difference in their lives. So it gets forwarded, people never read past the first paragraph or two, and don’t make it to the “further research is needed” line.
If an article ever came out refuting it, it probably wouldn’t get nearly as much press (who wants to read bad news?) and would be quickly forgotten outside of medical circles.
But the reality is that people are really looking for shortcuts. Unless you live under a rock, it’s pretty clear to both medical and lay people that such things as exercise and a healthy diet can help avoid multiple chronic health conditions. This doesn’t mean most of us, myself included, will do such faithfully. It just takes less time and effort to drink more tea than it does to go to the gym, so we want to believe.
That’s just human nature.
Dr. Block has a solo neurology practice in Scottsdale, Ariz.
COMMENT & CONTROVERSY
CAN WE RETURN TO THE ABCS OF CRAFTING A MEDICAL RECORD NOTE?
ROBERT L. BARBIERI, MD (OCTOBER 2021)
Physicians can help provide EMR fixes
I appreciate Dr. Barbieri’s editorials and insight on many issues facing our profession. I would like to offer my comments on a recent article.
If you want your brakes fixed, don’t go to a shoe maker. Physicians seem to have lost our sense of who is most competent in determining the best way to practice and communicate medical care. Somehow we have turned this over to the bureaucrats, who seem to find ways to complicate the lives of both providers and patients. Maybe we are too busy caring for patients and trying to find ways to alleviate the burden placed on our time by the electronic medical record (EMR) system, which was touted as an improvement in medical care and increasing provider efficiency. Most of the time I hear my colleagues describing ways to “work around” an EMR system that has immense deficiencies in providing accurate information in a way that is easily digested by whomever is viewing the record. The universal ability to transfer information is simply not true. One colleague had the same office version of Cerner as was used in the hospital setting but was unable to send information back and forth due to the danger of the potential to corrupt the system.
Dr. Barbieri mentioned his work around to make the record easier for the patient to read. I ask, what about the coding descriptions, which most systems are now requiring physicians to put in at the time of the encounter? In the past this was done by certified coders, who undergo a 1- to 2-year training program, and is now being performed by physicians who have minimal to no training in coding. (And who, by the way, can be fined for both under- and over coding.) The example Dr. Barbieri put forth for obesity comes to mind and is part of the medical record in all cases. The terminology used by ICD10 is not so kind and requires some imagination when trying to find the right code for many diagnoses.
When will we stop allowing others, who know little about medicine and caring for patients, to tell us how to provide the care that we have trained for 7-12 years on how best to deliver?
William Sutton, MD
Muncie, Indiana
Dr. Barbieri responds
I thank Dr. Sutton for providing his experience with the electronic medical record. I agree with him that bureaucrats often create health care rules that do more to hinder than help patients. With regard to coding and billing, I use ICD-10 codes and usually bill based on time, which includes both face-to-face time with the patient and time spent reviewing the patient’s medical records. Now that federal regulations require medical notes to be shared with patients, I craft my history, assessment, and plan with language that is easy for a patient to accept and understand, avoiding medical terms that patients might misinterpret.
Should microscopy be replaced?
I agree with many points Dr. Barbieri made in his editorial. However, I do not agree that the microscopic examination of the vaginal discharge should be replaced. NAAT offers some advantages, but it does not offer a complete assessment of the vaginal ecosystem and microbiome. I believe that NAAT should be used in conjunction with the pelvic examination and microscopic examination of the vaginal discharge.
Microscopic examination of the vaginal discharge can reveal:
- whether or not the squamous epithelial cells are estrogenized. The absence of estrogen will, along with physical findings, indicate the possibility that the patient is experiencing atrophic vaginitis.
- the presence of estrogenized squamous epithelial cells. Plus, a finding of erythema of the vaginal epithelium indicates that the patient has an inflammatory condition and vaginitis, suggesting a possible infection in addition to vaginitis.
- the presence of white blood cells >5/40X magnification, which indicates the possible presence of infection in addition to vaginitis (eg, BV).
I agree that NAAT can confirm an initial diagnosis or refute it. In the latter case, the physician can change treatment accordingly. In the absence or in conjunction with the presence of a sexually transmitted infection, the composition of the vaginal microbiome is significant (ie, determining if vaginal dysbiosis is present). Performing a comprehensive evaluation, determining if the most common pathogens are present in aerobic vaginitis and/or BV, plus completing a Lactobacillus panel can be expensive. If insurance companies do not pay for such testing, patients will be reluctant to pay out of pocket for these tests.
My final comment addresses the administration of NAAT for aerobic vaginitis, and for BV, it is probably an ineffective treatment. Vaginal dysbiosis is based on whether the appropriate species of Lactobacillus is present, and the concentration. Treatment most likely will be based on replenishing or restoring the appropriate species of Lactobacillus to dominance.
Sebastian Faro, MD, PhD
Houston, Texas
Dr. Barbieri responds
I agree with Dr. Faro; when used by highly trained clinicians, microscopy is an excellent tool for evaluating vaginal specimens. Expert clinicians, such as Dr. Faro, with a focus on infectious diseases do not need to rely on NAAT testing except for identifying cases of T vaginalis infection. However, in standard clinical practice, microscopy performs poorly, resulting in misdiagnosis.1 In the average clinical practice, NAAT testing may help improve patient outcomes.
1. Gaydos CA, Beqaj S, Schwebke JR, et al. Clinical validation of a test for the diagnosis of vaginitis. Obstet Gynecol. 2017;130:181-189.
A note of thanks
I am a 74-year-old ObGyn who finished training at the University of North Carolina in 1979. Currently, I am working at a rural health group 2 days a week as a source of in-house gyn referral for 17 primary care physicians and mid-level providers. Our patients are almost all underserved and self-pay. The bulk of my work is related to evaluating abnormal uterine bleeding and abnormal Pap tests. Your publication of OBG Management serves now as one of my main sources of information. I just wanted to thank you and let you know that the publication is important. Keep up the good work and best wishes.
Julian Brantley, MD
Rocky Mountain, North Carolina
Dr. Barbieri responds
I thank Dr. Brantley for taking time from a busy practice to write about how OBG Management provides practical information relevant to practice. Each issue of OBG Management is built on a foundation of insights from expert clinicians, which is crafted into a finished product by a superb editorial team. Our goal is to enhance the quality of women’s health care and the professional development of obstetrician-gynecologists and all women’s health care clinicians. ●
CAN WE RETURN TO THE ABCS OF CRAFTING A MEDICAL RECORD NOTE?
ROBERT L. BARBIERI, MD (OCTOBER 2021)
Physicians can help provide EMR fixes
I appreciate Dr. Barbieri’s editorials and insight on many issues facing our profession. I would like to offer my comments on a recent article.
If you want your brakes fixed, don’t go to a shoe maker. Physicians seem to have lost our sense of who is most competent in determining the best way to practice and communicate medical care. Somehow we have turned this over to the bureaucrats, who seem to find ways to complicate the lives of both providers and patients. Maybe we are too busy caring for patients and trying to find ways to alleviate the burden placed on our time by the electronic medical record (EMR) system, which was touted as an improvement in medical care and increasing provider efficiency. Most of the time I hear my colleagues describing ways to “work around” an EMR system that has immense deficiencies in providing accurate information in a way that is easily digested by whomever is viewing the record. The universal ability to transfer information is simply not true. One colleague had the same office version of Cerner as was used in the hospital setting but was unable to send information back and forth due to the danger of the potential to corrupt the system.
Dr. Barbieri mentioned his work around to make the record easier for the patient to read. I ask, what about the coding descriptions, which most systems are now requiring physicians to put in at the time of the encounter? In the past this was done by certified coders, who undergo a 1- to 2-year training program, and is now being performed by physicians who have minimal to no training in coding. (And who, by the way, can be fined for both under- and over coding.) The example Dr. Barbieri put forth for obesity comes to mind and is part of the medical record in all cases. The terminology used by ICD10 is not so kind and requires some imagination when trying to find the right code for many diagnoses.
When will we stop allowing others, who know little about medicine and caring for patients, to tell us how to provide the care that we have trained for 7-12 years on how best to deliver?
William Sutton, MD
Muncie, Indiana
Dr. Barbieri responds
I thank Dr. Sutton for providing his experience with the electronic medical record. I agree with him that bureaucrats often create health care rules that do more to hinder than help patients. With regard to coding and billing, I use ICD-10 codes and usually bill based on time, which includes both face-to-face time with the patient and time spent reviewing the patient’s medical records. Now that federal regulations require medical notes to be shared with patients, I craft my history, assessment, and plan with language that is easy for a patient to accept and understand, avoiding medical terms that patients might misinterpret.
Should microscopy be replaced?
I agree with many points Dr. Barbieri made in his editorial. However, I do not agree that the microscopic examination of the vaginal discharge should be replaced. NAAT offers some advantages, but it does not offer a complete assessment of the vaginal ecosystem and microbiome. I believe that NAAT should be used in conjunction with the pelvic examination and microscopic examination of the vaginal discharge.
Microscopic examination of the vaginal discharge can reveal:
- whether or not the squamous epithelial cells are estrogenized. The absence of estrogen will, along with physical findings, indicate the possibility that the patient is experiencing atrophic vaginitis.
- the presence of estrogenized squamous epithelial cells. Plus, a finding of erythema of the vaginal epithelium indicates that the patient has an inflammatory condition and vaginitis, suggesting a possible infection in addition to vaginitis.
- the presence of white blood cells >5/40X magnification, which indicates the possible presence of infection in addition to vaginitis (eg, BV).
I agree that NAAT can confirm an initial diagnosis or refute it. In the latter case, the physician can change treatment accordingly. In the absence or in conjunction with the presence of a sexually transmitted infection, the composition of the vaginal microbiome is significant (ie, determining if vaginal dysbiosis is present). Performing a comprehensive evaluation, determining if the most common pathogens are present in aerobic vaginitis and/or BV, plus completing a Lactobacillus panel can be expensive. If insurance companies do not pay for such testing, patients will be reluctant to pay out of pocket for these tests.
My final comment addresses the administration of NAAT for aerobic vaginitis, and for BV, it is probably an ineffective treatment. Vaginal dysbiosis is based on whether the appropriate species of Lactobacillus is present, and the concentration. Treatment most likely will be based on replenishing or restoring the appropriate species of Lactobacillus to dominance.
Sebastian Faro, MD, PhD
Houston, Texas
Dr. Barbieri responds
I agree with Dr. Faro; when used by highly trained clinicians, microscopy is an excellent tool for evaluating vaginal specimens. Expert clinicians, such as Dr. Faro, with a focus on infectious diseases do not need to rely on NAAT testing except for identifying cases of T vaginalis infection. However, in standard clinical practice, microscopy performs poorly, resulting in misdiagnosis.1 In the average clinical practice, NAAT testing may help improve patient outcomes.
1. Gaydos CA, Beqaj S, Schwebke JR, et al. Clinical validation of a test for the diagnosis of vaginitis. Obstet Gynecol. 2017;130:181-189.
A note of thanks
I am a 74-year-old ObGyn who finished training at the University of North Carolina in 1979. Currently, I am working at a rural health group 2 days a week as a source of in-house gyn referral for 17 primary care physicians and mid-level providers. Our patients are almost all underserved and self-pay. The bulk of my work is related to evaluating abnormal uterine bleeding and abnormal Pap tests. Your publication of OBG Management serves now as one of my main sources of information. I just wanted to thank you and let you know that the publication is important. Keep up the good work and best wishes.
Julian Brantley, MD
Rocky Mountain, North Carolina
Dr. Barbieri responds
I thank Dr. Brantley for taking time from a busy practice to write about how OBG Management provides practical information relevant to practice. Each issue of OBG Management is built on a foundation of insights from expert clinicians, which is crafted into a finished product by a superb editorial team. Our goal is to enhance the quality of women’s health care and the professional development of obstetrician-gynecologists and all women’s health care clinicians. ●
CAN WE RETURN TO THE ABCS OF CRAFTING A MEDICAL RECORD NOTE?
ROBERT L. BARBIERI, MD (OCTOBER 2021)
Physicians can help provide EMR fixes
I appreciate Dr. Barbieri’s editorials and insight on many issues facing our profession. I would like to offer my comments on a recent article.
If you want your brakes fixed, don’t go to a shoe maker. Physicians seem to have lost our sense of who is most competent in determining the best way to practice and communicate medical care. Somehow we have turned this over to the bureaucrats, who seem to find ways to complicate the lives of both providers and patients. Maybe we are too busy caring for patients and trying to find ways to alleviate the burden placed on our time by the electronic medical record (EMR) system, which was touted as an improvement in medical care and increasing provider efficiency. Most of the time I hear my colleagues describing ways to “work around” an EMR system that has immense deficiencies in providing accurate information in a way that is easily digested by whomever is viewing the record. The universal ability to transfer information is simply not true. One colleague had the same office version of Cerner as was used in the hospital setting but was unable to send information back and forth due to the danger of the potential to corrupt the system.
Dr. Barbieri mentioned his work around to make the record easier for the patient to read. I ask, what about the coding descriptions, which most systems are now requiring physicians to put in at the time of the encounter? In the past this was done by certified coders, who undergo a 1- to 2-year training program, and is now being performed by physicians who have minimal to no training in coding. (And who, by the way, can be fined for both under- and over coding.) The example Dr. Barbieri put forth for obesity comes to mind and is part of the medical record in all cases. The terminology used by ICD10 is not so kind and requires some imagination when trying to find the right code for many diagnoses.
When will we stop allowing others, who know little about medicine and caring for patients, to tell us how to provide the care that we have trained for 7-12 years on how best to deliver?
William Sutton, MD
Muncie, Indiana
Dr. Barbieri responds
I thank Dr. Sutton for providing his experience with the electronic medical record. I agree with him that bureaucrats often create health care rules that do more to hinder than help patients. With regard to coding and billing, I use ICD-10 codes and usually bill based on time, which includes both face-to-face time with the patient and time spent reviewing the patient’s medical records. Now that federal regulations require medical notes to be shared with patients, I craft my history, assessment, and plan with language that is easy for a patient to accept and understand, avoiding medical terms that patients might misinterpret.
Should microscopy be replaced?
I agree with many points Dr. Barbieri made in his editorial. However, I do not agree that the microscopic examination of the vaginal discharge should be replaced. NAAT offers some advantages, but it does not offer a complete assessment of the vaginal ecosystem and microbiome. I believe that NAAT should be used in conjunction with the pelvic examination and microscopic examination of the vaginal discharge.
Microscopic examination of the vaginal discharge can reveal:
- whether or not the squamous epithelial cells are estrogenized. The absence of estrogen will, along with physical findings, indicate the possibility that the patient is experiencing atrophic vaginitis.
- the presence of estrogenized squamous epithelial cells. Plus, a finding of erythema of the vaginal epithelium indicates that the patient has an inflammatory condition and vaginitis, suggesting a possible infection in addition to vaginitis.
- the presence of white blood cells >5/40X magnification, which indicates the possible presence of infection in addition to vaginitis (eg, BV).
I agree that NAAT can confirm an initial diagnosis or refute it. In the latter case, the physician can change treatment accordingly. In the absence or in conjunction with the presence of a sexually transmitted infection, the composition of the vaginal microbiome is significant (ie, determining if vaginal dysbiosis is present). Performing a comprehensive evaluation, determining if the most common pathogens are present in aerobic vaginitis and/or BV, plus completing a Lactobacillus panel can be expensive. If insurance companies do not pay for such testing, patients will be reluctant to pay out of pocket for these tests.
My final comment addresses the administration of NAAT for aerobic vaginitis, and for BV, it is probably an ineffective treatment. Vaginal dysbiosis is based on whether the appropriate species of Lactobacillus is present, and the concentration. Treatment most likely will be based on replenishing or restoring the appropriate species of Lactobacillus to dominance.
Sebastian Faro, MD, PhD
Houston, Texas
Dr. Barbieri responds
I agree with Dr. Faro; when used by highly trained clinicians, microscopy is an excellent tool for evaluating vaginal specimens. Expert clinicians, such as Dr. Faro, with a focus on infectious diseases do not need to rely on NAAT testing except for identifying cases of T vaginalis infection. However, in standard clinical practice, microscopy performs poorly, resulting in misdiagnosis.1 In the average clinical practice, NAAT testing may help improve patient outcomes.
1. Gaydos CA, Beqaj S, Schwebke JR, et al. Clinical validation of a test for the diagnosis of vaginitis. Obstet Gynecol. 2017;130:181-189.
A note of thanks
I am a 74-year-old ObGyn who finished training at the University of North Carolina in 1979. Currently, I am working at a rural health group 2 days a week as a source of in-house gyn referral for 17 primary care physicians and mid-level providers. Our patients are almost all underserved and self-pay. The bulk of my work is related to evaluating abnormal uterine bleeding and abnormal Pap tests. Your publication of OBG Management serves now as one of my main sources of information. I just wanted to thank you and let you know that the publication is important. Keep up the good work and best wishes.
Julian Brantley, MD
Rocky Mountain, North Carolina
Dr. Barbieri responds
I thank Dr. Brantley for taking time from a busy practice to write about how OBG Management provides practical information relevant to practice. Each issue of OBG Management is built on a foundation of insights from expert clinicians, which is crafted into a finished product by a superb editorial team. Our goal is to enhance the quality of women’s health care and the professional development of obstetrician-gynecologists and all women’s health care clinicians. ●
FDA OKs selpercatinib for adults with RET-fusion+ solid tumors
that have progressed during or following systemic treatment, or for patients for whom there are no good alternative treatments.
In 2020, selpercatinib received accelerated approval for lung and thyroid RET-positive tumors; that approval transitioned to a regular approval for non–small cell lung cancer on Sept. 21. The latest approval expands the drug label to include an array of RET-positive tumor types, including pancreatic and colorectal cancers.
The approval was based on data from the phase 1/2 LIBRETTO-001 trial, which evaluated 41 patients with RET fusion–positive tumors. Thirty-seven patients (90%) had received prior systemic therapy, with almost one-third receiving three or more. Primary efficacy measures were overall response rate and duration of response.
Among the 41 patients, the overall response rate was 44%, with a duration of response of 24.5 months. Additionally, for 67% of patients, results lasted at least 6 months.
“In the LIBRETTO-001 trial, selpercatinib demonstrated clinically meaningful and durable responses across a variety of tumor types in patients with RET-driven cancers,” Vivek Subbiah, MD, coinvestigator for the trial, said in a press release. “These data and FDA approval of the tumor-agnostic indication underscore the importance of routine, comprehensive genomic testing for patients across a wide variety of tumor types.”
The most common cancers in the study were pancreatic adenocarcinoma (27%), colorectal cancer (24%), and salivary cancer (10%).
The recommended selpercatinib dose, based on body weight, is 120 mg orally twice daily for people who weigh less than 110 pounds or 160 mg orally twice daily for who weigh 110 pounds or more.
The most common adverse reactions were edema, diarrhea, fatigue, dry mouth, hypertension, abdominal pain, constipation, rash, nausea, and headache.
A version of this article first appeared on Medscape.com.
that have progressed during or following systemic treatment, or for patients for whom there are no good alternative treatments.
In 2020, selpercatinib received accelerated approval for lung and thyroid RET-positive tumors; that approval transitioned to a regular approval for non–small cell lung cancer on Sept. 21. The latest approval expands the drug label to include an array of RET-positive tumor types, including pancreatic and colorectal cancers.
The approval was based on data from the phase 1/2 LIBRETTO-001 trial, which evaluated 41 patients with RET fusion–positive tumors. Thirty-seven patients (90%) had received prior systemic therapy, with almost one-third receiving three or more. Primary efficacy measures were overall response rate and duration of response.
Among the 41 patients, the overall response rate was 44%, with a duration of response of 24.5 months. Additionally, for 67% of patients, results lasted at least 6 months.
“In the LIBRETTO-001 trial, selpercatinib demonstrated clinically meaningful and durable responses across a variety of tumor types in patients with RET-driven cancers,” Vivek Subbiah, MD, coinvestigator for the trial, said in a press release. “These data and FDA approval of the tumor-agnostic indication underscore the importance of routine, comprehensive genomic testing for patients across a wide variety of tumor types.”
The most common cancers in the study were pancreatic adenocarcinoma (27%), colorectal cancer (24%), and salivary cancer (10%).
The recommended selpercatinib dose, based on body weight, is 120 mg orally twice daily for people who weigh less than 110 pounds or 160 mg orally twice daily for who weigh 110 pounds or more.
The most common adverse reactions were edema, diarrhea, fatigue, dry mouth, hypertension, abdominal pain, constipation, rash, nausea, and headache.
A version of this article first appeared on Medscape.com.
that have progressed during or following systemic treatment, or for patients for whom there are no good alternative treatments.
In 2020, selpercatinib received accelerated approval for lung and thyroid RET-positive tumors; that approval transitioned to a regular approval for non–small cell lung cancer on Sept. 21. The latest approval expands the drug label to include an array of RET-positive tumor types, including pancreatic and colorectal cancers.
The approval was based on data from the phase 1/2 LIBRETTO-001 trial, which evaluated 41 patients with RET fusion–positive tumors. Thirty-seven patients (90%) had received prior systemic therapy, with almost one-third receiving three or more. Primary efficacy measures were overall response rate and duration of response.
Among the 41 patients, the overall response rate was 44%, with a duration of response of 24.5 months. Additionally, for 67% of patients, results lasted at least 6 months.
“In the LIBRETTO-001 trial, selpercatinib demonstrated clinically meaningful and durable responses across a variety of tumor types in patients with RET-driven cancers,” Vivek Subbiah, MD, coinvestigator for the trial, said in a press release. “These data and FDA approval of the tumor-agnostic indication underscore the importance of routine, comprehensive genomic testing for patients across a wide variety of tumor types.”
The most common cancers in the study were pancreatic adenocarcinoma (27%), colorectal cancer (24%), and salivary cancer (10%).
The recommended selpercatinib dose, based on body weight, is 120 mg orally twice daily for people who weigh less than 110 pounds or 160 mg orally twice daily for who weigh 110 pounds or more.
The most common adverse reactions were edema, diarrhea, fatigue, dry mouth, hypertension, abdominal pain, constipation, rash, nausea, and headache.
A version of this article first appeared on Medscape.com.
Childhood cow’s milk allergy raises health care costs
Managing children’s cow’s milk allergy is costly to families and to health care systems, largely owing to costs of prescriptions, according to an industry-sponsored study based on data from the United Kingdom.
“This large cohort study provides novel evidence of a significant health economic burden of cow’s milk allergy in children,” Abbie L. Cawood, PhD, RNutr, MICR, head of scientific affairs at Nutricia Ltd in Trowbridge, England, and colleagues wrote in Clinical and Translational Allergy.
“Management of cow’s milk allergy necessitates the exclusion of cow’s milk protein from the diet. Whilst breastmilk remains the ideal nutrient source in infants with cow’s milk allergy, infants who are not exclusively breastfed require a hypoallergenic formula,” added Dr. Cawood, a visiting research fellow at University of Southampton, and her coauthors.
Cow’s milk allergy, an immune‐mediated response to one or more proteins in cow’s milk, is one of the most common childhood food allergies and affects 2%-5% of infants in Europe. Management involves avoiding cow’s milk protein and treating possible related gastrointestinal, skin, respiratory, and other allergic conditions, the authors explained.
In their retrospective matched cohort study, Dr. Cawood and colleagues turned to The Health Improvement Network (THIN), a Cegedim Rx proprietary database of 2.9 million anonymized active patient records. They extracted data from nearly 7,000 case records covering 5 years (2015-2020).
They examined medication prescriptions and health care professional contacts based on diagnosis read-codes and hypoallergenic formula prescriptions and compared health care costs for children with cow’s milk allergy with the costs for those without.
They matched 3,499 children aged 1 year or younger who had confirmed or suspected cow’s milk allergy with the same number of children without cow’s milk allergy. Around half of the participants were boys, and the mean observation period was 4.2 years.
Children with cow’s milk allergy need more, costly health care
The researchers found:
- Medications were prescribed to significantly more children with cow’s milk allergy (CMA), at a higher rate, than to those without CMA. In particular, prescriptions for antireflux medication increased by almost 500%.
- Children with CMA needed significantly more health care contacts and at a higher rate than those without CMA.
- CMA was linked with additional potential health care costs of £1381.53 per person per year. Assuming a 2.5% prevalence from the estimated 2%-5% CMA prevalence range and extrapolating to the UK infant population, CMA may have added more than £25.7 million in annual health care costs nationwide.
“Several conditions in infancy necessitate the elimination of cow milk–based formulas and require extensively hydrolyzed or amino acid formulas or, if preferred or able, exclusive breast milk,” Kara E. Coffey, MD, assistant professor of pediatrics at the University of Pittsburgh, said by email.
“This study shows that, regardless of the reason for cow milk–based avoidance, these infants require more healthcare service utilizations (clinic visits, nutritional assessments, prescriptions) than [do] their peers, which is certainly a commitment of a lot of time and money for their families to ensure their ability to grow and thrive,” added Dr. Coffey, who was not involved in the study.
Jodi A. Shroba, MSN, APRN, CPNP, the coordinator for the Food Allergy Program at Children’s Mercy Kansas City, Mo., did not find these numbers surprising.
“Children with food allergies typically have other atopic comorbidities that require more visits to primary care physicians and specialists and more prescriptions,” Ms. Shroba, who was not involved in the study, said by email.
“An intriguing statement is that the U.K. guidelines recommend the involvement of a dietitian for children with cow’s milk allergy,” she noted. “In the United States, having a dietitian involved would be a wonderful addition to care, as avoidance of cow’s milk can cause nutritional and growth deficiencies. But not all healthcare practices have those resources available.
“The higher rate of antibiotic use and the almost 500% increase of antireflux prescriptions by the children with cow’s milk allergy warrant additional research,” she added.
Nutricia Ltd. funded the study. Dr. Cawood and one coauthor are employed by Nutricia, and all other coauthors have been employees of or have other financial relationships with Nutricia. One coauthor is employed by Cegedim Rx, which was funded for this research by Nutricia. Ms. Shroba and Dr. Coffey report no conflicts of interest with the study.
A version of this article first appeared on Medscape.com.
Managing children’s cow’s milk allergy is costly to families and to health care systems, largely owing to costs of prescriptions, according to an industry-sponsored study based on data from the United Kingdom.
“This large cohort study provides novel evidence of a significant health economic burden of cow’s milk allergy in children,” Abbie L. Cawood, PhD, RNutr, MICR, head of scientific affairs at Nutricia Ltd in Trowbridge, England, and colleagues wrote in Clinical and Translational Allergy.
“Management of cow’s milk allergy necessitates the exclusion of cow’s milk protein from the diet. Whilst breastmilk remains the ideal nutrient source in infants with cow’s milk allergy, infants who are not exclusively breastfed require a hypoallergenic formula,” added Dr. Cawood, a visiting research fellow at University of Southampton, and her coauthors.
Cow’s milk allergy, an immune‐mediated response to one or more proteins in cow’s milk, is one of the most common childhood food allergies and affects 2%-5% of infants in Europe. Management involves avoiding cow’s milk protein and treating possible related gastrointestinal, skin, respiratory, and other allergic conditions, the authors explained.
In their retrospective matched cohort study, Dr. Cawood and colleagues turned to The Health Improvement Network (THIN), a Cegedim Rx proprietary database of 2.9 million anonymized active patient records. They extracted data from nearly 7,000 case records covering 5 years (2015-2020).
They examined medication prescriptions and health care professional contacts based on diagnosis read-codes and hypoallergenic formula prescriptions and compared health care costs for children with cow’s milk allergy with the costs for those without.
They matched 3,499 children aged 1 year or younger who had confirmed or suspected cow’s milk allergy with the same number of children without cow’s milk allergy. Around half of the participants were boys, and the mean observation period was 4.2 years.
Children with cow’s milk allergy need more, costly health care
The researchers found:
- Medications were prescribed to significantly more children with cow’s milk allergy (CMA), at a higher rate, than to those without CMA. In particular, prescriptions for antireflux medication increased by almost 500%.
- Children with CMA needed significantly more health care contacts and at a higher rate than those without CMA.
- CMA was linked with additional potential health care costs of £1381.53 per person per year. Assuming a 2.5% prevalence from the estimated 2%-5% CMA prevalence range and extrapolating to the UK infant population, CMA may have added more than £25.7 million in annual health care costs nationwide.
“Several conditions in infancy necessitate the elimination of cow milk–based formulas and require extensively hydrolyzed or amino acid formulas or, if preferred or able, exclusive breast milk,” Kara E. Coffey, MD, assistant professor of pediatrics at the University of Pittsburgh, said by email.
“This study shows that, regardless of the reason for cow milk–based avoidance, these infants require more healthcare service utilizations (clinic visits, nutritional assessments, prescriptions) than [do] their peers, which is certainly a commitment of a lot of time and money for their families to ensure their ability to grow and thrive,” added Dr. Coffey, who was not involved in the study.
Jodi A. Shroba, MSN, APRN, CPNP, the coordinator for the Food Allergy Program at Children’s Mercy Kansas City, Mo., did not find these numbers surprising.
“Children with food allergies typically have other atopic comorbidities that require more visits to primary care physicians and specialists and more prescriptions,” Ms. Shroba, who was not involved in the study, said by email.
“An intriguing statement is that the U.K. guidelines recommend the involvement of a dietitian for children with cow’s milk allergy,” she noted. “In the United States, having a dietitian involved would be a wonderful addition to care, as avoidance of cow’s milk can cause nutritional and growth deficiencies. But not all healthcare practices have those resources available.
“The higher rate of antibiotic use and the almost 500% increase of antireflux prescriptions by the children with cow’s milk allergy warrant additional research,” she added.
Nutricia Ltd. funded the study. Dr. Cawood and one coauthor are employed by Nutricia, and all other coauthors have been employees of or have other financial relationships with Nutricia. One coauthor is employed by Cegedim Rx, which was funded for this research by Nutricia. Ms. Shroba and Dr. Coffey report no conflicts of interest with the study.
A version of this article first appeared on Medscape.com.
Managing children’s cow’s milk allergy is costly to families and to health care systems, largely owing to costs of prescriptions, according to an industry-sponsored study based on data from the United Kingdom.
“This large cohort study provides novel evidence of a significant health economic burden of cow’s milk allergy in children,” Abbie L. Cawood, PhD, RNutr, MICR, head of scientific affairs at Nutricia Ltd in Trowbridge, England, and colleagues wrote in Clinical and Translational Allergy.
“Management of cow’s milk allergy necessitates the exclusion of cow’s milk protein from the diet. Whilst breastmilk remains the ideal nutrient source in infants with cow’s milk allergy, infants who are not exclusively breastfed require a hypoallergenic formula,” added Dr. Cawood, a visiting research fellow at University of Southampton, and her coauthors.
Cow’s milk allergy, an immune‐mediated response to one or more proteins in cow’s milk, is one of the most common childhood food allergies and affects 2%-5% of infants in Europe. Management involves avoiding cow’s milk protein and treating possible related gastrointestinal, skin, respiratory, and other allergic conditions, the authors explained.
In their retrospective matched cohort study, Dr. Cawood and colleagues turned to The Health Improvement Network (THIN), a Cegedim Rx proprietary database of 2.9 million anonymized active patient records. They extracted data from nearly 7,000 case records covering 5 years (2015-2020).
They examined medication prescriptions and health care professional contacts based on diagnosis read-codes and hypoallergenic formula prescriptions and compared health care costs for children with cow’s milk allergy with the costs for those without.
They matched 3,499 children aged 1 year or younger who had confirmed or suspected cow’s milk allergy with the same number of children without cow’s milk allergy. Around half of the participants were boys, and the mean observation period was 4.2 years.
Children with cow’s milk allergy need more, costly health care
The researchers found:
- Medications were prescribed to significantly more children with cow’s milk allergy (CMA), at a higher rate, than to those without CMA. In particular, prescriptions for antireflux medication increased by almost 500%.
- Children with CMA needed significantly more health care contacts and at a higher rate than those without CMA.
- CMA was linked with additional potential health care costs of £1381.53 per person per year. Assuming a 2.5% prevalence from the estimated 2%-5% CMA prevalence range and extrapolating to the UK infant population, CMA may have added more than £25.7 million in annual health care costs nationwide.
“Several conditions in infancy necessitate the elimination of cow milk–based formulas and require extensively hydrolyzed or amino acid formulas or, if preferred or able, exclusive breast milk,” Kara E. Coffey, MD, assistant professor of pediatrics at the University of Pittsburgh, said by email.
“This study shows that, regardless of the reason for cow milk–based avoidance, these infants require more healthcare service utilizations (clinic visits, nutritional assessments, prescriptions) than [do] their peers, which is certainly a commitment of a lot of time and money for their families to ensure their ability to grow and thrive,” added Dr. Coffey, who was not involved in the study.
Jodi A. Shroba, MSN, APRN, CPNP, the coordinator for the Food Allergy Program at Children’s Mercy Kansas City, Mo., did not find these numbers surprising.
“Children with food allergies typically have other atopic comorbidities that require more visits to primary care physicians and specialists and more prescriptions,” Ms. Shroba, who was not involved in the study, said by email.
“An intriguing statement is that the U.K. guidelines recommend the involvement of a dietitian for children with cow’s milk allergy,” she noted. “In the United States, having a dietitian involved would be a wonderful addition to care, as avoidance of cow’s milk can cause nutritional and growth deficiencies. But not all healthcare practices have those resources available.
“The higher rate of antibiotic use and the almost 500% increase of antireflux prescriptions by the children with cow’s milk allergy warrant additional research,” she added.
Nutricia Ltd. funded the study. Dr. Cawood and one coauthor are employed by Nutricia, and all other coauthors have been employees of or have other financial relationships with Nutricia. One coauthor is employed by Cegedim Rx, which was funded for this research by Nutricia. Ms. Shroba and Dr. Coffey report no conflicts of interest with the study.
A version of this article first appeared on Medscape.com.
FROM CLINICAL AND TRANSLATIONAL ALLERGY
Legacy of neutral renal denervation trial recast by long-term outcomes: SYMPLICITY HTN-3
BOSTON – There’s an intriguing plot twist in the story of SYMPLICITY HTN-3, the sham-controlled clinical trial that nearly put the kibosh on renal denervation (RDN) therapy as a promising approach to treatment-resistant hypertension (HTN).
The trial famously showed no benefit for systolic blood pressure (BP) from the invasive procedure at 6 months and 12 months, dampening enthusiasm for RDN in HTN for both physicians and industry. But it turns out that disappointment in the study may have been premature.
The procedure led to significant improvements in systolic BP, whether in-office or ambulatory, compared with a sham control procedure, in a new analysis that followed the trial’s patients out to 3 years. Those who underwent RDN also required less intense antihypertensive drug therapy.
“These findings support that durable blood pressure reductions with radiofrequency renal artery denervation, in the presence of lifestyle modification and maximal medical therapy, are safely achievable,” Deepak L. Bhatt, MD, said in a Sept. 18 presentation at the Transcatheter Cardiovascular Therapeutics annual meeting, which was sponsored by the Cardiovascular Research Foundation.
Dr. Bhatt, of Boston’s Brigham and Women’s Hospital and Harvard Medical School, is lead author on the report published in The Lancet simultaneously with his presentation.
Strides in RDN technology and trial design since the neutral primary SYMPLICITY HTN-3 results were reported in 2014 have long since restored faith in the procedure, which is currently in advanced stages of clinical trials and expected to eventually make a mark on practice.
But Roxana Mehran, MD, not connected to SYMPLICITY HTN-3, expressed caution in interpreting the current analysis based on secondary endpoints and extended follow-up time.
And elsewhere at the TCT sessions, observers of the trial as well as Dr. Bhatt urged similar cautions interpreting “positive” secondary results from trials that were “negative” in their primary analyses.
Still, “I believe there is no question that we have now enough evidence to say that renal denervation on top of medications is probably something that we’re going to be seeing in the future,” Dr. Mehran, of the Icahn School of Medicine at Mount Sinai, New York, told this news organization.
Importantly, and a bit controversially, the RDN group in the 36-month SYMPLICITY HTN-3 analysis includes patients originally assigned to the sham control group who crossed over to receive RDN after the trial was unblinded. Their “control” BP responses were thereafter imputed by accepted statistical methodology that Dr. Bhatt characterized as “last observation carried forward.”
That’s another reason to be circumspect about the current results, observed Naomi Fisher, MD, also of Brigham and Women’s and Harvard Medical School, as a panelist following Dr. Bhatt’s formal presentation.
“With all the missing data and imputational calculations,” she said, “I think we have to apply caution in the interpretation.”
She also pointed out that blinding in the trial was lifted at 6 months, allowing patients to learn their treatment assignment, and potentially influencing subsequent changes to medications.
They were prescribed, on average, about five antihypertensive meds, Dr. Fisher noted, and “that’s already a red flag. Patients taking that many medications generally aren’t universally taking them. There’s very high likelihood that there could have been variable adherence.”
Patients who learned they were in the sham control group, for example, could have “fallen off” taking their medications, potentially worsening outcomes and amplifying the apparent benefit of RDN. Such an effect, Dr. Fisher said, “could have contributed” to the study’s long-term results.
As previously reported, the single-blind SYMPLICITY HTN-3 had randomly assigned 535 patients to either RDN or a sham control procedure, 364 and 171 patients respectively, at 88 U.S. centers. The trial used the Symplicity Flex RDN radiofrequency ablation catheter (Medtronic).
For study entry, patients were required to have office systolic BP of at least 160 mm Hg and 24-hour ambulatory systolic BP of at least 135 mm Hg despite stable, maximally tolerated dosages of a diuretic plus at least two other antihypertensive agents.
Blinding was lifted at 6 months, per protocol, after which patients in the sham control group who still met the trial’s BP entry criteria were allowed to cross over and undergo RDN. The 101 controls who crossed over were combined with the original active-therapy cohort for the current analysis.
From baseline to 36 months, mean number of medication classes per patient maintained between 4.5 and 5, with no significant difference between groups at any point.
However, medication burden expressed as number of doses daily held steady between 9.7 to 10.2 for controls while the RDN group showed a steady decline from 10.2 to 8.4. Differences between RDN patients and controls were significant at both 24 months (P = .01) and 36 months (P = .005), Dr. Bhatt reported.
All relative decreases favor the RDN group, P < .0001
The RDN group spent a longer percentage of time with systolic BP at goal compared to those in the sham control group in an analysis that did not involve imputation of data, Dr. Bhatt reported. The proportions of time in therapeutic range were 18% for RDN patients and 9% for controls (P < .0001).
As in the 6- and 12-month analyses, there was no adverse safety signal associated with RDN in follow-up out to both 36 and 48 months. As Dr. Bhatt reported, the rates of the composite safety endpoint in RDN patients, crossovers, and noncrossover controls were 15%, 14%, and 14%, respectively.
The safety endpoint included death, new end-stage renal disease, significant embolic events causing end-organ damage, vascular complications, renal-artery reintervention, and “hypertensive emergency unrelated to nonadherence to medications,” Dr. Bhatt reported.
There are many patients with “out of control” HTN “who cannot remain compliant on their medications,” Dr. Mehran observed for this news organization. “I believe having an adjunct to medical management of these patients,” that is RDN, “is going to be tremendously important.”
SYMPLICITY HTN-3 was funded by Medtronic. Dr. Bhatt has disclosed ties with many companies, as well as WebMD, Medscape Cardiology, and other publications or organizations. Dr. Mehran disclosed ties to Abbott Vascular, AstraZeneca, Bayer, Bristol-Myers Squibb, CSL Behring, Daiichi-Sankyo/Eli Lilly, Medtronic, Novartis, OrbusNeich, Abiomed; Boston Scientific, Alleviant, Amgen, AM-Pharma, Applied Therapeutics, Arena, BAIM, Biosensors, Biotronik, CardiaWave, CellAegis, Concept Medical, CeloNova, CERC, Chiesi, Cytosorbents, Duke University, Element Science, Faraday, Humacyte, Idorsia, Insel Gruppe, Philips, RenalPro, Vivasure, and Zoll; as well as Medscape/WebMD, and Cine-Med Research; and holding equity, stock, or stock options with Control Rad, Applied Therapeutics, and Elixir Medical. Dr. Fisher disclosed ties to Medtronic, Recor Medical, and Aktiia; and receiving grants or hold research contracts with Recor Medical and Aktiia.
A version of this article first appeared on Medscape.com.
BOSTON – There’s an intriguing plot twist in the story of SYMPLICITY HTN-3, the sham-controlled clinical trial that nearly put the kibosh on renal denervation (RDN) therapy as a promising approach to treatment-resistant hypertension (HTN).
The trial famously showed no benefit for systolic blood pressure (BP) from the invasive procedure at 6 months and 12 months, dampening enthusiasm for RDN in HTN for both physicians and industry. But it turns out that disappointment in the study may have been premature.
The procedure led to significant improvements in systolic BP, whether in-office or ambulatory, compared with a sham control procedure, in a new analysis that followed the trial’s patients out to 3 years. Those who underwent RDN also required less intense antihypertensive drug therapy.
“These findings support that durable blood pressure reductions with radiofrequency renal artery denervation, in the presence of lifestyle modification and maximal medical therapy, are safely achievable,” Deepak L. Bhatt, MD, said in a Sept. 18 presentation at the Transcatheter Cardiovascular Therapeutics annual meeting, which was sponsored by the Cardiovascular Research Foundation.
Dr. Bhatt, of Boston’s Brigham and Women’s Hospital and Harvard Medical School, is lead author on the report published in The Lancet simultaneously with his presentation.
Strides in RDN technology and trial design since the neutral primary SYMPLICITY HTN-3 results were reported in 2014 have long since restored faith in the procedure, which is currently in advanced stages of clinical trials and expected to eventually make a mark on practice.
But Roxana Mehran, MD, not connected to SYMPLICITY HTN-3, expressed caution in interpreting the current analysis based on secondary endpoints and extended follow-up time.
And elsewhere at the TCT sessions, observers of the trial as well as Dr. Bhatt urged similar cautions interpreting “positive” secondary results from trials that were “negative” in their primary analyses.
Still, “I believe there is no question that we have now enough evidence to say that renal denervation on top of medications is probably something that we’re going to be seeing in the future,” Dr. Mehran, of the Icahn School of Medicine at Mount Sinai, New York, told this news organization.
Importantly, and a bit controversially, the RDN group in the 36-month SYMPLICITY HTN-3 analysis includes patients originally assigned to the sham control group who crossed over to receive RDN after the trial was unblinded. Their “control” BP responses were thereafter imputed by accepted statistical methodology that Dr. Bhatt characterized as “last observation carried forward.”
That’s another reason to be circumspect about the current results, observed Naomi Fisher, MD, also of Brigham and Women’s and Harvard Medical School, as a panelist following Dr. Bhatt’s formal presentation.
“With all the missing data and imputational calculations,” she said, “I think we have to apply caution in the interpretation.”
She also pointed out that blinding in the trial was lifted at 6 months, allowing patients to learn their treatment assignment, and potentially influencing subsequent changes to medications.
They were prescribed, on average, about five antihypertensive meds, Dr. Fisher noted, and “that’s already a red flag. Patients taking that many medications generally aren’t universally taking them. There’s very high likelihood that there could have been variable adherence.”
Patients who learned they were in the sham control group, for example, could have “fallen off” taking their medications, potentially worsening outcomes and amplifying the apparent benefit of RDN. Such an effect, Dr. Fisher said, “could have contributed” to the study’s long-term results.
As previously reported, the single-blind SYMPLICITY HTN-3 had randomly assigned 535 patients to either RDN or a sham control procedure, 364 and 171 patients respectively, at 88 U.S. centers. The trial used the Symplicity Flex RDN radiofrequency ablation catheter (Medtronic).
For study entry, patients were required to have office systolic BP of at least 160 mm Hg and 24-hour ambulatory systolic BP of at least 135 mm Hg despite stable, maximally tolerated dosages of a diuretic plus at least two other antihypertensive agents.
Blinding was lifted at 6 months, per protocol, after which patients in the sham control group who still met the trial’s BP entry criteria were allowed to cross over and undergo RDN. The 101 controls who crossed over were combined with the original active-therapy cohort for the current analysis.
From baseline to 36 months, mean number of medication classes per patient maintained between 4.5 and 5, with no significant difference between groups at any point.
However, medication burden expressed as number of doses daily held steady between 9.7 to 10.2 for controls while the RDN group showed a steady decline from 10.2 to 8.4. Differences between RDN patients and controls were significant at both 24 months (P = .01) and 36 months (P = .005), Dr. Bhatt reported.
All relative decreases favor the RDN group, P < .0001
The RDN group spent a longer percentage of time with systolic BP at goal compared to those in the sham control group in an analysis that did not involve imputation of data, Dr. Bhatt reported. The proportions of time in therapeutic range were 18% for RDN patients and 9% for controls (P < .0001).
As in the 6- and 12-month analyses, there was no adverse safety signal associated with RDN in follow-up out to both 36 and 48 months. As Dr. Bhatt reported, the rates of the composite safety endpoint in RDN patients, crossovers, and noncrossover controls were 15%, 14%, and 14%, respectively.
The safety endpoint included death, new end-stage renal disease, significant embolic events causing end-organ damage, vascular complications, renal-artery reintervention, and “hypertensive emergency unrelated to nonadherence to medications,” Dr. Bhatt reported.
There are many patients with “out of control” HTN “who cannot remain compliant on their medications,” Dr. Mehran observed for this news organization. “I believe having an adjunct to medical management of these patients,” that is RDN, “is going to be tremendously important.”
SYMPLICITY HTN-3 was funded by Medtronic. Dr. Bhatt has disclosed ties with many companies, as well as WebMD, Medscape Cardiology, and other publications or organizations. Dr. Mehran disclosed ties to Abbott Vascular, AstraZeneca, Bayer, Bristol-Myers Squibb, CSL Behring, Daiichi-Sankyo/Eli Lilly, Medtronic, Novartis, OrbusNeich, Abiomed; Boston Scientific, Alleviant, Amgen, AM-Pharma, Applied Therapeutics, Arena, BAIM, Biosensors, Biotronik, CardiaWave, CellAegis, Concept Medical, CeloNova, CERC, Chiesi, Cytosorbents, Duke University, Element Science, Faraday, Humacyte, Idorsia, Insel Gruppe, Philips, RenalPro, Vivasure, and Zoll; as well as Medscape/WebMD, and Cine-Med Research; and holding equity, stock, or stock options with Control Rad, Applied Therapeutics, and Elixir Medical. Dr. Fisher disclosed ties to Medtronic, Recor Medical, and Aktiia; and receiving grants or hold research contracts with Recor Medical and Aktiia.
A version of this article first appeared on Medscape.com.
BOSTON – There’s an intriguing plot twist in the story of SYMPLICITY HTN-3, the sham-controlled clinical trial that nearly put the kibosh on renal denervation (RDN) therapy as a promising approach to treatment-resistant hypertension (HTN).
The trial famously showed no benefit for systolic blood pressure (BP) from the invasive procedure at 6 months and 12 months, dampening enthusiasm for RDN in HTN for both physicians and industry. But it turns out that disappointment in the study may have been premature.
The procedure led to significant improvements in systolic BP, whether in-office or ambulatory, compared with a sham control procedure, in a new analysis that followed the trial’s patients out to 3 years. Those who underwent RDN also required less intense antihypertensive drug therapy.
“These findings support that durable blood pressure reductions with radiofrequency renal artery denervation, in the presence of lifestyle modification and maximal medical therapy, are safely achievable,” Deepak L. Bhatt, MD, said in a Sept. 18 presentation at the Transcatheter Cardiovascular Therapeutics annual meeting, which was sponsored by the Cardiovascular Research Foundation.
Dr. Bhatt, of Boston’s Brigham and Women’s Hospital and Harvard Medical School, is lead author on the report published in The Lancet simultaneously with his presentation.
Strides in RDN technology and trial design since the neutral primary SYMPLICITY HTN-3 results were reported in 2014 have long since restored faith in the procedure, which is currently in advanced stages of clinical trials and expected to eventually make a mark on practice.
But Roxana Mehran, MD, not connected to SYMPLICITY HTN-3, expressed caution in interpreting the current analysis based on secondary endpoints and extended follow-up time.
And elsewhere at the TCT sessions, observers of the trial as well as Dr. Bhatt urged similar cautions interpreting “positive” secondary results from trials that were “negative” in their primary analyses.
Still, “I believe there is no question that we have now enough evidence to say that renal denervation on top of medications is probably something that we’re going to be seeing in the future,” Dr. Mehran, of the Icahn School of Medicine at Mount Sinai, New York, told this news organization.
Importantly, and a bit controversially, the RDN group in the 36-month SYMPLICITY HTN-3 analysis includes patients originally assigned to the sham control group who crossed over to receive RDN after the trial was unblinded. Their “control” BP responses were thereafter imputed by accepted statistical methodology that Dr. Bhatt characterized as “last observation carried forward.”
That’s another reason to be circumspect about the current results, observed Naomi Fisher, MD, also of Brigham and Women’s and Harvard Medical School, as a panelist following Dr. Bhatt’s formal presentation.
“With all the missing data and imputational calculations,” she said, “I think we have to apply caution in the interpretation.”
She also pointed out that blinding in the trial was lifted at 6 months, allowing patients to learn their treatment assignment, and potentially influencing subsequent changes to medications.
They were prescribed, on average, about five antihypertensive meds, Dr. Fisher noted, and “that’s already a red flag. Patients taking that many medications generally aren’t universally taking them. There’s very high likelihood that there could have been variable adherence.”
Patients who learned they were in the sham control group, for example, could have “fallen off” taking their medications, potentially worsening outcomes and amplifying the apparent benefit of RDN. Such an effect, Dr. Fisher said, “could have contributed” to the study’s long-term results.
As previously reported, the single-blind SYMPLICITY HTN-3 had randomly assigned 535 patients to either RDN or a sham control procedure, 364 and 171 patients respectively, at 88 U.S. centers. The trial used the Symplicity Flex RDN radiofrequency ablation catheter (Medtronic).
For study entry, patients were required to have office systolic BP of at least 160 mm Hg and 24-hour ambulatory systolic BP of at least 135 mm Hg despite stable, maximally tolerated dosages of a diuretic plus at least two other antihypertensive agents.
Blinding was lifted at 6 months, per protocol, after which patients in the sham control group who still met the trial’s BP entry criteria were allowed to cross over and undergo RDN. The 101 controls who crossed over were combined with the original active-therapy cohort for the current analysis.
From baseline to 36 months, mean number of medication classes per patient maintained between 4.5 and 5, with no significant difference between groups at any point.
However, medication burden expressed as number of doses daily held steady between 9.7 to 10.2 for controls while the RDN group showed a steady decline from 10.2 to 8.4. Differences between RDN patients and controls were significant at both 24 months (P = .01) and 36 months (P = .005), Dr. Bhatt reported.
All relative decreases favor the RDN group, P < .0001
The RDN group spent a longer percentage of time with systolic BP at goal compared to those in the sham control group in an analysis that did not involve imputation of data, Dr. Bhatt reported. The proportions of time in therapeutic range were 18% for RDN patients and 9% for controls (P < .0001).
As in the 6- and 12-month analyses, there was no adverse safety signal associated with RDN in follow-up out to both 36 and 48 months. As Dr. Bhatt reported, the rates of the composite safety endpoint in RDN patients, crossovers, and noncrossover controls were 15%, 14%, and 14%, respectively.
The safety endpoint included death, new end-stage renal disease, significant embolic events causing end-organ damage, vascular complications, renal-artery reintervention, and “hypertensive emergency unrelated to nonadherence to medications,” Dr. Bhatt reported.
There are many patients with “out of control” HTN “who cannot remain compliant on their medications,” Dr. Mehran observed for this news organization. “I believe having an adjunct to medical management of these patients,” that is RDN, “is going to be tremendously important.”
SYMPLICITY HTN-3 was funded by Medtronic. Dr. Bhatt has disclosed ties with many companies, as well as WebMD, Medscape Cardiology, and other publications or organizations. Dr. Mehran disclosed ties to Abbott Vascular, AstraZeneca, Bayer, Bristol-Myers Squibb, CSL Behring, Daiichi-Sankyo/Eli Lilly, Medtronic, Novartis, OrbusNeich, Abiomed; Boston Scientific, Alleviant, Amgen, AM-Pharma, Applied Therapeutics, Arena, BAIM, Biosensors, Biotronik, CardiaWave, CellAegis, Concept Medical, CeloNova, CERC, Chiesi, Cytosorbents, Duke University, Element Science, Faraday, Humacyte, Idorsia, Insel Gruppe, Philips, RenalPro, Vivasure, and Zoll; as well as Medscape/WebMD, and Cine-Med Research; and holding equity, stock, or stock options with Control Rad, Applied Therapeutics, and Elixir Medical. Dr. Fisher disclosed ties to Medtronic, Recor Medical, and Aktiia; and receiving grants or hold research contracts with Recor Medical and Aktiia.
A version of this article first appeared on Medscape.com.
AT TCT 2022
Minorities hit especially hard by overdose deaths during COVID
The results underscore the “urgency of expanding prevention, treatment, and harm reduction interventions tailored to specific populations, especially American Indian or Alaska Native and Black populations, given long-standing structural racism and inequities in accessing these services,” the researchers note.
The study was published online in JAMA Network Open.
‘Urgent need’ for education
From February 2020 to August 2021, drug overdose deaths in the United States rose 37%, and these deaths were largely due to synthetic opioids other than methadone – primarily fentanyl or analogs – and methamphetamine.
Yet, data are lacking regarding racial and ethnic disparities in overdose death rates.
To investigate, Beth Han, MD, PhD, with the National Institute on Drug Abuse, and colleagues analyzed federal drug overdose death data for individuals aged 15-34 and 35-64 from March 2018 to August 2021.
Among individuals aged 15-34, from March 2018 to August 2021, overdose death rates involving any drug, fentanyl, and methamphetamine with or without fentanyl, increased overall.
For the 6 months from March to August 2021, non-Hispanic Native American or Alaska Native men had the highest rates overall involving any drug, fentanyl, and methamphetamine without fentanyl, with rates of 42.0, 30.2, and 6.0 per 100,000, respectively.
The highest rates (per 100,000) of drug overdose deaths involving methamphetamine with fentanyl were for Native American or Alaska Native men (9.2) and women (8.0) and non-Hispanic White men (6.7).
Among people aged 35-64, from March to August 2021, overall drug overdose rates (per 100,000) were highest among non-Hispanic Black men (61.2) and Native American or Alaska Native men (60.0), and fentanyl-involved death rates were highest among Black men (43.3).
Rates involving methamphetamine with fentanyl were highest among Native American or Alaska Native men (12.6) and women (9.4) and White men (9.5).
Rates involving methamphetamine without fentanyl were highest among Native American or Alaska Native men (22.9).
The researchers note the findings highlight the “urgent need” for education on dangers of methamphetamine and fentanyl.
Expanding access to naloxone, fentanyl test strips, and treatments for substance use disorders to disproportionately affected populations is also critical to help curb disparities in drug overdose deaths, they add.
Limitations of the study are that overdose deaths may be underestimated because of the use of 2021 provisional data and that racial or ethnic identification may be misclassified, especially for Native American or Alaska Native people.
This study was sponsored by the National Institute on Drug Abuse of the National Institutes of Health and the Centers for Disease Control and Prevention. The authors report no relevant disclosures.
A version of this article first appeared on Medscape.com.
The results underscore the “urgency of expanding prevention, treatment, and harm reduction interventions tailored to specific populations, especially American Indian or Alaska Native and Black populations, given long-standing structural racism and inequities in accessing these services,” the researchers note.
The study was published online in JAMA Network Open.
‘Urgent need’ for education
From February 2020 to August 2021, drug overdose deaths in the United States rose 37%, and these deaths were largely due to synthetic opioids other than methadone – primarily fentanyl or analogs – and methamphetamine.
Yet, data are lacking regarding racial and ethnic disparities in overdose death rates.
To investigate, Beth Han, MD, PhD, with the National Institute on Drug Abuse, and colleagues analyzed federal drug overdose death data for individuals aged 15-34 and 35-64 from March 2018 to August 2021.
Among individuals aged 15-34, from March 2018 to August 2021, overdose death rates involving any drug, fentanyl, and methamphetamine with or without fentanyl, increased overall.
For the 6 months from March to August 2021, non-Hispanic Native American or Alaska Native men had the highest rates overall involving any drug, fentanyl, and methamphetamine without fentanyl, with rates of 42.0, 30.2, and 6.0 per 100,000, respectively.
The highest rates (per 100,000) of drug overdose deaths involving methamphetamine with fentanyl were for Native American or Alaska Native men (9.2) and women (8.0) and non-Hispanic White men (6.7).
Among people aged 35-64, from March to August 2021, overall drug overdose rates (per 100,000) were highest among non-Hispanic Black men (61.2) and Native American or Alaska Native men (60.0), and fentanyl-involved death rates were highest among Black men (43.3).
Rates involving methamphetamine with fentanyl were highest among Native American or Alaska Native men (12.6) and women (9.4) and White men (9.5).
Rates involving methamphetamine without fentanyl were highest among Native American or Alaska Native men (22.9).
The researchers note the findings highlight the “urgent need” for education on dangers of methamphetamine and fentanyl.
Expanding access to naloxone, fentanyl test strips, and treatments for substance use disorders to disproportionately affected populations is also critical to help curb disparities in drug overdose deaths, they add.
Limitations of the study are that overdose deaths may be underestimated because of the use of 2021 provisional data and that racial or ethnic identification may be misclassified, especially for Native American or Alaska Native people.
This study was sponsored by the National Institute on Drug Abuse of the National Institutes of Health and the Centers for Disease Control and Prevention. The authors report no relevant disclosures.
A version of this article first appeared on Medscape.com.
The results underscore the “urgency of expanding prevention, treatment, and harm reduction interventions tailored to specific populations, especially American Indian or Alaska Native and Black populations, given long-standing structural racism and inequities in accessing these services,” the researchers note.
The study was published online in JAMA Network Open.
‘Urgent need’ for education
From February 2020 to August 2021, drug overdose deaths in the United States rose 37%, and these deaths were largely due to synthetic opioids other than methadone – primarily fentanyl or analogs – and methamphetamine.
Yet, data are lacking regarding racial and ethnic disparities in overdose death rates.
To investigate, Beth Han, MD, PhD, with the National Institute on Drug Abuse, and colleagues analyzed federal drug overdose death data for individuals aged 15-34 and 35-64 from March 2018 to August 2021.
Among individuals aged 15-34, from March 2018 to August 2021, overdose death rates involving any drug, fentanyl, and methamphetamine with or without fentanyl, increased overall.
For the 6 months from March to August 2021, non-Hispanic Native American or Alaska Native men had the highest rates overall involving any drug, fentanyl, and methamphetamine without fentanyl, with rates of 42.0, 30.2, and 6.0 per 100,000, respectively.
The highest rates (per 100,000) of drug overdose deaths involving methamphetamine with fentanyl were for Native American or Alaska Native men (9.2) and women (8.0) and non-Hispanic White men (6.7).
Among people aged 35-64, from March to August 2021, overall drug overdose rates (per 100,000) were highest among non-Hispanic Black men (61.2) and Native American or Alaska Native men (60.0), and fentanyl-involved death rates were highest among Black men (43.3).
Rates involving methamphetamine with fentanyl were highest among Native American or Alaska Native men (12.6) and women (9.4) and White men (9.5).
Rates involving methamphetamine without fentanyl were highest among Native American or Alaska Native men (22.9).
The researchers note the findings highlight the “urgent need” for education on dangers of methamphetamine and fentanyl.
Expanding access to naloxone, fentanyl test strips, and treatments for substance use disorders to disproportionately affected populations is also critical to help curb disparities in drug overdose deaths, they add.
Limitations of the study are that overdose deaths may be underestimated because of the use of 2021 provisional data and that racial or ethnic identification may be misclassified, especially for Native American or Alaska Native people.
This study was sponsored by the National Institute on Drug Abuse of the National Institutes of Health and the Centers for Disease Control and Prevention. The authors report no relevant disclosures.
A version of this article first appeared on Medscape.com.
FROM JAMA NETWORK OPEN
Commentary: Preventing and Predicting T2D Complications, October 2022
Diabetes guidelines recommend sodium-glucose transport protein 2 (SGLT2) inhibitors to reduce kidney disease progression in patients with type 2 diabetes (T2D) and moderate-to-severe albuminuric kidney disease on the basis of renal outcomes trials, such as CREDENCE and DAPA-CKD. However, these trials did not include patients who are at low risk for kidney disease progression.
Mozenson and colleagues published a post hoc analysis of the DECLARE-TIMI 58 trial and focused on patients with low kidney risk. They demonstated that dapagliflozin slowed the progression of kidney disease in patients with T2D at high cardiovascular risk, including those who are at low risk for kidney progression. The absolute benefit for slowing kidney progression was much lower in patients with low kidney risk compared with those who are at high or very high risk (number needed to treat 177 vs 13-23). Though dapagliflozin does provide kidney protection across a spectrum of patients with kidney risk, clinicians should consider the level of risk when starting an SGLT2 inhibitor for slowing kidney disease.
SGLT2 inhibitor outcome trials and meta-analyses have mainly shown neutral results for ischemic stroke, except for sotagliflozin vs placebo in the SCORED trial. In this trial, sotagliflozin was shown to reduce total stroke. Recently, in a retrospective longitudinal cohort study of patients with T2D in Taiwan, Lin and colleagues have shown a significant reduction in new onset stroke among those who use SGLT2 inhibitor compared with those who don't. A 15% relative risk reduction in stroke was shown in an analysis that adjusted for age, sex, and duration of T2D, with a similar reduction in a propensity score-matched analysis. Although limited by its observational design, this study suggests that further research should be continued regarding the impact of SGLT2 inhibitors on stroke outcomes.
Severe hypoglycemia is a serious complication of insulin and insulin secretagogue therapy. There have been few studies regarding the association between long-term glycemic variability of A1c and fasting plasma glucose (FPG) and the risk for severe hypoglycemia. Long and colleagues performed a post hoc analysis of the ACCORD study and found that both A1c and FPG variability were associated with a greater risk for severe hypoglycemia in T2D, with FPG being a more sensitive indicator than is A1c variability. Clinicians need to be aware that A1c and FPG variability in insulin- or insulin secretagogue–treated patients with T2D places them at greater risk for severe hypoglycemia and such variability should be considered a potential target of treatment.
Although a higher mean A1c has been linked to diabetes microvascular and macrovascular complications, there is a paucity of data comparing mean A1c and A1c variability and diabetes complications. In a prospective study from Taiwan, Wu and colleagues demonstrated that both mean A1c and A1c variability predicted most diabetes-related complications, with mean A1c being more effective at predicting retinopathy and A1c variability being more effective at predicting a decline in kidney function and cardiovascular and total mortality. Perhaps physicians need to pay more attention to A1c variability and not just the mean A1c over time when assessing an individual and their overall risk for diabetes complications.
Diabetes guidelines recommend sodium-glucose transport protein 2 (SGLT2) inhibitors to reduce kidney disease progression in patients with type 2 diabetes (T2D) and moderate-to-severe albuminuric kidney disease on the basis of renal outcomes trials, such as CREDENCE and DAPA-CKD. However, these trials did not include patients who are at low risk for kidney disease progression.
Mozenson and colleagues published a post hoc analysis of the DECLARE-TIMI 58 trial and focused on patients with low kidney risk. They demonstated that dapagliflozin slowed the progression of kidney disease in patients with T2D at high cardiovascular risk, including those who are at low risk for kidney progression. The absolute benefit for slowing kidney progression was much lower in patients with low kidney risk compared with those who are at high or very high risk (number needed to treat 177 vs 13-23). Though dapagliflozin does provide kidney protection across a spectrum of patients with kidney risk, clinicians should consider the level of risk when starting an SGLT2 inhibitor for slowing kidney disease.
SGLT2 inhibitor outcome trials and meta-analyses have mainly shown neutral results for ischemic stroke, except for sotagliflozin vs placebo in the SCORED trial. In this trial, sotagliflozin was shown to reduce total stroke. Recently, in a retrospective longitudinal cohort study of patients with T2D in Taiwan, Lin and colleagues have shown a significant reduction in new onset stroke among those who use SGLT2 inhibitor compared with those who don't. A 15% relative risk reduction in stroke was shown in an analysis that adjusted for age, sex, and duration of T2D, with a similar reduction in a propensity score-matched analysis. Although limited by its observational design, this study suggests that further research should be continued regarding the impact of SGLT2 inhibitors on stroke outcomes.
Severe hypoglycemia is a serious complication of insulin and insulin secretagogue therapy. There have been few studies regarding the association between long-term glycemic variability of A1c and fasting plasma glucose (FPG) and the risk for severe hypoglycemia. Long and colleagues performed a post hoc analysis of the ACCORD study and found that both A1c and FPG variability were associated with a greater risk for severe hypoglycemia in T2D, with FPG being a more sensitive indicator than is A1c variability. Clinicians need to be aware that A1c and FPG variability in insulin- or insulin secretagogue–treated patients with T2D places them at greater risk for severe hypoglycemia and such variability should be considered a potential target of treatment.
Although a higher mean A1c has been linked to diabetes microvascular and macrovascular complications, there is a paucity of data comparing mean A1c and A1c variability and diabetes complications. In a prospective study from Taiwan, Wu and colleagues demonstrated that both mean A1c and A1c variability predicted most diabetes-related complications, with mean A1c being more effective at predicting retinopathy and A1c variability being more effective at predicting a decline in kidney function and cardiovascular and total mortality. Perhaps physicians need to pay more attention to A1c variability and not just the mean A1c over time when assessing an individual and their overall risk for diabetes complications.
Diabetes guidelines recommend sodium-glucose transport protein 2 (SGLT2) inhibitors to reduce kidney disease progression in patients with type 2 diabetes (T2D) and moderate-to-severe albuminuric kidney disease on the basis of renal outcomes trials, such as CREDENCE and DAPA-CKD. However, these trials did not include patients who are at low risk for kidney disease progression.
Mozenson and colleagues published a post hoc analysis of the DECLARE-TIMI 58 trial and focused on patients with low kidney risk. They demonstated that dapagliflozin slowed the progression of kidney disease in patients with T2D at high cardiovascular risk, including those who are at low risk for kidney progression. The absolute benefit for slowing kidney progression was much lower in patients with low kidney risk compared with those who are at high or very high risk (number needed to treat 177 vs 13-23). Though dapagliflozin does provide kidney protection across a spectrum of patients with kidney risk, clinicians should consider the level of risk when starting an SGLT2 inhibitor for slowing kidney disease.
SGLT2 inhibitor outcome trials and meta-analyses have mainly shown neutral results for ischemic stroke, except for sotagliflozin vs placebo in the SCORED trial. In this trial, sotagliflozin was shown to reduce total stroke. Recently, in a retrospective longitudinal cohort study of patients with T2D in Taiwan, Lin and colleagues have shown a significant reduction in new onset stroke among those who use SGLT2 inhibitor compared with those who don't. A 15% relative risk reduction in stroke was shown in an analysis that adjusted for age, sex, and duration of T2D, with a similar reduction in a propensity score-matched analysis. Although limited by its observational design, this study suggests that further research should be continued regarding the impact of SGLT2 inhibitors on stroke outcomes.
Severe hypoglycemia is a serious complication of insulin and insulin secretagogue therapy. There have been few studies regarding the association between long-term glycemic variability of A1c and fasting plasma glucose (FPG) and the risk for severe hypoglycemia. Long and colleagues performed a post hoc analysis of the ACCORD study and found that both A1c and FPG variability were associated with a greater risk for severe hypoglycemia in T2D, with FPG being a more sensitive indicator than is A1c variability. Clinicians need to be aware that A1c and FPG variability in insulin- or insulin secretagogue–treated patients with T2D places them at greater risk for severe hypoglycemia and such variability should be considered a potential target of treatment.
Although a higher mean A1c has been linked to diabetes microvascular and macrovascular complications, there is a paucity of data comparing mean A1c and A1c variability and diabetes complications. In a prospective study from Taiwan, Wu and colleagues demonstrated that both mean A1c and A1c variability predicted most diabetes-related complications, with mean A1c being more effective at predicting retinopathy and A1c variability being more effective at predicting a decline in kidney function and cardiovascular and total mortality. Perhaps physicians need to pay more attention to A1c variability and not just the mean A1c over time when assessing an individual and their overall risk for diabetes complications.
‘Concerning’ rate of benzo/Z-drug use in IBD
Patients with inflammatory bowel disease (IBD) are 70% more likely to use benzodiazepines and “Z-drugs” than are the general population, a large study from Canada suggests.
Mood/anxiety disorders and sleep disorders are common in patients with IBD, but few studies have looked at use of benzodiazepines and Z-drugs (such as zolpidem, zaleplon, and eszopiclone) in this patient population.
The results are “concerning, and especially as the IBD population ages, these drugs are associated with health risks, including something as simple as falls,” first author Charles Bernstein, MD, of the IBD clinical and research centre, University of Manitoba, Winnipeg, told this news organization.
“Clinicians need to find better strategies to deal with anxiety disorders and sleep disorders in the IBD population,” Dr. Bernstein said.
The study was published online in the American Journal of Gastroenterology.
High burden of use
Using administrative data from Manitoba, Dr. Bernstein and colleagues identified 5,741 patients with IBD (2,381 with Crohn’s disease and 3,360 with ulcerative colitis) and matched them (1:5) to 28,661 population controls without IBD.
Over a 20-year period (1997-2017), there was a “high burden” of benzodiazepine and Z-drug use in the IBD population. In 2017, roughly 20% of Manitobans with IBD were using benzodiazepines, and 20% were using Z-drugs, the study team reports.
The benzodiazepine use rate (per 1,000) was 28.06 in the IBD cohort vs. 16.83 in the non-IBD population (adjusted rate ratio, 1.67). The use rate for Z-drugs was 21.07 in the IBD cohort vs. 11.26 in the non-IBD population (adjusted RR, 1.87).
Benzodiazepine use declined from 1997 to 2016, but it remained at least 50% higher in patients with IBD than in the general population over this period, the researchers found. The rate of Z-drug use also was higher in the IBD population than in the general population but remained stable over the 20-year study period.
Regardless of age, men and women had similarly high use rates for benzodiazepines, Z-drugs, and joint use of benzodiazepines and Z-drugs. The highest incidence rates for joint benzodiazepine and Z-drug use were in young adults (age 18-44 years), and these rates were similar among men and women.
Patients with IBD and a mood/anxiety disorder also were more likely to use benzodiazepines and Z-drugs and to be continuous users than were those without a mood/anxiety disorder.
Mental health and IBD go hand in hand
“The results are not very surprising, but they highlight the importance of mental health and mood disturbances in patients with IBD,” Ashwin Ananthakrishnan, MBBS, MPH, with Massachusetts General Hospital and Harvard Medical School in Boston, who wasn’t involved in the study, told this news organization.
“It is important for treating physicians to be aware of the important mental health implications of IBD diagnosis and disease activity, to screen patients for these disturbances, and to institute early effective interventions,” Dr. Ananthakrishnan said.
Also offering perspective, Laurie Keefer, PhD, academic health psychologist and director of psychobehavioral research within the division of gastroenterology, Mount Sinai Health System, New York, said that she is “concerned but not surprised” by the results of this study.
“One in three patients with IBD meets criteria for an anxiety disorder,” Dr. Keefer told this news organization.
And with the ongoing mental health crisis and shortage of mental health providers, “gastroenterologists are, unfortunately, in the position where they may have to manage these issues,” she said.
Dr. Keefer noted that when patients are first diagnosed with IBD, they will likely be on prednisone, and “an antidote” for the side effects of prednisone are benzodiazepines and sleeping medications because prednisone itself causes insomnia. “However, that’s really just a band-aid,” she said.
A major concern, said Dr. Keefer, is that young men and women who are diagnosed with IBD in their 20s may start using these drugs and become reliant on them.
“People do build up a tolerance to these drugs, so they need more and more to receive the same effect,” she said.
A better approach is to figure out why patients are so anxious and teach them skills to manage their anxiety and sleep problems so that they’re not dependent on these drugs, Dr. Keefer said.
“There are behavioral strategies that can help. These are harder to do, and they’re not a quick fix. However, they are skills you can learn in your 20s and so when you have an IBD flare at 50, you have the skills to deal with it,” she said.
“We just have to be a little more proactive in really encouraging patients to learn disease management skills,” Dr. Keefer added.
The study was funded by the Canadian Institutes of Health Research and Crohn’s and Colitis Canada. Dr. Bernstein has disclosed relationships with AbbVie Canada, Amgen Canada, Bristol-Myers Squibb Canada, Roche Canada, Janssen Canada, Sandoz Canada, Takeda and Takeda Canada, Pfizer Canada, Mylan Pharmaceuticals, and Medtronic Canada. Dr. Ananthakrishnan and Dr. Keefer report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Patients with inflammatory bowel disease (IBD) are 70% more likely to use benzodiazepines and “Z-drugs” than are the general population, a large study from Canada suggests.
Mood/anxiety disorders and sleep disorders are common in patients with IBD, but few studies have looked at use of benzodiazepines and Z-drugs (such as zolpidem, zaleplon, and eszopiclone) in this patient population.
The results are “concerning, and especially as the IBD population ages, these drugs are associated with health risks, including something as simple as falls,” first author Charles Bernstein, MD, of the IBD clinical and research centre, University of Manitoba, Winnipeg, told this news organization.
“Clinicians need to find better strategies to deal with anxiety disorders and sleep disorders in the IBD population,” Dr. Bernstein said.
The study was published online in the American Journal of Gastroenterology.
High burden of use
Using administrative data from Manitoba, Dr. Bernstein and colleagues identified 5,741 patients with IBD (2,381 with Crohn’s disease and 3,360 with ulcerative colitis) and matched them (1:5) to 28,661 population controls without IBD.
Over a 20-year period (1997-2017), there was a “high burden” of benzodiazepine and Z-drug use in the IBD population. In 2017, roughly 20% of Manitobans with IBD were using benzodiazepines, and 20% were using Z-drugs, the study team reports.
The benzodiazepine use rate (per 1,000) was 28.06 in the IBD cohort vs. 16.83 in the non-IBD population (adjusted rate ratio, 1.67). The use rate for Z-drugs was 21.07 in the IBD cohort vs. 11.26 in the non-IBD population (adjusted RR, 1.87).
Benzodiazepine use declined from 1997 to 2016, but it remained at least 50% higher in patients with IBD than in the general population over this period, the researchers found. The rate of Z-drug use also was higher in the IBD population than in the general population but remained stable over the 20-year study period.
Regardless of age, men and women had similarly high use rates for benzodiazepines, Z-drugs, and joint use of benzodiazepines and Z-drugs. The highest incidence rates for joint benzodiazepine and Z-drug use were in young adults (age 18-44 years), and these rates were similar among men and women.
Patients with IBD and a mood/anxiety disorder also were more likely to use benzodiazepines and Z-drugs and to be continuous users than were those without a mood/anxiety disorder.
Mental health and IBD go hand in hand
“The results are not very surprising, but they highlight the importance of mental health and mood disturbances in patients with IBD,” Ashwin Ananthakrishnan, MBBS, MPH, with Massachusetts General Hospital and Harvard Medical School in Boston, who wasn’t involved in the study, told this news organization.
“It is important for treating physicians to be aware of the important mental health implications of IBD diagnosis and disease activity, to screen patients for these disturbances, and to institute early effective interventions,” Dr. Ananthakrishnan said.
Also offering perspective, Laurie Keefer, PhD, academic health psychologist and director of psychobehavioral research within the division of gastroenterology, Mount Sinai Health System, New York, said that she is “concerned but not surprised” by the results of this study.
“One in three patients with IBD meets criteria for an anxiety disorder,” Dr. Keefer told this news organization.
And with the ongoing mental health crisis and shortage of mental health providers, “gastroenterologists are, unfortunately, in the position where they may have to manage these issues,” she said.
Dr. Keefer noted that when patients are first diagnosed with IBD, they will likely be on prednisone, and “an antidote” for the side effects of prednisone are benzodiazepines and sleeping medications because prednisone itself causes insomnia. “However, that’s really just a band-aid,” she said.
A major concern, said Dr. Keefer, is that young men and women who are diagnosed with IBD in their 20s may start using these drugs and become reliant on them.
“People do build up a tolerance to these drugs, so they need more and more to receive the same effect,” she said.
A better approach is to figure out why patients are so anxious and teach them skills to manage their anxiety and sleep problems so that they’re not dependent on these drugs, Dr. Keefer said.
“There are behavioral strategies that can help. These are harder to do, and they’re not a quick fix. However, they are skills you can learn in your 20s and so when you have an IBD flare at 50, you have the skills to deal with it,” she said.
“We just have to be a little more proactive in really encouraging patients to learn disease management skills,” Dr. Keefer added.
The study was funded by the Canadian Institutes of Health Research and Crohn’s and Colitis Canada. Dr. Bernstein has disclosed relationships with AbbVie Canada, Amgen Canada, Bristol-Myers Squibb Canada, Roche Canada, Janssen Canada, Sandoz Canada, Takeda and Takeda Canada, Pfizer Canada, Mylan Pharmaceuticals, and Medtronic Canada. Dr. Ananthakrishnan and Dr. Keefer report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Patients with inflammatory bowel disease (IBD) are 70% more likely to use benzodiazepines and “Z-drugs” than are the general population, a large study from Canada suggests.
Mood/anxiety disorders and sleep disorders are common in patients with IBD, but few studies have looked at use of benzodiazepines and Z-drugs (such as zolpidem, zaleplon, and eszopiclone) in this patient population.
The results are “concerning, and especially as the IBD population ages, these drugs are associated with health risks, including something as simple as falls,” first author Charles Bernstein, MD, of the IBD clinical and research centre, University of Manitoba, Winnipeg, told this news organization.
“Clinicians need to find better strategies to deal with anxiety disorders and sleep disorders in the IBD population,” Dr. Bernstein said.
The study was published online in the American Journal of Gastroenterology.
High burden of use
Using administrative data from Manitoba, Dr. Bernstein and colleagues identified 5,741 patients with IBD (2,381 with Crohn’s disease and 3,360 with ulcerative colitis) and matched them (1:5) to 28,661 population controls without IBD.
Over a 20-year period (1997-2017), there was a “high burden” of benzodiazepine and Z-drug use in the IBD population. In 2017, roughly 20% of Manitobans with IBD were using benzodiazepines, and 20% were using Z-drugs, the study team reports.
The benzodiazepine use rate (per 1,000) was 28.06 in the IBD cohort vs. 16.83 in the non-IBD population (adjusted rate ratio, 1.67). The use rate for Z-drugs was 21.07 in the IBD cohort vs. 11.26 in the non-IBD population (adjusted RR, 1.87).
Benzodiazepine use declined from 1997 to 2016, but it remained at least 50% higher in patients with IBD than in the general population over this period, the researchers found. The rate of Z-drug use also was higher in the IBD population than in the general population but remained stable over the 20-year study period.
Regardless of age, men and women had similarly high use rates for benzodiazepines, Z-drugs, and joint use of benzodiazepines and Z-drugs. The highest incidence rates for joint benzodiazepine and Z-drug use were in young adults (age 18-44 years), and these rates were similar among men and women.
Patients with IBD and a mood/anxiety disorder also were more likely to use benzodiazepines and Z-drugs and to be continuous users than were those without a mood/anxiety disorder.
Mental health and IBD go hand in hand
“The results are not very surprising, but they highlight the importance of mental health and mood disturbances in patients with IBD,” Ashwin Ananthakrishnan, MBBS, MPH, with Massachusetts General Hospital and Harvard Medical School in Boston, who wasn’t involved in the study, told this news organization.
“It is important for treating physicians to be aware of the important mental health implications of IBD diagnosis and disease activity, to screen patients for these disturbances, and to institute early effective interventions,” Dr. Ananthakrishnan said.
Also offering perspective, Laurie Keefer, PhD, academic health psychologist and director of psychobehavioral research within the division of gastroenterology, Mount Sinai Health System, New York, said that she is “concerned but not surprised” by the results of this study.
“One in three patients with IBD meets criteria for an anxiety disorder,” Dr. Keefer told this news organization.
And with the ongoing mental health crisis and shortage of mental health providers, “gastroenterologists are, unfortunately, in the position where they may have to manage these issues,” she said.
Dr. Keefer noted that when patients are first diagnosed with IBD, they will likely be on prednisone, and “an antidote” for the side effects of prednisone are benzodiazepines and sleeping medications because prednisone itself causes insomnia. “However, that’s really just a band-aid,” she said.
A major concern, said Dr. Keefer, is that young men and women who are diagnosed with IBD in their 20s may start using these drugs and become reliant on them.
“People do build up a tolerance to these drugs, so they need more and more to receive the same effect,” she said.
A better approach is to figure out why patients are so anxious and teach them skills to manage their anxiety and sleep problems so that they’re not dependent on these drugs, Dr. Keefer said.
“There are behavioral strategies that can help. These are harder to do, and they’re not a quick fix. However, they are skills you can learn in your 20s and so when you have an IBD flare at 50, you have the skills to deal with it,” she said.
“We just have to be a little more proactive in really encouraging patients to learn disease management skills,” Dr. Keefer added.
The study was funded by the Canadian Institutes of Health Research and Crohn’s and Colitis Canada. Dr. Bernstein has disclosed relationships with AbbVie Canada, Amgen Canada, Bristol-Myers Squibb Canada, Roche Canada, Janssen Canada, Sandoz Canada, Takeda and Takeda Canada, Pfizer Canada, Mylan Pharmaceuticals, and Medtronic Canada. Dr. Ananthakrishnan and Dr. Keefer report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM AMERICAN JOURNAL OF GASTROENTEROLOGY