User login
What’s the best approach for dysplasia surveillance in IBD?
Chromoendoscopy
Chromoendoscopy is superior in both the detection and long-term management of dysplasia in IBD when compared to high-definition white-light examination. Chromoendoscopy not only enhances dysplasia detection but further improves the definition of these lesions which then facilitates endoscopic management.
Human beings have an innate visual perception limitation due to our inability to perceive depth in the red/green wavelength of light compared to the blue wavelength. All of the improvements in scope magnification and resolution bump up against this fact of our biology. Blue dye enhances our ability to perceive depth in this milieu and therefore detect and define flat lesions.
The superiority of chromoendoscopy when using standard definition colonoscopes has been demonstrated repeatedly and set the stage for the 2015 SCENIC international consensus statement and a seismic shift in our endoscopic management of dysplasia in patients with colitis. This evidence base remains relevant because only 77% of colonoscopies performed in the United States are performed using high-definition equipment. Nearly one-quarter of our patients lack access to the newer equipment and therefore without chromoendoscopy are being surveyed outside of current guidelines.
Since the SCENIC statement multiple studies comparing chromoendoscopy with newer higher resolution colonoscopes have been performed. The vast preponderance of evidence has shown either a trend toward superiority or the outright superiority of chromoendoscopy when compared with high-definition white-light examination in detection and long-term management of dysplasia.
Chromoendoscopy has allowed us to increase our visual vocabulary in describing dysplasia in the setting of colitis and, thus, open the door to further innovation and perhaps adoption of artificial intelligence going forward. Our ability to classify lesions encountered in colitis mucosa has become more precise with the expanded terminology the dye-enhanced high-definition view affords, with the Frankfurt Advanced Chromoendoscopic IBD Lesion Classification being the best and most detailed example.
It is no accident that advanced endoscopists have universally adopted chromoendoscopy for the management of dysplastic lesions whether by mucosal resection or submucosal dissection techniques. Chromoendoscopy is recommended by all society guidelines because of these inherent advantages.
Is high-definition white-light “good enough” for surveilling our patients with colitis? The overall incidence of CRC in IBD has been declining which makes each colonoscopy count more. We are performing up to 88 colonoscopies in patients with colitis to find a single cancer (compared to 8 in non-IBD surveillance patients). We need to be performing fewer and more precise chromoendoscopic examinations. We are otherwise failing to serve our IBD patients by performing too many negative procedures at too high a cost. Our patients deserve more than merely “good enough.”
James F. Marion, MD, is professor of medicine at the Icahn School of Medicine at Mount Sinai and director of education and outreach at The Susan and Leonard Feinstein Inflammatory Bowel Disease Center of The Mount Sinai Hospital, both in New York. He is on the advisory board for Janssen.
High-definition white light endoscopy
Longstanding ulcerative colitis and Crohn’s colitis increase the risk for developing colorectal cancer. The majority of neoplastic lesions are visible endoscopically, and therefore, dye spraying chromoendoscopy (DCE) may not be necessary for all inflammatory bowel disease (IBD) patients undergoing a routine dysplasia surveillance colonoscopy.
High-definition white light (HDWL) endoscopes have higher magnification capacities and pixel density than the standard definition (SD) systems and provide sharper images with fewer artifacts. Although DCE has been proven to be superior to SD, there have been no differences in detection of dysplasia for routine surveillance with use of HDWL compared to DCE.
The SCENIC guidelines key recommendation for optimizing detection and management of dysplasia in IBD is to use a HD colonoscope. Further, based on the recent ACG Practice Guidelines for Dysplasia Screening and Surveillance in 2019, HD colonoscopes are also recommended.
In a network meta-analysis of eight parallel-group randomized controlled trials (RCT), there was very low quality of evidence to support the use of DCE over HDWL. This was contrary to prior, non-RCT studies which suggested that both SD and HDWL were inferior to DCE. More recently, Iacucci and colleagues conducted a randomized noninferiority trial to determine detection rates of neoplastic lesions in IBD patients with longstanding colitis who had inactive disease and enrolled in HDWL, DCE, or virtual chromoendoscopy (VCE) groups. The conclusion was that VCE and HDWL was not inferior to DCE, and HDWL was sufficient in detection of all neoplastic lesions including dysplasia and adenocarcinoma. In another large multicenter, prospective RCT of nine tertiary hospitals in South Korea, the detection rates of colitis-associated dysplasia or all colorectal neoplasia were comparable in HDWL versus high-definition chromoendoscopy. Lastly, a meta-analysis of six RCTs concluded that, although DCE is superior to SD in identification of dysplasia, there was no benefit of DCE compared to HDWL.
In summary, HDWL colonoscopy should be the standard of care for routine dysplasia surveillance in IBD. DCE should be considered in patients who are found to have a dysplastic lesion by HDWL in order to better delineate the lesion margins, endoscopically resect or remove, and for future dysplasia surveillance colonoscopies in the higher-risk IBD patient. Overall, a close and careful examination of the entire colon with use of HDWL is sufficient in detection of dysplasia and for routine surveillance in IBD patients.
Anita Afzali, MD, MPH, AGAF, is medical director of the Inflammatory Bowel Disease Center and program director of the Advanced Inflammatory Bowel Disease Fellowship at Ohio State University in Hilliard, Ohio. She has no relevant conflicts of interest.
These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Chromoendoscopy
Chromoendoscopy is superior in both the detection and long-term management of dysplasia in IBD when compared to high-definition white-light examination. Chromoendoscopy not only enhances dysplasia detection but further improves the definition of these lesions which then facilitates endoscopic management.
Human beings have an innate visual perception limitation due to our inability to perceive depth in the red/green wavelength of light compared to the blue wavelength. All of the improvements in scope magnification and resolution bump up against this fact of our biology. Blue dye enhances our ability to perceive depth in this milieu and therefore detect and define flat lesions.
The superiority of chromoendoscopy when using standard definition colonoscopes has been demonstrated repeatedly and set the stage for the 2015 SCENIC international consensus statement and a seismic shift in our endoscopic management of dysplasia in patients with colitis. This evidence base remains relevant because only 77% of colonoscopies performed in the United States are performed using high-definition equipment. Nearly one-quarter of our patients lack access to the newer equipment and therefore without chromoendoscopy are being surveyed outside of current guidelines.
Since the SCENIC statement multiple studies comparing chromoendoscopy with newer higher resolution colonoscopes have been performed. The vast preponderance of evidence has shown either a trend toward superiority or the outright superiority of chromoendoscopy when compared with high-definition white-light examination in detection and long-term management of dysplasia.
Chromoendoscopy has allowed us to increase our visual vocabulary in describing dysplasia in the setting of colitis and, thus, open the door to further innovation and perhaps adoption of artificial intelligence going forward. Our ability to classify lesions encountered in colitis mucosa has become more precise with the expanded terminology the dye-enhanced high-definition view affords, with the Frankfurt Advanced Chromoendoscopic IBD Lesion Classification being the best and most detailed example.
It is no accident that advanced endoscopists have universally adopted chromoendoscopy for the management of dysplastic lesions whether by mucosal resection or submucosal dissection techniques. Chromoendoscopy is recommended by all society guidelines because of these inherent advantages.
Is high-definition white-light “good enough” for surveilling our patients with colitis? The overall incidence of CRC in IBD has been declining which makes each colonoscopy count more. We are performing up to 88 colonoscopies in patients with colitis to find a single cancer (compared to 8 in non-IBD surveillance patients). We need to be performing fewer and more precise chromoendoscopic examinations. We are otherwise failing to serve our IBD patients by performing too many negative procedures at too high a cost. Our patients deserve more than merely “good enough.”
James F. Marion, MD, is professor of medicine at the Icahn School of Medicine at Mount Sinai and director of education and outreach at The Susan and Leonard Feinstein Inflammatory Bowel Disease Center of The Mount Sinai Hospital, both in New York. He is on the advisory board for Janssen.
High-definition white light endoscopy
Longstanding ulcerative colitis and Crohn’s colitis increase the risk for developing colorectal cancer. The majority of neoplastic lesions are visible endoscopically, and therefore, dye spraying chromoendoscopy (DCE) may not be necessary for all inflammatory bowel disease (IBD) patients undergoing a routine dysplasia surveillance colonoscopy.
High-definition white light (HDWL) endoscopes have higher magnification capacities and pixel density than the standard definition (SD) systems and provide sharper images with fewer artifacts. Although DCE has been proven to be superior to SD, there have been no differences in detection of dysplasia for routine surveillance with use of HDWL compared to DCE.
The SCENIC guidelines key recommendation for optimizing detection and management of dysplasia in IBD is to use a HD colonoscope. Further, based on the recent ACG Practice Guidelines for Dysplasia Screening and Surveillance in 2019, HD colonoscopes are also recommended.
In a network meta-analysis of eight parallel-group randomized controlled trials (RCT), there was very low quality of evidence to support the use of DCE over HDWL. This was contrary to prior, non-RCT studies which suggested that both SD and HDWL were inferior to DCE. More recently, Iacucci and colleagues conducted a randomized noninferiority trial to determine detection rates of neoplastic lesions in IBD patients with longstanding colitis who had inactive disease and enrolled in HDWL, DCE, or virtual chromoendoscopy (VCE) groups. The conclusion was that VCE and HDWL was not inferior to DCE, and HDWL was sufficient in detection of all neoplastic lesions including dysplasia and adenocarcinoma. In another large multicenter, prospective RCT of nine tertiary hospitals in South Korea, the detection rates of colitis-associated dysplasia or all colorectal neoplasia were comparable in HDWL versus high-definition chromoendoscopy. Lastly, a meta-analysis of six RCTs concluded that, although DCE is superior to SD in identification of dysplasia, there was no benefit of DCE compared to HDWL.
In summary, HDWL colonoscopy should be the standard of care for routine dysplasia surveillance in IBD. DCE should be considered in patients who are found to have a dysplastic lesion by HDWL in order to better delineate the lesion margins, endoscopically resect or remove, and for future dysplasia surveillance colonoscopies in the higher-risk IBD patient. Overall, a close and careful examination of the entire colon with use of HDWL is sufficient in detection of dysplasia and for routine surveillance in IBD patients.
Anita Afzali, MD, MPH, AGAF, is medical director of the Inflammatory Bowel Disease Center and program director of the Advanced Inflammatory Bowel Disease Fellowship at Ohio State University in Hilliard, Ohio. She has no relevant conflicts of interest.
These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Chromoendoscopy
Chromoendoscopy is superior in both the detection and long-term management of dysplasia in IBD when compared to high-definition white-light examination. Chromoendoscopy not only enhances dysplasia detection but further improves the definition of these lesions which then facilitates endoscopic management.
Human beings have an innate visual perception limitation due to our inability to perceive depth in the red/green wavelength of light compared to the blue wavelength. All of the improvements in scope magnification and resolution bump up against this fact of our biology. Blue dye enhances our ability to perceive depth in this milieu and therefore detect and define flat lesions.
The superiority of chromoendoscopy when using standard definition colonoscopes has been demonstrated repeatedly and set the stage for the 2015 SCENIC international consensus statement and a seismic shift in our endoscopic management of dysplasia in patients with colitis. This evidence base remains relevant because only 77% of colonoscopies performed in the United States are performed using high-definition equipment. Nearly one-quarter of our patients lack access to the newer equipment and therefore without chromoendoscopy are being surveyed outside of current guidelines.
Since the SCENIC statement multiple studies comparing chromoendoscopy with newer higher resolution colonoscopes have been performed. The vast preponderance of evidence has shown either a trend toward superiority or the outright superiority of chromoendoscopy when compared with high-definition white-light examination in detection and long-term management of dysplasia.
Chromoendoscopy has allowed us to increase our visual vocabulary in describing dysplasia in the setting of colitis and, thus, open the door to further innovation and perhaps adoption of artificial intelligence going forward. Our ability to classify lesions encountered in colitis mucosa has become more precise with the expanded terminology the dye-enhanced high-definition view affords, with the Frankfurt Advanced Chromoendoscopic IBD Lesion Classification being the best and most detailed example.
It is no accident that advanced endoscopists have universally adopted chromoendoscopy for the management of dysplastic lesions whether by mucosal resection or submucosal dissection techniques. Chromoendoscopy is recommended by all society guidelines because of these inherent advantages.
Is high-definition white-light “good enough” for surveilling our patients with colitis? The overall incidence of CRC in IBD has been declining which makes each colonoscopy count more. We are performing up to 88 colonoscopies in patients with colitis to find a single cancer (compared to 8 in non-IBD surveillance patients). We need to be performing fewer and more precise chromoendoscopic examinations. We are otherwise failing to serve our IBD patients by performing too many negative procedures at too high a cost. Our patients deserve more than merely “good enough.”
James F. Marion, MD, is professor of medicine at the Icahn School of Medicine at Mount Sinai and director of education and outreach at The Susan and Leonard Feinstein Inflammatory Bowel Disease Center of The Mount Sinai Hospital, both in New York. He is on the advisory board for Janssen.
High-definition white light endoscopy
Longstanding ulcerative colitis and Crohn’s colitis increase the risk for developing colorectal cancer. The majority of neoplastic lesions are visible endoscopically, and therefore, dye spraying chromoendoscopy (DCE) may not be necessary for all inflammatory bowel disease (IBD) patients undergoing a routine dysplasia surveillance colonoscopy.
High-definition white light (HDWL) endoscopes have higher magnification capacities and pixel density than the standard definition (SD) systems and provide sharper images with fewer artifacts. Although DCE has been proven to be superior to SD, there have been no differences in detection of dysplasia for routine surveillance with use of HDWL compared to DCE.
The SCENIC guidelines key recommendation for optimizing detection and management of dysplasia in IBD is to use a HD colonoscope. Further, based on the recent ACG Practice Guidelines for Dysplasia Screening and Surveillance in 2019, HD colonoscopes are also recommended.
In a network meta-analysis of eight parallel-group randomized controlled trials (RCT), there was very low quality of evidence to support the use of DCE over HDWL. This was contrary to prior, non-RCT studies which suggested that both SD and HDWL were inferior to DCE. More recently, Iacucci and colleagues conducted a randomized noninferiority trial to determine detection rates of neoplastic lesions in IBD patients with longstanding colitis who had inactive disease and enrolled in HDWL, DCE, or virtual chromoendoscopy (VCE) groups. The conclusion was that VCE and HDWL was not inferior to DCE, and HDWL was sufficient in detection of all neoplastic lesions including dysplasia and adenocarcinoma. In another large multicenter, prospective RCT of nine tertiary hospitals in South Korea, the detection rates of colitis-associated dysplasia or all colorectal neoplasia were comparable in HDWL versus high-definition chromoendoscopy. Lastly, a meta-analysis of six RCTs concluded that, although DCE is superior to SD in identification of dysplasia, there was no benefit of DCE compared to HDWL.
In summary, HDWL colonoscopy should be the standard of care for routine dysplasia surveillance in IBD. DCE should be considered in patients who are found to have a dysplastic lesion by HDWL in order to better delineate the lesion margins, endoscopically resect or remove, and for future dysplasia surveillance colonoscopies in the higher-risk IBD patient. Overall, a close and careful examination of the entire colon with use of HDWL is sufficient in detection of dysplasia and for routine surveillance in IBD patients.
Anita Afzali, MD, MPH, AGAF, is medical director of the Inflammatory Bowel Disease Center and program director of the Advanced Inflammatory Bowel Disease Fellowship at Ohio State University in Hilliard, Ohio. She has no relevant conflicts of interest.
These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Retinopathy risk higher in young-onset T2D, more so in men
Men diagnosed with type 2 diabetes (T2D) by the age of 40 years appear significantly more likely to develop retinopathy than men who are diagnosed at an older age, Norwegian researchers report.
In a cross-sectional study of about 10,000 people, men with young-onset T2D were 72% more likely than men aged 50 years or older to have retinopathy.
While an increased retinopathy risk was also seen in women with young-onset T2D versus older women at first, this difference was not significant after adjusting for various confounding factors.
The effect of young-onset diabetes on retinopathy seems to be gender specific, Katrina Tibballs, MD, of the department of general practice at the University of Oslo, reported at the annual meeting of the European Association for the Study of Diabetes.
“In the unadjusted analysis, the odds ratio for retinopathy was substantially higher in both [young-onset] men [odds ratio, 3.0] and women [OR, 2.46], compared with those 50 or older at diabetes diagnosis,” Dr. Tibballs said.
That relationship was not substantially altered after adjustment for variables such as level of education, country background, gender, and body mass index, with adjusted ORs of 2.56 and 2.55 for men and women, respectively.
However, further adjustment to include current age, duration of diabetes, and blood lipids and glycated hemoglobin levels, led to the difference no longer holding for women (OR, 1.34; 95% confidence interval, 0.95-1.89) as it did for men (OR 1.72; 95% CI, 1.29-2.29).
First data in Norwegian population
Cross-sectional data on more than 10,000 people with T2D were used for the analysis. These came from the ROSA4 study, a general practice study conducted across Norway in 2014.
Just over 10% of the study population used in the analysis was under the age of 40 years at diagnosis of T2D; 21% were aged between 40 and 49 years, and 69% were at least 50 years old.
The mean age of those with young-onset T2D, defined as a diagnosis before the age of 40 years, was 33 years. These individuals had a longer disease duration than those in the other age groups (11.4 vs. 10.0 vs. 7.8 years).
“Looking at clinical characteristics, we say that individuals [with young-onset T2D] have a higher level of hemoglobin A1c than those with diabetes onset later in life,” Dr. Tibballs said.
“This is despite a substantially higher proportion [being] treated with insulin and fewer on lifestyle interventions alone.”
Gender differences were seen in A1c levels, with men with young-onset T2D having consistently higher levels than women, with levels increasing with diabetes duration.
Rise in retinopathy faster in men than in women
Dr. Tibballs reported that, not only did the prevalence of retinopathy rise faster in those of a younger age, but it also rose more quickly in men with young-onset T2D than it in their female counterparts.
“Comparing that [young-onset diabetes] and later-onset diabetes in men and women separately, we see a clearly higher prevalence of retinopathy with increasing diabetes duration for [young-onset] men,” she said.
In women, on the other hand, there was “no clear indication of a higher retinopathy prevalence in [young-onset diabetes], except in those with the longest diabetes duration.”
So, what do the results mean for practice? First, they confirm prior work showing that there is a strong association between retinopathy and age at diagnosis of T2D. Second, they suggest that this is despite intensive glucose-lowering treatment.
She speculated that men with young-onset T2D may have had a delayed diagnosis when compared with women and individuals with later onset diabetes, Dr. Tibballs said.
“This may in turn lead to delayed onset of glucose-lowering treatment, allowing for more time with high glycemic exposure and increased risk of acquiring complications, such as retinopathy at the time of diagnosis, or in the first years after,” said Dr. Tibballs.
These are cross-sectional data, “so we can’t say anything about whether this treatment is sufficient, but it is obviously not reducing HbA1c levels as much as we would like” added Dr. Tibballs, who is a primary care physician and PhD student.
The study was supported by The Norwegian Research Fund for General Practice. Dr. Tibballs had no conflicts of interest to disclose.
Men diagnosed with type 2 diabetes (T2D) by the age of 40 years appear significantly more likely to develop retinopathy than men who are diagnosed at an older age, Norwegian researchers report.
In a cross-sectional study of about 10,000 people, men with young-onset T2D were 72% more likely than men aged 50 years or older to have retinopathy.
While an increased retinopathy risk was also seen in women with young-onset T2D versus older women at first, this difference was not significant after adjusting for various confounding factors.
The effect of young-onset diabetes on retinopathy seems to be gender specific, Katrina Tibballs, MD, of the department of general practice at the University of Oslo, reported at the annual meeting of the European Association for the Study of Diabetes.
“In the unadjusted analysis, the odds ratio for retinopathy was substantially higher in both [young-onset] men [odds ratio, 3.0] and women [OR, 2.46], compared with those 50 or older at diabetes diagnosis,” Dr. Tibballs said.
That relationship was not substantially altered after adjustment for variables such as level of education, country background, gender, and body mass index, with adjusted ORs of 2.56 and 2.55 for men and women, respectively.
However, further adjustment to include current age, duration of diabetes, and blood lipids and glycated hemoglobin levels, led to the difference no longer holding for women (OR, 1.34; 95% confidence interval, 0.95-1.89) as it did for men (OR 1.72; 95% CI, 1.29-2.29).
First data in Norwegian population
Cross-sectional data on more than 10,000 people with T2D were used for the analysis. These came from the ROSA4 study, a general practice study conducted across Norway in 2014.
Just over 10% of the study population used in the analysis was under the age of 40 years at diagnosis of T2D; 21% were aged between 40 and 49 years, and 69% were at least 50 years old.
The mean age of those with young-onset T2D, defined as a diagnosis before the age of 40 years, was 33 years. These individuals had a longer disease duration than those in the other age groups (11.4 vs. 10.0 vs. 7.8 years).
“Looking at clinical characteristics, we say that individuals [with young-onset T2D] have a higher level of hemoglobin A1c than those with diabetes onset later in life,” Dr. Tibballs said.
“This is despite a substantially higher proportion [being] treated with insulin and fewer on lifestyle interventions alone.”
Gender differences were seen in A1c levels, with men with young-onset T2D having consistently higher levels than women, with levels increasing with diabetes duration.
Rise in retinopathy faster in men than in women
Dr. Tibballs reported that, not only did the prevalence of retinopathy rise faster in those of a younger age, but it also rose more quickly in men with young-onset T2D than it in their female counterparts.
“Comparing that [young-onset diabetes] and later-onset diabetes in men and women separately, we see a clearly higher prevalence of retinopathy with increasing diabetes duration for [young-onset] men,” she said.
In women, on the other hand, there was “no clear indication of a higher retinopathy prevalence in [young-onset diabetes], except in those with the longest diabetes duration.”
So, what do the results mean for practice? First, they confirm prior work showing that there is a strong association between retinopathy and age at diagnosis of T2D. Second, they suggest that this is despite intensive glucose-lowering treatment.
She speculated that men with young-onset T2D may have had a delayed diagnosis when compared with women and individuals with later onset diabetes, Dr. Tibballs said.
“This may in turn lead to delayed onset of glucose-lowering treatment, allowing for more time with high glycemic exposure and increased risk of acquiring complications, such as retinopathy at the time of diagnosis, or in the first years after,” said Dr. Tibballs.
These are cross-sectional data, “so we can’t say anything about whether this treatment is sufficient, but it is obviously not reducing HbA1c levels as much as we would like” added Dr. Tibballs, who is a primary care physician and PhD student.
The study was supported by The Norwegian Research Fund for General Practice. Dr. Tibballs had no conflicts of interest to disclose.
Men diagnosed with type 2 diabetes (T2D) by the age of 40 years appear significantly more likely to develop retinopathy than men who are diagnosed at an older age, Norwegian researchers report.
In a cross-sectional study of about 10,000 people, men with young-onset T2D were 72% more likely than men aged 50 years or older to have retinopathy.
While an increased retinopathy risk was also seen in women with young-onset T2D versus older women at first, this difference was not significant after adjusting for various confounding factors.
The effect of young-onset diabetes on retinopathy seems to be gender specific, Katrina Tibballs, MD, of the department of general practice at the University of Oslo, reported at the annual meeting of the European Association for the Study of Diabetes.
“In the unadjusted analysis, the odds ratio for retinopathy was substantially higher in both [young-onset] men [odds ratio, 3.0] and women [OR, 2.46], compared with those 50 or older at diabetes diagnosis,” Dr. Tibballs said.
That relationship was not substantially altered after adjustment for variables such as level of education, country background, gender, and body mass index, with adjusted ORs of 2.56 and 2.55 for men and women, respectively.
However, further adjustment to include current age, duration of diabetes, and blood lipids and glycated hemoglobin levels, led to the difference no longer holding for women (OR, 1.34; 95% confidence interval, 0.95-1.89) as it did for men (OR 1.72; 95% CI, 1.29-2.29).
First data in Norwegian population
Cross-sectional data on more than 10,000 people with T2D were used for the analysis. These came from the ROSA4 study, a general practice study conducted across Norway in 2014.
Just over 10% of the study population used in the analysis was under the age of 40 years at diagnosis of T2D; 21% were aged between 40 and 49 years, and 69% were at least 50 years old.
The mean age of those with young-onset T2D, defined as a diagnosis before the age of 40 years, was 33 years. These individuals had a longer disease duration than those in the other age groups (11.4 vs. 10.0 vs. 7.8 years).
“Looking at clinical characteristics, we say that individuals [with young-onset T2D] have a higher level of hemoglobin A1c than those with diabetes onset later in life,” Dr. Tibballs said.
“This is despite a substantially higher proportion [being] treated with insulin and fewer on lifestyle interventions alone.”
Gender differences were seen in A1c levels, with men with young-onset T2D having consistently higher levels than women, with levels increasing with diabetes duration.
Rise in retinopathy faster in men than in women
Dr. Tibballs reported that, not only did the prevalence of retinopathy rise faster in those of a younger age, but it also rose more quickly in men with young-onset T2D than it in their female counterparts.
“Comparing that [young-onset diabetes] and later-onset diabetes in men and women separately, we see a clearly higher prevalence of retinopathy with increasing diabetes duration for [young-onset] men,” she said.
In women, on the other hand, there was “no clear indication of a higher retinopathy prevalence in [young-onset diabetes], except in those with the longest diabetes duration.”
So, what do the results mean for practice? First, they confirm prior work showing that there is a strong association between retinopathy and age at diagnosis of T2D. Second, they suggest that this is despite intensive glucose-lowering treatment.
She speculated that men with young-onset T2D may have had a delayed diagnosis when compared with women and individuals with later onset diabetes, Dr. Tibballs said.
“This may in turn lead to delayed onset of glucose-lowering treatment, allowing for more time with high glycemic exposure and increased risk of acquiring complications, such as retinopathy at the time of diagnosis, or in the first years after,” said Dr. Tibballs.
These are cross-sectional data, “so we can’t say anything about whether this treatment is sufficient, but it is obviously not reducing HbA1c levels as much as we would like” added Dr. Tibballs, who is a primary care physician and PhD student.
The study was supported by The Norwegian Research Fund for General Practice. Dr. Tibballs had no conflicts of interest to disclose.
FROM EASD 2021
Genetic testing for colon cancer: Who, when, and how
Gastroenterologists should be skilled in recognition of patients with inherited risk of colorectal neoplasia. Fay Kastrinos, MD, presented a 49-year-old female who had more than 10 cumulative adenomas and a cecal adenocarcinoma on two colonoscopies, the first of which was performed for evaluation of rectal bleeding. Carol Burke, MD, reviewed the differential diagnosis of adenomatous polyposis (defined as >10 cumulative adenomas).
Germline syndromes include familial adenomatous polyposis (FAP), MUTYH-associated polyposis (MAP), and a number of rare germline syndromes. Lynch syndrome should be considered especially for carriers of pathogenic variants in MSH6 who can present with a polyposis phenotype, as well as in children with constitutional mismatch repair deficiency syndrome. Finally, polyposis can be due to smoking, familial clustering, or previous abdominal radiation called therapy-associated polyposis. Polyposis without a known cause is referred to as colonic polyposis of unknown etiology (CPUE).
Dr. Kastrinos reviewed the patient’s three-generation family history of a brother and mother with “polyps” and second-degree relatives with endometrial and colon cancer. Niloy Jewel Samadder, MD, presented on the role of taking a comprehensive family history, tumor tests for Lynch syndrome, selection of genetic test type, and risks, benefits, and alternatives of genetic testing. Dr. Samadder reviewed indications for germline genetic testing for colorectal neoplasia of which the patient met two criteria, namely colorectal cancer under age 50 and 10 or more cumulative adenomas.
The final section was presented by this author on multigene panel testing, in which multiple genes are sequenced simultaneously. This patient’s panel showed two pathogenic variants in the MUTYH gene consistent with MAP, a recessive polyposis syndrome typically with 10s-100 cumulative adenomas. The test also showed a variant of uncertain significance (VUS) which is not clinically actionable. Providers counseling patients on multigene panel testing should discuss the possibility of VUS results (especially in individuals of non-European descent), moderate penetrant genes for which management recommendations are uncertain, or unexpected findings in genes not associated with colonic neoplasia. Studies have shown that the prevalence of finding an inherited syndrome is increased at younger ages of disease onset.
Dr. Kastrinos summarized key points from the session, including hereditary colorectal cancer syndromes are not rare, red flags for inherited syndromes, include early onset colorectal neoplasia and/or numerous relatives with colorectal and other extra-colonic cancer, extended family history assessment is recommended, and genetic risk assessment and genetic testing with multigene panels is a process and should be personalized. The question and answer session was lively with discussion of cost as well as direct-to-consumer genetic testing.
Sonia Kupfer, MD, AGAF, is an associate professor in the section of gastroenterology, hepatology, and nutrition at the University of Chicago. She has no financial conflicts of interest. These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Gastroenterologists should be skilled in recognition of patients with inherited risk of colorectal neoplasia. Fay Kastrinos, MD, presented a 49-year-old female who had more than 10 cumulative adenomas and a cecal adenocarcinoma on two colonoscopies, the first of which was performed for evaluation of rectal bleeding. Carol Burke, MD, reviewed the differential diagnosis of adenomatous polyposis (defined as >10 cumulative adenomas).
Germline syndromes include familial adenomatous polyposis (FAP), MUTYH-associated polyposis (MAP), and a number of rare germline syndromes. Lynch syndrome should be considered especially for carriers of pathogenic variants in MSH6 who can present with a polyposis phenotype, as well as in children with constitutional mismatch repair deficiency syndrome. Finally, polyposis can be due to smoking, familial clustering, or previous abdominal radiation called therapy-associated polyposis. Polyposis without a known cause is referred to as colonic polyposis of unknown etiology (CPUE).
Dr. Kastrinos reviewed the patient’s three-generation family history of a brother and mother with “polyps” and second-degree relatives with endometrial and colon cancer. Niloy Jewel Samadder, MD, presented on the role of taking a comprehensive family history, tumor tests for Lynch syndrome, selection of genetic test type, and risks, benefits, and alternatives of genetic testing. Dr. Samadder reviewed indications for germline genetic testing for colorectal neoplasia of which the patient met two criteria, namely colorectal cancer under age 50 and 10 or more cumulative adenomas.
The final section was presented by this author on multigene panel testing, in which multiple genes are sequenced simultaneously. This patient’s panel showed two pathogenic variants in the MUTYH gene consistent with MAP, a recessive polyposis syndrome typically with 10s-100 cumulative adenomas. The test also showed a variant of uncertain significance (VUS) which is not clinically actionable. Providers counseling patients on multigene panel testing should discuss the possibility of VUS results (especially in individuals of non-European descent), moderate penetrant genes for which management recommendations are uncertain, or unexpected findings in genes not associated with colonic neoplasia. Studies have shown that the prevalence of finding an inherited syndrome is increased at younger ages of disease onset.
Dr. Kastrinos summarized key points from the session, including hereditary colorectal cancer syndromes are not rare, red flags for inherited syndromes, include early onset colorectal neoplasia and/or numerous relatives with colorectal and other extra-colonic cancer, extended family history assessment is recommended, and genetic risk assessment and genetic testing with multigene panels is a process and should be personalized. The question and answer session was lively with discussion of cost as well as direct-to-consumer genetic testing.
Sonia Kupfer, MD, AGAF, is an associate professor in the section of gastroenterology, hepatology, and nutrition at the University of Chicago. She has no financial conflicts of interest. These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Gastroenterologists should be skilled in recognition of patients with inherited risk of colorectal neoplasia. Fay Kastrinos, MD, presented a 49-year-old female who had more than 10 cumulative adenomas and a cecal adenocarcinoma on two colonoscopies, the first of which was performed for evaluation of rectal bleeding. Carol Burke, MD, reviewed the differential diagnosis of adenomatous polyposis (defined as >10 cumulative adenomas).
Germline syndromes include familial adenomatous polyposis (FAP), MUTYH-associated polyposis (MAP), and a number of rare germline syndromes. Lynch syndrome should be considered especially for carriers of pathogenic variants in MSH6 who can present with a polyposis phenotype, as well as in children with constitutional mismatch repair deficiency syndrome. Finally, polyposis can be due to smoking, familial clustering, or previous abdominal radiation called therapy-associated polyposis. Polyposis without a known cause is referred to as colonic polyposis of unknown etiology (CPUE).
Dr. Kastrinos reviewed the patient’s three-generation family history of a brother and mother with “polyps” and second-degree relatives with endometrial and colon cancer. Niloy Jewel Samadder, MD, presented on the role of taking a comprehensive family history, tumor tests for Lynch syndrome, selection of genetic test type, and risks, benefits, and alternatives of genetic testing. Dr. Samadder reviewed indications for germline genetic testing for colorectal neoplasia of which the patient met two criteria, namely colorectal cancer under age 50 and 10 or more cumulative adenomas.
The final section was presented by this author on multigene panel testing, in which multiple genes are sequenced simultaneously. This patient’s panel showed two pathogenic variants in the MUTYH gene consistent with MAP, a recessive polyposis syndrome typically with 10s-100 cumulative adenomas. The test also showed a variant of uncertain significance (VUS) which is not clinically actionable. Providers counseling patients on multigene panel testing should discuss the possibility of VUS results (especially in individuals of non-European descent), moderate penetrant genes for which management recommendations are uncertain, or unexpected findings in genes not associated with colonic neoplasia. Studies have shown that the prevalence of finding an inherited syndrome is increased at younger ages of disease onset.
Dr. Kastrinos summarized key points from the session, including hereditary colorectal cancer syndromes are not rare, red flags for inherited syndromes, include early onset colorectal neoplasia and/or numerous relatives with colorectal and other extra-colonic cancer, extended family history assessment is recommended, and genetic risk assessment and genetic testing with multigene panels is a process and should be personalized. The question and answer session was lively with discussion of cost as well as direct-to-consumer genetic testing.
Sonia Kupfer, MD, AGAF, is an associate professor in the section of gastroenterology, hepatology, and nutrition at the University of Chicago. She has no financial conflicts of interest. These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Extraesophageal symptoms of GERD
Patients often present with symptoms that are not classic for reflux such as chronic cough, worsening asthma, sore throat, or globus.
In the upper GI section of the postgraduate course program, Rena Yadlapati, MD, and C. Prakash Gyawali, MD, MRCP, educated us about optimal strategies for diagnosis and treatment of this difficult group of patients. Dr. Gyawali reminded us of risk stratification of patients into those with high or low likelihood of reflux as contributing etiology for patients with suspected extraesophageal reflux. Dr. Yadlapti reviewed the utility of the HASBEER score in stratifying patients into these two risk categories. Patients with known reflux at baseline and/or if they have classic symptoms of reflux in addition to extraesophageal symptoms may be at higher likelihood of having abnormal esophageal acid exposure than those without classic heartburn and/or regurgitation. The low-risk group may then benefit from diagnostic testing off PPI therapy (either impedance/pH monitoring or wireless pH testing), whereas those in the high-risk group for reflux may undergo impedance pH testing on PPI therapy to ensure control of reflux while on therapy.
Dr. Yadlapati also updated the audience about lack of robust data to suggest clinical utility for oropharyngeal pH test or salivary pepsin assay testing. It was generally agreed on that the majority of patients who do not respond to aggressive acid suppressive therapy likely do not have reflux related extraesophageal symptoms and alternative etiologies may be at play.
Finally, both investigators outlined the importance of neuromodulation in those whose symptoms may be due to “irritable larynx.” They emphasized the role of tricyclics as well as gabapentin as off label uses for patients who have normal reflux testing and continue to have chronic cough or globus sensation.
Michael F. Vaezi, MD, PhD, MSc, is an associate chief and a clinical director of the division of gastroenterology, hepatology, and nutrition and director of the Clinical Research and Center for Esophageal Disorders at Vanderbilt University, Nashville, Tenn. He reports consulting for Phathom, Ironwood, Diversatek, Isothrive, and Medtronic. These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Patients often present with symptoms that are not classic for reflux such as chronic cough, worsening asthma, sore throat, or globus.
In the upper GI section of the postgraduate course program, Rena Yadlapati, MD, and C. Prakash Gyawali, MD, MRCP, educated us about optimal strategies for diagnosis and treatment of this difficult group of patients. Dr. Gyawali reminded us of risk stratification of patients into those with high or low likelihood of reflux as contributing etiology for patients with suspected extraesophageal reflux. Dr. Yadlapti reviewed the utility of the HASBEER score in stratifying patients into these two risk categories. Patients with known reflux at baseline and/or if they have classic symptoms of reflux in addition to extraesophageal symptoms may be at higher likelihood of having abnormal esophageal acid exposure than those without classic heartburn and/or regurgitation. The low-risk group may then benefit from diagnostic testing off PPI therapy (either impedance/pH monitoring or wireless pH testing), whereas those in the high-risk group for reflux may undergo impedance pH testing on PPI therapy to ensure control of reflux while on therapy.
Dr. Yadlapati also updated the audience about lack of robust data to suggest clinical utility for oropharyngeal pH test or salivary pepsin assay testing. It was generally agreed on that the majority of patients who do not respond to aggressive acid suppressive therapy likely do not have reflux related extraesophageal symptoms and alternative etiologies may be at play.
Finally, both investigators outlined the importance of neuromodulation in those whose symptoms may be due to “irritable larynx.” They emphasized the role of tricyclics as well as gabapentin as off label uses for patients who have normal reflux testing and continue to have chronic cough or globus sensation.
Michael F. Vaezi, MD, PhD, MSc, is an associate chief and a clinical director of the division of gastroenterology, hepatology, and nutrition and director of the Clinical Research and Center for Esophageal Disorders at Vanderbilt University, Nashville, Tenn. He reports consulting for Phathom, Ironwood, Diversatek, Isothrive, and Medtronic. These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Patients often present with symptoms that are not classic for reflux such as chronic cough, worsening asthma, sore throat, or globus.
In the upper GI section of the postgraduate course program, Rena Yadlapati, MD, and C. Prakash Gyawali, MD, MRCP, educated us about optimal strategies for diagnosis and treatment of this difficult group of patients. Dr. Gyawali reminded us of risk stratification of patients into those with high or low likelihood of reflux as contributing etiology for patients with suspected extraesophageal reflux. Dr. Yadlapti reviewed the utility of the HASBEER score in stratifying patients into these two risk categories. Patients with known reflux at baseline and/or if they have classic symptoms of reflux in addition to extraesophageal symptoms may be at higher likelihood of having abnormal esophageal acid exposure than those without classic heartburn and/or regurgitation. The low-risk group may then benefit from diagnostic testing off PPI therapy (either impedance/pH monitoring or wireless pH testing), whereas those in the high-risk group for reflux may undergo impedance pH testing on PPI therapy to ensure control of reflux while on therapy.
Dr. Yadlapati also updated the audience about lack of robust data to suggest clinical utility for oropharyngeal pH test or salivary pepsin assay testing. It was generally agreed on that the majority of patients who do not respond to aggressive acid suppressive therapy likely do not have reflux related extraesophageal symptoms and alternative etiologies may be at play.
Finally, both investigators outlined the importance of neuromodulation in those whose symptoms may be due to “irritable larynx.” They emphasized the role of tricyclics as well as gabapentin as off label uses for patients who have normal reflux testing and continue to have chronic cough or globus sensation.
Michael F. Vaezi, MD, PhD, MSc, is an associate chief and a clinical director of the division of gastroenterology, hepatology, and nutrition and director of the Clinical Research and Center for Esophageal Disorders at Vanderbilt University, Nashville, Tenn. He reports consulting for Phathom, Ironwood, Diversatek, Isothrive, and Medtronic. These remarks were made during one of the AGA Postgraduate Course sessions held at DDW 2021.
Worried parents scramble to vaccinate kids despite FDA guidance
One week after reporting promising results from the trial of their COVID-19 vaccine in children ages 5-11, Pfizer and BioNTech announced they’d submitted the data to the Food and Drug Administration. But that hasn’t stopped some parents from discreetly getting their children under age 12 vaccinated.
“The FDA, you never want to get ahead of their judgment,” Anthony S. Fauci, MD, director of the National Institute of Allergy and Infectious Diseases, told MSNBC on Sept. 28. “But I would imagine in the next few weeks, they will examine that data and hopefully they’ll give the okay so that we can start vaccinating children, hopefully before the end of October.”
Lying to vaccinate now
More than half of all parents with children under 12 say they plan to get their kids vaccinated, according to a Gallup poll.
And although the FDA and the American Academy of Pediatrics have warned against it, some parents whose children can pass for 12 have lied to get them vaccinated already.
Dawn G. is a mom of two in southwest Missouri, where less than 45% of the population has been fully vaccinated. Her son turns 12 in early October, but in-person school started in mid-August.
“It was scary, thinking of him going to school for even 2 months,” she said. “Some parents thought their kid had a low chance of getting COVID, and their kid died. Nobody expects it to be them.”
In July, she and her husband took their son to a walk-in clinic and lied about his age.
“So many things can happen, from bullying to school shootings, and now this added pandemic risk,” she said. “I’ll do anything I can to protect my child, and a birthdate seems so arbitrary. He’ll be 12 in a matter of weeks. It seems ridiculous that that date would stop me from protecting him.”
In northern California, Carrie S. had a similar thought. When the vaccine was authorized for children ages 12-15 in May, the older of her two children got the shot right away. But her youngest doesn’t turn 12 until November.
“We were tempted to get the younger one vaccinated in May, but it didn’t seem like a rush. We were willing to wait to get the dosage right,” she ssaid. “But as Delta came through, there were no options for online school, the CDC was dropping mask expectations –it seemed like the world was ready to forget the pandemic was happening. It seemed like the least-bad option to get her vaccinated so she could go back to school, and we could find some balance of risk in our lives.”
Adult vs. pediatric doses
For now, experts advise against getting younger children vaccinated, even those who are the size of an adult, because of the way the human immune system develops.
“It’s not really about size,” said Anne Liu, MD, an immunologist and pediatrics professor at Stanford (Calif.) University. “The immune system behaves differently at different ages. Younger kids tend to have a more exuberant innate immune system, which is the part of the immune system that senses danger, even before it has developed a memory response.”
The adult Pfizer-BioNTech vaccine contains 30 mcg of mRNA, while the pediatric dose is just 10 mcg. That smaller dose produces an immune response similar to what’s seen in adults who receive 30 mcg, according to Pfizer.
“We were one of the sites that was involved in the phase 1 trial, a lot of times that’s called a dose-finding trial,” said Michael Smith, MD, a coinvestigator for the COVID vaccine trials done at Duke University. “And basically, if younger kids got a higher dose, they had more of a reaction, so it hurt more. They had fever, they had more redness and swelling at the site of the injection, and they just felt lousy, more than at the lower doses.”
At this point, with Pfizer’s data showing that younger children need a smaller dose, it doesn’t make sense to lie about your child’s age, said Dr. Smith.
“If my two options were having my child get the infection versus getting the vaccine, I’d get the vaccine. But we’re a few weeks away from getting the lower dose approved in kids,” he said. “It’s certainly safer. I don’t expect major, lifelong side effects from the higher dose, but it’s going to hurt, your kid’s going to have a fever, they’re going to feel lousy for a couple days, and they just don’t need that much antigen.”
A version of this article first appeared on WebMD.com.
One week after reporting promising results from the trial of their COVID-19 vaccine in children ages 5-11, Pfizer and BioNTech announced they’d submitted the data to the Food and Drug Administration. But that hasn’t stopped some parents from discreetly getting their children under age 12 vaccinated.
“The FDA, you never want to get ahead of their judgment,” Anthony S. Fauci, MD, director of the National Institute of Allergy and Infectious Diseases, told MSNBC on Sept. 28. “But I would imagine in the next few weeks, they will examine that data and hopefully they’ll give the okay so that we can start vaccinating children, hopefully before the end of October.”
Lying to vaccinate now
More than half of all parents with children under 12 say they plan to get their kids vaccinated, according to a Gallup poll.
And although the FDA and the American Academy of Pediatrics have warned against it, some parents whose children can pass for 12 have lied to get them vaccinated already.
Dawn G. is a mom of two in southwest Missouri, where less than 45% of the population has been fully vaccinated. Her son turns 12 in early October, but in-person school started in mid-August.
“It was scary, thinking of him going to school for even 2 months,” she said. “Some parents thought their kid had a low chance of getting COVID, and their kid died. Nobody expects it to be them.”
In July, she and her husband took their son to a walk-in clinic and lied about his age.
“So many things can happen, from bullying to school shootings, and now this added pandemic risk,” she said. “I’ll do anything I can to protect my child, and a birthdate seems so arbitrary. He’ll be 12 in a matter of weeks. It seems ridiculous that that date would stop me from protecting him.”
In northern California, Carrie S. had a similar thought. When the vaccine was authorized for children ages 12-15 in May, the older of her two children got the shot right away. But her youngest doesn’t turn 12 until November.
“We were tempted to get the younger one vaccinated in May, but it didn’t seem like a rush. We were willing to wait to get the dosage right,” she ssaid. “But as Delta came through, there were no options for online school, the CDC was dropping mask expectations –it seemed like the world was ready to forget the pandemic was happening. It seemed like the least-bad option to get her vaccinated so she could go back to school, and we could find some balance of risk in our lives.”
Adult vs. pediatric doses
For now, experts advise against getting younger children vaccinated, even those who are the size of an adult, because of the way the human immune system develops.
“It’s not really about size,” said Anne Liu, MD, an immunologist and pediatrics professor at Stanford (Calif.) University. “The immune system behaves differently at different ages. Younger kids tend to have a more exuberant innate immune system, which is the part of the immune system that senses danger, even before it has developed a memory response.”
The adult Pfizer-BioNTech vaccine contains 30 mcg of mRNA, while the pediatric dose is just 10 mcg. That smaller dose produces an immune response similar to what’s seen in adults who receive 30 mcg, according to Pfizer.
“We were one of the sites that was involved in the phase 1 trial, a lot of times that’s called a dose-finding trial,” said Michael Smith, MD, a coinvestigator for the COVID vaccine trials done at Duke University. “And basically, if younger kids got a higher dose, they had more of a reaction, so it hurt more. They had fever, they had more redness and swelling at the site of the injection, and they just felt lousy, more than at the lower doses.”
At this point, with Pfizer’s data showing that younger children need a smaller dose, it doesn’t make sense to lie about your child’s age, said Dr. Smith.
“If my two options were having my child get the infection versus getting the vaccine, I’d get the vaccine. But we’re a few weeks away from getting the lower dose approved in kids,” he said. “It’s certainly safer. I don’t expect major, lifelong side effects from the higher dose, but it’s going to hurt, your kid’s going to have a fever, they’re going to feel lousy for a couple days, and they just don’t need that much antigen.”
A version of this article first appeared on WebMD.com.
One week after reporting promising results from the trial of their COVID-19 vaccine in children ages 5-11, Pfizer and BioNTech announced they’d submitted the data to the Food and Drug Administration. But that hasn’t stopped some parents from discreetly getting their children under age 12 vaccinated.
“The FDA, you never want to get ahead of their judgment,” Anthony S. Fauci, MD, director of the National Institute of Allergy and Infectious Diseases, told MSNBC on Sept. 28. “But I would imagine in the next few weeks, they will examine that data and hopefully they’ll give the okay so that we can start vaccinating children, hopefully before the end of October.”
Lying to vaccinate now
More than half of all parents with children under 12 say they plan to get their kids vaccinated, according to a Gallup poll.
And although the FDA and the American Academy of Pediatrics have warned against it, some parents whose children can pass for 12 have lied to get them vaccinated already.
Dawn G. is a mom of two in southwest Missouri, where less than 45% of the population has been fully vaccinated. Her son turns 12 in early October, but in-person school started in mid-August.
“It was scary, thinking of him going to school for even 2 months,” she said. “Some parents thought their kid had a low chance of getting COVID, and their kid died. Nobody expects it to be them.”
In July, she and her husband took their son to a walk-in clinic and lied about his age.
“So many things can happen, from bullying to school shootings, and now this added pandemic risk,” she said. “I’ll do anything I can to protect my child, and a birthdate seems so arbitrary. He’ll be 12 in a matter of weeks. It seems ridiculous that that date would stop me from protecting him.”
In northern California, Carrie S. had a similar thought. When the vaccine was authorized for children ages 12-15 in May, the older of her two children got the shot right away. But her youngest doesn’t turn 12 until November.
“We were tempted to get the younger one vaccinated in May, but it didn’t seem like a rush. We were willing to wait to get the dosage right,” she ssaid. “But as Delta came through, there were no options for online school, the CDC was dropping mask expectations –it seemed like the world was ready to forget the pandemic was happening. It seemed like the least-bad option to get her vaccinated so she could go back to school, and we could find some balance of risk in our lives.”
Adult vs. pediatric doses
For now, experts advise against getting younger children vaccinated, even those who are the size of an adult, because of the way the human immune system develops.
“It’s not really about size,” said Anne Liu, MD, an immunologist and pediatrics professor at Stanford (Calif.) University. “The immune system behaves differently at different ages. Younger kids tend to have a more exuberant innate immune system, which is the part of the immune system that senses danger, even before it has developed a memory response.”
The adult Pfizer-BioNTech vaccine contains 30 mcg of mRNA, while the pediatric dose is just 10 mcg. That smaller dose produces an immune response similar to what’s seen in adults who receive 30 mcg, according to Pfizer.
“We were one of the sites that was involved in the phase 1 trial, a lot of times that’s called a dose-finding trial,” said Michael Smith, MD, a coinvestigator for the COVID vaccine trials done at Duke University. “And basically, if younger kids got a higher dose, they had more of a reaction, so it hurt more. They had fever, they had more redness and swelling at the site of the injection, and they just felt lousy, more than at the lower doses.”
At this point, with Pfizer’s data showing that younger children need a smaller dose, it doesn’t make sense to lie about your child’s age, said Dr. Smith.
“If my two options were having my child get the infection versus getting the vaccine, I’d get the vaccine. But we’re a few weeks away from getting the lower dose approved in kids,” he said. “It’s certainly safer. I don’t expect major, lifelong side effects from the higher dose, but it’s going to hurt, your kid’s going to have a fever, they’re going to feel lousy for a couple days, and they just don’t need that much antigen.”
A version of this article first appeared on WebMD.com.
Predicted pandemic retirement of many physicians hasn’t happened
The number of physicians who have chosen early retirement or have left medicine because of the COVID-19 pandemic may be considerably lower than previously thought, results of a new study suggest.
The research letter in the Journal of the American Medical Association, based on Medicare claims data, stated that “practice interruption rates were similar before and during the COVID-19 pandemic, except for a spike in April 2020.”
By contrast, in a Physicians Foundation Survey conducted in August 2020, 8% of physicians said they had closed their practices as a result of COVID, and 4% of the respondents said they planned to leave their practices within the next 12 months.
Similarly, a Jackson Physician Search survey in the fourth quarter of 2020 found that 54% of physicians surveyed had changed their employment plans. Of those doctors, 21% said they might hang up their white coat for early retirement. That works out to about 11% of the respondents.
The JAMA study’s authors analyzed the Medicare claims data from Jan. 1, 2019, to Dec. 30, 2020, to see how many physicians with Medicare patients had stopped filing claims for a period during those 2 years.
If a doctor had ceased submitting claims and then resumed filing them within 6 months after the last billing month, the lapse in filing was defined as “interruption with return.” If a physician stopped filing claims to Medicare and did not resume within 6 months, the gap in filing was called “interruption without return.”
In April 2020, 6.9% of physicians billing Medicare had a practice interruption, compared to 1.4% in 2019. But only 1.1% of physicians stopped practice in April 2020 and did not return, compared with 0.33% in 2019.
Physicians aged 55 or older had higher rates of interruption both with and without return than younger doctors did. The change in interruption rates for older doctors was 7.2% vs. 3.9% for younger physicians. The change in older physicians’ interruption-without-return rate was 1.3% vs. 0.34% for younger colleagues.
“Female physicians, specialists, physicians in smaller practices, those not in a health professional shortage area, and those practicing in a metropolitan area experienced greater increases in practice interruption rates in April 2020 vs. April 2019,” the study states. “But those groups typically had higher rates of return, so the overall changes in practice interruptions without return were similar across characteristics other than age.”
Significance for retirement rate
Discussing these results, the authors stressed that practice interruptions without return can’t necessarily be attributed to retirement, and that practice interruptions with return don’t necessarily signify that doctors had been furloughed from their practices.
Also, they said, “this measure of practice interruption likely misses meaningful interruptions that lasted for less than a month or did not involve complete cessation in treating Medicare patients.”
Nevertheless, “the study does capture a signal of some doctors probably retiring,” Jonathan Weiner, DPH, professor of health policy and management at the Johns Hopkins Bloomberg School of Public Health, said in an interview.
But he added, “Some of those people who interrupted their practices and didn’t return may still come back. And there are probably a lot of other doctors who are leaving or changing practices that they didn’t capture.” For example, it’s possible that some doctors who went to work for other health care organizations stopped billing under their own names.
In Dr. Weiner’s view, the true percentage of physicians who have retired since the start of the pandemic is probably somewhere between the portion of doctors who interrupted their practice without return, according to the JAMA study, and the percentage of physicians who said they had closed their practices in the Physicians Foundation survey.
No mass exodus seen
Michael Belkin, JD, divisional vice president of recruiting for Merritt Hawkins, a physician search firm, said in an interview that the real number may be closer to the interruption-without-return figure in the JAMA study.
While many physician practices were disrupted in spring of 2020, he said, “it really didn’t result in a mass exodus [from health care]. We’re not talking to a lot of candidates who retired or walked away from their practices. We are talking to candidates who slowed down last year and then realized that they wanted to get back into medicine. And now they’re actively looking.”
One change in job candidates’ attitude, Mr. Belkin said, is that, because of COVID-19–related burnout, their quality of life is more important to them.
“They want to know, ‘What’s the culture of the employer like? What did they do last year during COVID? How did they handle it? Have they put together any protocols for the next pandemic?’ “
Demand for doctors has returned
In the summer of 2020, there was a major drop in physician recruitment by hospitals and health systems, partly because of fewer patient visits and procedures. But demand for doctors has bounced back over the past year, Mr. Belkin noted. One reason is the pent-up need for care among patients who avoided health care providers in 2020.
Another reason is that some employed doctors – particularly older physicians – have slowed down. Many doctors prefer to work remotely 1 or 2 days a week, providing telehealth visits to patients. That has led to a loss of productivity in many health care organizations and, consequently, a need to hire additional physicians.
Nevertheless, not many doctors are heading for the exit earlier than physicians did before COVID-19.
“They may work reduced hours,” Mr. Belkin said. “But the sense from a physician’s perspective is that this is all they know. For them to walk away from their life in medicine, from who they are, is problematic. So they’re continuing to practice, but at a reduced capacity.”
A version of this article first appeared on Medscape.com.
The number of physicians who have chosen early retirement or have left medicine because of the COVID-19 pandemic may be considerably lower than previously thought, results of a new study suggest.
The research letter in the Journal of the American Medical Association, based on Medicare claims data, stated that “practice interruption rates were similar before and during the COVID-19 pandemic, except for a spike in April 2020.”
By contrast, in a Physicians Foundation Survey conducted in August 2020, 8% of physicians said they had closed their practices as a result of COVID, and 4% of the respondents said they planned to leave their practices within the next 12 months.
Similarly, a Jackson Physician Search survey in the fourth quarter of 2020 found that 54% of physicians surveyed had changed their employment plans. Of those doctors, 21% said they might hang up their white coat for early retirement. That works out to about 11% of the respondents.
The JAMA study’s authors analyzed the Medicare claims data from Jan. 1, 2019, to Dec. 30, 2020, to see how many physicians with Medicare patients had stopped filing claims for a period during those 2 years.
If a doctor had ceased submitting claims and then resumed filing them within 6 months after the last billing month, the lapse in filing was defined as “interruption with return.” If a physician stopped filing claims to Medicare and did not resume within 6 months, the gap in filing was called “interruption without return.”
In April 2020, 6.9% of physicians billing Medicare had a practice interruption, compared to 1.4% in 2019. But only 1.1% of physicians stopped practice in April 2020 and did not return, compared with 0.33% in 2019.
Physicians aged 55 or older had higher rates of interruption both with and without return than younger doctors did. The change in interruption rates for older doctors was 7.2% vs. 3.9% for younger physicians. The change in older physicians’ interruption-without-return rate was 1.3% vs. 0.34% for younger colleagues.
“Female physicians, specialists, physicians in smaller practices, those not in a health professional shortage area, and those practicing in a metropolitan area experienced greater increases in practice interruption rates in April 2020 vs. April 2019,” the study states. “But those groups typically had higher rates of return, so the overall changes in practice interruptions without return were similar across characteristics other than age.”
Significance for retirement rate
Discussing these results, the authors stressed that practice interruptions without return can’t necessarily be attributed to retirement, and that practice interruptions with return don’t necessarily signify that doctors had been furloughed from their practices.
Also, they said, “this measure of practice interruption likely misses meaningful interruptions that lasted for less than a month or did not involve complete cessation in treating Medicare patients.”
Nevertheless, “the study does capture a signal of some doctors probably retiring,” Jonathan Weiner, DPH, professor of health policy and management at the Johns Hopkins Bloomberg School of Public Health, said in an interview.
But he added, “Some of those people who interrupted their practices and didn’t return may still come back. And there are probably a lot of other doctors who are leaving or changing practices that they didn’t capture.” For example, it’s possible that some doctors who went to work for other health care organizations stopped billing under their own names.
In Dr. Weiner’s view, the true percentage of physicians who have retired since the start of the pandemic is probably somewhere between the portion of doctors who interrupted their practice without return, according to the JAMA study, and the percentage of physicians who said they had closed their practices in the Physicians Foundation survey.
No mass exodus seen
Michael Belkin, JD, divisional vice president of recruiting for Merritt Hawkins, a physician search firm, said in an interview that the real number may be closer to the interruption-without-return figure in the JAMA study.
While many physician practices were disrupted in spring of 2020, he said, “it really didn’t result in a mass exodus [from health care]. We’re not talking to a lot of candidates who retired or walked away from their practices. We are talking to candidates who slowed down last year and then realized that they wanted to get back into medicine. And now they’re actively looking.”
One change in job candidates’ attitude, Mr. Belkin said, is that, because of COVID-19–related burnout, their quality of life is more important to them.
“They want to know, ‘What’s the culture of the employer like? What did they do last year during COVID? How did they handle it? Have they put together any protocols for the next pandemic?’ “
Demand for doctors has returned
In the summer of 2020, there was a major drop in physician recruitment by hospitals and health systems, partly because of fewer patient visits and procedures. But demand for doctors has bounced back over the past year, Mr. Belkin noted. One reason is the pent-up need for care among patients who avoided health care providers in 2020.
Another reason is that some employed doctors – particularly older physicians – have slowed down. Many doctors prefer to work remotely 1 or 2 days a week, providing telehealth visits to patients. That has led to a loss of productivity in many health care organizations and, consequently, a need to hire additional physicians.
Nevertheless, not many doctors are heading for the exit earlier than physicians did before COVID-19.
“They may work reduced hours,” Mr. Belkin said. “But the sense from a physician’s perspective is that this is all they know. For them to walk away from their life in medicine, from who they are, is problematic. So they’re continuing to practice, but at a reduced capacity.”
A version of this article first appeared on Medscape.com.
The number of physicians who have chosen early retirement or have left medicine because of the COVID-19 pandemic may be considerably lower than previously thought, results of a new study suggest.
The research letter in the Journal of the American Medical Association, based on Medicare claims data, stated that “practice interruption rates were similar before and during the COVID-19 pandemic, except for a spike in April 2020.”
By contrast, in a Physicians Foundation Survey conducted in August 2020, 8% of physicians said they had closed their practices as a result of COVID, and 4% of the respondents said they planned to leave their practices within the next 12 months.
Similarly, a Jackson Physician Search survey in the fourth quarter of 2020 found that 54% of physicians surveyed had changed their employment plans. Of those doctors, 21% said they might hang up their white coat for early retirement. That works out to about 11% of the respondents.
The JAMA study’s authors analyzed the Medicare claims data from Jan. 1, 2019, to Dec. 30, 2020, to see how many physicians with Medicare patients had stopped filing claims for a period during those 2 years.
If a doctor had ceased submitting claims and then resumed filing them within 6 months after the last billing month, the lapse in filing was defined as “interruption with return.” If a physician stopped filing claims to Medicare and did not resume within 6 months, the gap in filing was called “interruption without return.”
In April 2020, 6.9% of physicians billing Medicare had a practice interruption, compared to 1.4% in 2019. But only 1.1% of physicians stopped practice in April 2020 and did not return, compared with 0.33% in 2019.
Physicians aged 55 or older had higher rates of interruption both with and without return than younger doctors did. The change in interruption rates for older doctors was 7.2% vs. 3.9% for younger physicians. The change in older physicians’ interruption-without-return rate was 1.3% vs. 0.34% for younger colleagues.
“Female physicians, specialists, physicians in smaller practices, those not in a health professional shortage area, and those practicing in a metropolitan area experienced greater increases in practice interruption rates in April 2020 vs. April 2019,” the study states. “But those groups typically had higher rates of return, so the overall changes in practice interruptions without return were similar across characteristics other than age.”
Significance for retirement rate
Discussing these results, the authors stressed that practice interruptions without return can’t necessarily be attributed to retirement, and that practice interruptions with return don’t necessarily signify that doctors had been furloughed from their practices.
Also, they said, “this measure of practice interruption likely misses meaningful interruptions that lasted for less than a month or did not involve complete cessation in treating Medicare patients.”
Nevertheless, “the study does capture a signal of some doctors probably retiring,” Jonathan Weiner, DPH, professor of health policy and management at the Johns Hopkins Bloomberg School of Public Health, said in an interview.
But he added, “Some of those people who interrupted their practices and didn’t return may still come back. And there are probably a lot of other doctors who are leaving or changing practices that they didn’t capture.” For example, it’s possible that some doctors who went to work for other health care organizations stopped billing under their own names.
In Dr. Weiner’s view, the true percentage of physicians who have retired since the start of the pandemic is probably somewhere between the portion of doctors who interrupted their practice without return, according to the JAMA study, and the percentage of physicians who said they had closed their practices in the Physicians Foundation survey.
No mass exodus seen
Michael Belkin, JD, divisional vice president of recruiting for Merritt Hawkins, a physician search firm, said in an interview that the real number may be closer to the interruption-without-return figure in the JAMA study.
While many physician practices were disrupted in spring of 2020, he said, “it really didn’t result in a mass exodus [from health care]. We’re not talking to a lot of candidates who retired or walked away from their practices. We are talking to candidates who slowed down last year and then realized that they wanted to get back into medicine. And now they’re actively looking.”
One change in job candidates’ attitude, Mr. Belkin said, is that, because of COVID-19–related burnout, their quality of life is more important to them.
“They want to know, ‘What’s the culture of the employer like? What did they do last year during COVID? How did they handle it? Have they put together any protocols for the next pandemic?’ “
Demand for doctors has returned
In the summer of 2020, there was a major drop in physician recruitment by hospitals and health systems, partly because of fewer patient visits and procedures. But demand for doctors has bounced back over the past year, Mr. Belkin noted. One reason is the pent-up need for care among patients who avoided health care providers in 2020.
Another reason is that some employed doctors – particularly older physicians – have slowed down. Many doctors prefer to work remotely 1 or 2 days a week, providing telehealth visits to patients. That has led to a loss of productivity in many health care organizations and, consequently, a need to hire additional physicians.
Nevertheless, not many doctors are heading for the exit earlier than physicians did before COVID-19.
“They may work reduced hours,” Mr. Belkin said. “But the sense from a physician’s perspective is that this is all they know. For them to walk away from their life in medicine, from who they are, is problematic. So they’re continuing to practice, but at a reduced capacity.”
A version of this article first appeared on Medscape.com.
MIND diet preserves cognition, new data show
Adherence to the MIND diet can improve memory and thinking skills of older adults, even in the presence of Alzheimer’s disease pathology, new data from the Rush Memory and Aging Project (MAP) show.
“The MIND diet was associated with better cognitive functions independently of brain pathologies related to Alzheimer’s disease, suggesting that diet may contribute to cognitive resilience, which ultimately indicates that it is never too late for dementia prevention,” lead author Klodian Dhana, MD, PhD, with the Rush Institute of Healthy Aging at Rush University, Chicago, said in an interview.
The study was published online Sept. 14, 2021, in the Journal of Alzheimer’s Disease.
Impact on brain pathology
“While previous investigations determined that the MIND diet is associated with a slower cognitive decline, the current study furthered the diet and brain health evidence by assessing the impact of brain pathology in the diet-cognition relationship,” Dr. Dhana said.
The MIND diet was pioneered by the late Martha Clare Morris, ScD, a Rush nutritional epidemiologist, who died in 2020 of cancer at age 64. A hybrid of the Mediterranean and DASH (Dietary Approaches to Stop Hypertension) diets, the MIND diet includes green leafy vegetables, fish, nuts, berries, beans, and whole grains and limits consumption of fried and fast foods, sweets, and pastries.
The current study focused on 569 older adults who died while participating in the MAP study, which began in 1997. Participants in the study were mostly White and were without known dementia. All of the participants agreed to undergo annual clinical evaluations. They also agreed to undergo brain autopsy after death.
Beginning in 2004, participants completed annual food frequency questionnaires, which were used to calculate a MIND diet score based on how often the participants ate specific foods.
The researchers used a series of regression analyses to examine associations of the MIND diet, dementia-related brain pathologies, and global cognition near the time of death. Analyses were adjusted for age, sex, education, apo E4, late-life cognitive activities, and total energy intake.
(beta, 0.119; P = .003).
Notably, the researchers said, neither the strength nor the significance of association changed markedly when AD pathology and other brain pathologies were included in the model (beta, 0.111; P = .003).
The relationship between better adherence to the MIND diet and better cognition remained significant when the analysis was restricted to individuals without mild cognitive impairment at baseline (beta, 0.121; P = .005) as well as to persons in whom a postmortem diagnosis of AD was made on the basis of NIA-Reagan consensus recommendations (beta, 0.114; P = .023).
The limitations of the study include the reliance on self-reported diet information and a sample made up of mostly White volunteers who agreed to annual evaluations and postmortem organ donation, thus limiting generalizability.
Strengths of the study include the prospective design with annual assessment of cognitive function using standardized tests and collection of the dietary information using validated questionnaires. Also, the neuropathologic evaluations were performed by examiners blinded to clinical data.
“Diet changes can impact cognitive functioning and risk of dementia, for better or worse. There are fairly simple diet and lifestyle changes a person could make that may help to slow cognitive decline with aging and contribute to brain health,” Dr. Dhana said in a news release.
Builds resilience
Weighing in on the study, Heather Snyder, PhD, vice president of medical and scientific relations for the Alzheimer’s Association, said this “interesting study sheds light on the impact of nutrition on cognitive function.
“The findings add to the growing literature that lifestyle factors – like access to a heart-healthy diet – may help the brain be more resilient to disease-specific changes,” Snyder said in an interview.
“The Alzheimer’s Association’s US POINTER study is investigating how lifestyle interventions, including nutrition guidance, like the MIND diet, may impact a person’s risk of cognitive decline. An ancillary study of the US POINTER will include brain imaging to investigate how these lifestyle interventions impact the biology of the brain,” Dr. Snyder noted.
The research was supported by the National Institute on Aging of the National Institutes of Health. Dr. Dhana and Dr. Snyder disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Adherence to the MIND diet can improve memory and thinking skills of older adults, even in the presence of Alzheimer’s disease pathology, new data from the Rush Memory and Aging Project (MAP) show.
“The MIND diet was associated with better cognitive functions independently of brain pathologies related to Alzheimer’s disease, suggesting that diet may contribute to cognitive resilience, which ultimately indicates that it is never too late for dementia prevention,” lead author Klodian Dhana, MD, PhD, with the Rush Institute of Healthy Aging at Rush University, Chicago, said in an interview.
The study was published online Sept. 14, 2021, in the Journal of Alzheimer’s Disease.
Impact on brain pathology
“While previous investigations determined that the MIND diet is associated with a slower cognitive decline, the current study furthered the diet and brain health evidence by assessing the impact of brain pathology in the diet-cognition relationship,” Dr. Dhana said.
The MIND diet was pioneered by the late Martha Clare Morris, ScD, a Rush nutritional epidemiologist, who died in 2020 of cancer at age 64. A hybrid of the Mediterranean and DASH (Dietary Approaches to Stop Hypertension) diets, the MIND diet includes green leafy vegetables, fish, nuts, berries, beans, and whole grains and limits consumption of fried and fast foods, sweets, and pastries.
The current study focused on 569 older adults who died while participating in the MAP study, which began in 1997. Participants in the study were mostly White and were without known dementia. All of the participants agreed to undergo annual clinical evaluations. They also agreed to undergo brain autopsy after death.
Beginning in 2004, participants completed annual food frequency questionnaires, which were used to calculate a MIND diet score based on how often the participants ate specific foods.
The researchers used a series of regression analyses to examine associations of the MIND diet, dementia-related brain pathologies, and global cognition near the time of death. Analyses were adjusted for age, sex, education, apo E4, late-life cognitive activities, and total energy intake.
(beta, 0.119; P = .003).
Notably, the researchers said, neither the strength nor the significance of association changed markedly when AD pathology and other brain pathologies were included in the model (beta, 0.111; P = .003).
The relationship between better adherence to the MIND diet and better cognition remained significant when the analysis was restricted to individuals without mild cognitive impairment at baseline (beta, 0.121; P = .005) as well as to persons in whom a postmortem diagnosis of AD was made on the basis of NIA-Reagan consensus recommendations (beta, 0.114; P = .023).
The limitations of the study include the reliance on self-reported diet information and a sample made up of mostly White volunteers who agreed to annual evaluations and postmortem organ donation, thus limiting generalizability.
Strengths of the study include the prospective design with annual assessment of cognitive function using standardized tests and collection of the dietary information using validated questionnaires. Also, the neuropathologic evaluations were performed by examiners blinded to clinical data.
“Diet changes can impact cognitive functioning and risk of dementia, for better or worse. There are fairly simple diet and lifestyle changes a person could make that may help to slow cognitive decline with aging and contribute to brain health,” Dr. Dhana said in a news release.
Builds resilience
Weighing in on the study, Heather Snyder, PhD, vice president of medical and scientific relations for the Alzheimer’s Association, said this “interesting study sheds light on the impact of nutrition on cognitive function.
“The findings add to the growing literature that lifestyle factors – like access to a heart-healthy diet – may help the brain be more resilient to disease-specific changes,” Snyder said in an interview.
“The Alzheimer’s Association’s US POINTER study is investigating how lifestyle interventions, including nutrition guidance, like the MIND diet, may impact a person’s risk of cognitive decline. An ancillary study of the US POINTER will include brain imaging to investigate how these lifestyle interventions impact the biology of the brain,” Dr. Snyder noted.
The research was supported by the National Institute on Aging of the National Institutes of Health. Dr. Dhana and Dr. Snyder disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Adherence to the MIND diet can improve memory and thinking skills of older adults, even in the presence of Alzheimer’s disease pathology, new data from the Rush Memory and Aging Project (MAP) show.
“The MIND diet was associated with better cognitive functions independently of brain pathologies related to Alzheimer’s disease, suggesting that diet may contribute to cognitive resilience, which ultimately indicates that it is never too late for dementia prevention,” lead author Klodian Dhana, MD, PhD, with the Rush Institute of Healthy Aging at Rush University, Chicago, said in an interview.
The study was published online Sept. 14, 2021, in the Journal of Alzheimer’s Disease.
Impact on brain pathology
“While previous investigations determined that the MIND diet is associated with a slower cognitive decline, the current study furthered the diet and brain health evidence by assessing the impact of brain pathology in the diet-cognition relationship,” Dr. Dhana said.
The MIND diet was pioneered by the late Martha Clare Morris, ScD, a Rush nutritional epidemiologist, who died in 2020 of cancer at age 64. A hybrid of the Mediterranean and DASH (Dietary Approaches to Stop Hypertension) diets, the MIND diet includes green leafy vegetables, fish, nuts, berries, beans, and whole grains and limits consumption of fried and fast foods, sweets, and pastries.
The current study focused on 569 older adults who died while participating in the MAP study, which began in 1997. Participants in the study were mostly White and were without known dementia. All of the participants agreed to undergo annual clinical evaluations. They also agreed to undergo brain autopsy after death.
Beginning in 2004, participants completed annual food frequency questionnaires, which were used to calculate a MIND diet score based on how often the participants ate specific foods.
The researchers used a series of regression analyses to examine associations of the MIND diet, dementia-related brain pathologies, and global cognition near the time of death. Analyses were adjusted for age, sex, education, apo E4, late-life cognitive activities, and total energy intake.
(beta, 0.119; P = .003).
Notably, the researchers said, neither the strength nor the significance of association changed markedly when AD pathology and other brain pathologies were included in the model (beta, 0.111; P = .003).
The relationship between better adherence to the MIND diet and better cognition remained significant when the analysis was restricted to individuals without mild cognitive impairment at baseline (beta, 0.121; P = .005) as well as to persons in whom a postmortem diagnosis of AD was made on the basis of NIA-Reagan consensus recommendations (beta, 0.114; P = .023).
The limitations of the study include the reliance on self-reported diet information and a sample made up of mostly White volunteers who agreed to annual evaluations and postmortem organ donation, thus limiting generalizability.
Strengths of the study include the prospective design with annual assessment of cognitive function using standardized tests and collection of the dietary information using validated questionnaires. Also, the neuropathologic evaluations were performed by examiners blinded to clinical data.
“Diet changes can impact cognitive functioning and risk of dementia, for better or worse. There are fairly simple diet and lifestyle changes a person could make that may help to slow cognitive decline with aging and contribute to brain health,” Dr. Dhana said in a news release.
Builds resilience
Weighing in on the study, Heather Snyder, PhD, vice president of medical and scientific relations for the Alzheimer’s Association, said this “interesting study sheds light on the impact of nutrition on cognitive function.
“The findings add to the growing literature that lifestyle factors – like access to a heart-healthy diet – may help the brain be more resilient to disease-specific changes,” Snyder said in an interview.
“The Alzheimer’s Association’s US POINTER study is investigating how lifestyle interventions, including nutrition guidance, like the MIND diet, may impact a person’s risk of cognitive decline. An ancillary study of the US POINTER will include brain imaging to investigate how these lifestyle interventions impact the biology of the brain,” Dr. Snyder noted.
The research was supported by the National Institute on Aging of the National Institutes of Health. Dr. Dhana and Dr. Snyder disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Acceptance of biosimilars grows but greater use may hinge on switching, interchangeability studies
It took years for Elle Moxley to get a diagnosis that explained her crippling gastrointestinal pain, digestion problems, fatigue, and hot, red rashes. And after learning in 2016 that she had Crohn’s disease, a chronic inflammation of the digestive tract, she spent more than 4 years trying medications before getting her disease under control with a biologic drug called Remicade.
So Ms. Moxley, 33, was dismayed to receive a notice from her insurer in January that Remicade would no longer be covered as a preferred drug on her plan. Another drug, Inflectra, which the Food and Drug Administration says has no meaningful clinical differences from Remicade, is now preferred. It is a “biosimilar” drug.
“I felt very powerless,” said Ms. Moxley, who recently started a job as a public relations coordinator for Kansas City (Mo.) Public Schools. “I have this decision being made for me and my doctor that’s not in my best interest, and it might knock me out of remission.”
After Ms. Moxley’s first Inflectra infusion in July, she developed a painful rash. It went away after a few days, but she said she continues to feel extremely fatigued and experiences gastrointestinal pain, constipation, diarrhea and nausea.
Many medical professionals look to biosimilar drugs as a way to increase competition and give consumers cheaper options, much as generic drugs do, and they point to the more robust use of these products in Europe to cut costs.
Yet the United States has been slower to adopt biosimilar drugs since the first such medicine was approved in 2015. That’s partly because of concerns raised by patients like Moxley and their doctors, but also because brand-name biologics have kept biosimilars from entering the market. The companies behind the brand-name drugs have used legal actions to extend the life of their patents and incentives that make offering the brand biologic more attractive than offering a biosimilar on a formulary, listing which drugs are covered on an insurance plan.
“It distorts the market and makes it so that patients can’t get access,” said Jinoos Yazdany, MD, MPH, a professor of medicine and chief of the rheumatology division at Zuckerberg San Francisco General Hospital.
The FDA has approved 31 biosimilar medications since 2015, but only about 60% have made it to market, according to an analysis by NORC, a research organization at the University of Chicago.
Remicade’s manufacturer, Johnson & Johnson, and Pfizer, which makes the Remicade biosimilar Inflectra, have been embroiled in a long-running lawsuit over Pfizer’s claims that Johnson & Johnson tried to choke off competition through exclusionary contracts with insurers and other anticompetitive actions. In July, the companies settled the case on undisclosed terms.
In a statement, Pfizer said it would continue to sell Inflectra in the United States but noted ongoing challenges: “Pfizer has begun to see progress in the overall biosimilars marketplace in the U.S. However, changes in policy at a government level and acceptance of biosimilars among key stakeholders are critical to deliver more meaningful uptake so patients and the health care system at large can benefit from the cost savings these medicines may deliver.”
Johnson & Johnson said it is committed to making Remicade available to patients who choose it, which “compels us to compete responsibly on both price and value.”
Biologic medicines, which are generally grown from living organisms such as animal cells or bacteria, are more complex and expensive to manufacture than drugs made from chemicals. In recent years, biologic drugs have become a mainstay of treatment for autoimmune conditions like Crohn’s disease and rheumatoid arthritis, as well as certain cancers and diabetes, among other conditions.
Other drugmakers can’t exactly reproduce these biologic drugs by following chemical recipes as they do for generic versions of conventional drugs.
Instead, biosimilar versions of biologic drugs are generally made from the same types of materials as the original biologics and must be “highly similar” to them to be approved by the FDA. They must have no clinically meaningful differences from the biologic drug, and be just as safe, pure and potent. More than a decade after Congress created an approval pathway for biosimilars, they are widely accepted as safe and effective alternatives to brand biologics.
Medical experts hope that as biosimilars become more widely used they will increasingly provide a brake on drug spending.
From 2015 to 2019, drug spending overall grew 6.1%, while spending on biologics grew more than twice as much – 14.6% – according to a report by IQVIA, a health care analytics company. In 2019, biologics accounted for 43% of drug spending in the United States
Biosimilars provide a roughly 30% discount over brand biologics in the United States but have the potential to reduce spending by more than $100 billion in the next 5 years, the IQVIA analysis found.
In a survey of 602 physicians who prescribe biologic medications, more than three-quarters said they believed biosimilars are just as safe and effective as their biologic counterparts, according to NORC.
But they were less comfortable with switching patients from a brand biologic to a biosimilar. While about half said they were very likely to prescribe a biosimilar to a patient just starting biologic therapy, only 31% said they were very likely to prescribe a biosimilar to a patient already doing well on a brand biologic.
It can be challenging to find a treatment regimen that works for patients with complicated chronic conditions, and physicians and patients often don’t want to rock the boat once that is achieved.
In Ms. Moxley’s case, for example, before her condition stabilized on Remicade, she tried a conventional pill called Lialda, the biologic drug Humira and a lower dose of Remicade.
Some doctors and patients raise concerns that switching between these drugs might cause patients to develop antibodies that cause the drugs to lose effectiveness. They want to see more research about the effects of such switches.
“We haven’t seen enough studies about patients going from the biologic to the biosimilar and bouncing back and forth,” said Marcus Snow, MD, chair of the American College of Rheumatology’s Committee on Rheumatologic Care. “We don’t want our patients to be guinea pigs.”
Manufacturers of biologic and biosimilar drugs have participated in advertising, exhibit or sponsorship opportunities with the American College of Rheumatology, according to ACR spokesperson Jocelyn Givens.
But studies show a one-time switch from Remicade to a biosimilar like Inflectra does not cause side effects or the development of antibodies, said Ross Maltz, MD, a pediatric gastroenterologist at Nationwide Children’s Hospital in Columbus, Ohio, and former member of the Crohn’s & Colitis Foundation’s National Scientific Advisory Committee. Studies may be conducted by researchers with extensive ties to the industry and funded by drugmakers.
Situations like Ms. Moxley’s are unusual, said Kristine Grow, senior vice president of communications at AHIP, an insurer trade group.
“For patients who have been taking a brand-name biologic for some time, health insurance providers do not typically encourage them to switch to a biosimilar because of a formulary change, and most plans exclude these patients from any changes in cost sharing due to formulary changes,” she said.
Drugmakers can seek approval from the FDA of their biosimilar as interchangeable with a biologic drug, allowing pharmacists, subject to state law, to switch a physician’s prescription from the brand drug, as they often do with generic drugs.
However, the FDA has approved only one biosimilar (Semglee, a form of insulin) as interchangeable with a biologic (Lantus).
Like Ms. Moxley, many other patients using biologics get copay assistance from drug companies, but the money often isn’t enough to cover the full cost. In her old job as a radio reporter, Ms. Moxley said, she hit the $7,000 maximum annual out-of-pocket spending limit for her plan by May.
In her new job, Ms. Moxley has an individual plan with a $4,000 maximum out-of-pocket limit, which she expects to blow past once again within months.
But she received good news recently: Her new plan will cover Remicade.
“I’m still concerned that I will have developed antibodies since my last dose,” she said. “But it feels like a step in the direction of good health again.”
KHN (Kaiser Health News) is a national newsroom that produces in-depth journalism about health issues. Together with Policy Analysis and Polling, KHN is one of the three major operating programs at KFF (Kaiser Family Foundation). KFF is an endowed nonprofit organization providing information on health issues to the nation.
It took years for Elle Moxley to get a diagnosis that explained her crippling gastrointestinal pain, digestion problems, fatigue, and hot, red rashes. And after learning in 2016 that she had Crohn’s disease, a chronic inflammation of the digestive tract, she spent more than 4 years trying medications before getting her disease under control with a biologic drug called Remicade.
So Ms. Moxley, 33, was dismayed to receive a notice from her insurer in January that Remicade would no longer be covered as a preferred drug on her plan. Another drug, Inflectra, which the Food and Drug Administration says has no meaningful clinical differences from Remicade, is now preferred. It is a “biosimilar” drug.
“I felt very powerless,” said Ms. Moxley, who recently started a job as a public relations coordinator for Kansas City (Mo.) Public Schools. “I have this decision being made for me and my doctor that’s not in my best interest, and it might knock me out of remission.”
After Ms. Moxley’s first Inflectra infusion in July, she developed a painful rash. It went away after a few days, but she said she continues to feel extremely fatigued and experiences gastrointestinal pain, constipation, diarrhea and nausea.
Many medical professionals look to biosimilar drugs as a way to increase competition and give consumers cheaper options, much as generic drugs do, and they point to the more robust use of these products in Europe to cut costs.
Yet the United States has been slower to adopt biosimilar drugs since the first such medicine was approved in 2015. That’s partly because of concerns raised by patients like Moxley and their doctors, but also because brand-name biologics have kept biosimilars from entering the market. The companies behind the brand-name drugs have used legal actions to extend the life of their patents and incentives that make offering the brand biologic more attractive than offering a biosimilar on a formulary, listing which drugs are covered on an insurance plan.
“It distorts the market and makes it so that patients can’t get access,” said Jinoos Yazdany, MD, MPH, a professor of medicine and chief of the rheumatology division at Zuckerberg San Francisco General Hospital.
The FDA has approved 31 biosimilar medications since 2015, but only about 60% have made it to market, according to an analysis by NORC, a research organization at the University of Chicago.
Remicade’s manufacturer, Johnson & Johnson, and Pfizer, which makes the Remicade biosimilar Inflectra, have been embroiled in a long-running lawsuit over Pfizer’s claims that Johnson & Johnson tried to choke off competition through exclusionary contracts with insurers and other anticompetitive actions. In July, the companies settled the case on undisclosed terms.
In a statement, Pfizer said it would continue to sell Inflectra in the United States but noted ongoing challenges: “Pfizer has begun to see progress in the overall biosimilars marketplace in the U.S. However, changes in policy at a government level and acceptance of biosimilars among key stakeholders are critical to deliver more meaningful uptake so patients and the health care system at large can benefit from the cost savings these medicines may deliver.”
Johnson & Johnson said it is committed to making Remicade available to patients who choose it, which “compels us to compete responsibly on both price and value.”
Biologic medicines, which are generally grown from living organisms such as animal cells or bacteria, are more complex and expensive to manufacture than drugs made from chemicals. In recent years, biologic drugs have become a mainstay of treatment for autoimmune conditions like Crohn’s disease and rheumatoid arthritis, as well as certain cancers and diabetes, among other conditions.
Other drugmakers can’t exactly reproduce these biologic drugs by following chemical recipes as they do for generic versions of conventional drugs.
Instead, biosimilar versions of biologic drugs are generally made from the same types of materials as the original biologics and must be “highly similar” to them to be approved by the FDA. They must have no clinically meaningful differences from the biologic drug, and be just as safe, pure and potent. More than a decade after Congress created an approval pathway for biosimilars, they are widely accepted as safe and effective alternatives to brand biologics.
Medical experts hope that as biosimilars become more widely used they will increasingly provide a brake on drug spending.
From 2015 to 2019, drug spending overall grew 6.1%, while spending on biologics grew more than twice as much – 14.6% – according to a report by IQVIA, a health care analytics company. In 2019, biologics accounted for 43% of drug spending in the United States
Biosimilars provide a roughly 30% discount over brand biologics in the United States but have the potential to reduce spending by more than $100 billion in the next 5 years, the IQVIA analysis found.
In a survey of 602 physicians who prescribe biologic medications, more than three-quarters said they believed biosimilars are just as safe and effective as their biologic counterparts, according to NORC.
But they were less comfortable with switching patients from a brand biologic to a biosimilar. While about half said they were very likely to prescribe a biosimilar to a patient just starting biologic therapy, only 31% said they were very likely to prescribe a biosimilar to a patient already doing well on a brand biologic.
It can be challenging to find a treatment regimen that works for patients with complicated chronic conditions, and physicians and patients often don’t want to rock the boat once that is achieved.
In Ms. Moxley’s case, for example, before her condition stabilized on Remicade, she tried a conventional pill called Lialda, the biologic drug Humira and a lower dose of Remicade.
Some doctors and patients raise concerns that switching between these drugs might cause patients to develop antibodies that cause the drugs to lose effectiveness. They want to see more research about the effects of such switches.
“We haven’t seen enough studies about patients going from the biologic to the biosimilar and bouncing back and forth,” said Marcus Snow, MD, chair of the American College of Rheumatology’s Committee on Rheumatologic Care. “We don’t want our patients to be guinea pigs.”
Manufacturers of biologic and biosimilar drugs have participated in advertising, exhibit or sponsorship opportunities with the American College of Rheumatology, according to ACR spokesperson Jocelyn Givens.
But studies show a one-time switch from Remicade to a biosimilar like Inflectra does not cause side effects or the development of antibodies, said Ross Maltz, MD, a pediatric gastroenterologist at Nationwide Children’s Hospital in Columbus, Ohio, and former member of the Crohn’s & Colitis Foundation’s National Scientific Advisory Committee. Studies may be conducted by researchers with extensive ties to the industry and funded by drugmakers.
Situations like Ms. Moxley’s are unusual, said Kristine Grow, senior vice president of communications at AHIP, an insurer trade group.
“For patients who have been taking a brand-name biologic for some time, health insurance providers do not typically encourage them to switch to a biosimilar because of a formulary change, and most plans exclude these patients from any changes in cost sharing due to formulary changes,” she said.
Drugmakers can seek approval from the FDA of their biosimilar as interchangeable with a biologic drug, allowing pharmacists, subject to state law, to switch a physician’s prescription from the brand drug, as they often do with generic drugs.
However, the FDA has approved only one biosimilar (Semglee, a form of insulin) as interchangeable with a biologic (Lantus).
Like Ms. Moxley, many other patients using biologics get copay assistance from drug companies, but the money often isn’t enough to cover the full cost. In her old job as a radio reporter, Ms. Moxley said, she hit the $7,000 maximum annual out-of-pocket spending limit for her plan by May.
In her new job, Ms. Moxley has an individual plan with a $4,000 maximum out-of-pocket limit, which she expects to blow past once again within months.
But she received good news recently: Her new plan will cover Remicade.
“I’m still concerned that I will have developed antibodies since my last dose,” she said. “But it feels like a step in the direction of good health again.”
KHN (Kaiser Health News) is a national newsroom that produces in-depth journalism about health issues. Together with Policy Analysis and Polling, KHN is one of the three major operating programs at KFF (Kaiser Family Foundation). KFF is an endowed nonprofit organization providing information on health issues to the nation.
It took years for Elle Moxley to get a diagnosis that explained her crippling gastrointestinal pain, digestion problems, fatigue, and hot, red rashes. And after learning in 2016 that she had Crohn’s disease, a chronic inflammation of the digestive tract, she spent more than 4 years trying medications before getting her disease under control with a biologic drug called Remicade.
So Ms. Moxley, 33, was dismayed to receive a notice from her insurer in January that Remicade would no longer be covered as a preferred drug on her plan. Another drug, Inflectra, which the Food and Drug Administration says has no meaningful clinical differences from Remicade, is now preferred. It is a “biosimilar” drug.
“I felt very powerless,” said Ms. Moxley, who recently started a job as a public relations coordinator for Kansas City (Mo.) Public Schools. “I have this decision being made for me and my doctor that’s not in my best interest, and it might knock me out of remission.”
After Ms. Moxley’s first Inflectra infusion in July, she developed a painful rash. It went away after a few days, but she said she continues to feel extremely fatigued and experiences gastrointestinal pain, constipation, diarrhea and nausea.
Many medical professionals look to biosimilar drugs as a way to increase competition and give consumers cheaper options, much as generic drugs do, and they point to the more robust use of these products in Europe to cut costs.
Yet the United States has been slower to adopt biosimilar drugs since the first such medicine was approved in 2015. That’s partly because of concerns raised by patients like Moxley and their doctors, but also because brand-name biologics have kept biosimilars from entering the market. The companies behind the brand-name drugs have used legal actions to extend the life of their patents and incentives that make offering the brand biologic more attractive than offering a biosimilar on a formulary, listing which drugs are covered on an insurance plan.
“It distorts the market and makes it so that patients can’t get access,” said Jinoos Yazdany, MD, MPH, a professor of medicine and chief of the rheumatology division at Zuckerberg San Francisco General Hospital.
The FDA has approved 31 biosimilar medications since 2015, but only about 60% have made it to market, according to an analysis by NORC, a research organization at the University of Chicago.
Remicade’s manufacturer, Johnson & Johnson, and Pfizer, which makes the Remicade biosimilar Inflectra, have been embroiled in a long-running lawsuit over Pfizer’s claims that Johnson & Johnson tried to choke off competition through exclusionary contracts with insurers and other anticompetitive actions. In July, the companies settled the case on undisclosed terms.
In a statement, Pfizer said it would continue to sell Inflectra in the United States but noted ongoing challenges: “Pfizer has begun to see progress in the overall biosimilars marketplace in the U.S. However, changes in policy at a government level and acceptance of biosimilars among key stakeholders are critical to deliver more meaningful uptake so patients and the health care system at large can benefit from the cost savings these medicines may deliver.”
Johnson & Johnson said it is committed to making Remicade available to patients who choose it, which “compels us to compete responsibly on both price and value.”
Biologic medicines, which are generally grown from living organisms such as animal cells or bacteria, are more complex and expensive to manufacture than drugs made from chemicals. In recent years, biologic drugs have become a mainstay of treatment for autoimmune conditions like Crohn’s disease and rheumatoid arthritis, as well as certain cancers and diabetes, among other conditions.
Other drugmakers can’t exactly reproduce these biologic drugs by following chemical recipes as they do for generic versions of conventional drugs.
Instead, biosimilar versions of biologic drugs are generally made from the same types of materials as the original biologics and must be “highly similar” to them to be approved by the FDA. They must have no clinically meaningful differences from the biologic drug, and be just as safe, pure and potent. More than a decade after Congress created an approval pathway for biosimilars, they are widely accepted as safe and effective alternatives to brand biologics.
Medical experts hope that as biosimilars become more widely used they will increasingly provide a brake on drug spending.
From 2015 to 2019, drug spending overall grew 6.1%, while spending on biologics grew more than twice as much – 14.6% – according to a report by IQVIA, a health care analytics company. In 2019, biologics accounted for 43% of drug spending in the United States
Biosimilars provide a roughly 30% discount over brand biologics in the United States but have the potential to reduce spending by more than $100 billion in the next 5 years, the IQVIA analysis found.
In a survey of 602 physicians who prescribe biologic medications, more than three-quarters said they believed biosimilars are just as safe and effective as their biologic counterparts, according to NORC.
But they were less comfortable with switching patients from a brand biologic to a biosimilar. While about half said they were very likely to prescribe a biosimilar to a patient just starting biologic therapy, only 31% said they were very likely to prescribe a biosimilar to a patient already doing well on a brand biologic.
It can be challenging to find a treatment regimen that works for patients with complicated chronic conditions, and physicians and patients often don’t want to rock the boat once that is achieved.
In Ms. Moxley’s case, for example, before her condition stabilized on Remicade, she tried a conventional pill called Lialda, the biologic drug Humira and a lower dose of Remicade.
Some doctors and patients raise concerns that switching between these drugs might cause patients to develop antibodies that cause the drugs to lose effectiveness. They want to see more research about the effects of such switches.
“We haven’t seen enough studies about patients going from the biologic to the biosimilar and bouncing back and forth,” said Marcus Snow, MD, chair of the American College of Rheumatology’s Committee on Rheumatologic Care. “We don’t want our patients to be guinea pigs.”
Manufacturers of biologic and biosimilar drugs have participated in advertising, exhibit or sponsorship opportunities with the American College of Rheumatology, according to ACR spokesperson Jocelyn Givens.
But studies show a one-time switch from Remicade to a biosimilar like Inflectra does not cause side effects or the development of antibodies, said Ross Maltz, MD, a pediatric gastroenterologist at Nationwide Children’s Hospital in Columbus, Ohio, and former member of the Crohn’s & Colitis Foundation’s National Scientific Advisory Committee. Studies may be conducted by researchers with extensive ties to the industry and funded by drugmakers.
Situations like Ms. Moxley’s are unusual, said Kristine Grow, senior vice president of communications at AHIP, an insurer trade group.
“For patients who have been taking a brand-name biologic for some time, health insurance providers do not typically encourage them to switch to a biosimilar because of a formulary change, and most plans exclude these patients from any changes in cost sharing due to formulary changes,” she said.
Drugmakers can seek approval from the FDA of their biosimilar as interchangeable with a biologic drug, allowing pharmacists, subject to state law, to switch a physician’s prescription from the brand drug, as they often do with generic drugs.
However, the FDA has approved only one biosimilar (Semglee, a form of insulin) as interchangeable with a biologic (Lantus).
Like Ms. Moxley, many other patients using biologics get copay assistance from drug companies, but the money often isn’t enough to cover the full cost. In her old job as a radio reporter, Ms. Moxley said, she hit the $7,000 maximum annual out-of-pocket spending limit for her plan by May.
In her new job, Ms. Moxley has an individual plan with a $4,000 maximum out-of-pocket limit, which she expects to blow past once again within months.
But she received good news recently: Her new plan will cover Remicade.
“I’m still concerned that I will have developed antibodies since my last dose,” she said. “But it feels like a step in the direction of good health again.”
KHN (Kaiser Health News) is a national newsroom that produces in-depth journalism about health issues. Together with Policy Analysis and Polling, KHN is one of the three major operating programs at KFF (Kaiser Family Foundation). KFF is an endowed nonprofit organization providing information on health issues to the nation.
Study finds paying people to participate in clinical trials is not unethical
Paying people to participate in clinical trials remains controversial. But to date, most reservations are based on hypothetical scenarios or expert opinion with few real-world data to support them.
Research released this week could change that.
Investigators offered nearly 1,300 participants in two clinical trials either no payment or incentives up to $500 to partake in a smoking cessation study or an analysis of a behavioral intervention to increase ambulation in hospitalized patients.
More cash was associated with greater agreement to participate in the smoking cessation study but not the ambulation trial.
But the bigger news may be that offering payment did not appear to get people to accept more risks or skew participation to lower-income individuals, as some ethicists have warned.
“With the publication of our study, investigators finally have data that they can cite to put to rest any lingering concerns about offering moderate incentives in low-risk trials,” lead author Scott D. Halpern, MD, PhD, the John M. Eisenberg Professor of Medicine, Epidemiology, and Medical Ethics & Health Policy at the University of Pennsylvania, Philadelphia, told this news organization.
This initial real-world data centers on low-risk interventions and more research is needed to analyze the ethics and effectiveness of paying people to join clinical trials with more inherent risk, the researchers note.
The study was published online Sept. 20 in JAMA Internal Medicine.
A good first step?
“Payments to research participants are notoriously controversial. Many people oppose payments altogether or insist on minimal payments out of concern that people might be unduly influenced to participate,” Ana S. Iltis, PhD, told this news organization when asked for comment. “Others worry that incentives will disproportionately motivate the less well-off to participate.”
“This is an important study that begins to assess whether these concerns are justified in a real-world context,” added Dr. Iltis, director of the Center for Bioethics, Health and Society and professor of philosophy at Wake Forest University in Winston-Salem, N.C.
In an accompanying invited commentary, Sang Ngo, Anthony S. Kim, MD, and Winston Chiong, MD, PhD, write: “This work is welcome, as it presents experimental data to a bioethical debate that so far has been largely driven by conjecture and competing suppositions.”
The commentary authors, however, question the conclusiveness of the findings. “Interpreting the authors’ findings is complex and illustrates some of the challenges inherent to applying empirical data to ethical problems,” they write.
Recruitment realities
When asked his advice for researchers considering financial incentives, Dr. Halpern said: “All researchers would happily include incentives in their trial budgets if not for concerns that the sponsor or institutional review board might not approve of them.”
“By far the biggest threat to a trial’s success is the inability to enroll enough participants,” he added.
Dr. Iltis agreed, framing the need to boost enrollment in ethical terms. “There is another important ethical issue that often gets ignored, and that is the issue of studies that fail to enroll enough participants and are never completed or are underpowered,” she said.
“These studies end up exposing people to research risks and burdens without a compensating social benefit.”
“If incentives help to increase enrollment and do not necessarily result in undue influence or unfair participant selection, then there might be ethical reasons to offer incentives,” Dr. Iltis added.
Building on previous work assessing financial incentives in hypothetical clinical trials, Dr. Halpern and colleagues studied 654 participants with major depressive disorder in a smoking cessation trial. They also studied another 642 participants in a study that compared a gamification strategy to usual care for encouraging hospitalized patients to get out of bed and walk.
Dr. Halpern and colleagues randomly assigned people in the smoking cessation study to receive no financial compensation, $200, or $500. In the ambulation trial, participants were randomly allocated to receive no compensation, $100, or $300.
Key findings
A total of 22% of those offered no incentive enrolled in the smoking cessation study. In contrast, 36% offered $200 agreed, as did 47% of those offered $500, which the investigators say supports offering cash incentives to boost enrollment. The differences were significant (P < .001).
In contrast, the amount offered did not significantly incentivize more people to participate in the ambulation trial (P = .62). Rates were 45% with no compensation, 48% with $100 payment, and 43% with $300 payment.
In an analysis that adjusted for demographic differences, financial well-being, and Research Attitudes Questionnaire (RAQ-7) scores, each increase in cash incentive increased the odds of enrollment in the smoking cessation trial by 70% (adjusted odds ratio, 1.70; 95% confidence interval, 1.34-2.17).
The same effect was not seen in the ambulation trial, where each higher cash incentive did not make a significant difference (aOR, 0.88; 95% CI, 0.64-1.22).
“The ambulation trial was a lower-risk trial in which patients’ willingness to participate was higher in general. So there were likely fewer people whose participation decisions could be influenced by offers of money,” Dr. Halpern said.
Inducement vs. coercion
The incentives in the study “did not function as unjust inducements, as they were not preferentially motivating across groups with different income levels or financial well-being in either trial,” the researchers note.
Dr. Halpern and colleagues also checked for any perceptions of coercion. More than 70% of participants in each smoking cessation trial group perceived no coercion, as did more than 93% of participants in each ambulation trial group, according to scores on a modified Perceived Coercion Scale of the MacArthur Admission Experience Survey.
Furthermore, perception of risks did not significantly alter the association between cash incentives and enrollment in either trial.
After collecting the findings, Dr. Halpern and colleagues informed participants about their participation in RETAIN and explained the rationale for using different cash incentives. They also let all participants know they would ultimately receive the maximum incentive – either $500 or $300, depending on the trial.
Research implications
A study limitation was reliance on participant risk perception, as was an inability to measure perceived coercion among people who chose not to participant in the trials. Another potential limitation is that “neither of these parent trials posed particularly high risks. Future tests of incentives of different sizes, and in the context of higher-risk parent trials, including trials that test treatments of serious illnesses, are warranted,” the researchers note.
“While there are many more questions to ask and contexts in which to study the effects of incentives, this study calls on opponents of incentivizing research participants with money to be more humble,” Dr. Iltis said. “Incentives might not have the effects they assume they have and which they have long held make such incentives unethical.”
“I encourage researchers who are offering incentives to consider working with people doing ethics research to assess the effects of incentives in their studies,” Dr. Halpern said. “Real-world, as opposed to hypothetical studies that can improve our understanding of the impact of incentives can improve the ethical conduct of research over time.”
Responding to criticism
The authors of the invited commentary questioned the definitions Dr. Halpern and colleagues used for undue or unjust inducement. “Among bioethicists, there is no consensus about what counts as undue inducement or an unjust distribution of research burdens. In this article, the authors have operationalized these constructs based on their own interpretations of undue and unjust inducement, which may not capture all the concerns that scholars have raised about inducement.”
Asked to respond to this and other criticisms raised in the commentary, Dr. Halpern said: “Did our study answer all possible questions about incentives? Absolutely not. But when it comes to incentives for research participation, an ounce of data is worth a pound of conjecture.”
There was agreement, however, that the findings could now put the onus on opponents of financial incentives for trial participants.
“I agree with the commentary’s authors that our study essentially shifts the burden of proof, such that, as they say, ‘those who would limit [incentives’] application may owe us an applicable criterion,’ ” Dr. Halpern said.
The authors of the invited commentary also criticized use of the study’s noninferiority design to rule out undue or unjust inducement. They note this design “may be unfamiliar to many bioethicists and can place substantial evaluative demands on readers.”
“As for the authors’ claim that noninferiority designs are difficult to interpret and unfamiliar to most clinicians and ethicists, I certainly agree,” Dr. Halpern said. “But that is hardly a reason to not employ the most rigorous methods possible to answer important questions.”
The study was supported by funding from the National Cancer Institute.
A version of this article first appeared on Medscape.com.
Paying people to participate in clinical trials remains controversial. But to date, most reservations are based on hypothetical scenarios or expert opinion with few real-world data to support them.
Research released this week could change that.
Investigators offered nearly 1,300 participants in two clinical trials either no payment or incentives up to $500 to partake in a smoking cessation study or an analysis of a behavioral intervention to increase ambulation in hospitalized patients.
More cash was associated with greater agreement to participate in the smoking cessation study but not the ambulation trial.
But the bigger news may be that offering payment did not appear to get people to accept more risks or skew participation to lower-income individuals, as some ethicists have warned.
“With the publication of our study, investigators finally have data that they can cite to put to rest any lingering concerns about offering moderate incentives in low-risk trials,” lead author Scott D. Halpern, MD, PhD, the John M. Eisenberg Professor of Medicine, Epidemiology, and Medical Ethics & Health Policy at the University of Pennsylvania, Philadelphia, told this news organization.
This initial real-world data centers on low-risk interventions and more research is needed to analyze the ethics and effectiveness of paying people to join clinical trials with more inherent risk, the researchers note.
The study was published online Sept. 20 in JAMA Internal Medicine.
A good first step?
“Payments to research participants are notoriously controversial. Many people oppose payments altogether or insist on minimal payments out of concern that people might be unduly influenced to participate,” Ana S. Iltis, PhD, told this news organization when asked for comment. “Others worry that incentives will disproportionately motivate the less well-off to participate.”
“This is an important study that begins to assess whether these concerns are justified in a real-world context,” added Dr. Iltis, director of the Center for Bioethics, Health and Society and professor of philosophy at Wake Forest University in Winston-Salem, N.C.
In an accompanying invited commentary, Sang Ngo, Anthony S. Kim, MD, and Winston Chiong, MD, PhD, write: “This work is welcome, as it presents experimental data to a bioethical debate that so far has been largely driven by conjecture and competing suppositions.”
The commentary authors, however, question the conclusiveness of the findings. “Interpreting the authors’ findings is complex and illustrates some of the challenges inherent to applying empirical data to ethical problems,” they write.
Recruitment realities
When asked his advice for researchers considering financial incentives, Dr. Halpern said: “All researchers would happily include incentives in their trial budgets if not for concerns that the sponsor or institutional review board might not approve of them.”
“By far the biggest threat to a trial’s success is the inability to enroll enough participants,” he added.
Dr. Iltis agreed, framing the need to boost enrollment in ethical terms. “There is another important ethical issue that often gets ignored, and that is the issue of studies that fail to enroll enough participants and are never completed or are underpowered,” she said.
“These studies end up exposing people to research risks and burdens without a compensating social benefit.”
“If incentives help to increase enrollment and do not necessarily result in undue influence or unfair participant selection, then there might be ethical reasons to offer incentives,” Dr. Iltis added.
Building on previous work assessing financial incentives in hypothetical clinical trials, Dr. Halpern and colleagues studied 654 participants with major depressive disorder in a smoking cessation trial. They also studied another 642 participants in a study that compared a gamification strategy to usual care for encouraging hospitalized patients to get out of bed and walk.
Dr. Halpern and colleagues randomly assigned people in the smoking cessation study to receive no financial compensation, $200, or $500. In the ambulation trial, participants were randomly allocated to receive no compensation, $100, or $300.
Key findings
A total of 22% of those offered no incentive enrolled in the smoking cessation study. In contrast, 36% offered $200 agreed, as did 47% of those offered $500, which the investigators say supports offering cash incentives to boost enrollment. The differences were significant (P < .001).
In contrast, the amount offered did not significantly incentivize more people to participate in the ambulation trial (P = .62). Rates were 45% with no compensation, 48% with $100 payment, and 43% with $300 payment.
In an analysis that adjusted for demographic differences, financial well-being, and Research Attitudes Questionnaire (RAQ-7) scores, each increase in cash incentive increased the odds of enrollment in the smoking cessation trial by 70% (adjusted odds ratio, 1.70; 95% confidence interval, 1.34-2.17).
The same effect was not seen in the ambulation trial, where each higher cash incentive did not make a significant difference (aOR, 0.88; 95% CI, 0.64-1.22).
“The ambulation trial was a lower-risk trial in which patients’ willingness to participate was higher in general. So there were likely fewer people whose participation decisions could be influenced by offers of money,” Dr. Halpern said.
Inducement vs. coercion
The incentives in the study “did not function as unjust inducements, as they were not preferentially motivating across groups with different income levels or financial well-being in either trial,” the researchers note.
Dr. Halpern and colleagues also checked for any perceptions of coercion. More than 70% of participants in each smoking cessation trial group perceived no coercion, as did more than 93% of participants in each ambulation trial group, according to scores on a modified Perceived Coercion Scale of the MacArthur Admission Experience Survey.
Furthermore, perception of risks did not significantly alter the association between cash incentives and enrollment in either trial.
After collecting the findings, Dr. Halpern and colleagues informed participants about their participation in RETAIN and explained the rationale for using different cash incentives. They also let all participants know they would ultimately receive the maximum incentive – either $500 or $300, depending on the trial.
Research implications
A study limitation was reliance on participant risk perception, as was an inability to measure perceived coercion among people who chose not to participant in the trials. Another potential limitation is that “neither of these parent trials posed particularly high risks. Future tests of incentives of different sizes, and in the context of higher-risk parent trials, including trials that test treatments of serious illnesses, are warranted,” the researchers note.
“While there are many more questions to ask and contexts in which to study the effects of incentives, this study calls on opponents of incentivizing research participants with money to be more humble,” Dr. Iltis said. “Incentives might not have the effects they assume they have and which they have long held make such incentives unethical.”
“I encourage researchers who are offering incentives to consider working with people doing ethics research to assess the effects of incentives in their studies,” Dr. Halpern said. “Real-world, as opposed to hypothetical studies that can improve our understanding of the impact of incentives can improve the ethical conduct of research over time.”
Responding to criticism
The authors of the invited commentary questioned the definitions Dr. Halpern and colleagues used for undue or unjust inducement. “Among bioethicists, there is no consensus about what counts as undue inducement or an unjust distribution of research burdens. In this article, the authors have operationalized these constructs based on their own interpretations of undue and unjust inducement, which may not capture all the concerns that scholars have raised about inducement.”
Asked to respond to this and other criticisms raised in the commentary, Dr. Halpern said: “Did our study answer all possible questions about incentives? Absolutely not. But when it comes to incentives for research participation, an ounce of data is worth a pound of conjecture.”
There was agreement, however, that the findings could now put the onus on opponents of financial incentives for trial participants.
“I agree with the commentary’s authors that our study essentially shifts the burden of proof, such that, as they say, ‘those who would limit [incentives’] application may owe us an applicable criterion,’ ” Dr. Halpern said.
The authors of the invited commentary also criticized use of the study’s noninferiority design to rule out undue or unjust inducement. They note this design “may be unfamiliar to many bioethicists and can place substantial evaluative demands on readers.”
“As for the authors’ claim that noninferiority designs are difficult to interpret and unfamiliar to most clinicians and ethicists, I certainly agree,” Dr. Halpern said. “But that is hardly a reason to not employ the most rigorous methods possible to answer important questions.”
The study was supported by funding from the National Cancer Institute.
A version of this article first appeared on Medscape.com.
Paying people to participate in clinical trials remains controversial. But to date, most reservations are based on hypothetical scenarios or expert opinion with few real-world data to support them.
Research released this week could change that.
Investigators offered nearly 1,300 participants in two clinical trials either no payment or incentives up to $500 to partake in a smoking cessation study or an analysis of a behavioral intervention to increase ambulation in hospitalized patients.
More cash was associated with greater agreement to participate in the smoking cessation study but not the ambulation trial.
But the bigger news may be that offering payment did not appear to get people to accept more risks or skew participation to lower-income individuals, as some ethicists have warned.
“With the publication of our study, investigators finally have data that they can cite to put to rest any lingering concerns about offering moderate incentives in low-risk trials,” lead author Scott D. Halpern, MD, PhD, the John M. Eisenberg Professor of Medicine, Epidemiology, and Medical Ethics & Health Policy at the University of Pennsylvania, Philadelphia, told this news organization.
This initial real-world data centers on low-risk interventions and more research is needed to analyze the ethics and effectiveness of paying people to join clinical trials with more inherent risk, the researchers note.
The study was published online Sept. 20 in JAMA Internal Medicine.
A good first step?
“Payments to research participants are notoriously controversial. Many people oppose payments altogether or insist on minimal payments out of concern that people might be unduly influenced to participate,” Ana S. Iltis, PhD, told this news organization when asked for comment. “Others worry that incentives will disproportionately motivate the less well-off to participate.”
“This is an important study that begins to assess whether these concerns are justified in a real-world context,” added Dr. Iltis, director of the Center for Bioethics, Health and Society and professor of philosophy at Wake Forest University in Winston-Salem, N.C.
In an accompanying invited commentary, Sang Ngo, Anthony S. Kim, MD, and Winston Chiong, MD, PhD, write: “This work is welcome, as it presents experimental data to a bioethical debate that so far has been largely driven by conjecture and competing suppositions.”
The commentary authors, however, question the conclusiveness of the findings. “Interpreting the authors’ findings is complex and illustrates some of the challenges inherent to applying empirical data to ethical problems,” they write.
Recruitment realities
When asked his advice for researchers considering financial incentives, Dr. Halpern said: “All researchers would happily include incentives in their trial budgets if not for concerns that the sponsor or institutional review board might not approve of them.”
“By far the biggest threat to a trial’s success is the inability to enroll enough participants,” he added.
Dr. Iltis agreed, framing the need to boost enrollment in ethical terms. “There is another important ethical issue that often gets ignored, and that is the issue of studies that fail to enroll enough participants and are never completed or are underpowered,” she said.
“These studies end up exposing people to research risks and burdens without a compensating social benefit.”
“If incentives help to increase enrollment and do not necessarily result in undue influence or unfair participant selection, then there might be ethical reasons to offer incentives,” Dr. Iltis added.
Building on previous work assessing financial incentives in hypothetical clinical trials, Dr. Halpern and colleagues studied 654 participants with major depressive disorder in a smoking cessation trial. They also studied another 642 participants in a study that compared a gamification strategy to usual care for encouraging hospitalized patients to get out of bed and walk.
Dr. Halpern and colleagues randomly assigned people in the smoking cessation study to receive no financial compensation, $200, or $500. In the ambulation trial, participants were randomly allocated to receive no compensation, $100, or $300.
Key findings
A total of 22% of those offered no incentive enrolled in the smoking cessation study. In contrast, 36% offered $200 agreed, as did 47% of those offered $500, which the investigators say supports offering cash incentives to boost enrollment. The differences were significant (P < .001).
In contrast, the amount offered did not significantly incentivize more people to participate in the ambulation trial (P = .62). Rates were 45% with no compensation, 48% with $100 payment, and 43% with $300 payment.
In an analysis that adjusted for demographic differences, financial well-being, and Research Attitudes Questionnaire (RAQ-7) scores, each increase in cash incentive increased the odds of enrollment in the smoking cessation trial by 70% (adjusted odds ratio, 1.70; 95% confidence interval, 1.34-2.17).
The same effect was not seen in the ambulation trial, where each higher cash incentive did not make a significant difference (aOR, 0.88; 95% CI, 0.64-1.22).
“The ambulation trial was a lower-risk trial in which patients’ willingness to participate was higher in general. So there were likely fewer people whose participation decisions could be influenced by offers of money,” Dr. Halpern said.
Inducement vs. coercion
The incentives in the study “did not function as unjust inducements, as they were not preferentially motivating across groups with different income levels or financial well-being in either trial,” the researchers note.
Dr. Halpern and colleagues also checked for any perceptions of coercion. More than 70% of participants in each smoking cessation trial group perceived no coercion, as did more than 93% of participants in each ambulation trial group, according to scores on a modified Perceived Coercion Scale of the MacArthur Admission Experience Survey.
Furthermore, perception of risks did not significantly alter the association between cash incentives and enrollment in either trial.
After collecting the findings, Dr. Halpern and colleagues informed participants about their participation in RETAIN and explained the rationale for using different cash incentives. They also let all participants know they would ultimately receive the maximum incentive – either $500 or $300, depending on the trial.
Research implications
A study limitation was reliance on participant risk perception, as was an inability to measure perceived coercion among people who chose not to participant in the trials. Another potential limitation is that “neither of these parent trials posed particularly high risks. Future tests of incentives of different sizes, and in the context of higher-risk parent trials, including trials that test treatments of serious illnesses, are warranted,” the researchers note.
“While there are many more questions to ask and contexts in which to study the effects of incentives, this study calls on opponents of incentivizing research participants with money to be more humble,” Dr. Iltis said. “Incentives might not have the effects they assume they have and which they have long held make such incentives unethical.”
“I encourage researchers who are offering incentives to consider working with people doing ethics research to assess the effects of incentives in their studies,” Dr. Halpern said. “Real-world, as opposed to hypothetical studies that can improve our understanding of the impact of incentives can improve the ethical conduct of research over time.”
Responding to criticism
The authors of the invited commentary questioned the definitions Dr. Halpern and colleagues used for undue or unjust inducement. “Among bioethicists, there is no consensus about what counts as undue inducement or an unjust distribution of research burdens. In this article, the authors have operationalized these constructs based on their own interpretations of undue and unjust inducement, which may not capture all the concerns that scholars have raised about inducement.”
Asked to respond to this and other criticisms raised in the commentary, Dr. Halpern said: “Did our study answer all possible questions about incentives? Absolutely not. But when it comes to incentives for research participation, an ounce of data is worth a pound of conjecture.”
There was agreement, however, that the findings could now put the onus on opponents of financial incentives for trial participants.
“I agree with the commentary’s authors that our study essentially shifts the burden of proof, such that, as they say, ‘those who would limit [incentives’] application may owe us an applicable criterion,’ ” Dr. Halpern said.
The authors of the invited commentary also criticized use of the study’s noninferiority design to rule out undue or unjust inducement. They note this design “may be unfamiliar to many bioethicists and can place substantial evaluative demands on readers.”
“As for the authors’ claim that noninferiority designs are difficult to interpret and unfamiliar to most clinicians and ethicists, I certainly agree,” Dr. Halpern said. “But that is hardly a reason to not employ the most rigorous methods possible to answer important questions.”
The study was supported by funding from the National Cancer Institute.
A version of this article first appeared on Medscape.com.
Could the osteoporosis drug alendronate ward off diabetes?
A nationwide, retrospective, case-control study of older adults in Denmark suggests that the bisphosphonate alendronate that is widely used to treat osteoporosis may protect against new-onset type 2 diabetes. But these preliminary findings need to be confirmed in a randomized controlled trial, experts said.
The registry study showed that from 2008 to 2018, among individuals in Denmark age 50 and older (with a mean age of 67), those who were taking alendronate were 36% less likely to have new-onset type 2 diabetes than age- and sex-matched individuals who were not taking the drug, after controlling for multiple risk factors.
The results also suggest that longer alendronate use and higher compliance might be more protective.
Rikke Viggers, MD, a PhD student in the department of clinical medicine, Aalborg (Denmark) University, presented the findings during an oral session at the annual meeting of the European Association for the Study of Diabetes.
“Excitingly, our research suggests that alendronate, an inexpensive medicine widely used to treat osteoporosis, may also protect against type 2 diabetes,” Dr. Viggers summarized in a press release issued by the EASD.
“Type 2 diabetes is a serious lifelong condition that can lead to other serious health issues such as stroke, heart disease, blindness, and limb amputation,” she noted, “and anything that prevents or even delays it will also reduce a person’s risk of all these other conditions.”
“We believe that doctors should consider this when prescribing osteoporosis drugs to those with prediabetes or at high risk of type 2 diabetes,” she added.
Preliminary results, need for RCT
However, these are preliminary results, Dr. Viggers cautioned during the oral presentation and in an email. “This is a registry-based study,” she stressed, “and we cannot conclude causality.”
“We do not know if this effect [of decreased risk of developing diabetes among people taking alendronate] is ‘real’ and what the mechanisms are.”
“It could be a direct effect on peripheral tissues, for example, muscle and adipose tissue,” Dr. Viggers speculated, “or an indirect effect through bone metabolites that may impact glucose metabolism.”
The group is now conducting a randomized controlled trial in patients with diabetes and osteopenia or osteoporosis to examine the relationship between alendronate and insulin sensitivity, bone indices, and glycemic control.
They also aim to investigate whether alendronate is the optimal antiosteoporotic therapy for patients with type 2 diabetes. Preliminary results suggest that other bisphosphonates have similar effects.
“Alendronate decreases bone turnover and may not be beneficial in healthy bones,” Dr. Viggers noted. “However, as far as I know, potential other side effects have not been tested in healthy bones,” so further research is needed.
Invited to comment, Charles P. Vega, MD, who presented a case and a crowd-sourced opinion about deprescribing bisphosphonates, noted that type 2 diabetes is most often diagnosed between age 40 and 60, although a few cases are diagnosed after age 65, and the study by Dr. Viggers and colleagues suggests that alendronate might help lower the risk of diabetes onset in these older adults.
“This is an interesting retrospective analysis,” said Dr. Vega, health sciences clinical professor, family medicine, University of California, Irvine, but like the study authors, he cautioned that “it should be verified with other data.”
“A meta-analysis from clinical trials of bisphosphonates which followed blood glucose levels would be helpful,” he said.
Current registry study findings
Glucose homeostasis has been linked to bone metabolism, Dr. Viggers said, and bisphosphonates were associated with increased insulin sensitivity and decreased risk of diabetes risk in two registry studies from Denmark and Taiwan.
The researchers aimed to investigate if the risk of developing type 2 diabetes was altered by previous use of alendronate.
Using data from the national Danish Patient Registry, they identified 163,588 individuals age 50 and older newly diagnosed with type 2 diabetes in 2008-2018.
They matched each patient with three individuals of the same gender and age range who did not have diabetes, for a total of 490,764 controls.
Roughly two-thirds of participants were in their 50s or 60s, a quarter were in their 70s, and 10% were 80 or older. About half of participants were women (45%).
Compared to the patients with new-onset type 2 diabetes, the control participants were healthier: they were less likely to have obesity (6% vs. 17%) and had a lower mean Charlson Comorbidity Index (0.38 vs. 0.88).
Using data from the national Danish Health Service Prescription Registry, the researchers identified individuals who filled prescriptions for alendronate in 2008-2018.
After controlling for heavy smoking, alcohol abuse, obesity, pancreatitis, hyperthyroidism, hypothyroidism, glucocorticoid use, marital status, household income, and Charlson Comorbidity Index, people taking alendronate were less likely to have new-onset diabetes than those not taking this drug (odds ratio, 0.64; 95% confidence interval, 0.62-0.66).
The odds of developing type 2 diabetes were even lower among those who took alendronate for 8 years or more versus never-users (OR, 0.47; 95% CI, 0.40-0.56), after controlling for the same variables.
Session Chair Zhila Semnani-Azad, a PhD student in nutritional science, University of Toronto, wanted to know if the researchers accounted for physical activity and vitamin D use. Dr. Viggers replied that the registries did not have this information.
The study was funded by a Steno Collaborative Project grant from the Novo Nordisk Foundation, Denmark. Dr. Viggers has disclosed receiving a grant from the foundation. Dr. Vega has disclosed serving as a consultant for Johnson & Johnson.
A version of this article first appeared on Medscape.com.
A nationwide, retrospective, case-control study of older adults in Denmark suggests that the bisphosphonate alendronate that is widely used to treat osteoporosis may protect against new-onset type 2 diabetes. But these preliminary findings need to be confirmed in a randomized controlled trial, experts said.
The registry study showed that from 2008 to 2018, among individuals in Denmark age 50 and older (with a mean age of 67), those who were taking alendronate were 36% less likely to have new-onset type 2 diabetes than age- and sex-matched individuals who were not taking the drug, after controlling for multiple risk factors.
The results also suggest that longer alendronate use and higher compliance might be more protective.
Rikke Viggers, MD, a PhD student in the department of clinical medicine, Aalborg (Denmark) University, presented the findings during an oral session at the annual meeting of the European Association for the Study of Diabetes.
“Excitingly, our research suggests that alendronate, an inexpensive medicine widely used to treat osteoporosis, may also protect against type 2 diabetes,” Dr. Viggers summarized in a press release issued by the EASD.
“Type 2 diabetes is a serious lifelong condition that can lead to other serious health issues such as stroke, heart disease, blindness, and limb amputation,” she noted, “and anything that prevents or even delays it will also reduce a person’s risk of all these other conditions.”
“We believe that doctors should consider this when prescribing osteoporosis drugs to those with prediabetes or at high risk of type 2 diabetes,” she added.
Preliminary results, need for RCT
However, these are preliminary results, Dr. Viggers cautioned during the oral presentation and in an email. “This is a registry-based study,” she stressed, “and we cannot conclude causality.”
“We do not know if this effect [of decreased risk of developing diabetes among people taking alendronate] is ‘real’ and what the mechanisms are.”
“It could be a direct effect on peripheral tissues, for example, muscle and adipose tissue,” Dr. Viggers speculated, “or an indirect effect through bone metabolites that may impact glucose metabolism.”
The group is now conducting a randomized controlled trial in patients with diabetes and osteopenia or osteoporosis to examine the relationship between alendronate and insulin sensitivity, bone indices, and glycemic control.
They also aim to investigate whether alendronate is the optimal antiosteoporotic therapy for patients with type 2 diabetes. Preliminary results suggest that other bisphosphonates have similar effects.
“Alendronate decreases bone turnover and may not be beneficial in healthy bones,” Dr. Viggers noted. “However, as far as I know, potential other side effects have not been tested in healthy bones,” so further research is needed.
Invited to comment, Charles P. Vega, MD, who presented a case and a crowd-sourced opinion about deprescribing bisphosphonates, noted that type 2 diabetes is most often diagnosed between age 40 and 60, although a few cases are diagnosed after age 65, and the study by Dr. Viggers and colleagues suggests that alendronate might help lower the risk of diabetes onset in these older adults.
“This is an interesting retrospective analysis,” said Dr. Vega, health sciences clinical professor, family medicine, University of California, Irvine, but like the study authors, he cautioned that “it should be verified with other data.”
“A meta-analysis from clinical trials of bisphosphonates which followed blood glucose levels would be helpful,” he said.
Current registry study findings
Glucose homeostasis has been linked to bone metabolism, Dr. Viggers said, and bisphosphonates were associated with increased insulin sensitivity and decreased risk of diabetes risk in two registry studies from Denmark and Taiwan.
The researchers aimed to investigate if the risk of developing type 2 diabetes was altered by previous use of alendronate.
Using data from the national Danish Patient Registry, they identified 163,588 individuals age 50 and older newly diagnosed with type 2 diabetes in 2008-2018.
They matched each patient with three individuals of the same gender and age range who did not have diabetes, for a total of 490,764 controls.
Roughly two-thirds of participants were in their 50s or 60s, a quarter were in their 70s, and 10% were 80 or older. About half of participants were women (45%).
Compared to the patients with new-onset type 2 diabetes, the control participants were healthier: they were less likely to have obesity (6% vs. 17%) and had a lower mean Charlson Comorbidity Index (0.38 vs. 0.88).
Using data from the national Danish Health Service Prescription Registry, the researchers identified individuals who filled prescriptions for alendronate in 2008-2018.
After controlling for heavy smoking, alcohol abuse, obesity, pancreatitis, hyperthyroidism, hypothyroidism, glucocorticoid use, marital status, household income, and Charlson Comorbidity Index, people taking alendronate were less likely to have new-onset diabetes than those not taking this drug (odds ratio, 0.64; 95% confidence interval, 0.62-0.66).
The odds of developing type 2 diabetes were even lower among those who took alendronate for 8 years or more versus never-users (OR, 0.47; 95% CI, 0.40-0.56), after controlling for the same variables.
Session Chair Zhila Semnani-Azad, a PhD student in nutritional science, University of Toronto, wanted to know if the researchers accounted for physical activity and vitamin D use. Dr. Viggers replied that the registries did not have this information.
The study was funded by a Steno Collaborative Project grant from the Novo Nordisk Foundation, Denmark. Dr. Viggers has disclosed receiving a grant from the foundation. Dr. Vega has disclosed serving as a consultant for Johnson & Johnson.
A version of this article first appeared on Medscape.com.
A nationwide, retrospective, case-control study of older adults in Denmark suggests that the bisphosphonate alendronate that is widely used to treat osteoporosis may protect against new-onset type 2 diabetes. But these preliminary findings need to be confirmed in a randomized controlled trial, experts said.
The registry study showed that from 2008 to 2018, among individuals in Denmark age 50 and older (with a mean age of 67), those who were taking alendronate were 36% less likely to have new-onset type 2 diabetes than age- and sex-matched individuals who were not taking the drug, after controlling for multiple risk factors.
The results also suggest that longer alendronate use and higher compliance might be more protective.
Rikke Viggers, MD, a PhD student in the department of clinical medicine, Aalborg (Denmark) University, presented the findings during an oral session at the annual meeting of the European Association for the Study of Diabetes.
“Excitingly, our research suggests that alendronate, an inexpensive medicine widely used to treat osteoporosis, may also protect against type 2 diabetes,” Dr. Viggers summarized in a press release issued by the EASD.
“Type 2 diabetes is a serious lifelong condition that can lead to other serious health issues such as stroke, heart disease, blindness, and limb amputation,” she noted, “and anything that prevents or even delays it will also reduce a person’s risk of all these other conditions.”
“We believe that doctors should consider this when prescribing osteoporosis drugs to those with prediabetes or at high risk of type 2 diabetes,” she added.
Preliminary results, need for RCT
However, these are preliminary results, Dr. Viggers cautioned during the oral presentation and in an email. “This is a registry-based study,” she stressed, “and we cannot conclude causality.”
“We do not know if this effect [of decreased risk of developing diabetes among people taking alendronate] is ‘real’ and what the mechanisms are.”
“It could be a direct effect on peripheral tissues, for example, muscle and adipose tissue,” Dr. Viggers speculated, “or an indirect effect through bone metabolites that may impact glucose metabolism.”
The group is now conducting a randomized controlled trial in patients with diabetes and osteopenia or osteoporosis to examine the relationship between alendronate and insulin sensitivity, bone indices, and glycemic control.
They also aim to investigate whether alendronate is the optimal antiosteoporotic therapy for patients with type 2 diabetes. Preliminary results suggest that other bisphosphonates have similar effects.
“Alendronate decreases bone turnover and may not be beneficial in healthy bones,” Dr. Viggers noted. “However, as far as I know, potential other side effects have not been tested in healthy bones,” so further research is needed.
Invited to comment, Charles P. Vega, MD, who presented a case and a crowd-sourced opinion about deprescribing bisphosphonates, noted that type 2 diabetes is most often diagnosed between age 40 and 60, although a few cases are diagnosed after age 65, and the study by Dr. Viggers and colleagues suggests that alendronate might help lower the risk of diabetes onset in these older adults.
“This is an interesting retrospective analysis,” said Dr. Vega, health sciences clinical professor, family medicine, University of California, Irvine, but like the study authors, he cautioned that “it should be verified with other data.”
“A meta-analysis from clinical trials of bisphosphonates which followed blood glucose levels would be helpful,” he said.
Current registry study findings
Glucose homeostasis has been linked to bone metabolism, Dr. Viggers said, and bisphosphonates were associated with increased insulin sensitivity and decreased risk of diabetes risk in two registry studies from Denmark and Taiwan.
The researchers aimed to investigate if the risk of developing type 2 diabetes was altered by previous use of alendronate.
Using data from the national Danish Patient Registry, they identified 163,588 individuals age 50 and older newly diagnosed with type 2 diabetes in 2008-2018.
They matched each patient with three individuals of the same gender and age range who did not have diabetes, for a total of 490,764 controls.
Roughly two-thirds of participants were in their 50s or 60s, a quarter were in their 70s, and 10% were 80 or older. About half of participants were women (45%).
Compared to the patients with new-onset type 2 diabetes, the control participants were healthier: they were less likely to have obesity (6% vs. 17%) and had a lower mean Charlson Comorbidity Index (0.38 vs. 0.88).
Using data from the national Danish Health Service Prescription Registry, the researchers identified individuals who filled prescriptions for alendronate in 2008-2018.
After controlling for heavy smoking, alcohol abuse, obesity, pancreatitis, hyperthyroidism, hypothyroidism, glucocorticoid use, marital status, household income, and Charlson Comorbidity Index, people taking alendronate were less likely to have new-onset diabetes than those not taking this drug (odds ratio, 0.64; 95% confidence interval, 0.62-0.66).
The odds of developing type 2 diabetes were even lower among those who took alendronate for 8 years or more versus never-users (OR, 0.47; 95% CI, 0.40-0.56), after controlling for the same variables.
Session Chair Zhila Semnani-Azad, a PhD student in nutritional science, University of Toronto, wanted to know if the researchers accounted for physical activity and vitamin D use. Dr. Viggers replied that the registries did not have this information.
The study was funded by a Steno Collaborative Project grant from the Novo Nordisk Foundation, Denmark. Dr. Viggers has disclosed receiving a grant from the foundation. Dr. Vega has disclosed serving as a consultant for Johnson & Johnson.
A version of this article first appeared on Medscape.com.
FROM EASD 2021