User login
Diabetes patients pushed into high-deductible plans
ORLANDO – The proportion of diabetes patients enrolled in high-deductible health plans jumped from 10% in 2005 to about 50% in 2014, according to a review of insurance data for 63 million Americans under age 65 years.
Diabetes patients often don’t have a choice. To cut costs, high-deductible plans are increasingly the only ones employers offer.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
While that may be adequate for healthy people, it’s quite another issue for people with chronic conditions, especially ones with low income. Out-of-pocket expenses can be thousands of dollars more than with traditional health plans, and the extra costs aren’t always offset by lower premiums.
The trend is concerning, said senior investigator J. Frank Wharam, MB, MPH, an associate professor of population medicine at Harvard Medical School, Boston. He explained the problem, and what’s being done about it, in an interview at the annual scientific sessions of the American Diabetes Association.
SOURCE: Garabedian LF et al. ADA 2018. Abstract 175-OR.
ORLANDO – The proportion of diabetes patients enrolled in high-deductible health plans jumped from 10% in 2005 to about 50% in 2014, according to a review of insurance data for 63 million Americans under age 65 years.
Diabetes patients often don’t have a choice. To cut costs, high-deductible plans are increasingly the only ones employers offer.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
While that may be adequate for healthy people, it’s quite another issue for people with chronic conditions, especially ones with low income. Out-of-pocket expenses can be thousands of dollars more than with traditional health plans, and the extra costs aren’t always offset by lower premiums.
The trend is concerning, said senior investigator J. Frank Wharam, MB, MPH, an associate professor of population medicine at Harvard Medical School, Boston. He explained the problem, and what’s being done about it, in an interview at the annual scientific sessions of the American Diabetes Association.
SOURCE: Garabedian LF et al. ADA 2018. Abstract 175-OR.
ORLANDO – The proportion of diabetes patients enrolled in high-deductible health plans jumped from 10% in 2005 to about 50% in 2014, according to a review of insurance data for 63 million Americans under age 65 years.
Diabetes patients often don’t have a choice. To cut costs, high-deductible plans are increasingly the only ones employers offer.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
While that may be adequate for healthy people, it’s quite another issue for people with chronic conditions, especially ones with low income. Out-of-pocket expenses can be thousands of dollars more than with traditional health plans, and the extra costs aren’t always offset by lower premiums.
The trend is concerning, said senior investigator J. Frank Wharam, MB, MPH, an associate professor of population medicine at Harvard Medical School, Boston. He explained the problem, and what’s being done about it, in an interview at the annual scientific sessions of the American Diabetes Association.
SOURCE: Garabedian LF et al. ADA 2018. Abstract 175-OR.
REPORTING FROM ADA 2018
Eversense CGM shown safe, accurate for 180 days in adolescents
ORLANDO – The Eversense continuous glucose monitoring (CGM) system, recently approved for use in adults with diabetes, also provides safe, durable, and accurate monitoring in the pediatric population, according to findings from a prospective single-arm study of 30 children and 6 adults.
Study subjects, who were all over age 11 years, with an average of 14 years, had the fully implantable sensor inserted at day 0 and removed at day 180, and the mean absolute relative difference (MARD) between sensor and true laboratory glucose values showed high device accuracy, Ronnie Aronson, MD, reported at the annual scientific sessions of the American Diabetes Association.
“Anything under 10% is considered good, and ours was 9.4% – and it didn’t deteriorate throughout the duration, so at 180 days it was still at 9.4%; every accuracy measure we looked at showed similar high levels of accuracy,” Dr. Aronson, founder and chief medical officer of LMC Diabetes & Endocrinology in Ontario, Canada said in a video interview.
The sensor, which is roughly 1.5 cm long, is coated with a material that fluoresces when exposed to glucose; the sensor uses the amount of light emitted to calculate blood glucose levels. Patients use an adhesive patch, changed daily, to attach a “smart” transmitter that overlies the area where the sensor is implanted. This rechargeable transmitter sends blood glucose levels to the mobile app every 5 minutes, and also powers the sensor. The Food and Drug Administration approved it for use in adults on June 21.
The system was highly rated by study participants, he said. “What makes it stand out is that it’s implanted, it’s there for at least 180 days, it’s accurate for 180 days,” the transmitter can be taken on and off, and the results can be seen very easily on a smart phone or Apple Watch.
Dr. Aronson said he also hopes to study the device in younger patients and for longer durations.
Dr. Aronson is an advisor for Novo Nordisk and Sanofi. He also receives research support from AstraZeneca, Eli Lilly, Valeant, Janssen, and Senseonics.
SOURCE: Aronson R et al. ADA 2018 Abstract 13-OR.
ORLANDO – The Eversense continuous glucose monitoring (CGM) system, recently approved for use in adults with diabetes, also provides safe, durable, and accurate monitoring in the pediatric population, according to findings from a prospective single-arm study of 30 children and 6 adults.
Study subjects, who were all over age 11 years, with an average of 14 years, had the fully implantable sensor inserted at day 0 and removed at day 180, and the mean absolute relative difference (MARD) between sensor and true laboratory glucose values showed high device accuracy, Ronnie Aronson, MD, reported at the annual scientific sessions of the American Diabetes Association.
“Anything under 10% is considered good, and ours was 9.4% – and it didn’t deteriorate throughout the duration, so at 180 days it was still at 9.4%; every accuracy measure we looked at showed similar high levels of accuracy,” Dr. Aronson, founder and chief medical officer of LMC Diabetes & Endocrinology in Ontario, Canada said in a video interview.
The sensor, which is roughly 1.5 cm long, is coated with a material that fluoresces when exposed to glucose; the sensor uses the amount of light emitted to calculate blood glucose levels. Patients use an adhesive patch, changed daily, to attach a “smart” transmitter that overlies the area where the sensor is implanted. This rechargeable transmitter sends blood glucose levels to the mobile app every 5 minutes, and also powers the sensor. The Food and Drug Administration approved it for use in adults on June 21.
The system was highly rated by study participants, he said. “What makes it stand out is that it’s implanted, it’s there for at least 180 days, it’s accurate for 180 days,” the transmitter can be taken on and off, and the results can be seen very easily on a smart phone or Apple Watch.
Dr. Aronson said he also hopes to study the device in younger patients and for longer durations.
Dr. Aronson is an advisor for Novo Nordisk and Sanofi. He also receives research support from AstraZeneca, Eli Lilly, Valeant, Janssen, and Senseonics.
SOURCE: Aronson R et al. ADA 2018 Abstract 13-OR.
ORLANDO – The Eversense continuous glucose monitoring (CGM) system, recently approved for use in adults with diabetes, also provides safe, durable, and accurate monitoring in the pediatric population, according to findings from a prospective single-arm study of 30 children and 6 adults.
Study subjects, who were all over age 11 years, with an average of 14 years, had the fully implantable sensor inserted at day 0 and removed at day 180, and the mean absolute relative difference (MARD) between sensor and true laboratory glucose values showed high device accuracy, Ronnie Aronson, MD, reported at the annual scientific sessions of the American Diabetes Association.
“Anything under 10% is considered good, and ours was 9.4% – and it didn’t deteriorate throughout the duration, so at 180 days it was still at 9.4%; every accuracy measure we looked at showed similar high levels of accuracy,” Dr. Aronson, founder and chief medical officer of LMC Diabetes & Endocrinology in Ontario, Canada said in a video interview.
The sensor, which is roughly 1.5 cm long, is coated with a material that fluoresces when exposed to glucose; the sensor uses the amount of light emitted to calculate blood glucose levels. Patients use an adhesive patch, changed daily, to attach a “smart” transmitter that overlies the area where the sensor is implanted. This rechargeable transmitter sends blood glucose levels to the mobile app every 5 minutes, and also powers the sensor. The Food and Drug Administration approved it for use in adults on June 21.
The system was highly rated by study participants, he said. “What makes it stand out is that it’s implanted, it’s there for at least 180 days, it’s accurate for 180 days,” the transmitter can be taken on and off, and the results can be seen very easily on a smart phone or Apple Watch.
Dr. Aronson said he also hopes to study the device in younger patients and for longer durations.
Dr. Aronson is an advisor for Novo Nordisk and Sanofi. He also receives research support from AstraZeneca, Eli Lilly, Valeant, Janssen, and Senseonics.
SOURCE: Aronson R et al. ADA 2018 Abstract 13-OR.
REPORTING FROM ADA 2018
Key clinical point: The Eversense fully implantable continuous glucose monitoring device is safe and accurate in adolescents.
Major finding: The MARD between sensor and true laboratory glucose values showed high device accuracy, at 9.4% over 180 days.
Study details: A prospective single-arm study of 30 children and 6 adults.
Disclosures: Dr. Aronson is an advisor for Novo Nordisk and Sanofi. He also receives research support from AstraZeneca, Eli Lilly, Valeant, Janssen, and Senseonics.
Source: Aronson R et al. ADA Abstract 13-OR.
Switch back to human insulin a viable money saver
ORLANDO – It’s safe to switch many Medicare beneficiaries with type 2 diabetes to human insulins to save money on analogues, according to a review of 14,635 members of CareMore, a Medicare Advantage company based in Cerritos, Calif.
The company noticed that it’s spending on analogue insulins had ballooned to over $3 million a month by the end of 2014, in the wake of a more than 300% price increase in analogue insulins in recent years, while copays on analogues rose from nothing to $37.50. In 2015, it launched a program to switch type 2 patients to less costly human insulins. Physicians were counseled to stop secretagogues and move patients to premixed insulins at 80% of their former total daily analogue dose, two-thirds at breakfast, and one-third a dinner, with appropriate follow-up.
Analogue insulins fell from 90% of all insulins dispensed to 30%, with a corresponding rise in human insulin prescriptions. Total plan spending on analogues fell to about a half million dollars a month by the end of 2016. Spending on human insulins rose to just under a million dollars. The risk of patients falling into the Medicare Part D coverage gap – where they assume a greater proportion of their drug costs – was reduced by 55% (P less than .001).
“A lot of money was saved as a result of this intervention,” said lead investigator Jin Luo, MD, an internist and health services researcher at Brigham and Women’s Hospital, Boston.
Mean hemoglobin A1c rose 0.14 % from a baseline of 8.46% in 2014 (P less than 0.01), “but we do not believe that this is clinically important because this value falls within the biological within-subject variation of most modern HbA1c assays,” he said at the annual scientific sessions of the American Diabetes Association.
Meanwhile, there was no statistically significant change in the rate of hospitalizations or emergency department visits for hypoglycemia or hyperglycemia.
“Patients with type 2 diabetes and their clinical providers should strongly consider human insulin as a clinically viable and cost effective option,” Dr. Luo said.
“My personal clinical opinion is that if I have a patient who is really hard to control, and after four or five different regimens, we finally settle on an analogue regimen that [keeps] them under control” and out of the hospital, “I’m not going to switch them just because a health plan tells me I should. They are just too brittle, and I’m not comfortable doing that. Whereas if I have a patient who’d be fine with either option, and I’m not really worried about hypoglycemia, I’ll switch them,” he said.
There was no industry funding. Dr. Luo is a consultant for Alosa Health and Health Action International.
SOURCE: Luo J et al. 2018 American Diabetes Association scientific session abstract 4-OR
ORLANDO – It’s safe to switch many Medicare beneficiaries with type 2 diabetes to human insulins to save money on analogues, according to a review of 14,635 members of CareMore, a Medicare Advantage company based in Cerritos, Calif.
The company noticed that it’s spending on analogue insulins had ballooned to over $3 million a month by the end of 2014, in the wake of a more than 300% price increase in analogue insulins in recent years, while copays on analogues rose from nothing to $37.50. In 2015, it launched a program to switch type 2 patients to less costly human insulins. Physicians were counseled to stop secretagogues and move patients to premixed insulins at 80% of their former total daily analogue dose, two-thirds at breakfast, and one-third a dinner, with appropriate follow-up.
Analogue insulins fell from 90% of all insulins dispensed to 30%, with a corresponding rise in human insulin prescriptions. Total plan spending on analogues fell to about a half million dollars a month by the end of 2016. Spending on human insulins rose to just under a million dollars. The risk of patients falling into the Medicare Part D coverage gap – where they assume a greater proportion of their drug costs – was reduced by 55% (P less than .001).
“A lot of money was saved as a result of this intervention,” said lead investigator Jin Luo, MD, an internist and health services researcher at Brigham and Women’s Hospital, Boston.
Mean hemoglobin A1c rose 0.14 % from a baseline of 8.46% in 2014 (P less than 0.01), “but we do not believe that this is clinically important because this value falls within the biological within-subject variation of most modern HbA1c assays,” he said at the annual scientific sessions of the American Diabetes Association.
Meanwhile, there was no statistically significant change in the rate of hospitalizations or emergency department visits for hypoglycemia or hyperglycemia.
“Patients with type 2 diabetes and their clinical providers should strongly consider human insulin as a clinically viable and cost effective option,” Dr. Luo said.
“My personal clinical opinion is that if I have a patient who is really hard to control, and after four or five different regimens, we finally settle on an analogue regimen that [keeps] them under control” and out of the hospital, “I’m not going to switch them just because a health plan tells me I should. They are just too brittle, and I’m not comfortable doing that. Whereas if I have a patient who’d be fine with either option, and I’m not really worried about hypoglycemia, I’ll switch them,” he said.
There was no industry funding. Dr. Luo is a consultant for Alosa Health and Health Action International.
SOURCE: Luo J et al. 2018 American Diabetes Association scientific session abstract 4-OR
ORLANDO – It’s safe to switch many Medicare beneficiaries with type 2 diabetes to human insulins to save money on analogues, according to a review of 14,635 members of CareMore, a Medicare Advantage company based in Cerritos, Calif.
The company noticed that it’s spending on analogue insulins had ballooned to over $3 million a month by the end of 2014, in the wake of a more than 300% price increase in analogue insulins in recent years, while copays on analogues rose from nothing to $37.50. In 2015, it launched a program to switch type 2 patients to less costly human insulins. Physicians were counseled to stop secretagogues and move patients to premixed insulins at 80% of their former total daily analogue dose, two-thirds at breakfast, and one-third a dinner, with appropriate follow-up.
Analogue insulins fell from 90% of all insulins dispensed to 30%, with a corresponding rise in human insulin prescriptions. Total plan spending on analogues fell to about a half million dollars a month by the end of 2016. Spending on human insulins rose to just under a million dollars. The risk of patients falling into the Medicare Part D coverage gap – where they assume a greater proportion of their drug costs – was reduced by 55% (P less than .001).
“A lot of money was saved as a result of this intervention,” said lead investigator Jin Luo, MD, an internist and health services researcher at Brigham and Women’s Hospital, Boston.
Mean hemoglobin A1c rose 0.14 % from a baseline of 8.46% in 2014 (P less than 0.01), “but we do not believe that this is clinically important because this value falls within the biological within-subject variation of most modern HbA1c assays,” he said at the annual scientific sessions of the American Diabetes Association.
Meanwhile, there was no statistically significant change in the rate of hospitalizations or emergency department visits for hypoglycemia or hyperglycemia.
“Patients with type 2 diabetes and their clinical providers should strongly consider human insulin as a clinically viable and cost effective option,” Dr. Luo said.
“My personal clinical opinion is that if I have a patient who is really hard to control, and after four or five different regimens, we finally settle on an analogue regimen that [keeps] them under control” and out of the hospital, “I’m not going to switch them just because a health plan tells me I should. They are just too brittle, and I’m not comfortable doing that. Whereas if I have a patient who’d be fine with either option, and I’m not really worried about hypoglycemia, I’ll switch them,” he said.
There was no industry funding. Dr. Luo is a consultant for Alosa Health and Health Action International.
SOURCE: Luo J et al. 2018 American Diabetes Association scientific session abstract 4-OR
REPORTING FROM ADA 2018
Key clinical point:
Major finding: Mean HbA1c rose just 0.14% from a baseline of 8.46% (P less than 0.01).
Study details: A review of 14,635 members Medicare patients with type 2 diabetes.
Disclosures: There was no industry funding. The lead investigator is a consultant for Alosa Health and Health Action International.
Source: Luo J et al. ADA 2018, Abstract 4-OR
T1D neuropathy declines as glycemic control improves
ORLANDO – Rates of diabetic peripheral neuropathy (DPN) in U.S. patients with type 1 diabetes (T1D) may have dipped, possibly because of improving clinical care, a new study suggests. Researchers also found evidence that nonglycemic factors may play important roles in the development of the condition.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
There are differences between DPN in T1D and type 2 diabetes: Lifetime incidence in T1D is believed to be 45%, lower than in T2D. However, a 2016 report noted that, “whereas treating hyperglycemia in type 1 DM can significantly reduce the incidence of neuropathy by up to 60 to 70%, glucose control in type 2 DM has only a marginal 5 to 7% reduction in the development of neuropathy.” (F1000Research 2016, 5(F1000 Faculty Rev):738)
Still, DPN is believed to be very common in T1D. According to the new study, previous research has suggested that the DPN rate in this population could be as high as 35%.
For the new study, researchers examined self-reports of DPN from 5,058 patients across 62 sites via the T1D Exchange Registry. All patients were at least 18 years of age and had at least 5 years of T1D. Their mean age was 39 years, the duration of diabetes was 22 years, and their average hemoglobin A1c was 8.1. Over half (56%) were women, and most (88%) were white were white.
A preliminary analysis found that just 10% of the patients had signs of DPN, according to their self-reports. In part, the difference between this number and previous estimates of DPN prevalence may be because previous studies relied on symptoms, exams, and electrophysiologic testing, said study researcher Kara Mizokami-Stout, MD, of the University of Michigan, in an interview.
However, study researcher Rodica Pop-Busui, MD, PhD, noted in an interview that one strength of the new study is that it’s “a broad sample of patients with type 1 diabetes as they are currently treated in clinical care across the United States.”
Versus those without DPN, those with the condition were more likely to be older (mean 52 vs. 37 years), female (61% vs. 55%), and had T1D for a longer period (mean 32 vs. 21 years). They were also poorer and had less education. (All P less than .001)
The DPN group also had slightly higher systolic blood pressure (mean 126 vs. 123), higher triglycerides (117 vs. 95) and more than double the rate of tobacco use (9% vs. 4%), all P less than .001.
Also, cardiovascular disease was more common (26% vs. 6%) even though this group used statins (64% vs. 31%) and ACE inhibitors/ARBs (45% vs. 23%) at much higher levels, all P less than .001.
Researchers also found that this with DPN had higher HbA1c even after controlling for various confounders (8.4% vs. 8.1%, P less than .01).
“We have the ability to prevent neuropathy, and we should do that to our advantage, targeting glycemic control as best as possible without increasing the risk of hypoglycemia,” Dr. Mizokami-Stout said. Targeting nonglycemic factors is also crucial, she said.
The study was funded by the Helmsley Charitable Trust. Dr. Mizokami-Stout and Dr. Pop-Busui report no relevant disclosures. Some of the other authors report various disclosures.
SOURCE: Mizokami-Stout K, et al. ADA 2018, Abstract 62-OR.
ORLANDO – Rates of diabetic peripheral neuropathy (DPN) in U.S. patients with type 1 diabetes (T1D) may have dipped, possibly because of improving clinical care, a new study suggests. Researchers also found evidence that nonglycemic factors may play important roles in the development of the condition.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
There are differences between DPN in T1D and type 2 diabetes: Lifetime incidence in T1D is believed to be 45%, lower than in T2D. However, a 2016 report noted that, “whereas treating hyperglycemia in type 1 DM can significantly reduce the incidence of neuropathy by up to 60 to 70%, glucose control in type 2 DM has only a marginal 5 to 7% reduction in the development of neuropathy.” (F1000Research 2016, 5(F1000 Faculty Rev):738)
Still, DPN is believed to be very common in T1D. According to the new study, previous research has suggested that the DPN rate in this population could be as high as 35%.
For the new study, researchers examined self-reports of DPN from 5,058 patients across 62 sites via the T1D Exchange Registry. All patients were at least 18 years of age and had at least 5 years of T1D. Their mean age was 39 years, the duration of diabetes was 22 years, and their average hemoglobin A1c was 8.1. Over half (56%) were women, and most (88%) were white were white.
A preliminary analysis found that just 10% of the patients had signs of DPN, according to their self-reports. In part, the difference between this number and previous estimates of DPN prevalence may be because previous studies relied on symptoms, exams, and electrophysiologic testing, said study researcher Kara Mizokami-Stout, MD, of the University of Michigan, in an interview.
However, study researcher Rodica Pop-Busui, MD, PhD, noted in an interview that one strength of the new study is that it’s “a broad sample of patients with type 1 diabetes as they are currently treated in clinical care across the United States.”
Versus those without DPN, those with the condition were more likely to be older (mean 52 vs. 37 years), female (61% vs. 55%), and had T1D for a longer period (mean 32 vs. 21 years). They were also poorer and had less education. (All P less than .001)
The DPN group also had slightly higher systolic blood pressure (mean 126 vs. 123), higher triglycerides (117 vs. 95) and more than double the rate of tobacco use (9% vs. 4%), all P less than .001.
Also, cardiovascular disease was more common (26% vs. 6%) even though this group used statins (64% vs. 31%) and ACE inhibitors/ARBs (45% vs. 23%) at much higher levels, all P less than .001.
Researchers also found that this with DPN had higher HbA1c even after controlling for various confounders (8.4% vs. 8.1%, P less than .01).
“We have the ability to prevent neuropathy, and we should do that to our advantage, targeting glycemic control as best as possible without increasing the risk of hypoglycemia,” Dr. Mizokami-Stout said. Targeting nonglycemic factors is also crucial, she said.
The study was funded by the Helmsley Charitable Trust. Dr. Mizokami-Stout and Dr. Pop-Busui report no relevant disclosures. Some of the other authors report various disclosures.
SOURCE: Mizokami-Stout K, et al. ADA 2018, Abstract 62-OR.
ORLANDO – Rates of diabetic peripheral neuropathy (DPN) in U.S. patients with type 1 diabetes (T1D) may have dipped, possibly because of improving clinical care, a new study suggests. Researchers also found evidence that nonglycemic factors may play important roles in the development of the condition.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
There are differences between DPN in T1D and type 2 diabetes: Lifetime incidence in T1D is believed to be 45%, lower than in T2D. However, a 2016 report noted that, “whereas treating hyperglycemia in type 1 DM can significantly reduce the incidence of neuropathy by up to 60 to 70%, glucose control in type 2 DM has only a marginal 5 to 7% reduction in the development of neuropathy.” (F1000Research 2016, 5(F1000 Faculty Rev):738)
Still, DPN is believed to be very common in T1D. According to the new study, previous research has suggested that the DPN rate in this population could be as high as 35%.
For the new study, researchers examined self-reports of DPN from 5,058 patients across 62 sites via the T1D Exchange Registry. All patients were at least 18 years of age and had at least 5 years of T1D. Their mean age was 39 years, the duration of diabetes was 22 years, and their average hemoglobin A1c was 8.1. Over half (56%) were women, and most (88%) were white were white.
A preliminary analysis found that just 10% of the patients had signs of DPN, according to their self-reports. In part, the difference between this number and previous estimates of DPN prevalence may be because previous studies relied on symptoms, exams, and electrophysiologic testing, said study researcher Kara Mizokami-Stout, MD, of the University of Michigan, in an interview.
However, study researcher Rodica Pop-Busui, MD, PhD, noted in an interview that one strength of the new study is that it’s “a broad sample of patients with type 1 diabetes as they are currently treated in clinical care across the United States.”
Versus those without DPN, those with the condition were more likely to be older (mean 52 vs. 37 years), female (61% vs. 55%), and had T1D for a longer period (mean 32 vs. 21 years). They were also poorer and had less education. (All P less than .001)
The DPN group also had slightly higher systolic blood pressure (mean 126 vs. 123), higher triglycerides (117 vs. 95) and more than double the rate of tobacco use (9% vs. 4%), all P less than .001.
Also, cardiovascular disease was more common (26% vs. 6%) even though this group used statins (64% vs. 31%) and ACE inhibitors/ARBs (45% vs. 23%) at much higher levels, all P less than .001.
Researchers also found that this with DPN had higher HbA1c even after controlling for various confounders (8.4% vs. 8.1%, P less than .01).
“We have the ability to prevent neuropathy, and we should do that to our advantage, targeting glycemic control as best as possible without increasing the risk of hypoglycemia,” Dr. Mizokami-Stout said. Targeting nonglycemic factors is also crucial, she said.
The study was funded by the Helmsley Charitable Trust. Dr. Mizokami-Stout and Dr. Pop-Busui report no relevant disclosures. Some of the other authors report various disclosures.
SOURCE: Mizokami-Stout K, et al. ADA 2018, Abstract 62-OR.
REPORTING FROM ADA 2018
Key clinical point: Diabetic peripheral neuropathy (DPN) may be on the decline in type 1 diabetes (T1D), and nonglycemic factors may be crucial.
Major finding: 10% of subjects showed signs of DPN via self-report, and those with DPN had much higher rates of cardiovascular disease.
Study details: Analysis of 5,058 patients across 62 sites via the T1D Exchange Registry.
Disclosures: The study was funded by the Helmsley Charitable Trust. Some of the authors report various disclosures.
Source: Mizokami-Stout K, et al. ADA 2018, Abstract 62-OR.
New SLE classification criteria reset disease definition
AMSTERDAM – The new systemic lupus erythematosus classification criteria of the American College of Rheumatology and the European League Against Rheumatism are based on a point system that will produce a “paradigm shift” in how the disease gets studied going forward, said Sindhu Johnson, MD, while presenting the latest version of the newly revised classification scheme at the European Congress of Rheumatology.
Until now, classification of systemic lupus erythematosus (SLE) was a yes-or-no decision, based on whether the patient had a minimum number of characteristic signs or symptoms. The new criteria, which are on track for formal endorsement before the end of 2018 by the two medical societies that sponsored the revision, instead use a point system that gives varying weight to each of the 22 criteria. A patient needs to score at least 10 points from these criteria, and all patients classified with SLE also must have an antinuclear antibody (ANA) titer of at least 1:80 on HEp-2 cells or an equivalent positive test. This means that the criteria also can define patients who just miss classification with SLE by meeting the ANA standard and by tallying 8 or 9 points, and the criteria also identify patients who far exceed the classification threshold by having the requisite ANA plus racking up as many as, perhaps, 20 or 30 points.
“This is a real research opportunity,” to follow patients who fall just short with 8 or 9 points to assess their longer-term prognosis, as well as to study whether “higher scores mean a higher risk for developing a bad outcome,” said Dr. Johnson, a rheumatologist at the University of Toronto and director of the Toronto Scleroderma Program. Other areas for future research with the new criteria include seeing how they work in various SLE subgroups, such as patients with renal-predominant disease or skin-predominant disease, and also seeing how they work in various ethnic populations.
“Diagnosis of lupus still falls within the realm of the treating physician,” but the classification criteria “inform our concept of the disease,” Dr. Johnson said in a video interview. “The new criteria allow for a shift in the way we think of the disease.”
For example, for the first time, the new criteria includes fever as a classification criterion, which receives 2 points if an infectious or other non-SLE cause can be discounted. Fever has recently been identified as a marker of early-stage SLE in at least some patients, and its addition to the classification criteria “adds a new dimension to how we think about the disease and allows us to distinguish early disease from mimicking diseases,” she explained. At the other end of the classification spectrum, a finding of class III or IV lupus nephritis on renal biopsy receives 10 points, and hence, this one finding plus having a high enough level of ANA leads to SLE classification regardless of whether the patient has any other signs or symptoms of the disease.
That’s because “85% of our experts said that they would feel confident classifying a patient as having lupus based only on a renal biopsy” and ANA positivity, said Dr. Johnson, who served as the ACR-appointed cochair of the criteria-writing panel along with a cochair selected by EULAR, Martin Aringer, MD, PhD, of the Technical University of Dresden (Germany). She cautioned that other levels of lupus nephritis, class II or V, confer only 8 points to the classification and so by themselves are not enough to label a person as having lupus.
During her presentation, Dr. Johnson cited the high levels of sensitivity and specificity that the new classification criteria demonstrated in a validation cohort of more than 1,000 cases and controls. In the validation analysis, the new criteria had a sensitivity of 96.12% and specificity of 94.43% for classifying SLE, giving the new criteria a better result on both these measures than either the 1997 ACR criteria (Arthritis Rheum. 1997 Sept;40[9]:1725) or the 2012 Systemic Lupus International Collaborating Clinics criteria (Arthritis Rheum. 2012 Aug;64[8]:2677-86).
The 22 criteria cluster into seven separate clinical domains and three different immunologic domains. The point values assigned to each criterion range from 2 to 10 points.
Dr. Johnson had no disclosures.
AMSTERDAM – The new systemic lupus erythematosus classification criteria of the American College of Rheumatology and the European League Against Rheumatism are based on a point system that will produce a “paradigm shift” in how the disease gets studied going forward, said Sindhu Johnson, MD, while presenting the latest version of the newly revised classification scheme at the European Congress of Rheumatology.
Until now, classification of systemic lupus erythematosus (SLE) was a yes-or-no decision, based on whether the patient had a minimum number of characteristic signs or symptoms. The new criteria, which are on track for formal endorsement before the end of 2018 by the two medical societies that sponsored the revision, instead use a point system that gives varying weight to each of the 22 criteria. A patient needs to score at least 10 points from these criteria, and all patients classified with SLE also must have an antinuclear antibody (ANA) titer of at least 1:80 on HEp-2 cells or an equivalent positive test. This means that the criteria also can define patients who just miss classification with SLE by meeting the ANA standard and by tallying 8 or 9 points, and the criteria also identify patients who far exceed the classification threshold by having the requisite ANA plus racking up as many as, perhaps, 20 or 30 points.
“This is a real research opportunity,” to follow patients who fall just short with 8 or 9 points to assess their longer-term prognosis, as well as to study whether “higher scores mean a higher risk for developing a bad outcome,” said Dr. Johnson, a rheumatologist at the University of Toronto and director of the Toronto Scleroderma Program. Other areas for future research with the new criteria include seeing how they work in various SLE subgroups, such as patients with renal-predominant disease or skin-predominant disease, and also seeing how they work in various ethnic populations.
“Diagnosis of lupus still falls within the realm of the treating physician,” but the classification criteria “inform our concept of the disease,” Dr. Johnson said in a video interview. “The new criteria allow for a shift in the way we think of the disease.”
For example, for the first time, the new criteria includes fever as a classification criterion, which receives 2 points if an infectious or other non-SLE cause can be discounted. Fever has recently been identified as a marker of early-stage SLE in at least some patients, and its addition to the classification criteria “adds a new dimension to how we think about the disease and allows us to distinguish early disease from mimicking diseases,” she explained. At the other end of the classification spectrum, a finding of class III or IV lupus nephritis on renal biopsy receives 10 points, and hence, this one finding plus having a high enough level of ANA leads to SLE classification regardless of whether the patient has any other signs or symptoms of the disease.
That’s because “85% of our experts said that they would feel confident classifying a patient as having lupus based only on a renal biopsy” and ANA positivity, said Dr. Johnson, who served as the ACR-appointed cochair of the criteria-writing panel along with a cochair selected by EULAR, Martin Aringer, MD, PhD, of the Technical University of Dresden (Germany). She cautioned that other levels of lupus nephritis, class II or V, confer only 8 points to the classification and so by themselves are not enough to label a person as having lupus.
During her presentation, Dr. Johnson cited the high levels of sensitivity and specificity that the new classification criteria demonstrated in a validation cohort of more than 1,000 cases and controls. In the validation analysis, the new criteria had a sensitivity of 96.12% and specificity of 94.43% for classifying SLE, giving the new criteria a better result on both these measures than either the 1997 ACR criteria (Arthritis Rheum. 1997 Sept;40[9]:1725) or the 2012 Systemic Lupus International Collaborating Clinics criteria (Arthritis Rheum. 2012 Aug;64[8]:2677-86).
The 22 criteria cluster into seven separate clinical domains and three different immunologic domains. The point values assigned to each criterion range from 2 to 10 points.
Dr. Johnson had no disclosures.
AMSTERDAM – The new systemic lupus erythematosus classification criteria of the American College of Rheumatology and the European League Against Rheumatism are based on a point system that will produce a “paradigm shift” in how the disease gets studied going forward, said Sindhu Johnson, MD, while presenting the latest version of the newly revised classification scheme at the European Congress of Rheumatology.
Until now, classification of systemic lupus erythematosus (SLE) was a yes-or-no decision, based on whether the patient had a minimum number of characteristic signs or symptoms. The new criteria, which are on track for formal endorsement before the end of 2018 by the two medical societies that sponsored the revision, instead use a point system that gives varying weight to each of the 22 criteria. A patient needs to score at least 10 points from these criteria, and all patients classified with SLE also must have an antinuclear antibody (ANA) titer of at least 1:80 on HEp-2 cells or an equivalent positive test. This means that the criteria also can define patients who just miss classification with SLE by meeting the ANA standard and by tallying 8 or 9 points, and the criteria also identify patients who far exceed the classification threshold by having the requisite ANA plus racking up as many as, perhaps, 20 or 30 points.
“This is a real research opportunity,” to follow patients who fall just short with 8 or 9 points to assess their longer-term prognosis, as well as to study whether “higher scores mean a higher risk for developing a bad outcome,” said Dr. Johnson, a rheumatologist at the University of Toronto and director of the Toronto Scleroderma Program. Other areas for future research with the new criteria include seeing how they work in various SLE subgroups, such as patients with renal-predominant disease or skin-predominant disease, and also seeing how they work in various ethnic populations.
“Diagnosis of lupus still falls within the realm of the treating physician,” but the classification criteria “inform our concept of the disease,” Dr. Johnson said in a video interview. “The new criteria allow for a shift in the way we think of the disease.”
For example, for the first time, the new criteria includes fever as a classification criterion, which receives 2 points if an infectious or other non-SLE cause can be discounted. Fever has recently been identified as a marker of early-stage SLE in at least some patients, and its addition to the classification criteria “adds a new dimension to how we think about the disease and allows us to distinguish early disease from mimicking diseases,” she explained. At the other end of the classification spectrum, a finding of class III or IV lupus nephritis on renal biopsy receives 10 points, and hence, this one finding plus having a high enough level of ANA leads to SLE classification regardless of whether the patient has any other signs or symptoms of the disease.
That’s because “85% of our experts said that they would feel confident classifying a patient as having lupus based only on a renal biopsy” and ANA positivity, said Dr. Johnson, who served as the ACR-appointed cochair of the criteria-writing panel along with a cochair selected by EULAR, Martin Aringer, MD, PhD, of the Technical University of Dresden (Germany). She cautioned that other levels of lupus nephritis, class II or V, confer only 8 points to the classification and so by themselves are not enough to label a person as having lupus.
During her presentation, Dr. Johnson cited the high levels of sensitivity and specificity that the new classification criteria demonstrated in a validation cohort of more than 1,000 cases and controls. In the validation analysis, the new criteria had a sensitivity of 96.12% and specificity of 94.43% for classifying SLE, giving the new criteria a better result on both these measures than either the 1997 ACR criteria (Arthritis Rheum. 1997 Sept;40[9]:1725) or the 2012 Systemic Lupus International Collaborating Clinics criteria (Arthritis Rheum. 2012 Aug;64[8]:2677-86).
The 22 criteria cluster into seven separate clinical domains and three different immunologic domains. The point values assigned to each criterion range from 2 to 10 points.
Dr. Johnson had no disclosures.
REPORTING FROM THE EULAR 2018 CONGRESS
SLE classification criteria perform well in validation study
AMSTERDAM – The first European League Against Rheumatism and American College of Rheumatology joint criteria for classifying systemic lupus erythematosus have a sensitivity and a specificity of more than 90%.
This is important because they improve upon the existing ACR and Systemic Lupus International Collaborating Clinics (SLICC) criteria, said Martin Aringer, MD, PhD, who cochaired the Steering Committee that produced the new classification criteria.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
Most clinicians working with lupus are familiar with the 1997 ACR criteria for the classification of systemic lupus erythematosus (SLE), which “had a relatively simple structure,” Dr. Aringer said during the opening plenary abstract session at the European Congress of Rheumatology. These considered items such as the presence of malar or discoid rash, photosensitivity, oral ulcers and arthritis, among others. These had a high specificity but a lower sensitivity. The development of the SLICC criteria in 2012 improved upon the sensitivity of the ACR criteria (92%-99% vs. 77%-91%), but at a loss in specificity (74%–88% vs. 91%-96%).
The SLICC criteria introduced two novel ideas, said Dr. Aringer, professor of medicine and chief of the division of rheumatology at the Technical University of Dresden (Germany). The first was that there had to be at least one immunologic criterion met, and the second was that biopsy-proven lupus nephritis had to be present with antinuclear antibodies (ANA) and anti-DNA antibodies detected.
One of the goals in developing the joint EULAR/ACR criteria therefore was to try to maintain the respective sensitivity and specificity achieved with the SLICC and ACR criteria. One of the key things that the new criteria looked at was to see if ANA could be used as an entry criterion. Investigations involving more than 13,000 patients with SLE showed that it could, with a antibody titer threshold of 1:80, exhibit a sensitivity of 98% (Arthritis Care Res. 2018;70[3]:428-38). Another goal was to see if histology-proven nephritis was a stronger predictor of SLE than clinical factors, such as oral ulcers, and to identify items that would only be included if there was no other more likely explanation (Lupus. 2016;25[8]:805-11).
Draft SLE classification criteria were developed based on an expert Delphi process and included ANA as an entry criterion and weighted items according to the likelihood of being associated with lupus. Items considered included the presence and severity of lupus nephritis, serology and other antibody tests, skin and central nervous system involvement, and hematologic and immunologic criteria such as the presence of thrombocytopenia and low complement (C3 and/or C4).
The final, simplified draft SLE classification criteria include 22 items in addition to the presence of ANA. A cut-off score of 10 or more is required for a classification of SLE. For example, a patient with an ANA of 1:80 or higher plus class III/IV nephritis (scoring 10) would be classified as having SLE. A patient with class II/V nephritis (scoring 8) would need another factor to be classified as having lupus, such as the presence of arthritis (scoring 6).
“Performance characteristics find sensitivity similar to the SLICC criteria while maintaining the specificity of the ACR 1997 criteria,” Dr. Aringer said, adding that these criteria will now be formally submitted to and reviewed by EULAR and ACR.
The sensitivity and specificity of the new criteria were 98% and 96% in the derivation cohort and 96% and 93% in the validation cohort.
“I was really very pleased and very happy to see that the revised or the new ACR/EULAR classification criteria had sensitivity and specificity of above 90%,” Thomas Dörner, MD, PhD, said in an interview at the congress. Dr. Dörner was a codeveloper of these criteria.
Over the past 10-15 years there have been several therapies that have failed to live up to their early promise as a potential treatment for lupus, said Dr. Dörner, professor of medicine at Charité–Universitätsmedizin Berlin. He noted that the failed treatment trials had led investigators to try to determine ways in which lupus might be best treated, such as by a “treat-to-target” approach to attain remission and low-disease activity. It also led to the reevaluation of how lupus is classified to see if that might be affecting the population of patients recruited into clinical trials.
“We had the feeling, and this is now confirmed by the new classification criteria, that a number of patients studied in earlier trials may have not fulfilled what we think is the classical lupus profile, so-called lupus or SLE mimickers,” Dr. Dörner said. This could have affected the chances of a treatment approach being successful versus placebo.
The new classification criteria are similar to those in other rheumatic diseases in that they give different weight to the effects on different organ systems, Dr. Dörner said. The stipulation that there must be a positive ANA test is also an important step, “really to make sure that we are looking at an autoimmune disease and nothing else,” he observed.
For patients who do not have a positive ANA test, they can of course still be treated, Dr. Dörner reassured, but for the classification criteria and entering patients into clinical trials, it’s really important to have strict classification criteria so that the results may be compared.
Dr. Aringer and Dr. Dörner had no relevant disclosures besides their involvement in developing the new classification criteria.
SOURCE: Aringer M et al. Ann Rheum Dis. 2018;77(Suppl 2):60. Abstract OP0020.
AMSTERDAM – The first European League Against Rheumatism and American College of Rheumatology joint criteria for classifying systemic lupus erythematosus have a sensitivity and a specificity of more than 90%.
This is important because they improve upon the existing ACR and Systemic Lupus International Collaborating Clinics (SLICC) criteria, said Martin Aringer, MD, PhD, who cochaired the Steering Committee that produced the new classification criteria.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
Most clinicians working with lupus are familiar with the 1997 ACR criteria for the classification of systemic lupus erythematosus (SLE), which “had a relatively simple structure,” Dr. Aringer said during the opening plenary abstract session at the European Congress of Rheumatology. These considered items such as the presence of malar or discoid rash, photosensitivity, oral ulcers and arthritis, among others. These had a high specificity but a lower sensitivity. The development of the SLICC criteria in 2012 improved upon the sensitivity of the ACR criteria (92%-99% vs. 77%-91%), but at a loss in specificity (74%–88% vs. 91%-96%).
The SLICC criteria introduced two novel ideas, said Dr. Aringer, professor of medicine and chief of the division of rheumatology at the Technical University of Dresden (Germany). The first was that there had to be at least one immunologic criterion met, and the second was that biopsy-proven lupus nephritis had to be present with antinuclear antibodies (ANA) and anti-DNA antibodies detected.
One of the goals in developing the joint EULAR/ACR criteria therefore was to try to maintain the respective sensitivity and specificity achieved with the SLICC and ACR criteria. One of the key things that the new criteria looked at was to see if ANA could be used as an entry criterion. Investigations involving more than 13,000 patients with SLE showed that it could, with a antibody titer threshold of 1:80, exhibit a sensitivity of 98% (Arthritis Care Res. 2018;70[3]:428-38). Another goal was to see if histology-proven nephritis was a stronger predictor of SLE than clinical factors, such as oral ulcers, and to identify items that would only be included if there was no other more likely explanation (Lupus. 2016;25[8]:805-11).
Draft SLE classification criteria were developed based on an expert Delphi process and included ANA as an entry criterion and weighted items according to the likelihood of being associated with lupus. Items considered included the presence and severity of lupus nephritis, serology and other antibody tests, skin and central nervous system involvement, and hematologic and immunologic criteria such as the presence of thrombocytopenia and low complement (C3 and/or C4).
The final, simplified draft SLE classification criteria include 22 items in addition to the presence of ANA. A cut-off score of 10 or more is required for a classification of SLE. For example, a patient with an ANA of 1:80 or higher plus class III/IV nephritis (scoring 10) would be classified as having SLE. A patient with class II/V nephritis (scoring 8) would need another factor to be classified as having lupus, such as the presence of arthritis (scoring 6).
“Performance characteristics find sensitivity similar to the SLICC criteria while maintaining the specificity of the ACR 1997 criteria,” Dr. Aringer said, adding that these criteria will now be formally submitted to and reviewed by EULAR and ACR.
The sensitivity and specificity of the new criteria were 98% and 96% in the derivation cohort and 96% and 93% in the validation cohort.
“I was really very pleased and very happy to see that the revised or the new ACR/EULAR classification criteria had sensitivity and specificity of above 90%,” Thomas Dörner, MD, PhD, said in an interview at the congress. Dr. Dörner was a codeveloper of these criteria.
Over the past 10-15 years there have been several therapies that have failed to live up to their early promise as a potential treatment for lupus, said Dr. Dörner, professor of medicine at Charité–Universitätsmedizin Berlin. He noted that the failed treatment trials had led investigators to try to determine ways in which lupus might be best treated, such as by a “treat-to-target” approach to attain remission and low-disease activity. It also led to the reevaluation of how lupus is classified to see if that might be affecting the population of patients recruited into clinical trials.
“We had the feeling, and this is now confirmed by the new classification criteria, that a number of patients studied in earlier trials may have not fulfilled what we think is the classical lupus profile, so-called lupus or SLE mimickers,” Dr. Dörner said. This could have affected the chances of a treatment approach being successful versus placebo.
The new classification criteria are similar to those in other rheumatic diseases in that they give different weight to the effects on different organ systems, Dr. Dörner said. The stipulation that there must be a positive ANA test is also an important step, “really to make sure that we are looking at an autoimmune disease and nothing else,” he observed.
For patients who do not have a positive ANA test, they can of course still be treated, Dr. Dörner reassured, but for the classification criteria and entering patients into clinical trials, it’s really important to have strict classification criteria so that the results may be compared.
Dr. Aringer and Dr. Dörner had no relevant disclosures besides their involvement in developing the new classification criteria.
SOURCE: Aringer M et al. Ann Rheum Dis. 2018;77(Suppl 2):60. Abstract OP0020.
AMSTERDAM – The first European League Against Rheumatism and American College of Rheumatology joint criteria for classifying systemic lupus erythematosus have a sensitivity and a specificity of more than 90%.
This is important because they improve upon the existing ACR and Systemic Lupus International Collaborating Clinics (SLICC) criteria, said Martin Aringer, MD, PhD, who cochaired the Steering Committee that produced the new classification criteria.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
Most clinicians working with lupus are familiar with the 1997 ACR criteria for the classification of systemic lupus erythematosus (SLE), which “had a relatively simple structure,” Dr. Aringer said during the opening plenary abstract session at the European Congress of Rheumatology. These considered items such as the presence of malar or discoid rash, photosensitivity, oral ulcers and arthritis, among others. These had a high specificity but a lower sensitivity. The development of the SLICC criteria in 2012 improved upon the sensitivity of the ACR criteria (92%-99% vs. 77%-91%), but at a loss in specificity (74%–88% vs. 91%-96%).
The SLICC criteria introduced two novel ideas, said Dr. Aringer, professor of medicine and chief of the division of rheumatology at the Technical University of Dresden (Germany). The first was that there had to be at least one immunologic criterion met, and the second was that biopsy-proven lupus nephritis had to be present with antinuclear antibodies (ANA) and anti-DNA antibodies detected.
One of the goals in developing the joint EULAR/ACR criteria therefore was to try to maintain the respective sensitivity and specificity achieved with the SLICC and ACR criteria. One of the key things that the new criteria looked at was to see if ANA could be used as an entry criterion. Investigations involving more than 13,000 patients with SLE showed that it could, with a antibody titer threshold of 1:80, exhibit a sensitivity of 98% (Arthritis Care Res. 2018;70[3]:428-38). Another goal was to see if histology-proven nephritis was a stronger predictor of SLE than clinical factors, such as oral ulcers, and to identify items that would only be included if there was no other more likely explanation (Lupus. 2016;25[8]:805-11).
Draft SLE classification criteria were developed based on an expert Delphi process and included ANA as an entry criterion and weighted items according to the likelihood of being associated with lupus. Items considered included the presence and severity of lupus nephritis, serology and other antibody tests, skin and central nervous system involvement, and hematologic and immunologic criteria such as the presence of thrombocytopenia and low complement (C3 and/or C4).
The final, simplified draft SLE classification criteria include 22 items in addition to the presence of ANA. A cut-off score of 10 or more is required for a classification of SLE. For example, a patient with an ANA of 1:80 or higher plus class III/IV nephritis (scoring 10) would be classified as having SLE. A patient with class II/V nephritis (scoring 8) would need another factor to be classified as having lupus, such as the presence of arthritis (scoring 6).
“Performance characteristics find sensitivity similar to the SLICC criteria while maintaining the specificity of the ACR 1997 criteria,” Dr. Aringer said, adding that these criteria will now be formally submitted to and reviewed by EULAR and ACR.
The sensitivity and specificity of the new criteria were 98% and 96% in the derivation cohort and 96% and 93% in the validation cohort.
“I was really very pleased and very happy to see that the revised or the new ACR/EULAR classification criteria had sensitivity and specificity of above 90%,” Thomas Dörner, MD, PhD, said in an interview at the congress. Dr. Dörner was a codeveloper of these criteria.
Over the past 10-15 years there have been several therapies that have failed to live up to their early promise as a potential treatment for lupus, said Dr. Dörner, professor of medicine at Charité–Universitätsmedizin Berlin. He noted that the failed treatment trials had led investigators to try to determine ways in which lupus might be best treated, such as by a “treat-to-target” approach to attain remission and low-disease activity. It also led to the reevaluation of how lupus is classified to see if that might be affecting the population of patients recruited into clinical trials.
“We had the feeling, and this is now confirmed by the new classification criteria, that a number of patients studied in earlier trials may have not fulfilled what we think is the classical lupus profile, so-called lupus or SLE mimickers,” Dr. Dörner said. This could have affected the chances of a treatment approach being successful versus placebo.
The new classification criteria are similar to those in other rheumatic diseases in that they give different weight to the effects on different organ systems, Dr. Dörner said. The stipulation that there must be a positive ANA test is also an important step, “really to make sure that we are looking at an autoimmune disease and nothing else,” he observed.
For patients who do not have a positive ANA test, they can of course still be treated, Dr. Dörner reassured, but for the classification criteria and entering patients into clinical trials, it’s really important to have strict classification criteria so that the results may be compared.
Dr. Aringer and Dr. Dörner had no relevant disclosures besides their involvement in developing the new classification criteria.
SOURCE: Aringer M et al. Ann Rheum Dis. 2018;77(Suppl 2):60. Abstract OP0020.
REPORTING FROM THE EULAR 2018 CONGRESS
Key clinical point: New classification criteria for systemic lupus erythematosus (SLE) achieve both high sensitivity and specificity.
Major finding: The sensitivity and specificity of the new criteria were 98% and 96% in the derivation cohort and 96% and 93% in the validation cohort.
Study details: An international cohort of 1,160 SLE patients and 1,058 non-SLE patients in whom the new criteria were tested and validated.
Disclosures: Dr. Aringer and Dr. Dörner had no relevant disclosures besides their involvement in developing the new classification criteria.
Source: Aringer M et al. Ann Rheum Dis. 2018;77(Suppl 2):60. Abstract OP0020.
Canakinumab cut gout attacks in CANTOS
AMSTERDAM – in an exploratory, post hoc analysis of data collected from more than 10,000 patients in the CANTOS multicenter, randomized trial.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
While this result is only a hypothesis-generating suggestion that blocking interleukin (IL)-1 beta can have a significant impact on the frequency of gout flares, it serves as a proof-of-concept that IL-1 beta blockade is a potentially clinically meaningful strategy for future efforts to block gout attacks, Daniel H. Solomon, MD, said at the European Congress of Rheumatology.
“IL-1 beta is incredibly important in the inflammation associated with gout. Gout is considered by many to be the canonical IL-1 beta disease,” and hence it was important to examine the impact that treatment with the IL-1 beta blocker canakinumab had on gout in the CANTOS trial, Dr. Solomon explained in a video interview.
The answer was that treatment with canakinumab was linked with a roughly 50% reduction in gout flares in the total study group. The same reduction was seen in both the subgroups of patients with and without a history of gout. The effect was seen across all three subgroups of patients, based on their baseline serum urate levels including those with normal, elevated, or very elevated levels and across all the other prespecified subgroups including divisions based on sex, age, baseline body mass index, and baseline level of high-sensitivity C-reactive protein (hsCRP).
It’s also unclear that canakinumab (Ilaris) is the best type of IL-1 beta blocking drug to use for prevention of gout flares. In CANTOS, this expensive drug was administered subcutaneously every 3 months. A more appropriate agent might be an oral, small-molecule drug that blocks IL-1 beta. Several examples of this type of agent are currently in clinical development, said Dr. Solomon, a professor of medicine at Harvard Medical School and a rheumatologist at Brigham and Women’s Hospital, both in Boston.
CANTOS (Canakinumab Anti-inflammatory Thrombosis Outcome Study) randomized 10,061 patients with a history of MI and a hsCRP level of at least 2 mg/L at centers in 39 countries. The study’s primary endpoint was the combined rate of cardiovascular death, MI, or stroke, and canakinumab treatment at the 150-mg dosage level linked with a 15% relative reduction in this endpoint, compared with placebo in this secondary-prevention study (N Engl J Med. 2017 Sept 21;377[12]:1119-31). The study also randomized patients to either of two other canakinumab dosages, 50 mg or 300 mg, administered every 3 months, and, while each of these produced reductions in the primary endpoint relative to placebo, the 150-mg dosage had the largest effect. In the gout analysis reported by Dr. Solomon, the three different canakinumab dosages produced somewhat different levels of gout-flare reductions, but, generally, the effect was similar across the three treatment groups.
In the total study population, regardless of gout history, treatment with 50 mg, 150 mg, and 300 mg canakinumab every 3 months was linked with a reduction in gout attacks of 46%, 57%, and 53%, respectively, compared with placebo-treated patients, Dr. Solomon reported. The three dosages also uniformly produced significantly drops in serum levels of hsCRP, compared with placebo, but canakinumab treatment had no impact on serum urate levels, indicating that the gout-reducing effects of the drug did not occur via a mechanism that involved serum urate.
Because CANTOS exclusively enrolled patients with established coronary disease, the new analysis could not address whether IL-1 beta blockade would also be an effective strategy for reducing gout flares in people without cardiovascular disease, Dr. Solomon cautioned. Although it probably would, he said. He also stressed that treatment with an IL-1 blocking drug should not be seen as a substitute for appropriate urate-lowering treatment in patients with elevated levels of serum urate.
SOURCE: Solomon DH et al. Ann Rheum Dis. 2018;77(Suppl 2):56. Abstract OP0014.
AMSTERDAM – in an exploratory, post hoc analysis of data collected from more than 10,000 patients in the CANTOS multicenter, randomized trial.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
While this result is only a hypothesis-generating suggestion that blocking interleukin (IL)-1 beta can have a significant impact on the frequency of gout flares, it serves as a proof-of-concept that IL-1 beta blockade is a potentially clinically meaningful strategy for future efforts to block gout attacks, Daniel H. Solomon, MD, said at the European Congress of Rheumatology.
“IL-1 beta is incredibly important in the inflammation associated with gout. Gout is considered by many to be the canonical IL-1 beta disease,” and hence it was important to examine the impact that treatment with the IL-1 beta blocker canakinumab had on gout in the CANTOS trial, Dr. Solomon explained in a video interview.
The answer was that treatment with canakinumab was linked with a roughly 50% reduction in gout flares in the total study group. The same reduction was seen in both the subgroups of patients with and without a history of gout. The effect was seen across all three subgroups of patients, based on their baseline serum urate levels including those with normal, elevated, or very elevated levels and across all the other prespecified subgroups including divisions based on sex, age, baseline body mass index, and baseline level of high-sensitivity C-reactive protein (hsCRP).
It’s also unclear that canakinumab (Ilaris) is the best type of IL-1 beta blocking drug to use for prevention of gout flares. In CANTOS, this expensive drug was administered subcutaneously every 3 months. A more appropriate agent might be an oral, small-molecule drug that blocks IL-1 beta. Several examples of this type of agent are currently in clinical development, said Dr. Solomon, a professor of medicine at Harvard Medical School and a rheumatologist at Brigham and Women’s Hospital, both in Boston.
CANTOS (Canakinumab Anti-inflammatory Thrombosis Outcome Study) randomized 10,061 patients with a history of MI and a hsCRP level of at least 2 mg/L at centers in 39 countries. The study’s primary endpoint was the combined rate of cardiovascular death, MI, or stroke, and canakinumab treatment at the 150-mg dosage level linked with a 15% relative reduction in this endpoint, compared with placebo in this secondary-prevention study (N Engl J Med. 2017 Sept 21;377[12]:1119-31). The study also randomized patients to either of two other canakinumab dosages, 50 mg or 300 mg, administered every 3 months, and, while each of these produced reductions in the primary endpoint relative to placebo, the 150-mg dosage had the largest effect. In the gout analysis reported by Dr. Solomon, the three different canakinumab dosages produced somewhat different levels of gout-flare reductions, but, generally, the effect was similar across the three treatment groups.
In the total study population, regardless of gout history, treatment with 50 mg, 150 mg, and 300 mg canakinumab every 3 months was linked with a reduction in gout attacks of 46%, 57%, and 53%, respectively, compared with placebo-treated patients, Dr. Solomon reported. The three dosages also uniformly produced significantly drops in serum levels of hsCRP, compared with placebo, but canakinumab treatment had no impact on serum urate levels, indicating that the gout-reducing effects of the drug did not occur via a mechanism that involved serum urate.
Because CANTOS exclusively enrolled patients with established coronary disease, the new analysis could not address whether IL-1 beta blockade would also be an effective strategy for reducing gout flares in people without cardiovascular disease, Dr. Solomon cautioned. Although it probably would, he said. He also stressed that treatment with an IL-1 blocking drug should not be seen as a substitute for appropriate urate-lowering treatment in patients with elevated levels of serum urate.
SOURCE: Solomon DH et al. Ann Rheum Dis. 2018;77(Suppl 2):56. Abstract OP0014.
AMSTERDAM – in an exploratory, post hoc analysis of data collected from more than 10,000 patients in the CANTOS multicenter, randomized trial.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
While this result is only a hypothesis-generating suggestion that blocking interleukin (IL)-1 beta can have a significant impact on the frequency of gout flares, it serves as a proof-of-concept that IL-1 beta blockade is a potentially clinically meaningful strategy for future efforts to block gout attacks, Daniel H. Solomon, MD, said at the European Congress of Rheumatology.
“IL-1 beta is incredibly important in the inflammation associated with gout. Gout is considered by many to be the canonical IL-1 beta disease,” and hence it was important to examine the impact that treatment with the IL-1 beta blocker canakinumab had on gout in the CANTOS trial, Dr. Solomon explained in a video interview.
The answer was that treatment with canakinumab was linked with a roughly 50% reduction in gout flares in the total study group. The same reduction was seen in both the subgroups of patients with and without a history of gout. The effect was seen across all three subgroups of patients, based on their baseline serum urate levels including those with normal, elevated, or very elevated levels and across all the other prespecified subgroups including divisions based on sex, age, baseline body mass index, and baseline level of high-sensitivity C-reactive protein (hsCRP).
It’s also unclear that canakinumab (Ilaris) is the best type of IL-1 beta blocking drug to use for prevention of gout flares. In CANTOS, this expensive drug was administered subcutaneously every 3 months. A more appropriate agent might be an oral, small-molecule drug that blocks IL-1 beta. Several examples of this type of agent are currently in clinical development, said Dr. Solomon, a professor of medicine at Harvard Medical School and a rheumatologist at Brigham and Women’s Hospital, both in Boston.
CANTOS (Canakinumab Anti-inflammatory Thrombosis Outcome Study) randomized 10,061 patients with a history of MI and a hsCRP level of at least 2 mg/L at centers in 39 countries. The study’s primary endpoint was the combined rate of cardiovascular death, MI, or stroke, and canakinumab treatment at the 150-mg dosage level linked with a 15% relative reduction in this endpoint, compared with placebo in this secondary-prevention study (N Engl J Med. 2017 Sept 21;377[12]:1119-31). The study also randomized patients to either of two other canakinumab dosages, 50 mg or 300 mg, administered every 3 months, and, while each of these produced reductions in the primary endpoint relative to placebo, the 150-mg dosage had the largest effect. In the gout analysis reported by Dr. Solomon, the three different canakinumab dosages produced somewhat different levels of gout-flare reductions, but, generally, the effect was similar across the three treatment groups.
In the total study population, regardless of gout history, treatment with 50 mg, 150 mg, and 300 mg canakinumab every 3 months was linked with a reduction in gout attacks of 46%, 57%, and 53%, respectively, compared with placebo-treated patients, Dr. Solomon reported. The three dosages also uniformly produced significantly drops in serum levels of hsCRP, compared with placebo, but canakinumab treatment had no impact on serum urate levels, indicating that the gout-reducing effects of the drug did not occur via a mechanism that involved serum urate.
Because CANTOS exclusively enrolled patients with established coronary disease, the new analysis could not address whether IL-1 beta blockade would also be an effective strategy for reducing gout flares in people without cardiovascular disease, Dr. Solomon cautioned. Although it probably would, he said. He also stressed that treatment with an IL-1 blocking drug should not be seen as a substitute for appropriate urate-lowering treatment in patients with elevated levels of serum urate.
SOURCE: Solomon DH et al. Ann Rheum Dis. 2018;77(Suppl 2):56. Abstract OP0014.
REPORTING FROM THE EULAR 2018 CONGRESS
Key clinical point: IL-1 blockade seems to be an effective way to cut the incidence of gout attacks.
Major finding: IL-1 blockade with canakinumab was linked with about a 50% cut in gout flares, compared with placebo.
Study details: CANTOS, a multicenter, randomized trial with 10,061 patients.
Disclosures: CANTOS was funded by Novartis, the company that markets canakinumab. Dr. Solomon has no relationships with Novartis. Brigham and Women’s Hospital, the center at which he works, has received research funding from Amgen, Bristol-Myers Squibb, Genentech, and Pfizer for studies that Dr. Solomon has helped direct.
Source: Solomon DH et al. Ann Rheum Dis. 2018;77(Suppl 2):56. Abstract OP0014.
Observational data can’t answer question of inhibiting ankylosing spondylitis progression
AMSTERDAM – The attempt to determine whether biologics such as tumor necrosis factor inhibitors (TNFi) inhibit progression of ankylosing spondylitis has been pursued with observational studies, but these types of studies will never definitively answer the question, according to Robert B.M. Landewé, MD, PhD, professor of rheumatology at the University of Amsterdam.
“The methodology is sensitive to a lot of measurement error, making the results spurious,” Dr. Landewé said in an interview, recapping remarks he made in a presentation at the European Congress of Rheumatology.
This was disappointing to many investigators, including several speaking in the same symposium where Dr. Landewé made his remarks. Randomized, controlled trials that employ serial radiographs to document changes in ankylosing spondylitis are expensive, making observational studies an attractive surrogate, but Dr. Landewé said such studies are associated with an inherent risk of residual confounding.
In addition, he believes the effect size of biologics on progression, if it exists at all, is likely to be subtle. In the observational studies that have concluded that there is protection, complicated statistical analyses have been typically employed to produce a significant finding.
Observational studies do have hypothesis-generating value, according to Dr. Landewé, but he cautioned that they produce “more questions than answers.” He also emphasized that the inflammation-related progression that leads to bone growth in ankylosing spondylitis is different than it is in the destructive inflammatory diseases, such as rheumatoid arthritis, where the issue is bone loss.
It is rational to assume that effective anti-inflammatory therapy would prevent progression of inflammatory diseases, but Dr. Landewé said in his presentation that this is the type of bias that undermines the value of observational studies for reaching objective conclusions. Unlike the results of a registered randomized, controlled trial, which will be known to be consistent or not with the underlying hypothesis, there is a strong risk that data in an observational study will be reworked until they produce the desired result.
AMSTERDAM – The attempt to determine whether biologics such as tumor necrosis factor inhibitors (TNFi) inhibit progression of ankylosing spondylitis has been pursued with observational studies, but these types of studies will never definitively answer the question, according to Robert B.M. Landewé, MD, PhD, professor of rheumatology at the University of Amsterdam.
“The methodology is sensitive to a lot of measurement error, making the results spurious,” Dr. Landewé said in an interview, recapping remarks he made in a presentation at the European Congress of Rheumatology.
This was disappointing to many investigators, including several speaking in the same symposium where Dr. Landewé made his remarks. Randomized, controlled trials that employ serial radiographs to document changes in ankylosing spondylitis are expensive, making observational studies an attractive surrogate, but Dr. Landewé said such studies are associated with an inherent risk of residual confounding.
In addition, he believes the effect size of biologics on progression, if it exists at all, is likely to be subtle. In the observational studies that have concluded that there is protection, complicated statistical analyses have been typically employed to produce a significant finding.
Observational studies do have hypothesis-generating value, according to Dr. Landewé, but he cautioned that they produce “more questions than answers.” He also emphasized that the inflammation-related progression that leads to bone growth in ankylosing spondylitis is different than it is in the destructive inflammatory diseases, such as rheumatoid arthritis, where the issue is bone loss.
It is rational to assume that effective anti-inflammatory therapy would prevent progression of inflammatory diseases, but Dr. Landewé said in his presentation that this is the type of bias that undermines the value of observational studies for reaching objective conclusions. Unlike the results of a registered randomized, controlled trial, which will be known to be consistent or not with the underlying hypothesis, there is a strong risk that data in an observational study will be reworked until they produce the desired result.
AMSTERDAM – The attempt to determine whether biologics such as tumor necrosis factor inhibitors (TNFi) inhibit progression of ankylosing spondylitis has been pursued with observational studies, but these types of studies will never definitively answer the question, according to Robert B.M. Landewé, MD, PhD, professor of rheumatology at the University of Amsterdam.
“The methodology is sensitive to a lot of measurement error, making the results spurious,” Dr. Landewé said in an interview, recapping remarks he made in a presentation at the European Congress of Rheumatology.
This was disappointing to many investigators, including several speaking in the same symposium where Dr. Landewé made his remarks. Randomized, controlled trials that employ serial radiographs to document changes in ankylosing spondylitis are expensive, making observational studies an attractive surrogate, but Dr. Landewé said such studies are associated with an inherent risk of residual confounding.
In addition, he believes the effect size of biologics on progression, if it exists at all, is likely to be subtle. In the observational studies that have concluded that there is protection, complicated statistical analyses have been typically employed to produce a significant finding.
Observational studies do have hypothesis-generating value, according to Dr. Landewé, but he cautioned that they produce “more questions than answers.” He also emphasized that the inflammation-related progression that leads to bone growth in ankylosing spondylitis is different than it is in the destructive inflammatory diseases, such as rheumatoid arthritis, where the issue is bone loss.
It is rational to assume that effective anti-inflammatory therapy would prevent progression of inflammatory diseases, but Dr. Landewé said in his presentation that this is the type of bias that undermines the value of observational studies for reaching objective conclusions. Unlike the results of a registered randomized, controlled trial, which will be known to be consistent or not with the underlying hypothesis, there is a strong risk that data in an observational study will be reworked until they produce the desired result.
REPORTING FROM THE EULAR 2018 CONGRESS
Size can matter: Laparoscopic hysterectomy for the very large uterus

Visit the Society of Gynecologic Surgeons online: sgsonline.org
Additional videos from SGS are available here, including these recent offerings:

Visit the Society of Gynecologic Surgeons online: sgsonline.org
Additional videos from SGS are available here, including these recent offerings:

Visit the Society of Gynecologic Surgeons online: sgsonline.org
Additional videos from SGS are available here, including these recent offerings:
This video is brought to you by
Ankylosing spondylitis diagnosis linked to self-harm attempts
AMSTERDAM – There is an increased relative risk of deliberate self-harm that results in emergency treatment among individuals newly diagnosed with ankylosing spondylitis, according to the results of a large, Canadian population-based study.
A diagnosis of ankylosing spondylitis was associated with a 59% increased risk of deliberate self-harm, compared with no diagnosis (HR = 1.59, 95% CI, 1.16-2.21). While the risk of deliberate self-harm in patients diagnosed with rheumatoid arthritis (RA) was initially elevated, the association was not significant after adjustment for confounding factors (HR = 1.08, 95% CI, 0.87-1.34).
These findings call for heightened awareness among clinicians, study investigator Nigil Haroon, MD, PhD, said in an interview at the European Congress of Rheumatology. “Depression is generally well known to be increased in patients with chronic diseases, especially so with chronic inflammatory rheumatic diseases like ankylosing spondylitis and rheumatoid arthritis,” he said. This may in turn be linked to increased cases of deliberate self-harm, but there have been few studies to determine if this is the case, he said, which may be because it is a relatively rare event in routine clinical practice.
Dr. Haroon, who runs a specialist clinic in ankylosing spondylitis in Toronto, has seen the long-term effects of chronic pain, lack of social support, and inability to sleep on patients’ mood first hand. This is what drove him and other colleagues at the University of Toronto and University Health Network to look at the possibility that this could be linked to an increased risk for depression and perhaps deliberate self-harm among newly diagnosed patients.
To try to estimate the risk, they obtained administrative data on more than 100,000 individuals diagnosed with ankylosing spondylitis or RA in the province of Ontario, Canada, between 2002 and 2014. Excluding those with a history of mental illness or a prior self-harm attempt resulted in the creation of two cohorts of patients – 13,964 with ankylosing spondylitis and 53,240 with RA. Indviduals in these two cohorts were then matched, 4:1, to similar controls in the general population.
The average age of those diagnosed with ankylosing spondylitis was 46 years and of those with RA was 57 years, with more males than females in the ankylosing spondylitis group (57% vs. 43%) and more females than males in the RA group (67% vs. 33%).
The main outcome assessed was the first episode of intentional self-injury or self-poisoning that required emergency treatment that occurred after the diagnosis of ankylosing spondylitis or RA.
Overall, there were 69 deliberate self-harm attempts recorded in the ankylosing spondylitis patient group, compared with 131 attempts in the non-ankylosing spondylitis group. In the RA patient group, there were 129 attempts, and 372 attempts in the non-RA group.
Poisoning was “by far the most common modality” used to intentionally self-harm, used by 67% of patients with ankylosing spondylitis and by 81% of those with RA, Dr. Haroon reported. Contact with a sharp object was the second most common method used to deliberately self-harm by 30% of ankylosing spondylitis patients and 16% of RA patients.
Most (70%) patients were discharged following emergency treatment for a deliberate self-harm attempt, with around 15% of ankylosing spondylitis and 22% of RA patients requiring hospital admission.
“For any chronic disease there is a potential for depression to settle, and we should identify [patients] early, even at the primary care levels itself and try to address it,” Dr. Haroon advised. It’s important to spend time and to develop a good rapport with your patients, he added, which can help them open up and talk about their mood.
The work was funded by the Division of Rheumatology Pfizer Research Chair, University of Toronto. Dr. Haroon reported having no relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
SOURCE: Kuriya B et al. Ann Rheum Dis. 2018;77(Suppl 2):195. Abstract OP0296.
AMSTERDAM – There is an increased relative risk of deliberate self-harm that results in emergency treatment among individuals newly diagnosed with ankylosing spondylitis, according to the results of a large, Canadian population-based study.
A diagnosis of ankylosing spondylitis was associated with a 59% increased risk of deliberate self-harm, compared with no diagnosis (HR = 1.59, 95% CI, 1.16-2.21). While the risk of deliberate self-harm in patients diagnosed with rheumatoid arthritis (RA) was initially elevated, the association was not significant after adjustment for confounding factors (HR = 1.08, 95% CI, 0.87-1.34).
These findings call for heightened awareness among clinicians, study investigator Nigil Haroon, MD, PhD, said in an interview at the European Congress of Rheumatology. “Depression is generally well known to be increased in patients with chronic diseases, especially so with chronic inflammatory rheumatic diseases like ankylosing spondylitis and rheumatoid arthritis,” he said. This may in turn be linked to increased cases of deliberate self-harm, but there have been few studies to determine if this is the case, he said, which may be because it is a relatively rare event in routine clinical practice.
Dr. Haroon, who runs a specialist clinic in ankylosing spondylitis in Toronto, has seen the long-term effects of chronic pain, lack of social support, and inability to sleep on patients’ mood first hand. This is what drove him and other colleagues at the University of Toronto and University Health Network to look at the possibility that this could be linked to an increased risk for depression and perhaps deliberate self-harm among newly diagnosed patients.
To try to estimate the risk, they obtained administrative data on more than 100,000 individuals diagnosed with ankylosing spondylitis or RA in the province of Ontario, Canada, between 2002 and 2014. Excluding those with a history of mental illness or a prior self-harm attempt resulted in the creation of two cohorts of patients – 13,964 with ankylosing spondylitis and 53,240 with RA. Indviduals in these two cohorts were then matched, 4:1, to similar controls in the general population.
The average age of those diagnosed with ankylosing spondylitis was 46 years and of those with RA was 57 years, with more males than females in the ankylosing spondylitis group (57% vs. 43%) and more females than males in the RA group (67% vs. 33%).
The main outcome assessed was the first episode of intentional self-injury or self-poisoning that required emergency treatment that occurred after the diagnosis of ankylosing spondylitis or RA.
Overall, there were 69 deliberate self-harm attempts recorded in the ankylosing spondylitis patient group, compared with 131 attempts in the non-ankylosing spondylitis group. In the RA patient group, there were 129 attempts, and 372 attempts in the non-RA group.
Poisoning was “by far the most common modality” used to intentionally self-harm, used by 67% of patients with ankylosing spondylitis and by 81% of those with RA, Dr. Haroon reported. Contact with a sharp object was the second most common method used to deliberately self-harm by 30% of ankylosing spondylitis patients and 16% of RA patients.
Most (70%) patients were discharged following emergency treatment for a deliberate self-harm attempt, with around 15% of ankylosing spondylitis and 22% of RA patients requiring hospital admission.
“For any chronic disease there is a potential for depression to settle, and we should identify [patients] early, even at the primary care levels itself and try to address it,” Dr. Haroon advised. It’s important to spend time and to develop a good rapport with your patients, he added, which can help them open up and talk about their mood.
The work was funded by the Division of Rheumatology Pfizer Research Chair, University of Toronto. Dr. Haroon reported having no relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
SOURCE: Kuriya B et al. Ann Rheum Dis. 2018;77(Suppl 2):195. Abstract OP0296.
AMSTERDAM – There is an increased relative risk of deliberate self-harm that results in emergency treatment among individuals newly diagnosed with ankylosing spondylitis, according to the results of a large, Canadian population-based study.
A diagnosis of ankylosing spondylitis was associated with a 59% increased risk of deliberate self-harm, compared with no diagnosis (HR = 1.59, 95% CI, 1.16-2.21). While the risk of deliberate self-harm in patients diagnosed with rheumatoid arthritis (RA) was initially elevated, the association was not significant after adjustment for confounding factors (HR = 1.08, 95% CI, 0.87-1.34).
These findings call for heightened awareness among clinicians, study investigator Nigil Haroon, MD, PhD, said in an interview at the European Congress of Rheumatology. “Depression is generally well known to be increased in patients with chronic diseases, especially so with chronic inflammatory rheumatic diseases like ankylosing spondylitis and rheumatoid arthritis,” he said. This may in turn be linked to increased cases of deliberate self-harm, but there have been few studies to determine if this is the case, he said, which may be because it is a relatively rare event in routine clinical practice.
Dr. Haroon, who runs a specialist clinic in ankylosing spondylitis in Toronto, has seen the long-term effects of chronic pain, lack of social support, and inability to sleep on patients’ mood first hand. This is what drove him and other colleagues at the University of Toronto and University Health Network to look at the possibility that this could be linked to an increased risk for depression and perhaps deliberate self-harm among newly diagnosed patients.
To try to estimate the risk, they obtained administrative data on more than 100,000 individuals diagnosed with ankylosing spondylitis or RA in the province of Ontario, Canada, between 2002 and 2014. Excluding those with a history of mental illness or a prior self-harm attempt resulted in the creation of two cohorts of patients – 13,964 with ankylosing spondylitis and 53,240 with RA. Indviduals in these two cohorts were then matched, 4:1, to similar controls in the general population.
The average age of those diagnosed with ankylosing spondylitis was 46 years and of those with RA was 57 years, with more males than females in the ankylosing spondylitis group (57% vs. 43%) and more females than males in the RA group (67% vs. 33%).
The main outcome assessed was the first episode of intentional self-injury or self-poisoning that required emergency treatment that occurred after the diagnosis of ankylosing spondylitis or RA.
Overall, there were 69 deliberate self-harm attempts recorded in the ankylosing spondylitis patient group, compared with 131 attempts in the non-ankylosing spondylitis group. In the RA patient group, there were 129 attempts, and 372 attempts in the non-RA group.
Poisoning was “by far the most common modality” used to intentionally self-harm, used by 67% of patients with ankylosing spondylitis and by 81% of those with RA, Dr. Haroon reported. Contact with a sharp object was the second most common method used to deliberately self-harm by 30% of ankylosing spondylitis patients and 16% of RA patients.
Most (70%) patients were discharged following emergency treatment for a deliberate self-harm attempt, with around 15% of ankylosing spondylitis and 22% of RA patients requiring hospital admission.
“For any chronic disease there is a potential for depression to settle, and we should identify [patients] early, even at the primary care levels itself and try to address it,” Dr. Haroon advised. It’s important to spend time and to develop a good rapport with your patients, he added, which can help them open up and talk about their mood.
The work was funded by the Division of Rheumatology Pfizer Research Chair, University of Toronto. Dr. Haroon reported having no relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
SOURCE: Kuriya B et al. Ann Rheum Dis. 2018;77(Suppl 2):195. Abstract OP0296.
REPORTING FROM THE EULAR 2018 CONGRESS
Key clinical point: Major finding: Newly-diagnosed individuals with ankylosing spondylitis are more likely to attempt self harm than those without the diagnosis (HR = 1.59, 95% CI, 1.16-2.21).
Study details: Population-based study of 13,964 individuals with ankylosing spondylitis, 53,240 individuals with RA, and matched controls from the general population.
Disclosures: The work was funded by the Division of Rheumatology Pfizer Research Chair, University of Toronto. Dr. Haroon reported having no relevant financial disclosures.
Source: Kuriya B et al. Ann Rheum Dis. 2018;77(Suppl 2):195. Abstract OP0296.