User login
Tanezumab improves osteoarthritis pain, function in phase 3 trial
MADRID – Tanezumab, an investigational monoclonal antibody directed against nerve growth factor that is under development to treat osteoarthritis pain, met most of the coprimary efficacy endpoints set for the drug in a randomized, double-blind, parallel-group, placebo-controlled phase 3 study.
At the end of a 24-week, double-blind treatment period, Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain and WOMAC physical function subscale scores were significantly improved, compared with placebo in the two tanezumab (2.5 mg and 5 mg) dose groups.
The least squares (ls) mean change from baseline in WOMAC pain scores were –2.24 for placebo, –2.70 for tanezumab 2.5 mg, and –2.85 for tanezumab 5 mg (P less than or equal to .01 and P less than or equal to .001 vs. placebo).
The ls mean change from baseline in WOMAC physical function scores were a respective –2.11, –2.70, and –2.82 (P less than or equal to .001 for both vs. placebo).
The coprimary endpoint of patients’ global assessment of OA (PGA-OA) was also significantly improved with tanezumab 5 mg (–0.90; P less than or equal to .05) but not 2.5 mg (–0.82) versus placebo (–0.72).
As the 2.5-mg dose of tanezumab didn’t meet one of the three coprimary endpoints, further hypothesis testing was not possible, but exploratory findings suggested that tanezumab at 2.5 mg or 5 mg yielded higher proportions of patients with reductions from baseline in WOMAC pain scores when compared against placebo. This was the case for reductions of at least 30% (65.6%, 68.7%, 56.6%, respectively), 50% (45.4%, 47.9%, 33.8%), or 70% (21.3%, 23.2%, 17.8%).
“I think that we have now a lot of studies with tanezumab showing a significant effect on hip and knee OA pain and function, so we have the studies in order to have the drug on the market,” study first author Francis Berenbaum, MD, PhD, of Saint-Antoine Hospital, Sorbonne Université in Paris, said in an interview at the European Congress of Rheumatology.
“Of course, because of the safety issue with rapid progressive osteoarthritis (RPOA), what we are discussing now is: ‘For which patients will there be an optimal benefit-to-risk?’ So, it’s now more a discussion around the population of patients who can benefit the most with the drug,” Dr. Berenbaum added.
A possible link between the use of tanezumab and a risk for developing RPOA was first suggested by preclinical and early clinical trial data, prompting the U.S. Food and Drug Administration to place partial holds on its clinical development in 2010, and again in 2012.
However, Dr. Berenbaum noted that a “mitigation plan” had been put in place for the phase 3 program to try to lower the likelihood of RPOA. This included: lowering the dose of the drug used and delivering it subcutaneously rather than intravenously; not prescribing it with NSAIDs and testing its possible effects and safety in a difficult-to-treat population of patients with no known risk factors for the potentially very serious adverse event.
“Based on this mitigation plan, the risk of rapid progressive osteoarthritis has considerably decreased,” Dr. Berenbaum observed. Indeed, in the phase 3 study he presented at the meeting, he said that around 2% of patients developed RPOA, which is “exactly in line with what has already been shown.” RPOA was reported in none of the placebo-treated patients, in 1.4% of those treated with tanezumab 2.5 mg, and in 2.8% in those treated with tanezumab 5 mg.
However, a “striking” finding of the current study was that despite the small increase in RPOA seen, there was no difference between the tanezumab and placebo groups in the number of patients needing total joint replacement (TJR). The percentages of patients undergoing at least one TJR was 6.7% in the placebo group, 7.8% in the tanezumab 2.5-mg group, and 7.0% in the tanezumab 5-mg group.
The joint safety events seen in the study, including TJRs, were adjudicated as being part of the normal progression of OA in the majority (73.4%) of cases. Other joint events of note were one case of subchondral insufficiency fracture occurring in a patient treated with tanezumab 2.5 mg and one case of primary osteonecrosis in a patient treated with tanezumab 5 mg.
During his presentation of the findings in a late-breaking oral abstract session, Dr. Berenbaum noted that this was a difficult-to-treat population of patients. All 849 patients who had been recruited had moderate to severe OA pain of the knee or hip and had a history of insufficient pain relief or intolerance to treatment with acetaminophen, oral NSAIDs, and tramadol and were also not responding to, or unwilling to take, opioid painkillers. Patients had to have no radiographic evidence of specified bone conditions, including RPOA.
Patients had been treated with subcutaneous tanezumab 2.5 mg (n = 283) or 5 mg (n = 284) or placebo (n = 282) at baseline, week 8, and week 16, with the three coprimary efficacy endpoints assessed at week 24.
Discussing the risk-to-benefit ratio of the drug after his presentation, Dr. Berenbaum said: “You have to keep in mind that, first, it was in very difficult-to-treat patients, compared to the other trials in the field of OA symptoms.”
He added: “Second, is that compared to the other trials, this one was able to include patients with Kellgren-Lawrence grade 4, meaning that this is a more serious population,” and third, “when you look at the responders – WOMAC 30%, 50%, 70% – there is a strong difference in terms of responders.”
Dr. Berenbaum and his coauthors noted on the poster that accompanied the late-breaking oral presentation that “an active-controlled study will provide data to further characterize the risk-benefit of tanezumab in patients with OA.”
The study was sponsored by Pfizer and Eli Lilly. Dr. Berenbaum disclosed receiving research funding through his institution from Pfizer and acting as a consultant to, and speaker for, the company as well as multiple other pharmaceutical companies. Coauthors of the study also disclosed research funding or consultancy agreements with Pfizer or Eli Lilly or were employees of the companies.
SOURCE: Berenbaum F et al. Ann Rheum Dis. Jun 2019;78(Suppl 2):262-4. Abstract LB0007, doi: 10.1136/annrheumdis-2019-eular.8660
MADRID – Tanezumab, an investigational monoclonal antibody directed against nerve growth factor that is under development to treat osteoarthritis pain, met most of the coprimary efficacy endpoints set for the drug in a randomized, double-blind, parallel-group, placebo-controlled phase 3 study.
At the end of a 24-week, double-blind treatment period, Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain and WOMAC physical function subscale scores were significantly improved, compared with placebo in the two tanezumab (2.5 mg and 5 mg) dose groups.
The least squares (ls) mean change from baseline in WOMAC pain scores were –2.24 for placebo, –2.70 for tanezumab 2.5 mg, and –2.85 for tanezumab 5 mg (P less than or equal to .01 and P less than or equal to .001 vs. placebo).
The ls mean change from baseline in WOMAC physical function scores were a respective –2.11, –2.70, and –2.82 (P less than or equal to .001 for both vs. placebo).
The coprimary endpoint of patients’ global assessment of OA (PGA-OA) was also significantly improved with tanezumab 5 mg (–0.90; P less than or equal to .05) but not 2.5 mg (–0.82) versus placebo (–0.72).
As the 2.5-mg dose of tanezumab didn’t meet one of the three coprimary endpoints, further hypothesis testing was not possible, but exploratory findings suggested that tanezumab at 2.5 mg or 5 mg yielded higher proportions of patients with reductions from baseline in WOMAC pain scores when compared against placebo. This was the case for reductions of at least 30% (65.6%, 68.7%, 56.6%, respectively), 50% (45.4%, 47.9%, 33.8%), or 70% (21.3%, 23.2%, 17.8%).
“I think that we have now a lot of studies with tanezumab showing a significant effect on hip and knee OA pain and function, so we have the studies in order to have the drug on the market,” study first author Francis Berenbaum, MD, PhD, of Saint-Antoine Hospital, Sorbonne Université in Paris, said in an interview at the European Congress of Rheumatology.
“Of course, because of the safety issue with rapid progressive osteoarthritis (RPOA), what we are discussing now is: ‘For which patients will there be an optimal benefit-to-risk?’ So, it’s now more a discussion around the population of patients who can benefit the most with the drug,” Dr. Berenbaum added.
A possible link between the use of tanezumab and a risk for developing RPOA was first suggested by preclinical and early clinical trial data, prompting the U.S. Food and Drug Administration to place partial holds on its clinical development in 2010, and again in 2012.
However, Dr. Berenbaum noted that a “mitigation plan” had been put in place for the phase 3 program to try to lower the likelihood of RPOA. This included: lowering the dose of the drug used and delivering it subcutaneously rather than intravenously; not prescribing it with NSAIDs and testing its possible effects and safety in a difficult-to-treat population of patients with no known risk factors for the potentially very serious adverse event.
“Based on this mitigation plan, the risk of rapid progressive osteoarthritis has considerably decreased,” Dr. Berenbaum observed. Indeed, in the phase 3 study he presented at the meeting, he said that around 2% of patients developed RPOA, which is “exactly in line with what has already been shown.” RPOA was reported in none of the placebo-treated patients, in 1.4% of those treated with tanezumab 2.5 mg, and in 2.8% in those treated with tanezumab 5 mg.
However, a “striking” finding of the current study was that despite the small increase in RPOA seen, there was no difference between the tanezumab and placebo groups in the number of patients needing total joint replacement (TJR). The percentages of patients undergoing at least one TJR was 6.7% in the placebo group, 7.8% in the tanezumab 2.5-mg group, and 7.0% in the tanezumab 5-mg group.
The joint safety events seen in the study, including TJRs, were adjudicated as being part of the normal progression of OA in the majority (73.4%) of cases. Other joint events of note were one case of subchondral insufficiency fracture occurring in a patient treated with tanezumab 2.5 mg and one case of primary osteonecrosis in a patient treated with tanezumab 5 mg.
During his presentation of the findings in a late-breaking oral abstract session, Dr. Berenbaum noted that this was a difficult-to-treat population of patients. All 849 patients who had been recruited had moderate to severe OA pain of the knee or hip and had a history of insufficient pain relief or intolerance to treatment with acetaminophen, oral NSAIDs, and tramadol and were also not responding to, or unwilling to take, opioid painkillers. Patients had to have no radiographic evidence of specified bone conditions, including RPOA.
Patients had been treated with subcutaneous tanezumab 2.5 mg (n = 283) or 5 mg (n = 284) or placebo (n = 282) at baseline, week 8, and week 16, with the three coprimary efficacy endpoints assessed at week 24.
Discussing the risk-to-benefit ratio of the drug after his presentation, Dr. Berenbaum said: “You have to keep in mind that, first, it was in very difficult-to-treat patients, compared to the other trials in the field of OA symptoms.”
He added: “Second, is that compared to the other trials, this one was able to include patients with Kellgren-Lawrence grade 4, meaning that this is a more serious population,” and third, “when you look at the responders – WOMAC 30%, 50%, 70% – there is a strong difference in terms of responders.”
Dr. Berenbaum and his coauthors noted on the poster that accompanied the late-breaking oral presentation that “an active-controlled study will provide data to further characterize the risk-benefit of tanezumab in patients with OA.”
The study was sponsored by Pfizer and Eli Lilly. Dr. Berenbaum disclosed receiving research funding through his institution from Pfizer and acting as a consultant to, and speaker for, the company as well as multiple other pharmaceutical companies. Coauthors of the study also disclosed research funding or consultancy agreements with Pfizer or Eli Lilly or were employees of the companies.
SOURCE: Berenbaum F et al. Ann Rheum Dis. Jun 2019;78(Suppl 2):262-4. Abstract LB0007, doi: 10.1136/annrheumdis-2019-eular.8660
MADRID – Tanezumab, an investigational monoclonal antibody directed against nerve growth factor that is under development to treat osteoarthritis pain, met most of the coprimary efficacy endpoints set for the drug in a randomized, double-blind, parallel-group, placebo-controlled phase 3 study.
At the end of a 24-week, double-blind treatment period, Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain and WOMAC physical function subscale scores were significantly improved, compared with placebo in the two tanezumab (2.5 mg and 5 mg) dose groups.
The least squares (ls) mean change from baseline in WOMAC pain scores were –2.24 for placebo, –2.70 for tanezumab 2.5 mg, and –2.85 for tanezumab 5 mg (P less than or equal to .01 and P less than or equal to .001 vs. placebo).
The ls mean change from baseline in WOMAC physical function scores were a respective –2.11, –2.70, and –2.82 (P less than or equal to .001 for both vs. placebo).
The coprimary endpoint of patients’ global assessment of OA (PGA-OA) was also significantly improved with tanezumab 5 mg (–0.90; P less than or equal to .05) but not 2.5 mg (–0.82) versus placebo (–0.72).
As the 2.5-mg dose of tanezumab didn’t meet one of the three coprimary endpoints, further hypothesis testing was not possible, but exploratory findings suggested that tanezumab at 2.5 mg or 5 mg yielded higher proportions of patients with reductions from baseline in WOMAC pain scores when compared against placebo. This was the case for reductions of at least 30% (65.6%, 68.7%, 56.6%, respectively), 50% (45.4%, 47.9%, 33.8%), or 70% (21.3%, 23.2%, 17.8%).
“I think that we have now a lot of studies with tanezumab showing a significant effect on hip and knee OA pain and function, so we have the studies in order to have the drug on the market,” study first author Francis Berenbaum, MD, PhD, of Saint-Antoine Hospital, Sorbonne Université in Paris, said in an interview at the European Congress of Rheumatology.
“Of course, because of the safety issue with rapid progressive osteoarthritis (RPOA), what we are discussing now is: ‘For which patients will there be an optimal benefit-to-risk?’ So, it’s now more a discussion around the population of patients who can benefit the most with the drug,” Dr. Berenbaum added.
A possible link between the use of tanezumab and a risk for developing RPOA was first suggested by preclinical and early clinical trial data, prompting the U.S. Food and Drug Administration to place partial holds on its clinical development in 2010, and again in 2012.
However, Dr. Berenbaum noted that a “mitigation plan” had been put in place for the phase 3 program to try to lower the likelihood of RPOA. This included: lowering the dose of the drug used and delivering it subcutaneously rather than intravenously; not prescribing it with NSAIDs and testing its possible effects and safety in a difficult-to-treat population of patients with no known risk factors for the potentially very serious adverse event.
“Based on this mitigation plan, the risk of rapid progressive osteoarthritis has considerably decreased,” Dr. Berenbaum observed. Indeed, in the phase 3 study he presented at the meeting, he said that around 2% of patients developed RPOA, which is “exactly in line with what has already been shown.” RPOA was reported in none of the placebo-treated patients, in 1.4% of those treated with tanezumab 2.5 mg, and in 2.8% in those treated with tanezumab 5 mg.
However, a “striking” finding of the current study was that despite the small increase in RPOA seen, there was no difference between the tanezumab and placebo groups in the number of patients needing total joint replacement (TJR). The percentages of patients undergoing at least one TJR was 6.7% in the placebo group, 7.8% in the tanezumab 2.5-mg group, and 7.0% in the tanezumab 5-mg group.
The joint safety events seen in the study, including TJRs, were adjudicated as being part of the normal progression of OA in the majority (73.4%) of cases. Other joint events of note were one case of subchondral insufficiency fracture occurring in a patient treated with tanezumab 2.5 mg and one case of primary osteonecrosis in a patient treated with tanezumab 5 mg.
During his presentation of the findings in a late-breaking oral abstract session, Dr. Berenbaum noted that this was a difficult-to-treat population of patients. All 849 patients who had been recruited had moderate to severe OA pain of the knee or hip and had a history of insufficient pain relief or intolerance to treatment with acetaminophen, oral NSAIDs, and tramadol and were also not responding to, or unwilling to take, opioid painkillers. Patients had to have no radiographic evidence of specified bone conditions, including RPOA.
Patients had been treated with subcutaneous tanezumab 2.5 mg (n = 283) or 5 mg (n = 284) or placebo (n = 282) at baseline, week 8, and week 16, with the three coprimary efficacy endpoints assessed at week 24.
Discussing the risk-to-benefit ratio of the drug after his presentation, Dr. Berenbaum said: “You have to keep in mind that, first, it was in very difficult-to-treat patients, compared to the other trials in the field of OA symptoms.”
He added: “Second, is that compared to the other trials, this one was able to include patients with Kellgren-Lawrence grade 4, meaning that this is a more serious population,” and third, “when you look at the responders – WOMAC 30%, 50%, 70% – there is a strong difference in terms of responders.”
Dr. Berenbaum and his coauthors noted on the poster that accompanied the late-breaking oral presentation that “an active-controlled study will provide data to further characterize the risk-benefit of tanezumab in patients with OA.”
The study was sponsored by Pfizer and Eli Lilly. Dr. Berenbaum disclosed receiving research funding through his institution from Pfizer and acting as a consultant to, and speaker for, the company as well as multiple other pharmaceutical companies. Coauthors of the study also disclosed research funding or consultancy agreements with Pfizer or Eli Lilly or were employees of the companies.
SOURCE: Berenbaum F et al. Ann Rheum Dis. Jun 2019;78(Suppl 2):262-4. Abstract LB0007, doi: 10.1136/annrheumdis-2019-eular.8660
REPORTING FROM EULAR 2019 CONGRESS
IHS Announces Requirements to Increase Access to OUD Treatment
Native American communities have experienced the largest increase in drug overdose deaths of all racial/ethnic groups in the US. Between 1999 and 2015, drug overdose deaths rose > 500%. To help ensure that American Indians and Alaska Natives (AI/AN) get the treatment they need, the Indian Health Service (IHS) has released Special General Memorandum 2019-01: Assuring Access to Medication Assisted Treatment for Opioid Use Disorder. It requires all IHS federal facilities to:
- Identify opioid use disorder (OUD) treatment resources in their local areas;
- Create an action plan, no later than Dec. 11, 2019; and
- Provide or coordinate patient access to medication-assisted treatment (MAT), specifically increasing access to culturally appropriate prevention, treatment, and recovery support services.
MAT is a comprehensive evidence-based approach that combines pharmacologic interventions with substance abuse counseling and culturally sensitive social support.
The IHS has recently taken other steps to further facilitate MAT access in tribal communities. For example, it has added 3 FDA-approved medications to the National Core Formulary: buprenorphine, buprenorphine/naloxone, and injectable naltrexone, all of which relieve withdrawal symptoms and psychological cravings, supporting adherence to treatment and reducing illicit opioid use.
In addition, the IHS has published the Internet Eligible Controlled Substance Provider Designation Policy. This policy, established in 2018, is designed to increase access to treatment for AI/AN who live in rural or remote areas, where it can be difficult to access a provider with the necessary training and Drug Enforcement Administration approval to prescribe buprenorphine in an outpatient or office-based setting. Once approved, IHS, tribal, and urban Indian organization health care providers can prescribe controlled substances for MAT through telemedicine.
In 2018, the IHS also launched a new website (www.IHS.gov/opioids) to share information about opioids with patients, health care providers, tribal leaders, tribal and urban program administrators, and other community members. The site includes information on approaches to prevent opioid abuse, pain management, recovery tools, and funding opportunities.
Native American communities have experienced the largest increase in drug overdose deaths of all racial/ethnic groups in the US. Between 1999 and 2015, drug overdose deaths rose > 500%. To help ensure that American Indians and Alaska Natives (AI/AN) get the treatment they need, the Indian Health Service (IHS) has released Special General Memorandum 2019-01: Assuring Access to Medication Assisted Treatment for Opioid Use Disorder. It requires all IHS federal facilities to:
- Identify opioid use disorder (OUD) treatment resources in their local areas;
- Create an action plan, no later than Dec. 11, 2019; and
- Provide or coordinate patient access to medication-assisted treatment (MAT), specifically increasing access to culturally appropriate prevention, treatment, and recovery support services.
MAT is a comprehensive evidence-based approach that combines pharmacologic interventions with substance abuse counseling and culturally sensitive social support.
The IHS has recently taken other steps to further facilitate MAT access in tribal communities. For example, it has added 3 FDA-approved medications to the National Core Formulary: buprenorphine, buprenorphine/naloxone, and injectable naltrexone, all of which relieve withdrawal symptoms and psychological cravings, supporting adherence to treatment and reducing illicit opioid use.
In addition, the IHS has published the Internet Eligible Controlled Substance Provider Designation Policy. This policy, established in 2018, is designed to increase access to treatment for AI/AN who live in rural or remote areas, where it can be difficult to access a provider with the necessary training and Drug Enforcement Administration approval to prescribe buprenorphine in an outpatient or office-based setting. Once approved, IHS, tribal, and urban Indian organization health care providers can prescribe controlled substances for MAT through telemedicine.
In 2018, the IHS also launched a new website (www.IHS.gov/opioids) to share information about opioids with patients, health care providers, tribal leaders, tribal and urban program administrators, and other community members. The site includes information on approaches to prevent opioid abuse, pain management, recovery tools, and funding opportunities.
Native American communities have experienced the largest increase in drug overdose deaths of all racial/ethnic groups in the US. Between 1999 and 2015, drug overdose deaths rose > 500%. To help ensure that American Indians and Alaska Natives (AI/AN) get the treatment they need, the Indian Health Service (IHS) has released Special General Memorandum 2019-01: Assuring Access to Medication Assisted Treatment for Opioid Use Disorder. It requires all IHS federal facilities to:
- Identify opioid use disorder (OUD) treatment resources in their local areas;
- Create an action plan, no later than Dec. 11, 2019; and
- Provide or coordinate patient access to medication-assisted treatment (MAT), specifically increasing access to culturally appropriate prevention, treatment, and recovery support services.
MAT is a comprehensive evidence-based approach that combines pharmacologic interventions with substance abuse counseling and culturally sensitive social support.
The IHS has recently taken other steps to further facilitate MAT access in tribal communities. For example, it has added 3 FDA-approved medications to the National Core Formulary: buprenorphine, buprenorphine/naloxone, and injectable naltrexone, all of which relieve withdrawal symptoms and psychological cravings, supporting adherence to treatment and reducing illicit opioid use.
In addition, the IHS has published the Internet Eligible Controlled Substance Provider Designation Policy. This policy, established in 2018, is designed to increase access to treatment for AI/AN who live in rural or remote areas, where it can be difficult to access a provider with the necessary training and Drug Enforcement Administration approval to prescribe buprenorphine in an outpatient or office-based setting. Once approved, IHS, tribal, and urban Indian organization health care providers can prescribe controlled substances for MAT through telemedicine.
In 2018, the IHS also launched a new website (www.IHS.gov/opioids) to share information about opioids with patients, health care providers, tribal leaders, tribal and urban program administrators, and other community members. The site includes information on approaches to prevent opioid abuse, pain management, recovery tools, and funding opportunities.
Algorithm predicts villous atrophy in children with potential celiac disease
A new algorithm may be able to predict which children with potential celiac disease will go on to develop villous atrophy, according to investigators writing in Gastroenterology.
The risk model was developed from the largest cohort of its kind, with the longest follow-up to date, reported lead author Renata Auricchio, MD, PhD, of University Federico II in Naples, Italy, and colleagues. Using the algorithm, which relies most heavily on a baseline number of intraepithelial lymphocytes (IELs) in mucosa, followed by age at diagnosis and genetic profile, clinicians may now consider prescribing gluten-free diets to only the highest-risk patients, instead of all suspected cases, noting that more than half of potential cases do not develop flat mucosa within 12 years.
Development of the algorithm began with enrollment of 340 children aged 2-18 years who were positive for endomysial antibodies immunoglobulin A antibodies and had tested positive twice consecutively for antitissue transglutaminase antibodies. Additionally, children were required to possess HLA DQ2- or DQ8-positive haplotypes and have normal duodenal architecture in five biopsy samples. Because of symptoms suggestive of celiac disease or parental discretion, 60 patients were started on a gluten-free diet and excluded from the study, leaving 280 patients in the final cohort. These patients were kept on a gluten-containing diet and followed for up to 12 years. Every 6 months, the investigators checked antibodies and clinical status, and every 2 years, small bowel biopsy was performed, if symptoms had not necessitated this earlier.
After a median follow-up of 60 months, ranging from 18 months to 12 years, 39 patients (13.9%) developed symptoms of celiac disease and were placed on a gluten-free diet, although they declined confirmatory biopsy, disallowing classification of celiac disease. Another 33 patients (11.7%) were lost to follow-up and 89 (32%) stopped producing antibodies, with none going on to develop villous atrophy. In total, 42 patients (15%) developed flat mucosa during the follow-up period, with an estimated cumulative incidence of 43% at 12 years. The investigators noted that patients most frequently progressed within two time frames; at 24-48 months after enrollment, or at 96-120 months.
To develop the algorithm, the investigators performed multivariable analysis with several potential risk factors, including age, sex, genetic profile, mucosal characteristics, and concomitant autoimmune diseases. Of these, a high number of IELs upon first biopsy was most highly correlated with progression to celiac disease. Patients who developed villous atrophy had a mean value of 11.9 IELs at first biopsy, compared with 6.44 among those who remained potential (P = .05). The next strongest predictive factors were age and genetic profile. Just 7% of children less than 3 years developed flat mucosa, compared with 51% of patients aged 3-10 years and 55% of those older than 10 years (P = .007). HLA status was predictive in the group aged 3-10 years but not significant in the youngest or oldest patients. Therefore, HLA haplotype was included in the final algorithm, but with smaller contribution than five non-HLA genes, namely, IL12a, SH2B3, RGS1, CCR, and IL2/IL21.
“Combining these risk factors, we set up a model to predict the probability for a patient to evolve from potential celiac disease to villous atrophy,” the investigators wrote. “Overall, the discriminant analysis model allows us to correctly classify, at entry, 80% of the children who will not develop a flat mucosa over follow-up, while approximately 69% of those who will develop flat mucosa are correctly classified by the parameters we analyzed. This system is then more accurate to predict a child who will not develop flat mucosa and then can be monitored on a gluten-containing diet than a child who will become celiac.”
The investigators noted that IEL count may be an uncommon diagnostic; however, they recommended the test, even if it necessitates referral. “The [IEL] count turned out to be crucial for the prediction power of the discriminant analysis,” the investigators wrote.
“The long-term risks of potential celiac disease have never been accurately evaluated. Thus, before adopting a wait-and-see strategy on a gluten-containing diet, a final decision should always be shared with the family.”
Still, the investigators concluded that gluten-free diet “should not be prescribed indistinctly to all patients” with potential celiac disease, as it is a “very heterogenic condition and is not necessarily the first step of overt disease.”
The investigators disclosed no funding or conflicts of interest.
SOURCE: Auricchio R et al. Gastroenterology. 2019 Apr 9. doi: 10.1053/j.gastro.2019.04.004.
While the simplification of the diagnostic process for celiac disease (CD), now heavily reliant on CD-specific autoantibodies, has made the life of clinicians easier in many respects, new scenarios also have emerged that are posing new challenges. One of them is that a substantial, growing portion of subjects (who may or may not have symptoms) present with positive CD autoantibodies but a normal duodenal mucosa (“potential celiac patient”). If left on gluten, with time some will develop villous atrophy, but some won’t. What is the clinician supposed to do with them?
The paper by Auricchio et al. addresses this issue in a rigorous, well-structured way by closely prospectively monitoring a large series of pediatric patients. Their conclusions have very useful implications for the clinician. In fact taking into consideration several criteria, they found valuable after a long observation period – such as age of the child, HLA status, persistence of elevated CD-specific autoantibodies, and presence or absence of intraepithelial lymphocytes in the initial biopsy – they concluded that one can correctly identify at the beginning four out of five potential celiac patients who will not develop villous atrophy, and thus do not need to follow a gluten-free diet.
Ultimately, however, let’s not forget that we are still dealing with percentages of risk to develop full-blown CD, not with definitive certainties. Hence, the decision of starting a gluten-free diet or not (and of how often and in which way to monitor those who remain on gluten) remains a mutually agreed upon plan sealed by two actors: on one side the patient (or the patient’s family); and on the other, an experienced health care provider who has clearly explained the facts. In other words, evidence-based criteria, good old medicine, and a grain of salt!
Stefano Guandalini, MD, is a pediatric gastroenterologist at the University of Chicago Medical Center. He has no conflicts of interest.
While the simplification of the diagnostic process for celiac disease (CD), now heavily reliant on CD-specific autoantibodies, has made the life of clinicians easier in many respects, new scenarios also have emerged that are posing new challenges. One of them is that a substantial, growing portion of subjects (who may or may not have symptoms) present with positive CD autoantibodies but a normal duodenal mucosa (“potential celiac patient”). If left on gluten, with time some will develop villous atrophy, but some won’t. What is the clinician supposed to do with them?
The paper by Auricchio et al. addresses this issue in a rigorous, well-structured way by closely prospectively monitoring a large series of pediatric patients. Their conclusions have very useful implications for the clinician. In fact taking into consideration several criteria, they found valuable after a long observation period – such as age of the child, HLA status, persistence of elevated CD-specific autoantibodies, and presence or absence of intraepithelial lymphocytes in the initial biopsy – they concluded that one can correctly identify at the beginning four out of five potential celiac patients who will not develop villous atrophy, and thus do not need to follow a gluten-free diet.
Ultimately, however, let’s not forget that we are still dealing with percentages of risk to develop full-blown CD, not with definitive certainties. Hence, the decision of starting a gluten-free diet or not (and of how often and in which way to monitor those who remain on gluten) remains a mutually agreed upon plan sealed by two actors: on one side the patient (or the patient’s family); and on the other, an experienced health care provider who has clearly explained the facts. In other words, evidence-based criteria, good old medicine, and a grain of salt!
Stefano Guandalini, MD, is a pediatric gastroenterologist at the University of Chicago Medical Center. He has no conflicts of interest.
While the simplification of the diagnostic process for celiac disease (CD), now heavily reliant on CD-specific autoantibodies, has made the life of clinicians easier in many respects, new scenarios also have emerged that are posing new challenges. One of them is that a substantial, growing portion of subjects (who may or may not have symptoms) present with positive CD autoantibodies but a normal duodenal mucosa (“potential celiac patient”). If left on gluten, with time some will develop villous atrophy, but some won’t. What is the clinician supposed to do with them?
The paper by Auricchio et al. addresses this issue in a rigorous, well-structured way by closely prospectively monitoring a large series of pediatric patients. Their conclusions have very useful implications for the clinician. In fact taking into consideration several criteria, they found valuable after a long observation period – such as age of the child, HLA status, persistence of elevated CD-specific autoantibodies, and presence or absence of intraepithelial lymphocytes in the initial biopsy – they concluded that one can correctly identify at the beginning four out of five potential celiac patients who will not develop villous atrophy, and thus do not need to follow a gluten-free diet.
Ultimately, however, let’s not forget that we are still dealing with percentages of risk to develop full-blown CD, not with definitive certainties. Hence, the decision of starting a gluten-free diet or not (and of how often and in which way to monitor those who remain on gluten) remains a mutually agreed upon plan sealed by two actors: on one side the patient (or the patient’s family); and on the other, an experienced health care provider who has clearly explained the facts. In other words, evidence-based criteria, good old medicine, and a grain of salt!
Stefano Guandalini, MD, is a pediatric gastroenterologist at the University of Chicago Medical Center. He has no conflicts of interest.
A new algorithm may be able to predict which children with potential celiac disease will go on to develop villous atrophy, according to investigators writing in Gastroenterology.
The risk model was developed from the largest cohort of its kind, with the longest follow-up to date, reported lead author Renata Auricchio, MD, PhD, of University Federico II in Naples, Italy, and colleagues. Using the algorithm, which relies most heavily on a baseline number of intraepithelial lymphocytes (IELs) in mucosa, followed by age at diagnosis and genetic profile, clinicians may now consider prescribing gluten-free diets to only the highest-risk patients, instead of all suspected cases, noting that more than half of potential cases do not develop flat mucosa within 12 years.
Development of the algorithm began with enrollment of 340 children aged 2-18 years who were positive for endomysial antibodies immunoglobulin A antibodies and had tested positive twice consecutively for antitissue transglutaminase antibodies. Additionally, children were required to possess HLA DQ2- or DQ8-positive haplotypes and have normal duodenal architecture in five biopsy samples. Because of symptoms suggestive of celiac disease or parental discretion, 60 patients were started on a gluten-free diet and excluded from the study, leaving 280 patients in the final cohort. These patients were kept on a gluten-containing diet and followed for up to 12 years. Every 6 months, the investigators checked antibodies and clinical status, and every 2 years, small bowel biopsy was performed, if symptoms had not necessitated this earlier.
After a median follow-up of 60 months, ranging from 18 months to 12 years, 39 patients (13.9%) developed symptoms of celiac disease and were placed on a gluten-free diet, although they declined confirmatory biopsy, disallowing classification of celiac disease. Another 33 patients (11.7%) were lost to follow-up and 89 (32%) stopped producing antibodies, with none going on to develop villous atrophy. In total, 42 patients (15%) developed flat mucosa during the follow-up period, with an estimated cumulative incidence of 43% at 12 years. The investigators noted that patients most frequently progressed within two time frames; at 24-48 months after enrollment, or at 96-120 months.
To develop the algorithm, the investigators performed multivariable analysis with several potential risk factors, including age, sex, genetic profile, mucosal characteristics, and concomitant autoimmune diseases. Of these, a high number of IELs upon first biopsy was most highly correlated with progression to celiac disease. Patients who developed villous atrophy had a mean value of 11.9 IELs at first biopsy, compared with 6.44 among those who remained potential (P = .05). The next strongest predictive factors were age and genetic profile. Just 7% of children less than 3 years developed flat mucosa, compared with 51% of patients aged 3-10 years and 55% of those older than 10 years (P = .007). HLA status was predictive in the group aged 3-10 years but not significant in the youngest or oldest patients. Therefore, HLA haplotype was included in the final algorithm, but with smaller contribution than five non-HLA genes, namely, IL12a, SH2B3, RGS1, CCR, and IL2/IL21.
“Combining these risk factors, we set up a model to predict the probability for a patient to evolve from potential celiac disease to villous atrophy,” the investigators wrote. “Overall, the discriminant analysis model allows us to correctly classify, at entry, 80% of the children who will not develop a flat mucosa over follow-up, while approximately 69% of those who will develop flat mucosa are correctly classified by the parameters we analyzed. This system is then more accurate to predict a child who will not develop flat mucosa and then can be monitored on a gluten-containing diet than a child who will become celiac.”
The investigators noted that IEL count may be an uncommon diagnostic; however, they recommended the test, even if it necessitates referral. “The [IEL] count turned out to be crucial for the prediction power of the discriminant analysis,” the investigators wrote.
“The long-term risks of potential celiac disease have never been accurately evaluated. Thus, before adopting a wait-and-see strategy on a gluten-containing diet, a final decision should always be shared with the family.”
Still, the investigators concluded that gluten-free diet “should not be prescribed indistinctly to all patients” with potential celiac disease, as it is a “very heterogenic condition and is not necessarily the first step of overt disease.”
The investigators disclosed no funding or conflicts of interest.
SOURCE: Auricchio R et al. Gastroenterology. 2019 Apr 9. doi: 10.1053/j.gastro.2019.04.004.
A new algorithm may be able to predict which children with potential celiac disease will go on to develop villous atrophy, according to investigators writing in Gastroenterology.
The risk model was developed from the largest cohort of its kind, with the longest follow-up to date, reported lead author Renata Auricchio, MD, PhD, of University Federico II in Naples, Italy, and colleagues. Using the algorithm, which relies most heavily on a baseline number of intraepithelial lymphocytes (IELs) in mucosa, followed by age at diagnosis and genetic profile, clinicians may now consider prescribing gluten-free diets to only the highest-risk patients, instead of all suspected cases, noting that more than half of potential cases do not develop flat mucosa within 12 years.
Development of the algorithm began with enrollment of 340 children aged 2-18 years who were positive for endomysial antibodies immunoglobulin A antibodies and had tested positive twice consecutively for antitissue transglutaminase antibodies. Additionally, children were required to possess HLA DQ2- or DQ8-positive haplotypes and have normal duodenal architecture in five biopsy samples. Because of symptoms suggestive of celiac disease or parental discretion, 60 patients were started on a gluten-free diet and excluded from the study, leaving 280 patients in the final cohort. These patients were kept on a gluten-containing diet and followed for up to 12 years. Every 6 months, the investigators checked antibodies and clinical status, and every 2 years, small bowel biopsy was performed, if symptoms had not necessitated this earlier.
After a median follow-up of 60 months, ranging from 18 months to 12 years, 39 patients (13.9%) developed symptoms of celiac disease and were placed on a gluten-free diet, although they declined confirmatory biopsy, disallowing classification of celiac disease. Another 33 patients (11.7%) were lost to follow-up and 89 (32%) stopped producing antibodies, with none going on to develop villous atrophy. In total, 42 patients (15%) developed flat mucosa during the follow-up period, with an estimated cumulative incidence of 43% at 12 years. The investigators noted that patients most frequently progressed within two time frames; at 24-48 months after enrollment, or at 96-120 months.
To develop the algorithm, the investigators performed multivariable analysis with several potential risk factors, including age, sex, genetic profile, mucosal characteristics, and concomitant autoimmune diseases. Of these, a high number of IELs upon first biopsy was most highly correlated with progression to celiac disease. Patients who developed villous atrophy had a mean value of 11.9 IELs at first biopsy, compared with 6.44 among those who remained potential (P = .05). The next strongest predictive factors were age and genetic profile. Just 7% of children less than 3 years developed flat mucosa, compared with 51% of patients aged 3-10 years and 55% of those older than 10 years (P = .007). HLA status was predictive in the group aged 3-10 years but not significant in the youngest or oldest patients. Therefore, HLA haplotype was included in the final algorithm, but with smaller contribution than five non-HLA genes, namely, IL12a, SH2B3, RGS1, CCR, and IL2/IL21.
“Combining these risk factors, we set up a model to predict the probability for a patient to evolve from potential celiac disease to villous atrophy,” the investigators wrote. “Overall, the discriminant analysis model allows us to correctly classify, at entry, 80% of the children who will not develop a flat mucosa over follow-up, while approximately 69% of those who will develop flat mucosa are correctly classified by the parameters we analyzed. This system is then more accurate to predict a child who will not develop flat mucosa and then can be monitored on a gluten-containing diet than a child who will become celiac.”
The investigators noted that IEL count may be an uncommon diagnostic; however, they recommended the test, even if it necessitates referral. “The [IEL] count turned out to be crucial for the prediction power of the discriminant analysis,” the investigators wrote.
“The long-term risks of potential celiac disease have never been accurately evaluated. Thus, before adopting a wait-and-see strategy on a gluten-containing diet, a final decision should always be shared with the family.”
Still, the investigators concluded that gluten-free diet “should not be prescribed indistinctly to all patients” with potential celiac disease, as it is a “very heterogenic condition and is not necessarily the first step of overt disease.”
The investigators disclosed no funding or conflicts of interest.
SOURCE: Auricchio R et al. Gastroenterology. 2019 Apr 9. doi: 10.1053/j.gastro.2019.04.004.
FROM GASTROENTEROLOGY
Automated measurements of plasma predict amyloid status
, according to research published online ahead of print June 24 in JAMA Neurology. Analyzing APOE genotype in addition to these biomarkers increases the accuracy of the prediction. This blood test thus could allow neurologists to identify patients at risk of amyloid-beta positivity who should undergo further assessment, said the authors. It also could be used to enroll amyloid-beta–positive participants in clinical trials.
In vivo PET imaging and analysis of cerebrospinal fluid (CSF) can detect amyloid-beta, but these procedures are expensive, and their availability is limited. Clinicians need readily available methods for detecting amyloid-beta, and research has indicated that blood-based biomarkers correlate with those in CSF. Fully automated immunoassays, such as the Elecsys test developed by Roche Diagnostics, have recently demonstrated high reliability and precision for CSF amyloid-beta. Using the Elecsys assay, Sebastian Palmqvist, MD, PhD, a neurologist at Skåne University Hospital in Malmö, Sweden, and colleagues sought to examine the accuracy of plasma amyloid-beta and tau, together with other blood-based biomarkers, at detecting cerebral amyloid-beta.
Testing the immunoassay in two cohorts
Dr. Palmqvist and colleagues examined participants in the prospective Swedish BioFINDER Study, which enrolled patients between July 6, 2009, and February 11, 2015. This cohort included 513 cognitively unimpaired (CU) participants, 265 participants with mild cognitive impairment (MCI), and 64 participants with Alzheimer’s disease dementia. Investigators collected blood and CSF samples at the same time from all participants. Participants’ amyloid-beta status was ascertained using the Elecsys CSF amyloid-beta 42/amyloid-beta 40 ratio. The researchers defined amyloid-beta positivity with an unbiased cutoff of less than 0.059.
Dr. Palmqvist and colleagues also examined a validation cohort that included 237 participants who had been enrolled between January 29, 2000, and October 11, 2006, in Ulm and Hannover, Germany. This group included 34 CU participants, 109 participants with MCI, and 94 participants with mild Alzheimer’s disease dementia. The investigators applied the same cutoff of CSF amyloid-beta 42/amyloid-beta 40 to define amyloid-beta positivity in this cohort as they applied to the BioFINDER cohort.
Automated immunoassay had high predictive accuracy
The mean age of the BioFINDER cohort was 72 years, and 52.5% of participants were female. Overall, 44% of this cohort was amyloid-beta positive, including 29% of CU participants, 60% of participants with MCI, and 100% of participants with Alzheimer’s dementia. The investigators found statistically significant positive correlations between all plasma and corresponding CSF biomarkers in this cohort.
Plasma amyloid-beta 42 and amyloid-beta 40 levels predicted amyloid-beta status with an area under the receiver operating characteristic curve (AUC) of 0.80. When the researchers added APOE to the model, the AUC increased significantly to 0.85. Accuracy improved slightly when the researchers added plasma tau (AUC, 0.86) or tau and neurofilament light (AUC, 0.87) to amyloid-beta 42, amyloid-beta 40, and APOE. The results were similar in CU and cognitively impaired participants, and in younger and older participants.
In the validation cohort, the mean age was 66 years, and 50.6% of participants were female. When Dr. Palmqvist and colleagues applied the plasma amyloid-beta 42 and amyloid-beta 40 model from the BioFINDER cohort to this population, they obtained a slightly higher AUC (0.86), but plasma tau did not increase predictive accuracy.
The investigators performed a cost-benefit analysis using a scenario in which 1,000 amyloid-positive participants are included in a trial and given a cost of $4,000 per participant for amyloid PET. Using plasma amyloid-beta 42, amyloid-beta 40, and APOE in this scenario reduced PET costs by as much as 30%-50%, depending on the cutoff.
Validation cohort was small
Dr. Palmqvist and colleagues acknowledged that a lack of data about APOE was a limitation of their validation analysis. Other limitations that they acknowledged were the small population size, which precluded subpopulation analysis, and the lack of improvement in predictive ability when they replicated the model that included plasma tau.
“Overall, the accuracies of the amyloid-beta 42 and amyloid-beta 40 assays are not sufficient to be used on their own as a clinical test of amyloid-beta positivity,” said Dr. Palmqvist and colleagues. “Additional assay development is needed before this can be recommended, possibly together with other blood biomarkers and screening tools in diagnostic algorithms.”
Even though additional validation studies are necessary, the present findings indicate “the potential usefulness blood assays might have, especially considering the ongoing great need to recruit large cohorts for Alzheimer’s disease drug trials in preclinical and prodromal stages,” the authors concluded.
This investigation was funded by foundations including the European Research Council, the Swedish Research Council, and the Knut and Alice Wallenberg foundation. Several authors are employees of the Roche Group. One author served on a scientific advisory board for Roche Diagnostics, and another received institutional research support from that company.
SOURCE: Palmqvist S et al. JAMA Neurol. 2019 Jun 24. doi: 10.1001/jamaneurol.2019.1632.
The investigation by Palmqvist et al. “makes several significant advancements in the field,” said Sid E. O’Bryant, PhD, professor of pharmacology and neuroscience at the University of North Texas Health Science Center in Fort Worth, in an accompanying editorial. The study’s protocol design clears the ground for a context of use of a blood screen for amyloid positivity. Also, the fully automated immunoassay “yields performance measurements that are superior to [those of] many earlier nonautomated procedures,” said Dr. O’Bryant. When Dr. Palmqvist and colleagues applied their discovery findings from a training cohort directly to a test cohort, it produced strong results. “This study suggests that the field is one step closer to the actual application of blood-based biomarkers with specific contexts of use in Alzheimer’s disease.”
The main concern about the plasma biomarkers, however, is the scalability of the methods used to measure them. “If primary care physicians are to use such a technology, the technology must have the capacity to conduct hundreds of millions of assays annually around the globe,” said Dr. O’Bryant. “A blood test for primary care must fit into the existing protocols and parameters in clinical laboratory settings. The blood collection and processing procedures are not applicable to standard clinical lab practice and will cause substantial barriers to clinical application.”
In addition, the study authors emphasize the utility of the immunoassay for primary care, but the study was designed to test for amyloid positivity, which is more appropriate for clinical trials. “No currently available drugs for patient use target amyloid,” said Dr. O’Bryant. “Therefore, this specific context of use is geared more toward clinical trial application than primary care physicians who currently need a test for the presence or absence of Alzheimer’s disease so currently available treatments and support can be put in place for patients and family members.”
Nevertheless, Dr. Palmqvist and associates have presented promising data, Dr. O’Bryant continued. The question in the field is ceasing to be whether blood biomarkers can be used in Alzheimer’s disease, and becoming how they can be used.
The investigation by Palmqvist et al. “makes several significant advancements in the field,” said Sid E. O’Bryant, PhD, professor of pharmacology and neuroscience at the University of North Texas Health Science Center in Fort Worth, in an accompanying editorial. The study’s protocol design clears the ground for a context of use of a blood screen for amyloid positivity. Also, the fully automated immunoassay “yields performance measurements that are superior to [those of] many earlier nonautomated procedures,” said Dr. O’Bryant. When Dr. Palmqvist and colleagues applied their discovery findings from a training cohort directly to a test cohort, it produced strong results. “This study suggests that the field is one step closer to the actual application of blood-based biomarkers with specific contexts of use in Alzheimer’s disease.”
The main concern about the plasma biomarkers, however, is the scalability of the methods used to measure them. “If primary care physicians are to use such a technology, the technology must have the capacity to conduct hundreds of millions of assays annually around the globe,” said Dr. O’Bryant. “A blood test for primary care must fit into the existing protocols and parameters in clinical laboratory settings. The blood collection and processing procedures are not applicable to standard clinical lab practice and will cause substantial barriers to clinical application.”
In addition, the study authors emphasize the utility of the immunoassay for primary care, but the study was designed to test for amyloid positivity, which is more appropriate for clinical trials. “No currently available drugs for patient use target amyloid,” said Dr. O’Bryant. “Therefore, this specific context of use is geared more toward clinical trial application than primary care physicians who currently need a test for the presence or absence of Alzheimer’s disease so currently available treatments and support can be put in place for patients and family members.”
Nevertheless, Dr. Palmqvist and associates have presented promising data, Dr. O’Bryant continued. The question in the field is ceasing to be whether blood biomarkers can be used in Alzheimer’s disease, and becoming how they can be used.
The investigation by Palmqvist et al. “makes several significant advancements in the field,” said Sid E. O’Bryant, PhD, professor of pharmacology and neuroscience at the University of North Texas Health Science Center in Fort Worth, in an accompanying editorial. The study’s protocol design clears the ground for a context of use of a blood screen for amyloid positivity. Also, the fully automated immunoassay “yields performance measurements that are superior to [those of] many earlier nonautomated procedures,” said Dr. O’Bryant. When Dr. Palmqvist and colleagues applied their discovery findings from a training cohort directly to a test cohort, it produced strong results. “This study suggests that the field is one step closer to the actual application of blood-based biomarkers with specific contexts of use in Alzheimer’s disease.”
The main concern about the plasma biomarkers, however, is the scalability of the methods used to measure them. “If primary care physicians are to use such a technology, the technology must have the capacity to conduct hundreds of millions of assays annually around the globe,” said Dr. O’Bryant. “A blood test for primary care must fit into the existing protocols and parameters in clinical laboratory settings. The blood collection and processing procedures are not applicable to standard clinical lab practice and will cause substantial barriers to clinical application.”
In addition, the study authors emphasize the utility of the immunoassay for primary care, but the study was designed to test for amyloid positivity, which is more appropriate for clinical trials. “No currently available drugs for patient use target amyloid,” said Dr. O’Bryant. “Therefore, this specific context of use is geared more toward clinical trial application than primary care physicians who currently need a test for the presence or absence of Alzheimer’s disease so currently available treatments and support can be put in place for patients and family members.”
Nevertheless, Dr. Palmqvist and associates have presented promising data, Dr. O’Bryant continued. The question in the field is ceasing to be whether blood biomarkers can be used in Alzheimer’s disease, and becoming how they can be used.
, according to research published online ahead of print June 24 in JAMA Neurology. Analyzing APOE genotype in addition to these biomarkers increases the accuracy of the prediction. This blood test thus could allow neurologists to identify patients at risk of amyloid-beta positivity who should undergo further assessment, said the authors. It also could be used to enroll amyloid-beta–positive participants in clinical trials.
In vivo PET imaging and analysis of cerebrospinal fluid (CSF) can detect amyloid-beta, but these procedures are expensive, and their availability is limited. Clinicians need readily available methods for detecting amyloid-beta, and research has indicated that blood-based biomarkers correlate with those in CSF. Fully automated immunoassays, such as the Elecsys test developed by Roche Diagnostics, have recently demonstrated high reliability and precision for CSF amyloid-beta. Using the Elecsys assay, Sebastian Palmqvist, MD, PhD, a neurologist at Skåne University Hospital in Malmö, Sweden, and colleagues sought to examine the accuracy of plasma amyloid-beta and tau, together with other blood-based biomarkers, at detecting cerebral amyloid-beta.
Testing the immunoassay in two cohorts
Dr. Palmqvist and colleagues examined participants in the prospective Swedish BioFINDER Study, which enrolled patients between July 6, 2009, and February 11, 2015. This cohort included 513 cognitively unimpaired (CU) participants, 265 participants with mild cognitive impairment (MCI), and 64 participants with Alzheimer’s disease dementia. Investigators collected blood and CSF samples at the same time from all participants. Participants’ amyloid-beta status was ascertained using the Elecsys CSF amyloid-beta 42/amyloid-beta 40 ratio. The researchers defined amyloid-beta positivity with an unbiased cutoff of less than 0.059.
Dr. Palmqvist and colleagues also examined a validation cohort that included 237 participants who had been enrolled between January 29, 2000, and October 11, 2006, in Ulm and Hannover, Germany. This group included 34 CU participants, 109 participants with MCI, and 94 participants with mild Alzheimer’s disease dementia. The investigators applied the same cutoff of CSF amyloid-beta 42/amyloid-beta 40 to define amyloid-beta positivity in this cohort as they applied to the BioFINDER cohort.
Automated immunoassay had high predictive accuracy
The mean age of the BioFINDER cohort was 72 years, and 52.5% of participants were female. Overall, 44% of this cohort was amyloid-beta positive, including 29% of CU participants, 60% of participants with MCI, and 100% of participants with Alzheimer’s dementia. The investigators found statistically significant positive correlations between all plasma and corresponding CSF biomarkers in this cohort.
Plasma amyloid-beta 42 and amyloid-beta 40 levels predicted amyloid-beta status with an area under the receiver operating characteristic curve (AUC) of 0.80. When the researchers added APOE to the model, the AUC increased significantly to 0.85. Accuracy improved slightly when the researchers added plasma tau (AUC, 0.86) or tau and neurofilament light (AUC, 0.87) to amyloid-beta 42, amyloid-beta 40, and APOE. The results were similar in CU and cognitively impaired participants, and in younger and older participants.
In the validation cohort, the mean age was 66 years, and 50.6% of participants were female. When Dr. Palmqvist and colleagues applied the plasma amyloid-beta 42 and amyloid-beta 40 model from the BioFINDER cohort to this population, they obtained a slightly higher AUC (0.86), but plasma tau did not increase predictive accuracy.
The investigators performed a cost-benefit analysis using a scenario in which 1,000 amyloid-positive participants are included in a trial and given a cost of $4,000 per participant for amyloid PET. Using plasma amyloid-beta 42, amyloid-beta 40, and APOE in this scenario reduced PET costs by as much as 30%-50%, depending on the cutoff.
Validation cohort was small
Dr. Palmqvist and colleagues acknowledged that a lack of data about APOE was a limitation of their validation analysis. Other limitations that they acknowledged were the small population size, which precluded subpopulation analysis, and the lack of improvement in predictive ability when they replicated the model that included plasma tau.
“Overall, the accuracies of the amyloid-beta 42 and amyloid-beta 40 assays are not sufficient to be used on their own as a clinical test of amyloid-beta positivity,” said Dr. Palmqvist and colleagues. “Additional assay development is needed before this can be recommended, possibly together with other blood biomarkers and screening tools in diagnostic algorithms.”
Even though additional validation studies are necessary, the present findings indicate “the potential usefulness blood assays might have, especially considering the ongoing great need to recruit large cohorts for Alzheimer’s disease drug trials in preclinical and prodromal stages,” the authors concluded.
This investigation was funded by foundations including the European Research Council, the Swedish Research Council, and the Knut and Alice Wallenberg foundation. Several authors are employees of the Roche Group. One author served on a scientific advisory board for Roche Diagnostics, and another received institutional research support from that company.
SOURCE: Palmqvist S et al. JAMA Neurol. 2019 Jun 24. doi: 10.1001/jamaneurol.2019.1632.
, according to research published online ahead of print June 24 in JAMA Neurology. Analyzing APOE genotype in addition to these biomarkers increases the accuracy of the prediction. This blood test thus could allow neurologists to identify patients at risk of amyloid-beta positivity who should undergo further assessment, said the authors. It also could be used to enroll amyloid-beta–positive participants in clinical trials.
In vivo PET imaging and analysis of cerebrospinal fluid (CSF) can detect amyloid-beta, but these procedures are expensive, and their availability is limited. Clinicians need readily available methods for detecting amyloid-beta, and research has indicated that blood-based biomarkers correlate with those in CSF. Fully automated immunoassays, such as the Elecsys test developed by Roche Diagnostics, have recently demonstrated high reliability and precision for CSF amyloid-beta. Using the Elecsys assay, Sebastian Palmqvist, MD, PhD, a neurologist at Skåne University Hospital in Malmö, Sweden, and colleagues sought to examine the accuracy of plasma amyloid-beta and tau, together with other blood-based biomarkers, at detecting cerebral amyloid-beta.
Testing the immunoassay in two cohorts
Dr. Palmqvist and colleagues examined participants in the prospective Swedish BioFINDER Study, which enrolled patients between July 6, 2009, and February 11, 2015. This cohort included 513 cognitively unimpaired (CU) participants, 265 participants with mild cognitive impairment (MCI), and 64 participants with Alzheimer’s disease dementia. Investigators collected blood and CSF samples at the same time from all participants. Participants’ amyloid-beta status was ascertained using the Elecsys CSF amyloid-beta 42/amyloid-beta 40 ratio. The researchers defined amyloid-beta positivity with an unbiased cutoff of less than 0.059.
Dr. Palmqvist and colleagues also examined a validation cohort that included 237 participants who had been enrolled between January 29, 2000, and October 11, 2006, in Ulm and Hannover, Germany. This group included 34 CU participants, 109 participants with MCI, and 94 participants with mild Alzheimer’s disease dementia. The investigators applied the same cutoff of CSF amyloid-beta 42/amyloid-beta 40 to define amyloid-beta positivity in this cohort as they applied to the BioFINDER cohort.
Automated immunoassay had high predictive accuracy
The mean age of the BioFINDER cohort was 72 years, and 52.5% of participants were female. Overall, 44% of this cohort was amyloid-beta positive, including 29% of CU participants, 60% of participants with MCI, and 100% of participants with Alzheimer’s dementia. The investigators found statistically significant positive correlations between all plasma and corresponding CSF biomarkers in this cohort.
Plasma amyloid-beta 42 and amyloid-beta 40 levels predicted amyloid-beta status with an area under the receiver operating characteristic curve (AUC) of 0.80. When the researchers added APOE to the model, the AUC increased significantly to 0.85. Accuracy improved slightly when the researchers added plasma tau (AUC, 0.86) or tau and neurofilament light (AUC, 0.87) to amyloid-beta 42, amyloid-beta 40, and APOE. The results were similar in CU and cognitively impaired participants, and in younger and older participants.
In the validation cohort, the mean age was 66 years, and 50.6% of participants were female. When Dr. Palmqvist and colleagues applied the plasma amyloid-beta 42 and amyloid-beta 40 model from the BioFINDER cohort to this population, they obtained a slightly higher AUC (0.86), but plasma tau did not increase predictive accuracy.
The investigators performed a cost-benefit analysis using a scenario in which 1,000 amyloid-positive participants are included in a trial and given a cost of $4,000 per participant for amyloid PET. Using plasma amyloid-beta 42, amyloid-beta 40, and APOE in this scenario reduced PET costs by as much as 30%-50%, depending on the cutoff.
Validation cohort was small
Dr. Palmqvist and colleagues acknowledged that a lack of data about APOE was a limitation of their validation analysis. Other limitations that they acknowledged were the small population size, which precluded subpopulation analysis, and the lack of improvement in predictive ability when they replicated the model that included plasma tau.
“Overall, the accuracies of the amyloid-beta 42 and amyloid-beta 40 assays are not sufficient to be used on their own as a clinical test of amyloid-beta positivity,” said Dr. Palmqvist and colleagues. “Additional assay development is needed before this can be recommended, possibly together with other blood biomarkers and screening tools in diagnostic algorithms.”
Even though additional validation studies are necessary, the present findings indicate “the potential usefulness blood assays might have, especially considering the ongoing great need to recruit large cohorts for Alzheimer’s disease drug trials in preclinical and prodromal stages,” the authors concluded.
This investigation was funded by foundations including the European Research Council, the Swedish Research Council, and the Knut and Alice Wallenberg foundation. Several authors are employees of the Roche Group. One author served on a scientific advisory board for Roche Diagnostics, and another received institutional research support from that company.
SOURCE: Palmqvist S et al. JAMA Neurol. 2019 Jun 24. doi: 10.1001/jamaneurol.2019.1632.
FROM JAMA NEUROLOGY
FDA approves first treatment for neuromyelitis optica spectrum disorder
Soliris, a complement inhibitor, is the first FDA-approved treatment for NMOSD, a rare autoimmune disease of the central nervous system that mainly affects the optic nerves and spinal cord, according to a news release.
About 73% of patients with NMOSD test positive for anti-AQP4 antibodies, and complement activation resulting from anti-AQP4 antibodies is an underlying cause of the disease, according to the news release from Alexion, the company that markets the drug. The average age of NMOSD onset is 39 years, and the disease can lead to permanent visual impairment and paralysis. The condition, previously known as Devic’s disease, may affect between 4,000 and 8,000 people in the United States. NMOSD may be confused with other neurologic conditions such as multiple sclerosis.
Investigators studied the drug’s effectiveness in a placebo-controlled clinical trial of 143 patients with NMOSD who had anti-AQP4 antibodies. Compared with placebo, Soliris reduced the number of NMOSD relapses by 94% during the 48-week study. Nearly 98% of patients in the PREVENT trial who received Soliris were relapse-free after 48 weeks, compared with 63% of patients who received placebo.
Soliris also reduced hospitalizations and the need for corticosteroids and plasma exchange to treat acute attacks.
Soliris includes a boxed warning about life-threatening and fatal meningococcal infections that have occurred in patients treated with Soliris. Patients should be monitored and evaluated immediately if infection is suspected, according to the FDA announcement. In addition, health care professionals should use caution when administering Soliris to patients with any other infection. No cases of meningococcal infection were observed in the PREVENT trial.
Soliris is available through a restricted program under a Risk Evaluation and Mitigation Strategy (REMS). Prescribers must counsel patients about the risk of meningococcal infection and ensure that patients have been vaccinated with meningococcal vaccines.
Adverse reactions in the NMOSD clinical trial included upper respiratory infection, nasopharyngitis, diarrhea, back pain, dizziness, influenza, joint pain, sore throat, and confusion.
The drug’s use for NMOSD received Orphan Drug designation, which provides incentives for the development of drugs for rare diseases.
Eculizumab first was approved by the FDA in 2007 and also may be used to treat paroxysmal nocturnal hemoglobinuria, atypical hemolytic uremic syndrome, and myasthenia gravis.
Soliris, a complement inhibitor, is the first FDA-approved treatment for NMOSD, a rare autoimmune disease of the central nervous system that mainly affects the optic nerves and spinal cord, according to a news release.
About 73% of patients with NMOSD test positive for anti-AQP4 antibodies, and complement activation resulting from anti-AQP4 antibodies is an underlying cause of the disease, according to the news release from Alexion, the company that markets the drug. The average age of NMOSD onset is 39 years, and the disease can lead to permanent visual impairment and paralysis. The condition, previously known as Devic’s disease, may affect between 4,000 and 8,000 people in the United States. NMOSD may be confused with other neurologic conditions such as multiple sclerosis.
Investigators studied the drug’s effectiveness in a placebo-controlled clinical trial of 143 patients with NMOSD who had anti-AQP4 antibodies. Compared with placebo, Soliris reduced the number of NMOSD relapses by 94% during the 48-week study. Nearly 98% of patients in the PREVENT trial who received Soliris were relapse-free after 48 weeks, compared with 63% of patients who received placebo.
Soliris also reduced hospitalizations and the need for corticosteroids and plasma exchange to treat acute attacks.
Soliris includes a boxed warning about life-threatening and fatal meningococcal infections that have occurred in patients treated with Soliris. Patients should be monitored and evaluated immediately if infection is suspected, according to the FDA announcement. In addition, health care professionals should use caution when administering Soliris to patients with any other infection. No cases of meningococcal infection were observed in the PREVENT trial.
Soliris is available through a restricted program under a Risk Evaluation and Mitigation Strategy (REMS). Prescribers must counsel patients about the risk of meningococcal infection and ensure that patients have been vaccinated with meningococcal vaccines.
Adverse reactions in the NMOSD clinical trial included upper respiratory infection, nasopharyngitis, diarrhea, back pain, dizziness, influenza, joint pain, sore throat, and confusion.
The drug’s use for NMOSD received Orphan Drug designation, which provides incentives for the development of drugs for rare diseases.
Eculizumab first was approved by the FDA in 2007 and also may be used to treat paroxysmal nocturnal hemoglobinuria, atypical hemolytic uremic syndrome, and myasthenia gravis.
Soliris, a complement inhibitor, is the first FDA-approved treatment for NMOSD, a rare autoimmune disease of the central nervous system that mainly affects the optic nerves and spinal cord, according to a news release.
About 73% of patients with NMOSD test positive for anti-AQP4 antibodies, and complement activation resulting from anti-AQP4 antibodies is an underlying cause of the disease, according to the news release from Alexion, the company that markets the drug. The average age of NMOSD onset is 39 years, and the disease can lead to permanent visual impairment and paralysis. The condition, previously known as Devic’s disease, may affect between 4,000 and 8,000 people in the United States. NMOSD may be confused with other neurologic conditions such as multiple sclerosis.
Investigators studied the drug’s effectiveness in a placebo-controlled clinical trial of 143 patients with NMOSD who had anti-AQP4 antibodies. Compared with placebo, Soliris reduced the number of NMOSD relapses by 94% during the 48-week study. Nearly 98% of patients in the PREVENT trial who received Soliris were relapse-free after 48 weeks, compared with 63% of patients who received placebo.
Soliris also reduced hospitalizations and the need for corticosteroids and plasma exchange to treat acute attacks.
Soliris includes a boxed warning about life-threatening and fatal meningococcal infections that have occurred in patients treated with Soliris. Patients should be monitored and evaluated immediately if infection is suspected, according to the FDA announcement. In addition, health care professionals should use caution when administering Soliris to patients with any other infection. No cases of meningococcal infection were observed in the PREVENT trial.
Soliris is available through a restricted program under a Risk Evaluation and Mitigation Strategy (REMS). Prescribers must counsel patients about the risk of meningococcal infection and ensure that patients have been vaccinated with meningococcal vaccines.
Adverse reactions in the NMOSD clinical trial included upper respiratory infection, nasopharyngitis, diarrhea, back pain, dizziness, influenza, joint pain, sore throat, and confusion.
The drug’s use for NMOSD received Orphan Drug designation, which provides incentives for the development of drugs for rare diseases.
Eculizumab first was approved by the FDA in 2007 and also may be used to treat paroxysmal nocturnal hemoglobinuria, atypical hemolytic uremic syndrome, and myasthenia gravis.
Dr. Eve Espey: Some good news in her 2019 contraceptive update
NASHVILLE, TENN. – There’s some good news on the contraception and reproductive health front, according to a recent update from Eve Espey, MD.
The unintended pregnancy rate in the United States, including among adolescents and young women, is declining, and the U.S. abortion rate is at its lowest level since Roe v. Wade, she said at the annual clinical and scientific meeting of the American College of Obstetricians and Gynecologists.
A 2016 article based on 2008-2011 data showed that after hovering around 50% for nearly 3 decades, the unintended pregnancy rate dropped “for the first time in a very long period of time,” said Dr. Espey, professor and chair of the department of obstetrics & gynecology, division of family planning at the University of New Mexico, Albuquerque (N Engl J Med. 2016; 374[9]:843-52).
“It doesn’t look that impressive – it basically went down to 45%, but considering the scope and the number of women who are affected by unplanned pregnancy, this is actually a huge public health achievement,” she said. “And I think ... in the next cycles of the [Center for Disease Control and Prevention’s] National Survey of Family Growth ... we’ll hopefully continue to see this and potentially more [decline].”
As for abortion rates, an increase occurred following Roe v. Wade, but rates are now down to pre-Roe levels.
“One of the things that we know about the abortion rate is that the most important determinant ... is access to contraceptives,” Dr. Espey said, noting that both the abortion and unintended pregnancy rate declines are attributable to better and more consistent use of contraceptives, increased abstinence as teens are waiting longer to have sex, and the “meteoric rise in long-acting reversible contraceptive (LARC) use.”
Importantly, while improvements in public health have traditionally only impacted upper-class white women, a reduction is finally occurring in disparities with women of color, but those disparities still remain,” she added. “Just like we’re focusing so much on this relative to maternal mortality, the same kinds of disparities occur in access to reproductive health.”
Dr. Espey also provided updates on other aspects of contraception.
IUDs and other LARC methods
The use of LARCs increased from 2% of contraceptive types used by reproductive-aged women in 2002 to 12% in 2012. The majority of that change was in IUD use, with a small increase in implant use, she said, noting that the latest data from the 2015-2017 cycle of the National Survey of Family Growth shows that the rate is now up to 16%.
“The rise has been nothing that I ever imagined that I would see, certainly in my professional career,” she said.
The huge impact of LARCs on the unintended pregnancy rate is attributable to consistent effectiveness over time, compared with an increasing failure rate over time with short-acting contraceptive methods, she said, explaining that while the failure rate with oral contraceptives is about 8%-9% over the first 3 years, it increases to 53% at 8 years.
It’s a matter of looking at both “typical use” effectiveness and continuation rates: LARCs have continuation rates of about 75%-85%; Depot-Provera, for example, has a 25%-30% continuation rate at 1 year, she noted.
Dr. Espey also attributed the gains to improved access via the Affordable Care Act’s contraceptive mandate, which has been shown in numerous studies to have improved access and consistency of contraceptive use, but which is “currently being chipped away,” and to the federal Title X program that covers family planning care for low income women, including undocumented women.
“These two programs have made a huge impact for us, and I hope that we as ob.gyns. will continue to support them,” she said.
Reproductive justice
Despite their effectiveness, it is important to remember that LARC methods are not right for everyone, Dr. Espey said.
“It’s not all about effectiveness. Women have many reasons for accessing contraception, and our job is not to reduce unintended pregnancy. ... The idea really is that we empower women. ... We should really give choices and trust women to make the best choices for them,” she explained.
Barriers to IUD removal also should be eliminated, she noted, explaining that a woman who wants her IUD removed a month after insertion should have that option.
She said she has “changed her language,” from asking why a woman wants an $800 IUD removed after a month to asking whether she would like to hear about ways to make it better or if she is “just ready to have it removed.”
For those not interested in a discussion about birth control, she suggested providing information about the bedsider.org site.
“This is a great resource for patients,” she said, noting that it is available in both English and Spanish.
U.S. Medical Eligibility Criteria and Selected Practice Recommendations on contraceptive use
The MEC contraceptive guidance, a regularly updated, evidence-based project of the CDC, provides “best practices” information on candidate selection, or the “who” of contraceptive selection (who is a candidate for a particular method), Dr. Espy said, noting that it’s a “handy resource” for in-office use.
The SPR is more of a “how-to” guide that provides specifics on contraceptive use, such as when a woman can rely on the pill for contraception after she starts taking it, or how a woman should be followed after IUD placement, she said.
A free CDC app provides access to both.
Emergency contraception
The best overall emergency contraceptive method is the copper IUD, but often it is less accessible than oral methods, of which ulipristal acetate (ella), is the best choice, Dr. Espy said.
“Ulipristal is kind of a best-kept secret. It’s a selective estrogen-receptor modulator – it actually works better and longer than Plan B (levonorgestrel). What’s great about Plan B is that you can get it over the counter, but ulipristal delays ovulation longer,” she explained.
Contraceptives and obesity
Oral contraceptive efficacy is “so much more about adherence,” than about weight, she said.
With respect to the contraceptive patch, limited evidence suggests that obesity may reduce effectiveness, but “it’s still way better than barrier methods,” and for the contraceptive ring, no evidence suggests that obesity affects efficacy, she said.
For emergency contraception, evidence suggests that ulipristal is more effective than Plan B in women with high body mass index.
OTC contraceptive access
Pharmacy and OTC access are a good idea, Dr. Espy said.
“ACOG now supports both, which is great, and there are now a number of states where women can access contraception through the pharmacy. There are a lot of barriers there as well, and really the answer is OTC access,” she said. “There is a pill right now that is seeking [Food and Drug Administration] approval; it will be a progestin-only pill – the first one to be available over the counter, so I think this is something that we’ll see in the next 5-10 years.”
Additional future directions
One technology in development is a longer-acting injectable, such as a 6- or 9-month Depot-type shot.
Biodegradable implants also are in development. “What a cool idea – it just disappears in your arm, no need to remove it,” Dr. Espey said, adding that nonsurgical permanent sterilization is another possible advance, which would be “a holy grail.”
As for male contraception?
“I’ve been saying for about 25 years that in 5 years we’ll have a male contraceptive, so I’m not going to say it anymore with any kind of time frame, but it’s possible,” she said.
Dr. Espey reported having no financial disclosures.
NASHVILLE, TENN. – There’s some good news on the contraception and reproductive health front, according to a recent update from Eve Espey, MD.
The unintended pregnancy rate in the United States, including among adolescents and young women, is declining, and the U.S. abortion rate is at its lowest level since Roe v. Wade, she said at the annual clinical and scientific meeting of the American College of Obstetricians and Gynecologists.
A 2016 article based on 2008-2011 data showed that after hovering around 50% for nearly 3 decades, the unintended pregnancy rate dropped “for the first time in a very long period of time,” said Dr. Espey, professor and chair of the department of obstetrics & gynecology, division of family planning at the University of New Mexico, Albuquerque (N Engl J Med. 2016; 374[9]:843-52).
“It doesn’t look that impressive – it basically went down to 45%, but considering the scope and the number of women who are affected by unplanned pregnancy, this is actually a huge public health achievement,” she said. “And I think ... in the next cycles of the [Center for Disease Control and Prevention’s] National Survey of Family Growth ... we’ll hopefully continue to see this and potentially more [decline].”
As for abortion rates, an increase occurred following Roe v. Wade, but rates are now down to pre-Roe levels.
“One of the things that we know about the abortion rate is that the most important determinant ... is access to contraceptives,” Dr. Espey said, noting that both the abortion and unintended pregnancy rate declines are attributable to better and more consistent use of contraceptives, increased abstinence as teens are waiting longer to have sex, and the “meteoric rise in long-acting reversible contraceptive (LARC) use.”
Importantly, while improvements in public health have traditionally only impacted upper-class white women, a reduction is finally occurring in disparities with women of color, but those disparities still remain,” she added. “Just like we’re focusing so much on this relative to maternal mortality, the same kinds of disparities occur in access to reproductive health.”
Dr. Espey also provided updates on other aspects of contraception.
IUDs and other LARC methods
The use of LARCs increased from 2% of contraceptive types used by reproductive-aged women in 2002 to 12% in 2012. The majority of that change was in IUD use, with a small increase in implant use, she said, noting that the latest data from the 2015-2017 cycle of the National Survey of Family Growth shows that the rate is now up to 16%.
“The rise has been nothing that I ever imagined that I would see, certainly in my professional career,” she said.
The huge impact of LARCs on the unintended pregnancy rate is attributable to consistent effectiveness over time, compared with an increasing failure rate over time with short-acting contraceptive methods, she said, explaining that while the failure rate with oral contraceptives is about 8%-9% over the first 3 years, it increases to 53% at 8 years.
It’s a matter of looking at both “typical use” effectiveness and continuation rates: LARCs have continuation rates of about 75%-85%; Depot-Provera, for example, has a 25%-30% continuation rate at 1 year, she noted.
Dr. Espey also attributed the gains to improved access via the Affordable Care Act’s contraceptive mandate, which has been shown in numerous studies to have improved access and consistency of contraceptive use, but which is “currently being chipped away,” and to the federal Title X program that covers family planning care for low income women, including undocumented women.
“These two programs have made a huge impact for us, and I hope that we as ob.gyns. will continue to support them,” she said.
Reproductive justice
Despite their effectiveness, it is important to remember that LARC methods are not right for everyone, Dr. Espey said.
“It’s not all about effectiveness. Women have many reasons for accessing contraception, and our job is not to reduce unintended pregnancy. ... The idea really is that we empower women. ... We should really give choices and trust women to make the best choices for them,” she explained.
Barriers to IUD removal also should be eliminated, she noted, explaining that a woman who wants her IUD removed a month after insertion should have that option.
She said she has “changed her language,” from asking why a woman wants an $800 IUD removed after a month to asking whether she would like to hear about ways to make it better or if she is “just ready to have it removed.”
For those not interested in a discussion about birth control, she suggested providing information about the bedsider.org site.
“This is a great resource for patients,” she said, noting that it is available in both English and Spanish.
U.S. Medical Eligibility Criteria and Selected Practice Recommendations on contraceptive use
The MEC contraceptive guidance, a regularly updated, evidence-based project of the CDC, provides “best practices” information on candidate selection, or the “who” of contraceptive selection (who is a candidate for a particular method), Dr. Espy said, noting that it’s a “handy resource” for in-office use.
The SPR is more of a “how-to” guide that provides specifics on contraceptive use, such as when a woman can rely on the pill for contraception after she starts taking it, or how a woman should be followed after IUD placement, she said.
A free CDC app provides access to both.
Emergency contraception
The best overall emergency contraceptive method is the copper IUD, but often it is less accessible than oral methods, of which ulipristal acetate (ella), is the best choice, Dr. Espy said.
“Ulipristal is kind of a best-kept secret. It’s a selective estrogen-receptor modulator – it actually works better and longer than Plan B (levonorgestrel). What’s great about Plan B is that you can get it over the counter, but ulipristal delays ovulation longer,” she explained.
Contraceptives and obesity
Oral contraceptive efficacy is “so much more about adherence,” than about weight, she said.
With respect to the contraceptive patch, limited evidence suggests that obesity may reduce effectiveness, but “it’s still way better than barrier methods,” and for the contraceptive ring, no evidence suggests that obesity affects efficacy, she said.
For emergency contraception, evidence suggests that ulipristal is more effective than Plan B in women with high body mass index.
OTC contraceptive access
Pharmacy and OTC access are a good idea, Dr. Espy said.
“ACOG now supports both, which is great, and there are now a number of states where women can access contraception through the pharmacy. There are a lot of barriers there as well, and really the answer is OTC access,” she said. “There is a pill right now that is seeking [Food and Drug Administration] approval; it will be a progestin-only pill – the first one to be available over the counter, so I think this is something that we’ll see in the next 5-10 years.”
Additional future directions
One technology in development is a longer-acting injectable, such as a 6- or 9-month Depot-type shot.
Biodegradable implants also are in development. “What a cool idea – it just disappears in your arm, no need to remove it,” Dr. Espey said, adding that nonsurgical permanent sterilization is another possible advance, which would be “a holy grail.”
As for male contraception?
“I’ve been saying for about 25 years that in 5 years we’ll have a male contraceptive, so I’m not going to say it anymore with any kind of time frame, but it’s possible,” she said.
Dr. Espey reported having no financial disclosures.
NASHVILLE, TENN. – There’s some good news on the contraception and reproductive health front, according to a recent update from Eve Espey, MD.
The unintended pregnancy rate in the United States, including among adolescents and young women, is declining, and the U.S. abortion rate is at its lowest level since Roe v. Wade, she said at the annual clinical and scientific meeting of the American College of Obstetricians and Gynecologists.
A 2016 article based on 2008-2011 data showed that after hovering around 50% for nearly 3 decades, the unintended pregnancy rate dropped “for the first time in a very long period of time,” said Dr. Espey, professor and chair of the department of obstetrics & gynecology, division of family planning at the University of New Mexico, Albuquerque (N Engl J Med. 2016; 374[9]:843-52).
“It doesn’t look that impressive – it basically went down to 45%, but considering the scope and the number of women who are affected by unplanned pregnancy, this is actually a huge public health achievement,” she said. “And I think ... in the next cycles of the [Center for Disease Control and Prevention’s] National Survey of Family Growth ... we’ll hopefully continue to see this and potentially more [decline].”
As for abortion rates, an increase occurred following Roe v. Wade, but rates are now down to pre-Roe levels.
“One of the things that we know about the abortion rate is that the most important determinant ... is access to contraceptives,” Dr. Espey said, noting that both the abortion and unintended pregnancy rate declines are attributable to better and more consistent use of contraceptives, increased abstinence as teens are waiting longer to have sex, and the “meteoric rise in long-acting reversible contraceptive (LARC) use.”
Importantly, while improvements in public health have traditionally only impacted upper-class white women, a reduction is finally occurring in disparities with women of color, but those disparities still remain,” she added. “Just like we’re focusing so much on this relative to maternal mortality, the same kinds of disparities occur in access to reproductive health.”
Dr. Espey also provided updates on other aspects of contraception.
IUDs and other LARC methods
The use of LARCs increased from 2% of contraceptive types used by reproductive-aged women in 2002 to 12% in 2012. The majority of that change was in IUD use, with a small increase in implant use, she said, noting that the latest data from the 2015-2017 cycle of the National Survey of Family Growth shows that the rate is now up to 16%.
“The rise has been nothing that I ever imagined that I would see, certainly in my professional career,” she said.
The huge impact of LARCs on the unintended pregnancy rate is attributable to consistent effectiveness over time, compared with an increasing failure rate over time with short-acting contraceptive methods, she said, explaining that while the failure rate with oral contraceptives is about 8%-9% over the first 3 years, it increases to 53% at 8 years.
It’s a matter of looking at both “typical use” effectiveness and continuation rates: LARCs have continuation rates of about 75%-85%; Depot-Provera, for example, has a 25%-30% continuation rate at 1 year, she noted.
Dr. Espey also attributed the gains to improved access via the Affordable Care Act’s contraceptive mandate, which has been shown in numerous studies to have improved access and consistency of contraceptive use, but which is “currently being chipped away,” and to the federal Title X program that covers family planning care for low income women, including undocumented women.
“These two programs have made a huge impact for us, and I hope that we as ob.gyns. will continue to support them,” she said.
Reproductive justice
Despite their effectiveness, it is important to remember that LARC methods are not right for everyone, Dr. Espey said.
“It’s not all about effectiveness. Women have many reasons for accessing contraception, and our job is not to reduce unintended pregnancy. ... The idea really is that we empower women. ... We should really give choices and trust women to make the best choices for them,” she explained.
Barriers to IUD removal also should be eliminated, she noted, explaining that a woman who wants her IUD removed a month after insertion should have that option.
She said she has “changed her language,” from asking why a woman wants an $800 IUD removed after a month to asking whether she would like to hear about ways to make it better or if she is “just ready to have it removed.”
For those not interested in a discussion about birth control, she suggested providing information about the bedsider.org site.
“This is a great resource for patients,” she said, noting that it is available in both English and Spanish.
U.S. Medical Eligibility Criteria and Selected Practice Recommendations on contraceptive use
The MEC contraceptive guidance, a regularly updated, evidence-based project of the CDC, provides “best practices” information on candidate selection, or the “who” of contraceptive selection (who is a candidate for a particular method), Dr. Espy said, noting that it’s a “handy resource” for in-office use.
The SPR is more of a “how-to” guide that provides specifics on contraceptive use, such as when a woman can rely on the pill for contraception after she starts taking it, or how a woman should be followed after IUD placement, she said.
A free CDC app provides access to both.
Emergency contraception
The best overall emergency contraceptive method is the copper IUD, but often it is less accessible than oral methods, of which ulipristal acetate (ella), is the best choice, Dr. Espy said.
“Ulipristal is kind of a best-kept secret. It’s a selective estrogen-receptor modulator – it actually works better and longer than Plan B (levonorgestrel). What’s great about Plan B is that you can get it over the counter, but ulipristal delays ovulation longer,” she explained.
Contraceptives and obesity
Oral contraceptive efficacy is “so much more about adherence,” than about weight, she said.
With respect to the contraceptive patch, limited evidence suggests that obesity may reduce effectiveness, but “it’s still way better than barrier methods,” and for the contraceptive ring, no evidence suggests that obesity affects efficacy, she said.
For emergency contraception, evidence suggests that ulipristal is more effective than Plan B in women with high body mass index.
OTC contraceptive access
Pharmacy and OTC access are a good idea, Dr. Espy said.
“ACOG now supports both, which is great, and there are now a number of states where women can access contraception through the pharmacy. There are a lot of barriers there as well, and really the answer is OTC access,” she said. “There is a pill right now that is seeking [Food and Drug Administration] approval; it will be a progestin-only pill – the first one to be available over the counter, so I think this is something that we’ll see in the next 5-10 years.”
Additional future directions
One technology in development is a longer-acting injectable, such as a 6- or 9-month Depot-type shot.
Biodegradable implants also are in development. “What a cool idea – it just disappears in your arm, no need to remove it,” Dr. Espey said, adding that nonsurgical permanent sterilization is another possible advance, which would be “a holy grail.”
As for male contraception?
“I’ve been saying for about 25 years that in 5 years we’ll have a male contraceptive, so I’m not going to say it anymore with any kind of time frame, but it’s possible,” she said.
Dr. Espey reported having no financial disclosures.
EXPERT ANALYSIS FROM ACOG 2019
FDA expands Doptelet approval to ITP patients with thrombocytopenia
The Food and Drug Administration has approved a supplemental New Drug Application expanding the indication of avatrombopag (Doptelet) to include treatment of thrombocytopenia in adults with chronic immune thrombocytopenia (ITP) with insufficient response to previous therapy, according to Dova Pharmaceuticals.
FDA approval was based on results of a phase 3 trial in which a majority of patients who received avatrombopag achieved a platelet count of at least 50,000 per mcg after 8 days of therapy. In addition, efficacy was superior to patients in the placebo group in the maintenance of platelet counts during the 6-month treatment period.
Avatrombopag – an oral, thrombopoietin receptor agonist administered with food – was previously indicated for the treatment of chronic liver disease in adult patients who are scheduled to undergo a procedure. The most common adverse reactions in patients with ITP include headache, fatigue, contusion, epistaxis, upper respiratory tract infection, arthralgia, gingival bleeding, petechiae, and nasopharyngitis.
Find the full press release on the Dova Pharmaceuticals website.
The Food and Drug Administration has approved a supplemental New Drug Application expanding the indication of avatrombopag (Doptelet) to include treatment of thrombocytopenia in adults with chronic immune thrombocytopenia (ITP) with insufficient response to previous therapy, according to Dova Pharmaceuticals.
FDA approval was based on results of a phase 3 trial in which a majority of patients who received avatrombopag achieved a platelet count of at least 50,000 per mcg after 8 days of therapy. In addition, efficacy was superior to patients in the placebo group in the maintenance of platelet counts during the 6-month treatment period.
Avatrombopag – an oral, thrombopoietin receptor agonist administered with food – was previously indicated for the treatment of chronic liver disease in adult patients who are scheduled to undergo a procedure. The most common adverse reactions in patients with ITP include headache, fatigue, contusion, epistaxis, upper respiratory tract infection, arthralgia, gingival bleeding, petechiae, and nasopharyngitis.
Find the full press release on the Dova Pharmaceuticals website.
The Food and Drug Administration has approved a supplemental New Drug Application expanding the indication of avatrombopag (Doptelet) to include treatment of thrombocytopenia in adults with chronic immune thrombocytopenia (ITP) with insufficient response to previous therapy, according to Dova Pharmaceuticals.
FDA approval was based on results of a phase 3 trial in which a majority of patients who received avatrombopag achieved a platelet count of at least 50,000 per mcg after 8 days of therapy. In addition, efficacy was superior to patients in the placebo group in the maintenance of platelet counts during the 6-month treatment period.
Avatrombopag – an oral, thrombopoietin receptor agonist administered with food – was previously indicated for the treatment of chronic liver disease in adult patients who are scheduled to undergo a procedure. The most common adverse reactions in patients with ITP include headache, fatigue, contusion, epistaxis, upper respiratory tract infection, arthralgia, gingival bleeding, petechiae, and nasopharyngitis.
Find the full press release on the Dova Pharmaceuticals website.
JAK inhibitors are the ‘near future’ of alopecia areata treatment
MILAN – Janus kinase (JAK) inhibitors have clear science supporting their use in alopecia areata, and an increasing number of positive studies demonstrate their efficacy in regrowing hair, Brett King, MD, PhD, said at the World Congress of Dermatology.
Although not yet specifically approved for , said Dr. King, associate professor of dermatology at Yale University, New Haven, Conn.
“JAK inhibitors are very much within our reach for the treatment of severe alopecia areata,” Dr. King said in an oral presentation on therapeutic advances for alopecia. “We need to follow the science,” he added. “We would not be here telling a story about JAK inhibitors and these other agents without very bright scientists, so we really have to applaud the people who made this the focus of their research.”
JAK science
The science supporting JAK inhibitors can be traced back to a 2014 report by Angela M. Christiano, PhD,, Raphael Clynes, of Columbia University, New York, and others showing that alopecia areata is driven by cytotoxic T lymphocytes, and is reversed by inhibition of the JAK/STAT pathway in mouse models of disease (Nat Med. 2014 Sep;20[9]:1043-9). Those investigators also reported near-complete regrowth of hair in three patients who received oral ruxolitinib, an inhibitor of JAK1 and JAK2, hinting at the potential clinical importance of this targeted approach.
As Dr. King explained, secretion of interleukin (IL)-15 from the hair follicle endothelial cell activates CD8+NKG2D+ T cells leading to secretion of interferon (IFN)-gamma, which has a receptor on the hair follicle epithelial cell, activating that cell to secrete more IL-15.
“IL-15 and IFN-gamma both signal through the JAK/STAT pathway,” he said. “There are over 50 cytokines that signal through the JAK/STAT pathway, including IFN-gamma and IL-15, and on binding their receptor at the cell surface, they pass the baton, if you will, to the JAK enzymes, of which there are 4 members – JAK1, 2, 3 and tyrosine kinase 2. These enzymes subsequently pass the baton to STAT, and STAT translocates to the nucleus, where transcription occurs and disease happens. So we have an opportunity then with a small molecule JAK inhibitor to mediate disease, such that if we give this person a JAK inhibitor, they should regrow hair.”
JAK data
A number of studies of JAK inhibitors support that science, including an open-label study of 66 patients treated with the JAK1/3 inhibitor tofacitinib twice daily (JCI Insight. 2016 Sep 22;1[15]:e89776). About one-third experienced a 50% or greater improvement from baseline, as measured by the severity of alopecia tool (SALT) score over 3 months of treatment, with adverse events limited to grade 1-2 infections, according to the authors, which included Dr. King.
Around the same time, results of an open-label study with ruxolitinib, a JAK1/2 inhibitor, were published showing that 9 of 12 patients had complete or near complete scalp hair regrowth over 6 months of treatment, he said.
In a subsequent retrospective study of 90 patients treated with tofacitinib, about 66%-70% of patients experienced regrowth of hair, depending on the dose received. However, that study also showed that hair regrowth was unlikely in patients with complete or near complete scalp hair loss for 10 years or more, Dr. King said. An additional study showed that tofacitinib may be effective in adolescents as in adults, or even more effective, he added, while another found that low-dose ruxolitinib was as effective as higher dose ruxolitinib for the treatment of severe alopecia areata.
News earlier in 2019 surrounded the results of two randomized, double-blind placebo controlled trials, reported at the annual American Academy of Dermatology meeting in Washington, DC, showing efficacy for investigational oral JAK-targeted agents, a JAK 1/2 inhibitor (CTP-543), and a TYK2/JAK1 inhibitor (PF-06700841) and a JAK3 inhibitor (PF-06651600).
“I think this really is the near future of alopecia areata treatment,” Dr. King said.
No success yet for topical JAKs
One area where JAK inhibitors have not shined yet is in topical formulations. In a pilot study of tofacitinib 2% ointment, only 1 of 10 patients had significant scalp hair growth, while a study of topical ruxolitinib was stopped early and results have not yet been reported, according to Dr. King. “As dermatologists, we’re always interested in topical therapy for skin disease, but I’m not sure that alopecia areata is a disease for which topical JAK inhibitors will be effective,” he said.
Dr. King reported disclosures related to Aclaris Therapeutics, Concert Pharmaceuticals, Dermavant Sciences, Eli Lilly, Pfizer, Regeneron, and Sanofi Genzyme.
MILAN – Janus kinase (JAK) inhibitors have clear science supporting their use in alopecia areata, and an increasing number of positive studies demonstrate their efficacy in regrowing hair, Brett King, MD, PhD, said at the World Congress of Dermatology.
Although not yet specifically approved for , said Dr. King, associate professor of dermatology at Yale University, New Haven, Conn.
“JAK inhibitors are very much within our reach for the treatment of severe alopecia areata,” Dr. King said in an oral presentation on therapeutic advances for alopecia. “We need to follow the science,” he added. “We would not be here telling a story about JAK inhibitors and these other agents without very bright scientists, so we really have to applaud the people who made this the focus of their research.”
JAK science
The science supporting JAK inhibitors can be traced back to a 2014 report by Angela M. Christiano, PhD,, Raphael Clynes, of Columbia University, New York, and others showing that alopecia areata is driven by cytotoxic T lymphocytes, and is reversed by inhibition of the JAK/STAT pathway in mouse models of disease (Nat Med. 2014 Sep;20[9]:1043-9). Those investigators also reported near-complete regrowth of hair in three patients who received oral ruxolitinib, an inhibitor of JAK1 and JAK2, hinting at the potential clinical importance of this targeted approach.
As Dr. King explained, secretion of interleukin (IL)-15 from the hair follicle endothelial cell activates CD8+NKG2D+ T cells leading to secretion of interferon (IFN)-gamma, which has a receptor on the hair follicle epithelial cell, activating that cell to secrete more IL-15.
“IL-15 and IFN-gamma both signal through the JAK/STAT pathway,” he said. “There are over 50 cytokines that signal through the JAK/STAT pathway, including IFN-gamma and IL-15, and on binding their receptor at the cell surface, they pass the baton, if you will, to the JAK enzymes, of which there are 4 members – JAK1, 2, 3 and tyrosine kinase 2. These enzymes subsequently pass the baton to STAT, and STAT translocates to the nucleus, where transcription occurs and disease happens. So we have an opportunity then with a small molecule JAK inhibitor to mediate disease, such that if we give this person a JAK inhibitor, they should regrow hair.”
JAK data
A number of studies of JAK inhibitors support that science, including an open-label study of 66 patients treated with the JAK1/3 inhibitor tofacitinib twice daily (JCI Insight. 2016 Sep 22;1[15]:e89776). About one-third experienced a 50% or greater improvement from baseline, as measured by the severity of alopecia tool (SALT) score over 3 months of treatment, with adverse events limited to grade 1-2 infections, according to the authors, which included Dr. King.
Around the same time, results of an open-label study with ruxolitinib, a JAK1/2 inhibitor, were published showing that 9 of 12 patients had complete or near complete scalp hair regrowth over 6 months of treatment, he said.
In a subsequent retrospective study of 90 patients treated with tofacitinib, about 66%-70% of patients experienced regrowth of hair, depending on the dose received. However, that study also showed that hair regrowth was unlikely in patients with complete or near complete scalp hair loss for 10 years or more, Dr. King said. An additional study showed that tofacitinib may be effective in adolescents as in adults, or even more effective, he added, while another found that low-dose ruxolitinib was as effective as higher dose ruxolitinib for the treatment of severe alopecia areata.
News earlier in 2019 surrounded the results of two randomized, double-blind placebo controlled trials, reported at the annual American Academy of Dermatology meeting in Washington, DC, showing efficacy for investigational oral JAK-targeted agents, a JAK 1/2 inhibitor (CTP-543), and a TYK2/JAK1 inhibitor (PF-06700841) and a JAK3 inhibitor (PF-06651600).
“I think this really is the near future of alopecia areata treatment,” Dr. King said.
No success yet for topical JAKs
One area where JAK inhibitors have not shined yet is in topical formulations. In a pilot study of tofacitinib 2% ointment, only 1 of 10 patients had significant scalp hair growth, while a study of topical ruxolitinib was stopped early and results have not yet been reported, according to Dr. King. “As dermatologists, we’re always interested in topical therapy for skin disease, but I’m not sure that alopecia areata is a disease for which topical JAK inhibitors will be effective,” he said.
Dr. King reported disclosures related to Aclaris Therapeutics, Concert Pharmaceuticals, Dermavant Sciences, Eli Lilly, Pfizer, Regeneron, and Sanofi Genzyme.
MILAN – Janus kinase (JAK) inhibitors have clear science supporting their use in alopecia areata, and an increasing number of positive studies demonstrate their efficacy in regrowing hair, Brett King, MD, PhD, said at the World Congress of Dermatology.
Although not yet specifically approved for , said Dr. King, associate professor of dermatology at Yale University, New Haven, Conn.
“JAK inhibitors are very much within our reach for the treatment of severe alopecia areata,” Dr. King said in an oral presentation on therapeutic advances for alopecia. “We need to follow the science,” he added. “We would not be here telling a story about JAK inhibitors and these other agents without very bright scientists, so we really have to applaud the people who made this the focus of their research.”
JAK science
The science supporting JAK inhibitors can be traced back to a 2014 report by Angela M. Christiano, PhD,, Raphael Clynes, of Columbia University, New York, and others showing that alopecia areata is driven by cytotoxic T lymphocytes, and is reversed by inhibition of the JAK/STAT pathway in mouse models of disease (Nat Med. 2014 Sep;20[9]:1043-9). Those investigators also reported near-complete regrowth of hair in three patients who received oral ruxolitinib, an inhibitor of JAK1 and JAK2, hinting at the potential clinical importance of this targeted approach.
As Dr. King explained, secretion of interleukin (IL)-15 from the hair follicle endothelial cell activates CD8+NKG2D+ T cells leading to secretion of interferon (IFN)-gamma, which has a receptor on the hair follicle epithelial cell, activating that cell to secrete more IL-15.
“IL-15 and IFN-gamma both signal through the JAK/STAT pathway,” he said. “There are over 50 cytokines that signal through the JAK/STAT pathway, including IFN-gamma and IL-15, and on binding their receptor at the cell surface, they pass the baton, if you will, to the JAK enzymes, of which there are 4 members – JAK1, 2, 3 and tyrosine kinase 2. These enzymes subsequently pass the baton to STAT, and STAT translocates to the nucleus, where transcription occurs and disease happens. So we have an opportunity then with a small molecule JAK inhibitor to mediate disease, such that if we give this person a JAK inhibitor, they should regrow hair.”
JAK data
A number of studies of JAK inhibitors support that science, including an open-label study of 66 patients treated with the JAK1/3 inhibitor tofacitinib twice daily (JCI Insight. 2016 Sep 22;1[15]:e89776). About one-third experienced a 50% or greater improvement from baseline, as measured by the severity of alopecia tool (SALT) score over 3 months of treatment, with adverse events limited to grade 1-2 infections, according to the authors, which included Dr. King.
Around the same time, results of an open-label study with ruxolitinib, a JAK1/2 inhibitor, were published showing that 9 of 12 patients had complete or near complete scalp hair regrowth over 6 months of treatment, he said.
In a subsequent retrospective study of 90 patients treated with tofacitinib, about 66%-70% of patients experienced regrowth of hair, depending on the dose received. However, that study also showed that hair regrowth was unlikely in patients with complete or near complete scalp hair loss for 10 years or more, Dr. King said. An additional study showed that tofacitinib may be effective in adolescents as in adults, or even more effective, he added, while another found that low-dose ruxolitinib was as effective as higher dose ruxolitinib for the treatment of severe alopecia areata.
News earlier in 2019 surrounded the results of two randomized, double-blind placebo controlled trials, reported at the annual American Academy of Dermatology meeting in Washington, DC, showing efficacy for investigational oral JAK-targeted agents, a JAK 1/2 inhibitor (CTP-543), and a TYK2/JAK1 inhibitor (PF-06700841) and a JAK3 inhibitor (PF-06651600).
“I think this really is the near future of alopecia areata treatment,” Dr. King said.
No success yet for topical JAKs
One area where JAK inhibitors have not shined yet is in topical formulations. In a pilot study of tofacitinib 2% ointment, only 1 of 10 patients had significant scalp hair growth, while a study of topical ruxolitinib was stopped early and results have not yet been reported, according to Dr. King. “As dermatologists, we’re always interested in topical therapy for skin disease, but I’m not sure that alopecia areata is a disease for which topical JAK inhibitors will be effective,” he said.
Dr. King reported disclosures related to Aclaris Therapeutics, Concert Pharmaceuticals, Dermavant Sciences, Eli Lilly, Pfizer, Regeneron, and Sanofi Genzyme.
EXPERT ANALYSIS FROM WCD2019
LARC prolongs interpregnancy intervals but doesn’t cut preterm birth risk
NASHVILLE, TENN. – when used between a first and second pregnancy, results of a retrospective cohort study suggest.
Of 35,754 women who had a first and second live birth between 2005 and 2015 and who received non-emergent care within 10 years of the first birth, 3,083 (9%) had evidence of interpregnancy LARC exposure and were significantly less likely to have short interpregnancy intervals than were 32,671 with either non-LARC contraceptive use or no record of contraceptive-related care (P less than .0001), Sara E. Simonsen, PhD, reported in a poster at the annual meeting of the American College of Obstetricians and Gynecologists.
Intervals in those with intrapartum LARC use were 12 months or less in 4% of women, 13-18 months in 8%, 19-24 months in 11%, and greater than 24 months in 13%.
However, preterm birth, which occurred in 7% of first births and 6% of second births, was not lower among those with LARC exposure vs. those with no contraceptive encounters after adjustment for interpregnancy interval and a number of demographic factors, including education, presence of father, mother’s age, Hispanic ethnicity, fetal anomalies, and preterm birth history (adjusted odds ratio, 1.13), said Dr. Simonsen, a certified nurse midwife at the University of Utah Hospital, Salt Lake City.
“Preterm birth, a live birth at less than 37 weeks’ gestation, is a major determinant of poor neonatal outcomes,” she and her colleagues wrote. “Short interpregnancy interval, defined as less than 18 months, is an important risk factor for preterm birth.”
Given the increasing number of U.S. women who use highly effective LARCs to space pregnancies, she and her colleagues performed a retrospective cohort study of electronic medical records from two large health systems and linked them with birth and fetal death records to explore the relationship between interpregnancy LARC and both interpregnancy interval and preterm birth in the subsequent pregnancy.
“We did find that women who used LARC between their pregnancies were less likely to have a short interpregnancy interval, but in adjusted models ... we found no association with intrapartum LARC use and preterm birth in the second birth,” Dr. Simonsen said during an e-poster presentation at the meeting.
In fact, preterm birth in the second birth was most strongly associated with a prior preterm birth – a finding consistent with the literature, she and her colleagues noted.
Although the findings are limited by the use of retrospective data not designed for research, the data came from a large population-based sample representing about 85% of Utah births, they said.
The findings suggest that while LARC use may not reduce preterm birth risk, it “may contribute favorably to outcomes to the extent that having optimal interpregnancy interval does,” they wrote.
“‘We feel that these findings support providers counseling women on the full range of contraception options in the postpartum and not pushing [intrauterine devices,]” Dr. Simonsen added.
The related topic of immediate postpartum LARC use was addressed by Eve Espey, MD, in a separate presentation at the meeting.
Dr .Espey, professor and chair of the department of obstetrics and gynecology and director of the family planning fellowship at the University of New Mexico, Albuquerque, reported that immediate postpartum insertion of an intrauterine device (IUD) is highly cost-effective despite an expulsion rate of between 10% and 30%. She also addressed the value of postpartum LARC for reducing rapid-repeat pregnancy rates.
Payment models for immediate postpartum LARC are “very cumbersome,” but at the university, a persistent effort over 4 years has led to success. Immediate postpartum LARC is offered to women with Medicaid coverage, and payment is received in about 97% of cases, she said, adding that efforts are underway to help other hospitals “troubleshoot the issues.”
The lack of private insurance coverage for immediate postpartum LARC remains a challenge, but Dr. Espey said she remains “super enthusiastic” about its use.
“I think it’s going to take another 5 years or so [for better coverage], and honestly I think what we really need is an inpatient LARC CPT code to make this happen,” she said, urging colleagues to advocate for that within their American College of Obstetricians and Gynecologists sections when possible.
Dr. Simonsen and Dr. Espey reported having no relevant disclosures.
NASHVILLE, TENN. – when used between a first and second pregnancy, results of a retrospective cohort study suggest.
Of 35,754 women who had a first and second live birth between 2005 and 2015 and who received non-emergent care within 10 years of the first birth, 3,083 (9%) had evidence of interpregnancy LARC exposure and were significantly less likely to have short interpregnancy intervals than were 32,671 with either non-LARC contraceptive use or no record of contraceptive-related care (P less than .0001), Sara E. Simonsen, PhD, reported in a poster at the annual meeting of the American College of Obstetricians and Gynecologists.
Intervals in those with intrapartum LARC use were 12 months or less in 4% of women, 13-18 months in 8%, 19-24 months in 11%, and greater than 24 months in 13%.
However, preterm birth, which occurred in 7% of first births and 6% of second births, was not lower among those with LARC exposure vs. those with no contraceptive encounters after adjustment for interpregnancy interval and a number of demographic factors, including education, presence of father, mother’s age, Hispanic ethnicity, fetal anomalies, and preterm birth history (adjusted odds ratio, 1.13), said Dr. Simonsen, a certified nurse midwife at the University of Utah Hospital, Salt Lake City.
“Preterm birth, a live birth at less than 37 weeks’ gestation, is a major determinant of poor neonatal outcomes,” she and her colleagues wrote. “Short interpregnancy interval, defined as less than 18 months, is an important risk factor for preterm birth.”
Given the increasing number of U.S. women who use highly effective LARCs to space pregnancies, she and her colleagues performed a retrospective cohort study of electronic medical records from two large health systems and linked them with birth and fetal death records to explore the relationship between interpregnancy LARC and both interpregnancy interval and preterm birth in the subsequent pregnancy.
“We did find that women who used LARC between their pregnancies were less likely to have a short interpregnancy interval, but in adjusted models ... we found no association with intrapartum LARC use and preterm birth in the second birth,” Dr. Simonsen said during an e-poster presentation at the meeting.
In fact, preterm birth in the second birth was most strongly associated with a prior preterm birth – a finding consistent with the literature, she and her colleagues noted.
Although the findings are limited by the use of retrospective data not designed for research, the data came from a large population-based sample representing about 85% of Utah births, they said.
The findings suggest that while LARC use may not reduce preterm birth risk, it “may contribute favorably to outcomes to the extent that having optimal interpregnancy interval does,” they wrote.
“‘We feel that these findings support providers counseling women on the full range of contraception options in the postpartum and not pushing [intrauterine devices,]” Dr. Simonsen added.
The related topic of immediate postpartum LARC use was addressed by Eve Espey, MD, in a separate presentation at the meeting.
Dr .Espey, professor and chair of the department of obstetrics and gynecology and director of the family planning fellowship at the University of New Mexico, Albuquerque, reported that immediate postpartum insertion of an intrauterine device (IUD) is highly cost-effective despite an expulsion rate of between 10% and 30%. She also addressed the value of postpartum LARC for reducing rapid-repeat pregnancy rates.
Payment models for immediate postpartum LARC are “very cumbersome,” but at the university, a persistent effort over 4 years has led to success. Immediate postpartum LARC is offered to women with Medicaid coverage, and payment is received in about 97% of cases, she said, adding that efforts are underway to help other hospitals “troubleshoot the issues.”
The lack of private insurance coverage for immediate postpartum LARC remains a challenge, but Dr. Espey said she remains “super enthusiastic” about its use.
“I think it’s going to take another 5 years or so [for better coverage], and honestly I think what we really need is an inpatient LARC CPT code to make this happen,” she said, urging colleagues to advocate for that within their American College of Obstetricians and Gynecologists sections when possible.
Dr. Simonsen and Dr. Espey reported having no relevant disclosures.
NASHVILLE, TENN. – when used between a first and second pregnancy, results of a retrospective cohort study suggest.
Of 35,754 women who had a first and second live birth between 2005 and 2015 and who received non-emergent care within 10 years of the first birth, 3,083 (9%) had evidence of interpregnancy LARC exposure and were significantly less likely to have short interpregnancy intervals than were 32,671 with either non-LARC contraceptive use or no record of contraceptive-related care (P less than .0001), Sara E. Simonsen, PhD, reported in a poster at the annual meeting of the American College of Obstetricians and Gynecologists.
Intervals in those with intrapartum LARC use were 12 months or less in 4% of women, 13-18 months in 8%, 19-24 months in 11%, and greater than 24 months in 13%.
However, preterm birth, which occurred in 7% of first births and 6% of second births, was not lower among those with LARC exposure vs. those with no contraceptive encounters after adjustment for interpregnancy interval and a number of demographic factors, including education, presence of father, mother’s age, Hispanic ethnicity, fetal anomalies, and preterm birth history (adjusted odds ratio, 1.13), said Dr. Simonsen, a certified nurse midwife at the University of Utah Hospital, Salt Lake City.
“Preterm birth, a live birth at less than 37 weeks’ gestation, is a major determinant of poor neonatal outcomes,” she and her colleagues wrote. “Short interpregnancy interval, defined as less than 18 months, is an important risk factor for preterm birth.”
Given the increasing number of U.S. women who use highly effective LARCs to space pregnancies, she and her colleagues performed a retrospective cohort study of electronic medical records from two large health systems and linked them with birth and fetal death records to explore the relationship between interpregnancy LARC and both interpregnancy interval and preterm birth in the subsequent pregnancy.
“We did find that women who used LARC between their pregnancies were less likely to have a short interpregnancy interval, but in adjusted models ... we found no association with intrapartum LARC use and preterm birth in the second birth,” Dr. Simonsen said during an e-poster presentation at the meeting.
In fact, preterm birth in the second birth was most strongly associated with a prior preterm birth – a finding consistent with the literature, she and her colleagues noted.
Although the findings are limited by the use of retrospective data not designed for research, the data came from a large population-based sample representing about 85% of Utah births, they said.
The findings suggest that while LARC use may not reduce preterm birth risk, it “may contribute favorably to outcomes to the extent that having optimal interpregnancy interval does,” they wrote.
“‘We feel that these findings support providers counseling women on the full range of contraception options in the postpartum and not pushing [intrauterine devices,]” Dr. Simonsen added.
The related topic of immediate postpartum LARC use was addressed by Eve Espey, MD, in a separate presentation at the meeting.
Dr .Espey, professor and chair of the department of obstetrics and gynecology and director of the family planning fellowship at the University of New Mexico, Albuquerque, reported that immediate postpartum insertion of an intrauterine device (IUD) is highly cost-effective despite an expulsion rate of between 10% and 30%. She also addressed the value of postpartum LARC for reducing rapid-repeat pregnancy rates.
Payment models for immediate postpartum LARC are “very cumbersome,” but at the university, a persistent effort over 4 years has led to success. Immediate postpartum LARC is offered to women with Medicaid coverage, and payment is received in about 97% of cases, she said, adding that efforts are underway to help other hospitals “troubleshoot the issues.”
The lack of private insurance coverage for immediate postpartum LARC remains a challenge, but Dr. Espey said she remains “super enthusiastic” about its use.
“I think it’s going to take another 5 years or so [for better coverage], and honestly I think what we really need is an inpatient LARC CPT code to make this happen,” she said, urging colleagues to advocate for that within their American College of Obstetricians and Gynecologists sections when possible.
Dr. Simonsen and Dr. Espey reported having no relevant disclosures.
REPORTING FROM ACOG 2019
Immune modulators help anti-TNF agents battle Crohn’s disease, but not UC
Adding an immune modulator (IM) to anti–tumor necrosis factor (anti-TNF) initiation therapy benefits patients with Crohn’s disease (CD) but not those with ulcerative colitis (UC), according to a recent retrospective look at more than 1,000 cases.
The study showed that patients with CD who started combination therapy instead of monotherapy had lower rates of treatment ineffectiveness, experienced longer delays until hospitalization, and less often needed to switch their anti-TNF agent, reported lead author Laura E. Targownik, MD, of the University of Manitoba, in Winnipeg, Canada, and colleagues.
“Current guidelines on the medical management of IBD strongly support the use of IMs and anti-TNFs in combination over anti-TNF monotherapy,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, there is a sparsity of real-world data demonstrating the incremental benefits of combination therapy.”
The investigators noted that the SONIC trial, conducted in 2010, showed that patients treated with combination therapy were more likely to achieve corticosteroid-free remission at weeks 26 and 50; this became the basis of evidence leading multiple clinical guidelines to recommend combination therapy for patients with CD.
The present study involved 852 patients with CD and 303 with UC who began treatment with an anti-TNF agent during 2001-2016. Data were drawn from the Manitoba Inflammatory Bowel Disease (IBD) Epidemiology database.
The main outcome of interest was treatment ineffectiveness, which was defined by any of the following four events: acute, IBD-related hospital admission for more than 48 hours; resective intestinal surgery; corticosteroid use at least 14 days after initiating anti-TNF therapy, or, if corticosteroids were used within 16 weeks of anti-TNF initiation, then subsequent corticosteroid use occurring at least 16 weeks after initiation; or switching to a different anti-TNF agent. The investigators also looked for differences in effectiveness between two agents from each class: anti-TNF agents infliximab and adalimumab, and immunomodulators methotrexate and azathioprine.
Results showed that patients with CD had higher rates of ineffectiveness-free survival when treated with combination therapy instead of monotherapy at 1 year (74.2% vs. 68.6%) and 2 years (64.0% vs. 54.5%). Using a Cox proportional hazards model, this translated to a 38% reduced risk of treatment ineffectiveness (adjusted hazard ratio, 0.62).
“This suggests that the findings of the SONIC trial may extend to real-world clinical practice, even in patients who had previous IM exposure,” the investigators noted.
Combination therapy was also significantly associated with longer time to first IBD-related hospitalization (HR, 0.53) and the need to switch anti-TNF agent (HR, 0.63). However, no such relationships were found for time to resective surgery or corticosteroid use. Although combination therapy had no impact on the rate of primary treatment ineffectiveness in multivariable logistic regression, those who received anti-TNF therapy for more than 90 days had delayed secondary treatment ineffectiveness and fewer IBD-related hospitalizations. Choice of agent from either class had no influence on effectiveness of combination therapy.
In contrast with the above findings, combination therapy in patients with UC was less promising, which aligns with previous studies.
“[W]e were not able to demonstrate a significant advantage to combination therapy in persons with UC,” the investigators wrote. “In addition, all published cohort studies to date have not been able to confirm a significant benefit to combination therapy in UC. ... In light of the lower quality of prior evidence, combined with the results from our study, the indication for combination therapy in UC would appear to be weaker.”
“Further analyses in larger cohorts may clarify whether there is a clinically relevant benefit of combination therapy in persons with UC,” the investigators concluded. “Because of the discrepancy between our findings and those of a meta-analysis of cohort studies previously published on this topic, confirmation of our results is required in future studies.”
The investigators disclosed no funding or conflicts of interest.
SOURCE: Targownik LE et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.003.
Twenty years after the approval of the first anti–tumor necrosis factor (TNF) biologic agent for the treatment of inflammatory bowel disease (IBD), patients and providers are still learning how to optimize these medications. One optimization is the use of combination therapy (immunomodulator and anti-TNF). Immunomodulators are used independently for maintenance of remission of IBD, and they have been shown to reduce immunogenicity and improve efficacy when used in combination with an anti-TNF agent in prior short-term randomized controlled trials. However, use of combination therapy in the real-world is not universally practiced. Data are lacking on the risks and benefits of long-term use of these agents. Therefore, this article by Targownik et al. is very timely.
Importantly, a mixed group of patients who had previously been on azathioprine monotherapy and those newly starting this therapy at the time of anti-TNF initiation were included in this cohort (a group similar to what we see in real-world practice). Data on risk factors for disease complications, such as disease phenotype or severity, were not available. By contrast, none of the efficacy associations were improved in the smaller group of patients with ulcerative colitis on combination therapy.
As providers counsel patients on the benefits and risks of various IBD treatment choices, these data by Targownik et al. will inform decisions. Future research should incorporate additional means of biologic optimization, such as the use of therapeutic drug monitoring and/or risk factor–based selection of therapeutic agents, to better inform individualized treatment choices.
Millie D. Long MD, MPH, is an associate professor of medicine in the division of gastroenterology and hepatology; Inflammatory Bowel Diseases Center; vice chief for education; director, Gastroenterology and Hepatology Fellowship Program at the University of North Carolina at Chapel Hill. She has the following conflicts of interest: AbbVie, Takeda, Pfizer, UCB, Janssen, Salix, Prometheus, Target Pharmasolutions, and Valeant.
Twenty years after the approval of the first anti–tumor necrosis factor (TNF) biologic agent for the treatment of inflammatory bowel disease (IBD), patients and providers are still learning how to optimize these medications. One optimization is the use of combination therapy (immunomodulator and anti-TNF). Immunomodulators are used independently for maintenance of remission of IBD, and they have been shown to reduce immunogenicity and improve efficacy when used in combination with an anti-TNF agent in prior short-term randomized controlled trials. However, use of combination therapy in the real-world is not universally practiced. Data are lacking on the risks and benefits of long-term use of these agents. Therefore, this article by Targownik et al. is very timely.
Importantly, a mixed group of patients who had previously been on azathioprine monotherapy and those newly starting this therapy at the time of anti-TNF initiation were included in this cohort (a group similar to what we see in real-world practice). Data on risk factors for disease complications, such as disease phenotype or severity, were not available. By contrast, none of the efficacy associations were improved in the smaller group of patients with ulcerative colitis on combination therapy.
As providers counsel patients on the benefits and risks of various IBD treatment choices, these data by Targownik et al. will inform decisions. Future research should incorporate additional means of biologic optimization, such as the use of therapeutic drug monitoring and/or risk factor–based selection of therapeutic agents, to better inform individualized treatment choices.
Millie D. Long MD, MPH, is an associate professor of medicine in the division of gastroenterology and hepatology; Inflammatory Bowel Diseases Center; vice chief for education; director, Gastroenterology and Hepatology Fellowship Program at the University of North Carolina at Chapel Hill. She has the following conflicts of interest: AbbVie, Takeda, Pfizer, UCB, Janssen, Salix, Prometheus, Target Pharmasolutions, and Valeant.
Twenty years after the approval of the first anti–tumor necrosis factor (TNF) biologic agent for the treatment of inflammatory bowel disease (IBD), patients and providers are still learning how to optimize these medications. One optimization is the use of combination therapy (immunomodulator and anti-TNF). Immunomodulators are used independently for maintenance of remission of IBD, and they have been shown to reduce immunogenicity and improve efficacy when used in combination with an anti-TNF agent in prior short-term randomized controlled trials. However, use of combination therapy in the real-world is not universally practiced. Data are lacking on the risks and benefits of long-term use of these agents. Therefore, this article by Targownik et al. is very timely.
Importantly, a mixed group of patients who had previously been on azathioprine monotherapy and those newly starting this therapy at the time of anti-TNF initiation were included in this cohort (a group similar to what we see in real-world practice). Data on risk factors for disease complications, such as disease phenotype or severity, were not available. By contrast, none of the efficacy associations were improved in the smaller group of patients with ulcerative colitis on combination therapy.
As providers counsel patients on the benefits and risks of various IBD treatment choices, these data by Targownik et al. will inform decisions. Future research should incorporate additional means of biologic optimization, such as the use of therapeutic drug monitoring and/or risk factor–based selection of therapeutic agents, to better inform individualized treatment choices.
Millie D. Long MD, MPH, is an associate professor of medicine in the division of gastroenterology and hepatology; Inflammatory Bowel Diseases Center; vice chief for education; director, Gastroenterology and Hepatology Fellowship Program at the University of North Carolina at Chapel Hill. She has the following conflicts of interest: AbbVie, Takeda, Pfizer, UCB, Janssen, Salix, Prometheus, Target Pharmasolutions, and Valeant.
Adding an immune modulator (IM) to anti–tumor necrosis factor (anti-TNF) initiation therapy benefits patients with Crohn’s disease (CD) but not those with ulcerative colitis (UC), according to a recent retrospective look at more than 1,000 cases.
The study showed that patients with CD who started combination therapy instead of monotherapy had lower rates of treatment ineffectiveness, experienced longer delays until hospitalization, and less often needed to switch their anti-TNF agent, reported lead author Laura E. Targownik, MD, of the University of Manitoba, in Winnipeg, Canada, and colleagues.
“Current guidelines on the medical management of IBD strongly support the use of IMs and anti-TNFs in combination over anti-TNF monotherapy,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, there is a sparsity of real-world data demonstrating the incremental benefits of combination therapy.”
The investigators noted that the SONIC trial, conducted in 2010, showed that patients treated with combination therapy were more likely to achieve corticosteroid-free remission at weeks 26 and 50; this became the basis of evidence leading multiple clinical guidelines to recommend combination therapy for patients with CD.
The present study involved 852 patients with CD and 303 with UC who began treatment with an anti-TNF agent during 2001-2016. Data were drawn from the Manitoba Inflammatory Bowel Disease (IBD) Epidemiology database.
The main outcome of interest was treatment ineffectiveness, which was defined by any of the following four events: acute, IBD-related hospital admission for more than 48 hours; resective intestinal surgery; corticosteroid use at least 14 days after initiating anti-TNF therapy, or, if corticosteroids were used within 16 weeks of anti-TNF initiation, then subsequent corticosteroid use occurring at least 16 weeks after initiation; or switching to a different anti-TNF agent. The investigators also looked for differences in effectiveness between two agents from each class: anti-TNF agents infliximab and adalimumab, and immunomodulators methotrexate and azathioprine.
Results showed that patients with CD had higher rates of ineffectiveness-free survival when treated with combination therapy instead of monotherapy at 1 year (74.2% vs. 68.6%) and 2 years (64.0% vs. 54.5%). Using a Cox proportional hazards model, this translated to a 38% reduced risk of treatment ineffectiveness (adjusted hazard ratio, 0.62).
“This suggests that the findings of the SONIC trial may extend to real-world clinical practice, even in patients who had previous IM exposure,” the investigators noted.
Combination therapy was also significantly associated with longer time to first IBD-related hospitalization (HR, 0.53) and the need to switch anti-TNF agent (HR, 0.63). However, no such relationships were found for time to resective surgery or corticosteroid use. Although combination therapy had no impact on the rate of primary treatment ineffectiveness in multivariable logistic regression, those who received anti-TNF therapy for more than 90 days had delayed secondary treatment ineffectiveness and fewer IBD-related hospitalizations. Choice of agent from either class had no influence on effectiveness of combination therapy.
In contrast with the above findings, combination therapy in patients with UC was less promising, which aligns with previous studies.
“[W]e were not able to demonstrate a significant advantage to combination therapy in persons with UC,” the investigators wrote. “In addition, all published cohort studies to date have not been able to confirm a significant benefit to combination therapy in UC. ... In light of the lower quality of prior evidence, combined with the results from our study, the indication for combination therapy in UC would appear to be weaker.”
“Further analyses in larger cohorts may clarify whether there is a clinically relevant benefit of combination therapy in persons with UC,” the investigators concluded. “Because of the discrepancy between our findings and those of a meta-analysis of cohort studies previously published on this topic, confirmation of our results is required in future studies.”
The investigators disclosed no funding or conflicts of interest.
SOURCE: Targownik LE et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.003.
Adding an immune modulator (IM) to anti–tumor necrosis factor (anti-TNF) initiation therapy benefits patients with Crohn’s disease (CD) but not those with ulcerative colitis (UC), according to a recent retrospective look at more than 1,000 cases.
The study showed that patients with CD who started combination therapy instead of monotherapy had lower rates of treatment ineffectiveness, experienced longer delays until hospitalization, and less often needed to switch their anti-TNF agent, reported lead author Laura E. Targownik, MD, of the University of Manitoba, in Winnipeg, Canada, and colleagues.
“Current guidelines on the medical management of IBD strongly support the use of IMs and anti-TNFs in combination over anti-TNF monotherapy,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, there is a sparsity of real-world data demonstrating the incremental benefits of combination therapy.”
The investigators noted that the SONIC trial, conducted in 2010, showed that patients treated with combination therapy were more likely to achieve corticosteroid-free remission at weeks 26 and 50; this became the basis of evidence leading multiple clinical guidelines to recommend combination therapy for patients with CD.
The present study involved 852 patients with CD and 303 with UC who began treatment with an anti-TNF agent during 2001-2016. Data were drawn from the Manitoba Inflammatory Bowel Disease (IBD) Epidemiology database.
The main outcome of interest was treatment ineffectiveness, which was defined by any of the following four events: acute, IBD-related hospital admission for more than 48 hours; resective intestinal surgery; corticosteroid use at least 14 days after initiating anti-TNF therapy, or, if corticosteroids were used within 16 weeks of anti-TNF initiation, then subsequent corticosteroid use occurring at least 16 weeks after initiation; or switching to a different anti-TNF agent. The investigators also looked for differences in effectiveness between two agents from each class: anti-TNF agents infliximab and adalimumab, and immunomodulators methotrexate and azathioprine.
Results showed that patients with CD had higher rates of ineffectiveness-free survival when treated with combination therapy instead of monotherapy at 1 year (74.2% vs. 68.6%) and 2 years (64.0% vs. 54.5%). Using a Cox proportional hazards model, this translated to a 38% reduced risk of treatment ineffectiveness (adjusted hazard ratio, 0.62).
“This suggests that the findings of the SONIC trial may extend to real-world clinical practice, even in patients who had previous IM exposure,” the investigators noted.
Combination therapy was also significantly associated with longer time to first IBD-related hospitalization (HR, 0.53) and the need to switch anti-TNF agent (HR, 0.63). However, no such relationships were found for time to resective surgery or corticosteroid use. Although combination therapy had no impact on the rate of primary treatment ineffectiveness in multivariable logistic regression, those who received anti-TNF therapy for more than 90 days had delayed secondary treatment ineffectiveness and fewer IBD-related hospitalizations. Choice of agent from either class had no influence on effectiveness of combination therapy.
In contrast with the above findings, combination therapy in patients with UC was less promising, which aligns with previous studies.
“[W]e were not able to demonstrate a significant advantage to combination therapy in persons with UC,” the investigators wrote. “In addition, all published cohort studies to date have not been able to confirm a significant benefit to combination therapy in UC. ... In light of the lower quality of prior evidence, combined with the results from our study, the indication for combination therapy in UC would appear to be weaker.”
“Further analyses in larger cohorts may clarify whether there is a clinically relevant benefit of combination therapy in persons with UC,” the investigators concluded. “Because of the discrepancy between our findings and those of a meta-analysis of cohort studies previously published on this topic, confirmation of our results is required in future studies.”
The investigators disclosed no funding or conflicts of interest.
SOURCE: Targownik LE et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.003.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY