CF patients live longer in Canada than in U.S.

Different health care systems to blame
Article Type
Changed
Thu, 03/28/2019 - 14:55

 

People with cystic fibrosis (CF) survive an average of 10 years longer if they live in Canada than if they live in the United States, according to a report published online March 14 in Annals of Internal Medicine.

Differences between the two nations’ health care systems, including access to insurance, “may, in part, explain the Canadian survival advantage,” said Anne L. Stephenson, MD, PhD, of St. Michael’s Hospital, Toronto, and her associates.

Previous studies have suggested a significant survival gap between Americans and Canadians with CF, but their conclusions were “problematic” because of inherent differences between the two countries in registry data, which complicated direct comparisons. Dr. Stephenson and her associates used several statistical strategies to adjust for these differences, and confirmed the discrepancy in survival by analyzing information for 45,448 U.S. patients and 5,941 Canadian patients treated at 110 U.S. and 42 Canadian specialty centers from 1990 through 2013.

Overall there were 9,654 U.S. deaths and 1,288 Canadian deaths during the study period, for nearly identical overall mortality between the two countries (21.2% and 21.7%, respectively). However, the median survival was 10 years longer in Canada (50.9 years) than in the United States (40.6 years), a gap that persisted across numerous analyses that adjusted for patient characteristics and clinical factors, including CF severity.

One particular difference between the two study populations was found to be key: Canada has single-payer universal health insurance, while the United States does not. When U.S. patients were categorized according to their insurance status, Canadians had a 44% lower risk of death than did U.S. patients receiving continuous Medicaid or Medicare (95% confidence interval, 0.45-0.71; P less than .001), a 36% lower risk than for U.S. patients receiving intermittent Medicaid or Medicare (95% CI, 0.51-0.80; P = .002), and a 77% lower risk of death than U.S. patients with no or unknown health insurance (95% CI, 0.14-0.37; P less than .001), the investigators said (Ann. Intern. Med. 2017 Mar 14. doi: 10.7326/M16-0858). In contrast, there was no survival advantage for Canadian patients when compared with U.S. patients who had private health insurance. This “[raises] the question of whether a disparity exists in access to therapeutic approaches or health care delivery,” the researchers noted.

This study was supported by the U.S. Cystic Fibrosis Foundation, Cystic Fibrosis Canada, the National Institutes of Health, and the U.S. Food and Drug Administration. Dr. Stephenson reported grants from the Cystic Fibrosis Foundation and fees from Cystic Fibrosis Canada. Several of the study’s other authors reported receiving fees from various sources and one of those authors reported serving on the boards of pharmaceutical companies.

Body

 

Stephenson et al. confirmed that there is a “marked” [survival] “advantage” for CF patients in Canada, compared with those in the United States.

A key finding of this study was the survival difference between the two countries disappeared when U.S. patients insured by Medicaid or Medicare and those with no health insurance were excluded from the analysis. The fundamental differences between the two nations’ health care systems seem to be driving this disparity in survival.

Median predicted survival for all Canadians is higher than that of U.S. citizens, and this difference has increased over the last 2 decades.

Patrick A. Flume, MD, is at the Medical University of South Carolina in Charleston. Donald R. VanDevanter, PhD, is at Case Western Reserve University in Cleveland. They both reported ties to the Cystic Fibrosis Foundation. Dr. Flume and Dr. VanDevanter made these remarks in an editorial accompanying Dr. Stephenson’s report (Ann. Intern. Med. 2017 Mar 14. doi: 10.7326/M17-0564).

Publications
Topics
Sections
Body

 

Stephenson et al. confirmed that there is a “marked” [survival] “advantage” for CF patients in Canada, compared with those in the United States.

A key finding of this study was the survival difference between the two countries disappeared when U.S. patients insured by Medicaid or Medicare and those with no health insurance were excluded from the analysis. The fundamental differences between the two nations’ health care systems seem to be driving this disparity in survival.

Median predicted survival for all Canadians is higher than that of U.S. citizens, and this difference has increased over the last 2 decades.

Patrick A. Flume, MD, is at the Medical University of South Carolina in Charleston. Donald R. VanDevanter, PhD, is at Case Western Reserve University in Cleveland. They both reported ties to the Cystic Fibrosis Foundation. Dr. Flume and Dr. VanDevanter made these remarks in an editorial accompanying Dr. Stephenson’s report (Ann. Intern. Med. 2017 Mar 14. doi: 10.7326/M17-0564).

Body

 

Stephenson et al. confirmed that there is a “marked” [survival] “advantage” for CF patients in Canada, compared with those in the United States.

A key finding of this study was the survival difference between the two countries disappeared when U.S. patients insured by Medicaid or Medicare and those with no health insurance were excluded from the analysis. The fundamental differences between the two nations’ health care systems seem to be driving this disparity in survival.

Median predicted survival for all Canadians is higher than that of U.S. citizens, and this difference has increased over the last 2 decades.

Patrick A. Flume, MD, is at the Medical University of South Carolina in Charleston. Donald R. VanDevanter, PhD, is at Case Western Reserve University in Cleveland. They both reported ties to the Cystic Fibrosis Foundation. Dr. Flume and Dr. VanDevanter made these remarks in an editorial accompanying Dr. Stephenson’s report (Ann. Intern. Med. 2017 Mar 14. doi: 10.7326/M17-0564).

Title
Different health care systems to blame
Different health care systems to blame

 

People with cystic fibrosis (CF) survive an average of 10 years longer if they live in Canada than if they live in the United States, according to a report published online March 14 in Annals of Internal Medicine.

Differences between the two nations’ health care systems, including access to insurance, “may, in part, explain the Canadian survival advantage,” said Anne L. Stephenson, MD, PhD, of St. Michael’s Hospital, Toronto, and her associates.

Previous studies have suggested a significant survival gap between Americans and Canadians with CF, but their conclusions were “problematic” because of inherent differences between the two countries in registry data, which complicated direct comparisons. Dr. Stephenson and her associates used several statistical strategies to adjust for these differences, and confirmed the discrepancy in survival by analyzing information for 45,448 U.S. patients and 5,941 Canadian patients treated at 110 U.S. and 42 Canadian specialty centers from 1990 through 2013.

Overall there were 9,654 U.S. deaths and 1,288 Canadian deaths during the study period, for nearly identical overall mortality between the two countries (21.2% and 21.7%, respectively). However, the median survival was 10 years longer in Canada (50.9 years) than in the United States (40.6 years), a gap that persisted across numerous analyses that adjusted for patient characteristics and clinical factors, including CF severity.

One particular difference between the two study populations was found to be key: Canada has single-payer universal health insurance, while the United States does not. When U.S. patients were categorized according to their insurance status, Canadians had a 44% lower risk of death than did U.S. patients receiving continuous Medicaid or Medicare (95% confidence interval, 0.45-0.71; P less than .001), a 36% lower risk than for U.S. patients receiving intermittent Medicaid or Medicare (95% CI, 0.51-0.80; P = .002), and a 77% lower risk of death than U.S. patients with no or unknown health insurance (95% CI, 0.14-0.37; P less than .001), the investigators said (Ann. Intern. Med. 2017 Mar 14. doi: 10.7326/M16-0858). In contrast, there was no survival advantage for Canadian patients when compared with U.S. patients who had private health insurance. This “[raises] the question of whether a disparity exists in access to therapeutic approaches or health care delivery,” the researchers noted.

This study was supported by the U.S. Cystic Fibrosis Foundation, Cystic Fibrosis Canada, the National Institutes of Health, and the U.S. Food and Drug Administration. Dr. Stephenson reported grants from the Cystic Fibrosis Foundation and fees from Cystic Fibrosis Canada. Several of the study’s other authors reported receiving fees from various sources and one of those authors reported serving on the boards of pharmaceutical companies.

 

People with cystic fibrosis (CF) survive an average of 10 years longer if they live in Canada than if they live in the United States, according to a report published online March 14 in Annals of Internal Medicine.

Differences between the two nations’ health care systems, including access to insurance, “may, in part, explain the Canadian survival advantage,” said Anne L. Stephenson, MD, PhD, of St. Michael’s Hospital, Toronto, and her associates.

Previous studies have suggested a significant survival gap between Americans and Canadians with CF, but their conclusions were “problematic” because of inherent differences between the two countries in registry data, which complicated direct comparisons. Dr. Stephenson and her associates used several statistical strategies to adjust for these differences, and confirmed the discrepancy in survival by analyzing information for 45,448 U.S. patients and 5,941 Canadian patients treated at 110 U.S. and 42 Canadian specialty centers from 1990 through 2013.

Overall there were 9,654 U.S. deaths and 1,288 Canadian deaths during the study period, for nearly identical overall mortality between the two countries (21.2% and 21.7%, respectively). However, the median survival was 10 years longer in Canada (50.9 years) than in the United States (40.6 years), a gap that persisted across numerous analyses that adjusted for patient characteristics and clinical factors, including CF severity.

One particular difference between the two study populations was found to be key: Canada has single-payer universal health insurance, while the United States does not. When U.S. patients were categorized according to their insurance status, Canadians had a 44% lower risk of death than did U.S. patients receiving continuous Medicaid or Medicare (95% confidence interval, 0.45-0.71; P less than .001), a 36% lower risk than for U.S. patients receiving intermittent Medicaid or Medicare (95% CI, 0.51-0.80; P = .002), and a 77% lower risk of death than U.S. patients with no or unknown health insurance (95% CI, 0.14-0.37; P less than .001), the investigators said (Ann. Intern. Med. 2017 Mar 14. doi: 10.7326/M16-0858). In contrast, there was no survival advantage for Canadian patients when compared with U.S. patients who had private health insurance. This “[raises] the question of whether a disparity exists in access to therapeutic approaches or health care delivery,” the researchers noted.

This study was supported by the U.S. Cystic Fibrosis Foundation, Cystic Fibrosis Canada, the National Institutes of Health, and the U.S. Food and Drug Administration. Dr. Stephenson reported grants from the Cystic Fibrosis Foundation and fees from Cystic Fibrosis Canada. Several of the study’s other authors reported receiving fees from various sources and one of those authors reported serving on the boards of pharmaceutical companies.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: People with cystic fibrosis survive an average of 10 years longer if they live in Canada than if they live in the United States.

Major finding: Canadians with CF had a 44% lower risk of death than U.S. patients receiving Medicaid or Medicare and a striking 77% lower risk of death than U.S. patients with no health insurance, but the same risk as U.S. patients with private insurance.

Data source: A population-based cohort study involving 45,448 patients in a U.S. registry and 5,941 in a Canadian registry in 1990-2013.

Disclosures: This study was supported by the U.S. Cystic Fibrosis Foundation, Cystic Fibrosis Canada, the National Institutes of Health, and the Food and Drug Administration. The authors’ financial disclosures are available at www.acponline.org

No rise in CV events seen with tocilizumab

Article Type
Changed
Fri, 01/18/2019 - 16:36

 

Patients with refractory rheumatoid arthritis who switched to tocilizumab showed no increased cardiovascular risk when compared with those who switched to a tumor necrosis factor inhibitor in a large cohort study.

Dr. Seoyoung C. Kim
Dr. Seoyoung C. Kim
To examine this issue, the investigators analyzed information regarding CV mortality in three large health care claims databases covering all 50 states. They focused on 28,028 adults with RA who switched from taking at least one biologic agent or targeted synthetic disease-modifying antirheumatic drug to either tocilizumab (9,218 patients) or a tumor necrosis factor (TNF) inhibitor (18,810 patients). These patients were followed for a mean of 1 year.

To minimize the effect of confounding by the severity of RA and the baseline CV risk, the researchers adjusted the data to account for more than 90 variables related to CV events and to RA severity.

The primary outcome — a composite of MI and stroke — occurred in 125 patients, 36 taking tocilizumab and 89 taking TNF inhibitors. The rate of this composite outcome was 0.52 per 100 person-years with tocilizumab and 0.59 per 100 person-years for TNF inhibitors, a nonsignificant difference, Dr. Kim and her associates reported (Arthritis Rheumatol. 2017 Feb 28. doi: 10.1002/art.40084).

There also were no significant differences between the two study groups in secondary endpoints, including rates of coronary revascularization, acute coronary syndrome, heart failure, and all-cause mortality. In addition, all subgroup analyses confirmed that tocilizumab did not raise CV risk, regardless of patient age (younger than or older than 60 years), the presence of cardiovascular disease at baseline, the presence of diabetes, the use of methotrexate, the use of oral steroids, or the use of statins.

These “reassuring” findings show that even though tocilizumab appears to raise LDL levels, “such increases do not appear to be associated with an increased risk of clinical CV events,” the investigators said.

The results confirm those reported at the 2016 American College of Rheumatology annual meeting for the 5-year, randomized, postmarketing ENTRACTE trial in which the lipid changes induced by tocilizumab did not translate into an increased risk of heart attack or stroke in RA patients.

This cohort study was sponsored by Genentech, which markets tocilizumab (Actemra). Dr. Kim reported ties to Genentech, Lilly, Pfizer, Bristol-Myers Squibb, and AstraZeneca. Her associates reported ties to Genentech, Lilly, Pfizer, AstraZeneca, Amgen, Corrona, Whiscon, Aetion, and Boehringer Ingelheim. Three of the seven authors were employees of Genentech.

Publications
Topics
Sections
Related Articles

 

Patients with refractory rheumatoid arthritis who switched to tocilizumab showed no increased cardiovascular risk when compared with those who switched to a tumor necrosis factor inhibitor in a large cohort study.

Dr. Seoyoung C. Kim
Dr. Seoyoung C. Kim
To examine this issue, the investigators analyzed information regarding CV mortality in three large health care claims databases covering all 50 states. They focused on 28,028 adults with RA who switched from taking at least one biologic agent or targeted synthetic disease-modifying antirheumatic drug to either tocilizumab (9,218 patients) or a tumor necrosis factor (TNF) inhibitor (18,810 patients). These patients were followed for a mean of 1 year.

To minimize the effect of confounding by the severity of RA and the baseline CV risk, the researchers adjusted the data to account for more than 90 variables related to CV events and to RA severity.

The primary outcome — a composite of MI and stroke — occurred in 125 patients, 36 taking tocilizumab and 89 taking TNF inhibitors. The rate of this composite outcome was 0.52 per 100 person-years with tocilizumab and 0.59 per 100 person-years for TNF inhibitors, a nonsignificant difference, Dr. Kim and her associates reported (Arthritis Rheumatol. 2017 Feb 28. doi: 10.1002/art.40084).

There also were no significant differences between the two study groups in secondary endpoints, including rates of coronary revascularization, acute coronary syndrome, heart failure, and all-cause mortality. In addition, all subgroup analyses confirmed that tocilizumab did not raise CV risk, regardless of patient age (younger than or older than 60 years), the presence of cardiovascular disease at baseline, the presence of diabetes, the use of methotrexate, the use of oral steroids, or the use of statins.

These “reassuring” findings show that even though tocilizumab appears to raise LDL levels, “such increases do not appear to be associated with an increased risk of clinical CV events,” the investigators said.

The results confirm those reported at the 2016 American College of Rheumatology annual meeting for the 5-year, randomized, postmarketing ENTRACTE trial in which the lipid changes induced by tocilizumab did not translate into an increased risk of heart attack or stroke in RA patients.

This cohort study was sponsored by Genentech, which markets tocilizumab (Actemra). Dr. Kim reported ties to Genentech, Lilly, Pfizer, Bristol-Myers Squibb, and AstraZeneca. Her associates reported ties to Genentech, Lilly, Pfizer, AstraZeneca, Amgen, Corrona, Whiscon, Aetion, and Boehringer Ingelheim. Three of the seven authors were employees of Genentech.

 

Patients with refractory rheumatoid arthritis who switched to tocilizumab showed no increased cardiovascular risk when compared with those who switched to a tumor necrosis factor inhibitor in a large cohort study.

Dr. Seoyoung C. Kim
Dr. Seoyoung C. Kim
To examine this issue, the investigators analyzed information regarding CV mortality in three large health care claims databases covering all 50 states. They focused on 28,028 adults with RA who switched from taking at least one biologic agent or targeted synthetic disease-modifying antirheumatic drug to either tocilizumab (9,218 patients) or a tumor necrosis factor (TNF) inhibitor (18,810 patients). These patients were followed for a mean of 1 year.

To minimize the effect of confounding by the severity of RA and the baseline CV risk, the researchers adjusted the data to account for more than 90 variables related to CV events and to RA severity.

The primary outcome — a composite of MI and stroke — occurred in 125 patients, 36 taking tocilizumab and 89 taking TNF inhibitors. The rate of this composite outcome was 0.52 per 100 person-years with tocilizumab and 0.59 per 100 person-years for TNF inhibitors, a nonsignificant difference, Dr. Kim and her associates reported (Arthritis Rheumatol. 2017 Feb 28. doi: 10.1002/art.40084).

There also were no significant differences between the two study groups in secondary endpoints, including rates of coronary revascularization, acute coronary syndrome, heart failure, and all-cause mortality. In addition, all subgroup analyses confirmed that tocilizumab did not raise CV risk, regardless of patient age (younger than or older than 60 years), the presence of cardiovascular disease at baseline, the presence of diabetes, the use of methotrexate, the use of oral steroids, or the use of statins.

These “reassuring” findings show that even though tocilizumab appears to raise LDL levels, “such increases do not appear to be associated with an increased risk of clinical CV events,” the investigators said.

The results confirm those reported at the 2016 American College of Rheumatology annual meeting for the 5-year, randomized, postmarketing ENTRACTE trial in which the lipid changes induced by tocilizumab did not translate into an increased risk of heart attack or stroke in RA patients.

This cohort study was sponsored by Genentech, which markets tocilizumab (Actemra). Dr. Kim reported ties to Genentech, Lilly, Pfizer, Bristol-Myers Squibb, and AstraZeneca. Her associates reported ties to Genentech, Lilly, Pfizer, AstraZeneca, Amgen, Corrona, Whiscon, Aetion, and Boehringer Ingelheim. Three of the seven authors were employees of Genentech.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ARTHRITIS & RHEUMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Patients with refractory RA who switch to tocilizumab show no increased cardiovascular risk, compared with those who switch to a TNF inhibitor.

Major finding: The primary outcome – the rate of a composite of MI and stroke – was 0.52 per 100 person-years with tocilizumab and 0.59 per 100 person-years for TNF inhibitors, a nonsignificant difference.

Data source: A cohort study involving 28,028 adults with RA enrolled in three large health care claims databases from all 50 states who were followed for a median of 1 year.

Disclosures: This study was sponsored by Genentech, which markets tocilizumab (Actemra). Dr. Kim reported ties to Genentech, Lilly, Pfizer, Bristol-Myers Squibb, and AstraZeneca. Her associates reported ties to Genentech, Lilly, Pfizer, AstraZeneca, Amgen, Corrona, Whiscon, Aetion, and Boehringer Ingelheim. Three of the seven authors were employees of Genentech.

Norovirus reporting tool yields real-time outbreak data

Article Type
Changed
Thu, 03/28/2019 - 14:55

 

NoroSTAT, the Centers for Disease Control and Prevention’s new program with which states can report norovirus outbreaks, yields more timely and more complete epidemiologic and laboratory data, which allows a faster and better-informed public health response to such outbreaks, according to a report published in the Morbidity and Mortality Weekly Report.

 

The CDC launched NoroSTAT (Norovirus Sentinel Testing and Tracking) in 2012 to permit the health departments in selected states to report specific epidemiologic and laboratory data regarding norovirus outbreaks more rapidly than usual – within 7 business days, said Minesh P. Shah, MD, of the Epidemic Intelligence Service and the division of viral diseases, CDC, Atlanta, and his associates.

Courtesy CDC/Charles D. Humphrey
This transmission electron micrograph reveals norovirus virions or virus particles.
They analyzed outbreak data reported by five states (Minnesota, Ohio, Oregon, Tennessee, and Wisconsin) that initially participated in the program against data reported the usual way by the other states, plus Washington DC and Puerto Rico. They focused on the 3 years before and the 3 years after NoroSTAT was implemented.

NoroSTAT significantly reduced the median interval in reporting epidemiologic data concerning norovirus from 22 days to 2 days and significantly reduced the median interval in reporting relevant laboratory data from 21 days to 3 days. The percentage of reports submitted within 7 business days increased from 26% to 95% among the states participating in NoroSTAT, while remaining low – only 12%-13% – in nonparticipating states. The number of complete reports also increased substantially, from 87% to 99.9%, among the participating states.

These improvements likely result from NoroSTAT’s stringent reporting requirements and from the program’s ability “to enhance communication between epidemiologists and laboratorians in both state health departments and at CDC,” Dr. Shah and his associates said (MMWR Morbidity and Mortality Weekly Report. 2017 Feb 24;66:185-9).

NoroSTAT represents a key advancement in norovirus outbreak surveillance and has proved valuable in early identification and better characterization of outbreaks. It was expanded to include nine states in August 2016, the investigators added.

Publications
Topics
Sections

 

NoroSTAT, the Centers for Disease Control and Prevention’s new program with which states can report norovirus outbreaks, yields more timely and more complete epidemiologic and laboratory data, which allows a faster and better-informed public health response to such outbreaks, according to a report published in the Morbidity and Mortality Weekly Report.

 

The CDC launched NoroSTAT (Norovirus Sentinel Testing and Tracking) in 2012 to permit the health departments in selected states to report specific epidemiologic and laboratory data regarding norovirus outbreaks more rapidly than usual – within 7 business days, said Minesh P. Shah, MD, of the Epidemic Intelligence Service and the division of viral diseases, CDC, Atlanta, and his associates.

Courtesy CDC/Charles D. Humphrey
This transmission electron micrograph reveals norovirus virions or virus particles.
They analyzed outbreak data reported by five states (Minnesota, Ohio, Oregon, Tennessee, and Wisconsin) that initially participated in the program against data reported the usual way by the other states, plus Washington DC and Puerto Rico. They focused on the 3 years before and the 3 years after NoroSTAT was implemented.

NoroSTAT significantly reduced the median interval in reporting epidemiologic data concerning norovirus from 22 days to 2 days and significantly reduced the median interval in reporting relevant laboratory data from 21 days to 3 days. The percentage of reports submitted within 7 business days increased from 26% to 95% among the states participating in NoroSTAT, while remaining low – only 12%-13% – in nonparticipating states. The number of complete reports also increased substantially, from 87% to 99.9%, among the participating states.

These improvements likely result from NoroSTAT’s stringent reporting requirements and from the program’s ability “to enhance communication between epidemiologists and laboratorians in both state health departments and at CDC,” Dr. Shah and his associates said (MMWR Morbidity and Mortality Weekly Report. 2017 Feb 24;66:185-9).

NoroSTAT represents a key advancement in norovirus outbreak surveillance and has proved valuable in early identification and better characterization of outbreaks. It was expanded to include nine states in August 2016, the investigators added.

 

NoroSTAT, the Centers for Disease Control and Prevention’s new program with which states can report norovirus outbreaks, yields more timely and more complete epidemiologic and laboratory data, which allows a faster and better-informed public health response to such outbreaks, according to a report published in the Morbidity and Mortality Weekly Report.

 

The CDC launched NoroSTAT (Norovirus Sentinel Testing and Tracking) in 2012 to permit the health departments in selected states to report specific epidemiologic and laboratory data regarding norovirus outbreaks more rapidly than usual – within 7 business days, said Minesh P. Shah, MD, of the Epidemic Intelligence Service and the division of viral diseases, CDC, Atlanta, and his associates.

Courtesy CDC/Charles D. Humphrey
This transmission electron micrograph reveals norovirus virions or virus particles.
They analyzed outbreak data reported by five states (Minnesota, Ohio, Oregon, Tennessee, and Wisconsin) that initially participated in the program against data reported the usual way by the other states, plus Washington DC and Puerto Rico. They focused on the 3 years before and the 3 years after NoroSTAT was implemented.

NoroSTAT significantly reduced the median interval in reporting epidemiologic data concerning norovirus from 22 days to 2 days and significantly reduced the median interval in reporting relevant laboratory data from 21 days to 3 days. The percentage of reports submitted within 7 business days increased from 26% to 95% among the states participating in NoroSTAT, while remaining low – only 12%-13% – in nonparticipating states. The number of complete reports also increased substantially, from 87% to 99.9%, among the participating states.

These improvements likely result from NoroSTAT’s stringent reporting requirements and from the program’s ability “to enhance communication between epidemiologists and laboratorians in both state health departments and at CDC,” Dr. Shah and his associates said (MMWR Morbidity and Mortality Weekly Report. 2017 Feb 24;66:185-9).

NoroSTAT represents a key advancement in norovirus outbreak surveillance and has proved valuable in early identification and better characterization of outbreaks. It was expanded to include nine states in August 2016, the investigators added.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: NoroSTAT, the CDC’s new program with which states can report norovirus outbreaks, yields more timely and complete epidemiologic data.

Major finding: NoroSTAT significantly reduced the median interval in reporting epidemiologic data concerning norovirus from 22 days to 2 days and significantly reduced the median interval in reporting relevant laboratory data from 21 days to 3 days.

Data source: A comparison of epidemiologic and laboratory data reported by all 50 states for the 3 years before and the 3 years after NoroSTAT was implemented in 5 states.

Disclosures: This study was sponsored by the Centers for Disease Control and Prevention. No financial disclosures were provided.

In CML, imatinib benefits persist at 10 years

Article Type
Changed
Fri, 01/04/2019 - 10:00

 

In patients with chronic myeloid leukemia (CML), the safety and benefits of imatinib persisted over the course of 10 years in a follow-up study reported online March 9 in the New England Journal of Medicine.

The initial results of the phase III International Randomized Study of Interferon and ST1571 (IRIS) trial “fundamentally changed CML treatment and led to marked improvements in prognosis for patients” when imatinib was shown more effective than interferon plus cytarabine among newly diagnosed patients in the chronic phase of the disease, said Andreas Hochhaus, MD, of the Klinik für Innere Medizin II, Universitätsklinikum Jena (Germany), and his associates. The researchers are now reporting the final follow-up results after a median of 10.9 years for the 553 patients who had been randomly assigned to receive daily oral imatinib in the IRIS trial.

Courtesy Wikimedia Commons/Difu Wu/Creative Commons License
A small, hypolobated megakaryocyte (center of field) in a bone marrow aspirate, typically of chronic myelogenous leukemia.
The estimated rate of event-free 10-year survival was 79.6% with imatinib. In comparison, the estimated event-free 10-year survival was 56.6% among patients assigned to interferon plus cytarabine.

The high rate of crossover to the imatinib group among IRIS study participants precluded a direct comparison of overall survival between the two study groups at 10 years. However, the estimated overall 10-year survival in the imatinib group alone was 83.3%. “A total of 260 patients (47.0%) were alive and still receiving study treatment at 10 years, 96 patients (17.4%) were alive and not receiving study treatment, 86 known deaths (15.6% of patients) had occurred, and 11 patients (20.1%) had unknown survival status,” Dr. Hochhaus and his associates reported (N Engl J Med. 2017 Mar 9. doi: 10.1056/NEJMoa1609324).

Approximately 9% of those in the imatinib group had a serious adverse event during follow-up, including 4 patients (0.7%) in whom the event was considered to be related to the drug. Most occurred during the first year of treatment and declined over time. No new safety signals were observed after the 5-year follow-up.

“These results highlight the safety and efficacy of imatinib therapy, with a clear improvement over the outcomes that were expected n patients who received a diagnosis of CML before the introduction of tyrosine kinase inhibitor therapy, when interferon alfa and hematopoietic stem-cell transplantation were the standard therapies,” the investigators said.

Second-generation tyrosine kinase inhibitors have been developed since the IRIS trial was begun, and “it remains to be seen whether they will have similarly favorable long-term safety. Given the long-term safety and efficacy results with imatinib and the increasing availability of generic imatinib, comparative analyses evaluating the available tyrosine kinase inhibitors for first-line therapy are likely to be forthcoming,” they noted.

Publications
Topics
Sections

 

In patients with chronic myeloid leukemia (CML), the safety and benefits of imatinib persisted over the course of 10 years in a follow-up study reported online March 9 in the New England Journal of Medicine.

The initial results of the phase III International Randomized Study of Interferon and ST1571 (IRIS) trial “fundamentally changed CML treatment and led to marked improvements in prognosis for patients” when imatinib was shown more effective than interferon plus cytarabine among newly diagnosed patients in the chronic phase of the disease, said Andreas Hochhaus, MD, of the Klinik für Innere Medizin II, Universitätsklinikum Jena (Germany), and his associates. The researchers are now reporting the final follow-up results after a median of 10.9 years for the 553 patients who had been randomly assigned to receive daily oral imatinib in the IRIS trial.

Courtesy Wikimedia Commons/Difu Wu/Creative Commons License
A small, hypolobated megakaryocyte (center of field) in a bone marrow aspirate, typically of chronic myelogenous leukemia.
The estimated rate of event-free 10-year survival was 79.6% with imatinib. In comparison, the estimated event-free 10-year survival was 56.6% among patients assigned to interferon plus cytarabine.

The high rate of crossover to the imatinib group among IRIS study participants precluded a direct comparison of overall survival between the two study groups at 10 years. However, the estimated overall 10-year survival in the imatinib group alone was 83.3%. “A total of 260 patients (47.0%) were alive and still receiving study treatment at 10 years, 96 patients (17.4%) were alive and not receiving study treatment, 86 known deaths (15.6% of patients) had occurred, and 11 patients (20.1%) had unknown survival status,” Dr. Hochhaus and his associates reported (N Engl J Med. 2017 Mar 9. doi: 10.1056/NEJMoa1609324).

Approximately 9% of those in the imatinib group had a serious adverse event during follow-up, including 4 patients (0.7%) in whom the event was considered to be related to the drug. Most occurred during the first year of treatment and declined over time. No new safety signals were observed after the 5-year follow-up.

“These results highlight the safety and efficacy of imatinib therapy, with a clear improvement over the outcomes that were expected n patients who received a diagnosis of CML before the introduction of tyrosine kinase inhibitor therapy, when interferon alfa and hematopoietic stem-cell transplantation were the standard therapies,” the investigators said.

Second-generation tyrosine kinase inhibitors have been developed since the IRIS trial was begun, and “it remains to be seen whether they will have similarly favorable long-term safety. Given the long-term safety and efficacy results with imatinib and the increasing availability of generic imatinib, comparative analyses evaluating the available tyrosine kinase inhibitors for first-line therapy are likely to be forthcoming,” they noted.

 

In patients with chronic myeloid leukemia (CML), the safety and benefits of imatinib persisted over the course of 10 years in a follow-up study reported online March 9 in the New England Journal of Medicine.

The initial results of the phase III International Randomized Study of Interferon and ST1571 (IRIS) trial “fundamentally changed CML treatment and led to marked improvements in prognosis for patients” when imatinib was shown more effective than interferon plus cytarabine among newly diagnosed patients in the chronic phase of the disease, said Andreas Hochhaus, MD, of the Klinik für Innere Medizin II, Universitätsklinikum Jena (Germany), and his associates. The researchers are now reporting the final follow-up results after a median of 10.9 years for the 553 patients who had been randomly assigned to receive daily oral imatinib in the IRIS trial.

Courtesy Wikimedia Commons/Difu Wu/Creative Commons License
A small, hypolobated megakaryocyte (center of field) in a bone marrow aspirate, typically of chronic myelogenous leukemia.
The estimated rate of event-free 10-year survival was 79.6% with imatinib. In comparison, the estimated event-free 10-year survival was 56.6% among patients assigned to interferon plus cytarabine.

The high rate of crossover to the imatinib group among IRIS study participants precluded a direct comparison of overall survival between the two study groups at 10 years. However, the estimated overall 10-year survival in the imatinib group alone was 83.3%. “A total of 260 patients (47.0%) were alive and still receiving study treatment at 10 years, 96 patients (17.4%) were alive and not receiving study treatment, 86 known deaths (15.6% of patients) had occurred, and 11 patients (20.1%) had unknown survival status,” Dr. Hochhaus and his associates reported (N Engl J Med. 2017 Mar 9. doi: 10.1056/NEJMoa1609324).

Approximately 9% of those in the imatinib group had a serious adverse event during follow-up, including 4 patients (0.7%) in whom the event was considered to be related to the drug. Most occurred during the first year of treatment and declined over time. No new safety signals were observed after the 5-year follow-up.

“These results highlight the safety and efficacy of imatinib therapy, with a clear improvement over the outcomes that were expected n patients who received a diagnosis of CML before the introduction of tyrosine kinase inhibitor therapy, when interferon alfa and hematopoietic stem-cell transplantation were the standard therapies,” the investigators said.

Second-generation tyrosine kinase inhibitors have been developed since the IRIS trial was begun, and “it remains to be seen whether they will have similarly favorable long-term safety. Given the long-term safety and efficacy results with imatinib and the increasing availability of generic imatinib, comparative analyses evaluating the available tyrosine kinase inhibitors for first-line therapy are likely to be forthcoming,” they noted.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Imatinib’s benefits in chronic myeloid leukemia and its safety profile persisted over the long term in a 10-year follow-up study.

Major finding: The estimated overall 10-year survival was 83.3%, and no new safety signals were observed after 5-year follow-up.

Data source: Extended follow-up of an open-label international randomized trial involving 553 adults with CML.

Disclosures: This study was funded by Novartis. Dr. Hochhaus reported ties to Novartis and other drug companies.

Nemolizumab improves pruritus in atopic dermatitis

Agent appears to work quickly
Article Type
Changed
Fri, 01/18/2019 - 16:34

Monthly subcutaneous injections of nemolizumab, a humanized monoclonal antibody that inhibits interleukin-31 signaling, significantly improved pruritus associated with atopic dermatitis (AD) in a small, 3-month phase II trial. The results were published online March 2 in the New England Journal of Medicine.

“Although this trial has limitations, most notably the small number of patients and short duration, it provides evidence supporting the role of interleukin-31 in the pathobiologic mechanism of atopic dermatitis,” said Thomas Ruzicka, MD, of the department of dermatology and allergology, Ludwig Maximilian University, Munich, and his associates.

Pruritus aggravates atopic dermatitis and has been linked to loss of sleep, depression, aggressiveness, body disfiguration, and suicidal thoughts. Existing treatments, including emollients, topical glucocorticoids, calcineurin inhibitors, and oral antihistamines, have limited efficacy and can cause adverse effects when used long term, the investigators noted.

They assessed nemolizumab in a manufacturer-funded multiple-dose trial involving 264 adults in the United States, Europe, and Japan who had refractory moderate to severe atopic dermatitis, inadequately controlled with topical treatments. Study participants were randomly assigned in a double blind fashion to receive 12 weeks of 0.1 mg/kg nemolizumab (53 patients), 0.5 mg/kg nemolizumab (54 patients), 2.0 mg/kg nemolizumab (52 patients), or placebo (53 control subjects) every 4 weeks. Another 52 participants were given 2.0 mg/kg nemolizumab every 8 weeks in an exploratory analysis. All the study participants were permitted to use emollients and localized treatments, and some were permitted by the investigators to use a potent topical glucocorticoid as rescue therapy after week 4.

A total of 216 patients (82%) completed the trial.

The primary efficacy endpoint was the percentage improvement at week 12 in scores on a pruritus visual analogue scale, which patients recorded electronically every day. These scores improved significantly in a dose-dependent manner for active treatment, compared with placebo. Pruritus declined by 43.7% with the 0.1 mg/kg dose (P =.002), 59.8% with the 0.5 mg/kg dose (P less than .001), and 63.1% with the 2.0 mg/kg dose (P less than .001), compared with 20.9% with placebo.

Nemolizumab also bested placebo in several secondary endpoints including scores on a verbal rating of pruritus, the Eczema Area and Severity Index, and the static Investigator’s Global Assessment, the investigators said (N Engl J Med 2017;376:826-35. doi: 10.1056/NEJMoa1606490).

The study population was too small to allow the investigators to draw conclusions regarding adverse events, even before a relatively high number of participants dropped out. However, patients who received active treatment had a higher rate of dermatitis exacerbations and peripheral edema than did those who received placebo.

The group given 0.5 mg/kg nemolizumab every month showed the greatest treatment benefit and the best benefit-to-risk profile, Dr. Ruzicka and his associates said.

This trial was funded by Chugai Pharmaceutical, which also participated in the study design, data collection and analysis, and preparation of the manuscript. Dr. Ruzicka reported receiving research grants and personal fees from Chugai and honoraria from Astellas; his associates reported ties to numerous industry sources.

Body

In addition to the benefits cited by Ruzicka et al., nemolizumab appeared to work quickly, reducing pruritus by nearly 30% within the first week, compared with a slight placebo effect.

Data from larger and longer-term studies, as well as pediatric trials, are needed to fully understand how nemolizumab and other new agents should be incorporated into the management of AD.

It will be important to assess how quickly disease flares occur when these agents are stopped, and whether the concomitant use of other treatments may enhance their effectiveness or induce longer remissions.
 

Lynda C. Schneider, MD, is in the division of immunology at Boston Children’s Hospital. She disclosed having received grant support from Astellas, personal fees from Anacor Pharmaceuticals, and other support from the National Eczema Association outside the submitted work. Dr. Schneider made these remarks in an editorial accompanying the study (N Engl J Med. 2017 March 2. doi:10.1056/NEJMe1616072).

Publications
Topics
Sections
Body

In addition to the benefits cited by Ruzicka et al., nemolizumab appeared to work quickly, reducing pruritus by nearly 30% within the first week, compared with a slight placebo effect.

Data from larger and longer-term studies, as well as pediatric trials, are needed to fully understand how nemolizumab and other new agents should be incorporated into the management of AD.

It will be important to assess how quickly disease flares occur when these agents are stopped, and whether the concomitant use of other treatments may enhance their effectiveness or induce longer remissions.
 

Lynda C. Schneider, MD, is in the division of immunology at Boston Children’s Hospital. She disclosed having received grant support from Astellas, personal fees from Anacor Pharmaceuticals, and other support from the National Eczema Association outside the submitted work. Dr. Schneider made these remarks in an editorial accompanying the study (N Engl J Med. 2017 March 2. doi:10.1056/NEJMe1616072).

Body

In addition to the benefits cited by Ruzicka et al., nemolizumab appeared to work quickly, reducing pruritus by nearly 30% within the first week, compared with a slight placebo effect.

Data from larger and longer-term studies, as well as pediatric trials, are needed to fully understand how nemolizumab and other new agents should be incorporated into the management of AD.

It will be important to assess how quickly disease flares occur when these agents are stopped, and whether the concomitant use of other treatments may enhance their effectiveness or induce longer remissions.
 

Lynda C. Schneider, MD, is in the division of immunology at Boston Children’s Hospital. She disclosed having received grant support from Astellas, personal fees from Anacor Pharmaceuticals, and other support from the National Eczema Association outside the submitted work. Dr. Schneider made these remarks in an editorial accompanying the study (N Engl J Med. 2017 March 2. doi:10.1056/NEJMe1616072).

Title
Agent appears to work quickly
Agent appears to work quickly

Monthly subcutaneous injections of nemolizumab, a humanized monoclonal antibody that inhibits interleukin-31 signaling, significantly improved pruritus associated with atopic dermatitis (AD) in a small, 3-month phase II trial. The results were published online March 2 in the New England Journal of Medicine.

“Although this trial has limitations, most notably the small number of patients and short duration, it provides evidence supporting the role of interleukin-31 in the pathobiologic mechanism of atopic dermatitis,” said Thomas Ruzicka, MD, of the department of dermatology and allergology, Ludwig Maximilian University, Munich, and his associates.

Pruritus aggravates atopic dermatitis and has been linked to loss of sleep, depression, aggressiveness, body disfiguration, and suicidal thoughts. Existing treatments, including emollients, topical glucocorticoids, calcineurin inhibitors, and oral antihistamines, have limited efficacy and can cause adverse effects when used long term, the investigators noted.

They assessed nemolizumab in a manufacturer-funded multiple-dose trial involving 264 adults in the United States, Europe, and Japan who had refractory moderate to severe atopic dermatitis, inadequately controlled with topical treatments. Study participants were randomly assigned in a double blind fashion to receive 12 weeks of 0.1 mg/kg nemolizumab (53 patients), 0.5 mg/kg nemolizumab (54 patients), 2.0 mg/kg nemolizumab (52 patients), or placebo (53 control subjects) every 4 weeks. Another 52 participants were given 2.0 mg/kg nemolizumab every 8 weeks in an exploratory analysis. All the study participants were permitted to use emollients and localized treatments, and some were permitted by the investigators to use a potent topical glucocorticoid as rescue therapy after week 4.

A total of 216 patients (82%) completed the trial.

The primary efficacy endpoint was the percentage improvement at week 12 in scores on a pruritus visual analogue scale, which patients recorded electronically every day. These scores improved significantly in a dose-dependent manner for active treatment, compared with placebo. Pruritus declined by 43.7% with the 0.1 mg/kg dose (P =.002), 59.8% with the 0.5 mg/kg dose (P less than .001), and 63.1% with the 2.0 mg/kg dose (P less than .001), compared with 20.9% with placebo.

Nemolizumab also bested placebo in several secondary endpoints including scores on a verbal rating of pruritus, the Eczema Area and Severity Index, and the static Investigator’s Global Assessment, the investigators said (N Engl J Med 2017;376:826-35. doi: 10.1056/NEJMoa1606490).

The study population was too small to allow the investigators to draw conclusions regarding adverse events, even before a relatively high number of participants dropped out. However, patients who received active treatment had a higher rate of dermatitis exacerbations and peripheral edema than did those who received placebo.

The group given 0.5 mg/kg nemolizumab every month showed the greatest treatment benefit and the best benefit-to-risk profile, Dr. Ruzicka and his associates said.

This trial was funded by Chugai Pharmaceutical, which also participated in the study design, data collection and analysis, and preparation of the manuscript. Dr. Ruzicka reported receiving research grants and personal fees from Chugai and honoraria from Astellas; his associates reported ties to numerous industry sources.

Monthly subcutaneous injections of nemolizumab, a humanized monoclonal antibody that inhibits interleukin-31 signaling, significantly improved pruritus associated with atopic dermatitis (AD) in a small, 3-month phase II trial. The results were published online March 2 in the New England Journal of Medicine.

“Although this trial has limitations, most notably the small number of patients and short duration, it provides evidence supporting the role of interleukin-31 in the pathobiologic mechanism of atopic dermatitis,” said Thomas Ruzicka, MD, of the department of dermatology and allergology, Ludwig Maximilian University, Munich, and his associates.

Pruritus aggravates atopic dermatitis and has been linked to loss of sleep, depression, aggressiveness, body disfiguration, and suicidal thoughts. Existing treatments, including emollients, topical glucocorticoids, calcineurin inhibitors, and oral antihistamines, have limited efficacy and can cause adverse effects when used long term, the investigators noted.

They assessed nemolizumab in a manufacturer-funded multiple-dose trial involving 264 adults in the United States, Europe, and Japan who had refractory moderate to severe atopic dermatitis, inadequately controlled with topical treatments. Study participants were randomly assigned in a double blind fashion to receive 12 weeks of 0.1 mg/kg nemolizumab (53 patients), 0.5 mg/kg nemolizumab (54 patients), 2.0 mg/kg nemolizumab (52 patients), or placebo (53 control subjects) every 4 weeks. Another 52 participants were given 2.0 mg/kg nemolizumab every 8 weeks in an exploratory analysis. All the study participants were permitted to use emollients and localized treatments, and some were permitted by the investigators to use a potent topical glucocorticoid as rescue therapy after week 4.

A total of 216 patients (82%) completed the trial.

The primary efficacy endpoint was the percentage improvement at week 12 in scores on a pruritus visual analogue scale, which patients recorded electronically every day. These scores improved significantly in a dose-dependent manner for active treatment, compared with placebo. Pruritus declined by 43.7% with the 0.1 mg/kg dose (P =.002), 59.8% with the 0.5 mg/kg dose (P less than .001), and 63.1% with the 2.0 mg/kg dose (P less than .001), compared with 20.9% with placebo.

Nemolizumab also bested placebo in several secondary endpoints including scores on a verbal rating of pruritus, the Eczema Area and Severity Index, and the static Investigator’s Global Assessment, the investigators said (N Engl J Med 2017;376:826-35. doi: 10.1056/NEJMoa1606490).

The study population was too small to allow the investigators to draw conclusions regarding adverse events, even before a relatively high number of participants dropped out. However, patients who received active treatment had a higher rate of dermatitis exacerbations and peripheral edema than did those who received placebo.

The group given 0.5 mg/kg nemolizumab every month showed the greatest treatment benefit and the best benefit-to-risk profile, Dr. Ruzicka and his associates said.

This trial was funded by Chugai Pharmaceutical, which also participated in the study design, data collection and analysis, and preparation of the manuscript. Dr. Ruzicka reported receiving research grants and personal fees from Chugai and honoraria from Astellas; his associates reported ties to numerous industry sources.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

Key clinical point: Monthly nemolizumab injections significantly improved pruritus in adults with moderate to severe atopic dermatitis.

Major finding: Pruritus declined by 43.7% with the 0.1-mg/kg dose, 59.8% with the 0.5-mg/kg dose, and 63.1% with the 2.0-mg/kg dose, compared with 20.9% with placebo.

Data source: A manufacturer-funded international randomized double-blind placebo-controlled phase II trial of 216 adults with moderate to severe AD treated for 12 weeks.

Disclosures: This trial was funded by Chugai Pharmaceutical, which also participated in the study design, data collection and analysis, and preparation of the manuscript. Dr. Ruzicka reported receiving research grants and personal fees from Chugai and honoraria from Astellas; his associates reported ties to numerous industry sources.

Blinatumomab superior to chemotherapy for refractory ALL

Article Type
Changed
Fri, 01/04/2019 - 10:00

 

Blinatumomab proved superior to standard chemotherapy for relapsed or refractory acute lymphoblastic leukemia (ALL), based on results of an international phase III trial reported online March 2 in the New England Journal of Medicine.

The trial was halted early when an interim analysis revealed the clear benefit with blinatumomab, Hagop Kantarjian, MD, chair of the department of leukemia, University of Texas MD Anderson Cancer Center, Houston, and his associates wrote.

The manufacturer-sponsored open-label study included 376 adults with Ph-negative B-cell precursor ALL that was either refractory to primary induction therapy or to salvage with intensive combination chemotherapy, first relapse with the first remission lasting less than 12 months, second or greater relapse, or relapse at any time after allogeneic stem cell transplantation.

Study participants were randomly assigned to receive either blinatumomab (267 patients) or the investigator’s choice of one of four protocol-defined regimens of standard chemotherapy (109 patients) and were followed at 101 medical centers in 21 countries for a median of 11 months.

For each 6-week cycle of blinatumomab therapy, patients received treatment for 4 weeks (9 mcg blinatumomab per day during week 1 of induction cycle one and 28 mcg/day thereafter, by continuous infusion) and then no treatment for 2 weeks.

Maintenance treatment with blinatumomab was given as a 4-week continuous infusion every 12 weeks.

At the interim analysis – when 75% of the total number of planned deaths for the final analysis had occurred – the monitoring committee recommended that the trial be stopped early because of the benefit observed with blinatumomab therapy. Median overall survival was significantly longer with blinatumomab (7.7 months) than with chemotherapy (4 months), with a hazard ratio for death of 0.71. The estimated survival at 6 months was 54% with blinatumomab and 39% with chemotherapy.

Remission rates also favored blinatumomab: Rates of complete remission with full hematologic recovery were 34% vs. 16% and rates of complete remission with full, partial, or incomplete hematologic recovery were 44% vs. 25%.

In addition, the median duration of remission was 7.3 months with blinatumomab and 4.6 months with chemotherapy. And 6-month estimates of event-free survival were 31% vs. 12%. These survival and remission benefits were consistent across all subgroups of patients and persisted in several sensitivity analyses, the investigators said (N Engl J Med. 2017 Mar 2. doi: 10.1056/nejmOA1609783).

A total of 24% of the patients in the blinatumomab group and 24% of the patients in the chemotherapy group underwent allogeneic stem cell transplantation, with comparable outcomes and death rates.

Serious adverse events occurred in 62% of patients receiving blinatumomab and in 45% of those receiving chemotherapy, including fatal adverse events in 19% and 17%, respectively. The fatal events were considered to be related to treatment in 3% of the blinatumomab group and in 7% of the chemotherapy group. Rates of treatment discontinuation from an adverse event were 12% and 8%, respectively.

Patient-reported health status and quality of life improved with blinatumomab but worsened with chemotherapy.

“Given the previous exposure of these patients to myelosuppressive and immunosuppressive treatments, the activity of an immune-based therapy such as blinatumomab, which depends on functioning T cells for its activity, provides encouragement that responses may be further enhanced and made durable with additional immune activation strategies,” Dr. Kantarjian and his associates noted.

Dr. Kantarjian reported receiving research support from Amgen, Pfizer, Bristol-Myers Squibb, Novartis, and ARIAD; his associates reported ties to numerous industry sources.

Publications
Topics
Sections

 

Blinatumomab proved superior to standard chemotherapy for relapsed or refractory acute lymphoblastic leukemia (ALL), based on results of an international phase III trial reported online March 2 in the New England Journal of Medicine.

The trial was halted early when an interim analysis revealed the clear benefit with blinatumomab, Hagop Kantarjian, MD, chair of the department of leukemia, University of Texas MD Anderson Cancer Center, Houston, and his associates wrote.

The manufacturer-sponsored open-label study included 376 adults with Ph-negative B-cell precursor ALL that was either refractory to primary induction therapy or to salvage with intensive combination chemotherapy, first relapse with the first remission lasting less than 12 months, second or greater relapse, or relapse at any time after allogeneic stem cell transplantation.

Study participants were randomly assigned to receive either blinatumomab (267 patients) or the investigator’s choice of one of four protocol-defined regimens of standard chemotherapy (109 patients) and were followed at 101 medical centers in 21 countries for a median of 11 months.

For each 6-week cycle of blinatumomab therapy, patients received treatment for 4 weeks (9 mcg blinatumomab per day during week 1 of induction cycle one and 28 mcg/day thereafter, by continuous infusion) and then no treatment for 2 weeks.

Maintenance treatment with blinatumomab was given as a 4-week continuous infusion every 12 weeks.

At the interim analysis – when 75% of the total number of planned deaths for the final analysis had occurred – the monitoring committee recommended that the trial be stopped early because of the benefit observed with blinatumomab therapy. Median overall survival was significantly longer with blinatumomab (7.7 months) than with chemotherapy (4 months), with a hazard ratio for death of 0.71. The estimated survival at 6 months was 54% with blinatumomab and 39% with chemotherapy.

Remission rates also favored blinatumomab: Rates of complete remission with full hematologic recovery were 34% vs. 16% and rates of complete remission with full, partial, or incomplete hematologic recovery were 44% vs. 25%.

In addition, the median duration of remission was 7.3 months with blinatumomab and 4.6 months with chemotherapy. And 6-month estimates of event-free survival were 31% vs. 12%. These survival and remission benefits were consistent across all subgroups of patients and persisted in several sensitivity analyses, the investigators said (N Engl J Med. 2017 Mar 2. doi: 10.1056/nejmOA1609783).

A total of 24% of the patients in the blinatumomab group and 24% of the patients in the chemotherapy group underwent allogeneic stem cell transplantation, with comparable outcomes and death rates.

Serious adverse events occurred in 62% of patients receiving blinatumomab and in 45% of those receiving chemotherapy, including fatal adverse events in 19% and 17%, respectively. The fatal events were considered to be related to treatment in 3% of the blinatumomab group and in 7% of the chemotherapy group. Rates of treatment discontinuation from an adverse event were 12% and 8%, respectively.

Patient-reported health status and quality of life improved with blinatumomab but worsened with chemotherapy.

“Given the previous exposure of these patients to myelosuppressive and immunosuppressive treatments, the activity of an immune-based therapy such as blinatumomab, which depends on functioning T cells for its activity, provides encouragement that responses may be further enhanced and made durable with additional immune activation strategies,” Dr. Kantarjian and his associates noted.

Dr. Kantarjian reported receiving research support from Amgen, Pfizer, Bristol-Myers Squibb, Novartis, and ARIAD; his associates reported ties to numerous industry sources.

 

Blinatumomab proved superior to standard chemotherapy for relapsed or refractory acute lymphoblastic leukemia (ALL), based on results of an international phase III trial reported online March 2 in the New England Journal of Medicine.

The trial was halted early when an interim analysis revealed the clear benefit with blinatumomab, Hagop Kantarjian, MD, chair of the department of leukemia, University of Texas MD Anderson Cancer Center, Houston, and his associates wrote.

The manufacturer-sponsored open-label study included 376 adults with Ph-negative B-cell precursor ALL that was either refractory to primary induction therapy or to salvage with intensive combination chemotherapy, first relapse with the first remission lasting less than 12 months, second or greater relapse, or relapse at any time after allogeneic stem cell transplantation.

Study participants were randomly assigned to receive either blinatumomab (267 patients) or the investigator’s choice of one of four protocol-defined regimens of standard chemotherapy (109 patients) and were followed at 101 medical centers in 21 countries for a median of 11 months.

For each 6-week cycle of blinatumomab therapy, patients received treatment for 4 weeks (9 mcg blinatumomab per day during week 1 of induction cycle one and 28 mcg/day thereafter, by continuous infusion) and then no treatment for 2 weeks.

Maintenance treatment with blinatumomab was given as a 4-week continuous infusion every 12 weeks.

At the interim analysis – when 75% of the total number of planned deaths for the final analysis had occurred – the monitoring committee recommended that the trial be stopped early because of the benefit observed with blinatumomab therapy. Median overall survival was significantly longer with blinatumomab (7.7 months) than with chemotherapy (4 months), with a hazard ratio for death of 0.71. The estimated survival at 6 months was 54% with blinatumomab and 39% with chemotherapy.

Remission rates also favored blinatumomab: Rates of complete remission with full hematologic recovery were 34% vs. 16% and rates of complete remission with full, partial, or incomplete hematologic recovery were 44% vs. 25%.

In addition, the median duration of remission was 7.3 months with blinatumomab and 4.6 months with chemotherapy. And 6-month estimates of event-free survival were 31% vs. 12%. These survival and remission benefits were consistent across all subgroups of patients and persisted in several sensitivity analyses, the investigators said (N Engl J Med. 2017 Mar 2. doi: 10.1056/nejmOA1609783).

A total of 24% of the patients in the blinatumomab group and 24% of the patients in the chemotherapy group underwent allogeneic stem cell transplantation, with comparable outcomes and death rates.

Serious adverse events occurred in 62% of patients receiving blinatumomab and in 45% of those receiving chemotherapy, including fatal adverse events in 19% and 17%, respectively. The fatal events were considered to be related to treatment in 3% of the blinatumomab group and in 7% of the chemotherapy group. Rates of treatment discontinuation from an adverse event were 12% and 8%, respectively.

Patient-reported health status and quality of life improved with blinatumomab but worsened with chemotherapy.

“Given the previous exposure of these patients to myelosuppressive and immunosuppressive treatments, the activity of an immune-based therapy such as blinatumomab, which depends on functioning T cells for its activity, provides encouragement that responses may be further enhanced and made durable with additional immune activation strategies,” Dr. Kantarjian and his associates noted.

Dr. Kantarjian reported receiving research support from Amgen, Pfizer, Bristol-Myers Squibb, Novartis, and ARIAD; his associates reported ties to numerous industry sources.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Blinatumomab proved superior to standard chemotherapy for relapsed or refractory ALL in a phase III trial.

Major finding: Median overall survival with blinatumomab (7.7 months) was significantly longer than with chemotherapy (4.0 months), and rates of complete remission with full hematologic recovery were 34% vs 16%, respectively.

Data source: A manufacturer-sponsored international randomized open-label phase-3 trial involving 376 adults followed for 11 months.

Disclosures: This trial was funded by Amgen, which also participated in the study design, data analysis, and report writing. Dr. Kantarjian reported receiving research support from Amgen, Pfizer, Bristol-Myers Squibb, Novartis, and ARIAD; his associates reported ties to numerous industry sources.

Sirukumab found effective, safe for highly refractory RA

Comparison study would be useful
Article Type
Changed
Thu, 12/06/2018 - 11:34

 

The investigational interleukin-6 inhibitor sirukumab proved effective and safe for rheumatoid arthritis patients who failed to respond to or were intolerant of multiple previous therapies in a phase III trial reported online in The Lancet.

Body

 

It would be useful to compare sirukumab’s efficacy against that of two other inhibitors of the interleukin-6 pathway, tocilizumab (Actemra) and sarilumab.

Dr. Roy M. Fleischmann
But until a head-to-head study is performed, it is likely that at least some patients will find sirukumab to be superior to these agents. The efficacy and the risk-benefit profile reported here support the use of sirukumab for active RA in patients who are refractory to TNF inhibitors and other treatments.

Roy Fleischmann, MD, is with the University of Texas Southwestern Medical Center and Metroplex Clinical Research Center, both in Dallas. He reported receiving research grants and consulting fees from Genentech-Roche, Sanofi-Aventis, and GlaxoSmithKline. Dr. Fleischmann made these remarks in editorial accompanying Dr. Aletaha and colleagues’ report ( Lancet. 2017 Feb 15. doi: 10.1016/S0140-6736[17]30405-1 ).

Publications
Topics
Sections
Related Articles
Body

 

It would be useful to compare sirukumab’s efficacy against that of two other inhibitors of the interleukin-6 pathway, tocilizumab (Actemra) and sarilumab.

Dr. Roy M. Fleischmann
But until a head-to-head study is performed, it is likely that at least some patients will find sirukumab to be superior to these agents. The efficacy and the risk-benefit profile reported here support the use of sirukumab for active RA in patients who are refractory to TNF inhibitors and other treatments.

Roy Fleischmann, MD, is with the University of Texas Southwestern Medical Center and Metroplex Clinical Research Center, both in Dallas. He reported receiving research grants and consulting fees from Genentech-Roche, Sanofi-Aventis, and GlaxoSmithKline. Dr. Fleischmann made these remarks in editorial accompanying Dr. Aletaha and colleagues’ report ( Lancet. 2017 Feb 15. doi: 10.1016/S0140-6736[17]30405-1 ).

Body

 

It would be useful to compare sirukumab’s efficacy against that of two other inhibitors of the interleukin-6 pathway, tocilizumab (Actemra) and sarilumab.

Dr. Roy M. Fleischmann
But until a head-to-head study is performed, it is likely that at least some patients will find sirukumab to be superior to these agents. The efficacy and the risk-benefit profile reported here support the use of sirukumab for active RA in patients who are refractory to TNF inhibitors and other treatments.

Roy Fleischmann, MD, is with the University of Texas Southwestern Medical Center and Metroplex Clinical Research Center, both in Dallas. He reported receiving research grants and consulting fees from Genentech-Roche, Sanofi-Aventis, and GlaxoSmithKline. Dr. Fleischmann made these remarks in editorial accompanying Dr. Aletaha and colleagues’ report ( Lancet. 2017 Feb 15. doi: 10.1016/S0140-6736[17]30405-1 ).

Title
Comparison study would be useful
Comparison study would be useful

 

The investigational interleukin-6 inhibitor sirukumab proved effective and safe for rheumatoid arthritis patients who failed to respond to or were intolerant of multiple previous therapies in a phase III trial reported online in The Lancet.

 

The investigational interleukin-6 inhibitor sirukumab proved effective and safe for rheumatoid arthritis patients who failed to respond to or were intolerant of multiple previous therapies in a phase III trial reported online in The Lancet.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE LANCET

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Sirukumab proved effective and safe for RA patients who failed to respond to or were intolerant of multiple previous therapies.

Key numerical finding: The primary efficacy endpoint – the proportion of patients achieving an ACR20 response at week 16 – was 40% for low-dose and 45% for high-dose sirukumab, compared with 24% for placebo.

Data source: A manufacturer-sponsored, international, randomized, double-blind, placebo-controlled, phase III trial involving 878 adults with refractory RA.

Disclosures: This trial was funded by Janssen and GlaxoSmithKline, which also participated in the study design, data collection and analysis, and writing of the results. Dr. Aletaha reported serving as a consultant for or receiving research support from AbbVie, Pfizer, Grünenthal, Merck, Medac, UCB, Mitsubishi/Tanabe, Janssen, and Roche. His associates reported ties to numerous industry sources.

Rectal cancer proportion in young doubled

Some trends in colorectal cancer may have dietary and environmental influences
Article Type
Changed
Wed, 04/05/2017 - 16:02

 

The proportion of rectal cancer cases diagnosed in people younger than 55 years doubled over the past 2 decades, according to a report published online in the Journal of the National Cancer Institute.

In contrast, the proportion diagnosed in people older than 55 years has decreased over the last 4 decades, said Rebecca L. Siegel, MPH, strategic director of surveillance information services of surveillance and health services research at the American Cancer Society and her associates.

Courtesy Wikimedia Commons/nephron/Creative Commons License
A tumor budding in colorectal carcinoma is shown here.
They examined time trends in colorectal cancer (CRC) incidence using data from nine geographical areas in the Surveillance, Epidemiology, and End Results program regarding people aged 20 years and older who were diagnosed between 1974 and 2013. They used a statistical tool called age-period-cohort modeling to help differentiate factors that influence all age groups (period effects), such as changes in medical practice, from factors that vary by generation (cohort effects), which typically result from behavioral changes (J Natl Cancer Inst. 2017. doi: 10.1093/jnci/djw322). The study population comprised 490,305 patients.

The incidence of rectal cancer increased by 3.2% per year during the study period among patients aged 20-29 years and in those aged 30-39 years. It didn’t begin rising until the 1990s in adults aged 40-49 years and 50-54 years, and then it rose by a smaller amount – 2.3% per year. In contrast, the incidence of rectal cancer generally declined throughout the 40-year study period among adults aged 55 and older.

Because of these opposing trends, there was a net increase in rectal cancer of 4% per year for people in their twenties together with a net decrease of 2% per year for those aged 75 years and older.

The decreasing rate of rectal cancer in older adults “may partly reflect detection and removal of precancerous lesions during clinical inspection of the rectum, which was common practice well before formal [CRC] screening. Inherent differences within the colorectum in the way environmental factors initiate and or promote carcinogenesis, as well as the influence of unknown risk factors, may also have contributed,” Ms. Siegel and her associates said.

The temporal pattern was somewhat different for colon cancer. The risk of colon cancer declined “for successive generations during the first half of the twentieth century but has escalated back to the level of those born circa 1890 for current birth cohorts.”

“The strong birth cohort effects we observed signal relatively recent changes in exposures that influence risk,” including excess body weight, high intake of processed meat, low intake of dietary fiber, and low levels of physical activity. “New strategies to curb the obesity epidemic and shift Americans toward healthier eating and more active lifestyles” are needed, the researchers said.

In addition, both clinicians and the public must be educated about the rising probability of the disease in people younger than 55 years. Timely follow-up of symptoms, regardless of patient age, must be emphasized. Younger adults are nearly 60% more likely than are older adults to be diagnosed with advanced CRC, largely because they delay seeking medical care. The disease simply isn’t “on the radar” of young adults or their providers, the investigators added.

This study was supported by the American Cancer Society and the National Institutes of Health. Ms. Siegel and her associates did not provide their conflicts of interest.

 

AGA Resource

The AGA Colorectal Cancer Clinical Service Line provides tools to help you become more efficient, understand quality standards and improve the process of care for patients: http://www.gastro.org/patient-care/conditions-diseases/colorectal-cancer

Body

Colorectal cancer has been a “good news” story over the past 10-15 years. In the United States we have seen 30% reduction in both incidence and mortality over 10 years. This may be due to many factors, including increased rates of screening. The increased use of aspirin for cardiovascular protection, NSAIDs for joint and muscle pain, use of hormone replacement therapy, and reductions in smoking all likely contribute to the trend in CRC reduction.

Dr. David Lieberman
Despite this good news, there is further evidence of rising incidence of CRC in individuals less than 54 years over the past 30 years. Rates of both colon and rectal cancer are increasing for 20 to 54-year-olds. This age group represent a small absolute risk of CRC, accounting for less than 10% of CRC, but the trend is disturbing and begs explanation. Obesity, diabetes, and metabolic syndrome are increasing in younger individuals, and these are potential risk factors for CRC.

New or changing environmental exposures may place younger people at risk. The introduction of industrialized food in our diet over the past 4 decades could have both direct and indirect effects. It is possible that some food chemicals could be carcinogenic, but it is also quite possible that alteration of the microbiome by diet and environmental factors could lead to development of neoplasia in predisposed individuals. The use of antibiotics in our food chain may alter the microbiome.

There is considerable state-to-state variation in rates of CRC incidence and mortality. This is not new, but remains largely unexplained. The highest risk appears to be in the so-called “Rust Belt” and deep South, raising questions about environmental exposures that might predispose to CRC. Lower rates in states like Texas, Colorado, and California may be influenced by the population mix. There is evidence that Hispanics may have lower age-adjusted risk of CRC than blacks and Caucasians, so higher proportions of low-risk groups could impact the statewide risk of CRC. The differences between high-risk (West Virginia’s death rate of 23.4/100,000) and low-risk (Utah’s death rate 8.7/100,000) are too large to be explained by demographic differences alone, and strongly suggests an environmental culprit.

David Lieberman, MD, is professor of medicine; chief of the division of gastroenterology and hepatology, Oregon Health and Science University, Portland; and Vice President of the AGA Institute.

Publications
Topics
Sections
Related Articles
Body

Colorectal cancer has been a “good news” story over the past 10-15 years. In the United States we have seen 30% reduction in both incidence and mortality over 10 years. This may be due to many factors, including increased rates of screening. The increased use of aspirin for cardiovascular protection, NSAIDs for joint and muscle pain, use of hormone replacement therapy, and reductions in smoking all likely contribute to the trend in CRC reduction.

Dr. David Lieberman
Despite this good news, there is further evidence of rising incidence of CRC in individuals less than 54 years over the past 30 years. Rates of both colon and rectal cancer are increasing for 20 to 54-year-olds. This age group represent a small absolute risk of CRC, accounting for less than 10% of CRC, but the trend is disturbing and begs explanation. Obesity, diabetes, and metabolic syndrome are increasing in younger individuals, and these are potential risk factors for CRC.

New or changing environmental exposures may place younger people at risk. The introduction of industrialized food in our diet over the past 4 decades could have both direct and indirect effects. It is possible that some food chemicals could be carcinogenic, but it is also quite possible that alteration of the microbiome by diet and environmental factors could lead to development of neoplasia in predisposed individuals. The use of antibiotics in our food chain may alter the microbiome.

There is considerable state-to-state variation in rates of CRC incidence and mortality. This is not new, but remains largely unexplained. The highest risk appears to be in the so-called “Rust Belt” and deep South, raising questions about environmental exposures that might predispose to CRC. Lower rates in states like Texas, Colorado, and California may be influenced by the population mix. There is evidence that Hispanics may have lower age-adjusted risk of CRC than blacks and Caucasians, so higher proportions of low-risk groups could impact the statewide risk of CRC. The differences between high-risk (West Virginia’s death rate of 23.4/100,000) and low-risk (Utah’s death rate 8.7/100,000) are too large to be explained by demographic differences alone, and strongly suggests an environmental culprit.

David Lieberman, MD, is professor of medicine; chief of the division of gastroenterology and hepatology, Oregon Health and Science University, Portland; and Vice President of the AGA Institute.

Body

Colorectal cancer has been a “good news” story over the past 10-15 years. In the United States we have seen 30% reduction in both incidence and mortality over 10 years. This may be due to many factors, including increased rates of screening. The increased use of aspirin for cardiovascular protection, NSAIDs for joint and muscle pain, use of hormone replacement therapy, and reductions in smoking all likely contribute to the trend in CRC reduction.

Dr. David Lieberman
Despite this good news, there is further evidence of rising incidence of CRC in individuals less than 54 years over the past 30 years. Rates of both colon and rectal cancer are increasing for 20 to 54-year-olds. This age group represent a small absolute risk of CRC, accounting for less than 10% of CRC, but the trend is disturbing and begs explanation. Obesity, diabetes, and metabolic syndrome are increasing in younger individuals, and these are potential risk factors for CRC.

New or changing environmental exposures may place younger people at risk. The introduction of industrialized food in our diet over the past 4 decades could have both direct and indirect effects. It is possible that some food chemicals could be carcinogenic, but it is also quite possible that alteration of the microbiome by diet and environmental factors could lead to development of neoplasia in predisposed individuals. The use of antibiotics in our food chain may alter the microbiome.

There is considerable state-to-state variation in rates of CRC incidence and mortality. This is not new, but remains largely unexplained. The highest risk appears to be in the so-called “Rust Belt” and deep South, raising questions about environmental exposures that might predispose to CRC. Lower rates in states like Texas, Colorado, and California may be influenced by the population mix. There is evidence that Hispanics may have lower age-adjusted risk of CRC than blacks and Caucasians, so higher proportions of low-risk groups could impact the statewide risk of CRC. The differences between high-risk (West Virginia’s death rate of 23.4/100,000) and low-risk (Utah’s death rate 8.7/100,000) are too large to be explained by demographic differences alone, and strongly suggests an environmental culprit.

David Lieberman, MD, is professor of medicine; chief of the division of gastroenterology and hepatology, Oregon Health and Science University, Portland; and Vice President of the AGA Institute.

Title
Some trends in colorectal cancer may have dietary and environmental influences
Some trends in colorectal cancer may have dietary and environmental influences

 

The proportion of rectal cancer cases diagnosed in people younger than 55 years doubled over the past 2 decades, according to a report published online in the Journal of the National Cancer Institute.

In contrast, the proportion diagnosed in people older than 55 years has decreased over the last 4 decades, said Rebecca L. Siegel, MPH, strategic director of surveillance information services of surveillance and health services research at the American Cancer Society and her associates.

Courtesy Wikimedia Commons/nephron/Creative Commons License
A tumor budding in colorectal carcinoma is shown here.
They examined time trends in colorectal cancer (CRC) incidence using data from nine geographical areas in the Surveillance, Epidemiology, and End Results program regarding people aged 20 years and older who were diagnosed between 1974 and 2013. They used a statistical tool called age-period-cohort modeling to help differentiate factors that influence all age groups (period effects), such as changes in medical practice, from factors that vary by generation (cohort effects), which typically result from behavioral changes (J Natl Cancer Inst. 2017. doi: 10.1093/jnci/djw322). The study population comprised 490,305 patients.

The incidence of rectal cancer increased by 3.2% per year during the study period among patients aged 20-29 years and in those aged 30-39 years. It didn’t begin rising until the 1990s in adults aged 40-49 years and 50-54 years, and then it rose by a smaller amount – 2.3% per year. In contrast, the incidence of rectal cancer generally declined throughout the 40-year study period among adults aged 55 and older.

Because of these opposing trends, there was a net increase in rectal cancer of 4% per year for people in their twenties together with a net decrease of 2% per year for those aged 75 years and older.

The decreasing rate of rectal cancer in older adults “may partly reflect detection and removal of precancerous lesions during clinical inspection of the rectum, which was common practice well before formal [CRC] screening. Inherent differences within the colorectum in the way environmental factors initiate and or promote carcinogenesis, as well as the influence of unknown risk factors, may also have contributed,” Ms. Siegel and her associates said.

The temporal pattern was somewhat different for colon cancer. The risk of colon cancer declined “for successive generations during the first half of the twentieth century but has escalated back to the level of those born circa 1890 for current birth cohorts.”

“The strong birth cohort effects we observed signal relatively recent changes in exposures that influence risk,” including excess body weight, high intake of processed meat, low intake of dietary fiber, and low levels of physical activity. “New strategies to curb the obesity epidemic and shift Americans toward healthier eating and more active lifestyles” are needed, the researchers said.

In addition, both clinicians and the public must be educated about the rising probability of the disease in people younger than 55 years. Timely follow-up of symptoms, regardless of patient age, must be emphasized. Younger adults are nearly 60% more likely than are older adults to be diagnosed with advanced CRC, largely because they delay seeking medical care. The disease simply isn’t “on the radar” of young adults or their providers, the investigators added.

This study was supported by the American Cancer Society and the National Institutes of Health. Ms. Siegel and her associates did not provide their conflicts of interest.

 

AGA Resource

The AGA Colorectal Cancer Clinical Service Line provides tools to help you become more efficient, understand quality standards and improve the process of care for patients: http://www.gastro.org/patient-care/conditions-diseases/colorectal-cancer

 

The proportion of rectal cancer cases diagnosed in people younger than 55 years doubled over the past 2 decades, according to a report published online in the Journal of the National Cancer Institute.

In contrast, the proportion diagnosed in people older than 55 years has decreased over the last 4 decades, said Rebecca L. Siegel, MPH, strategic director of surveillance information services of surveillance and health services research at the American Cancer Society and her associates.

Courtesy Wikimedia Commons/nephron/Creative Commons License
A tumor budding in colorectal carcinoma is shown here.
They examined time trends in colorectal cancer (CRC) incidence using data from nine geographical areas in the Surveillance, Epidemiology, and End Results program regarding people aged 20 years and older who were diagnosed between 1974 and 2013. They used a statistical tool called age-period-cohort modeling to help differentiate factors that influence all age groups (period effects), such as changes in medical practice, from factors that vary by generation (cohort effects), which typically result from behavioral changes (J Natl Cancer Inst. 2017. doi: 10.1093/jnci/djw322). The study population comprised 490,305 patients.

The incidence of rectal cancer increased by 3.2% per year during the study period among patients aged 20-29 years and in those aged 30-39 years. It didn’t begin rising until the 1990s in adults aged 40-49 years and 50-54 years, and then it rose by a smaller amount – 2.3% per year. In contrast, the incidence of rectal cancer generally declined throughout the 40-year study period among adults aged 55 and older.

Because of these opposing trends, there was a net increase in rectal cancer of 4% per year for people in their twenties together with a net decrease of 2% per year for those aged 75 years and older.

The decreasing rate of rectal cancer in older adults “may partly reflect detection and removal of precancerous lesions during clinical inspection of the rectum, which was common practice well before formal [CRC] screening. Inherent differences within the colorectum in the way environmental factors initiate and or promote carcinogenesis, as well as the influence of unknown risk factors, may also have contributed,” Ms. Siegel and her associates said.

The temporal pattern was somewhat different for colon cancer. The risk of colon cancer declined “for successive generations during the first half of the twentieth century but has escalated back to the level of those born circa 1890 for current birth cohorts.”

“The strong birth cohort effects we observed signal relatively recent changes in exposures that influence risk,” including excess body weight, high intake of processed meat, low intake of dietary fiber, and low levels of physical activity. “New strategies to curb the obesity epidemic and shift Americans toward healthier eating and more active lifestyles” are needed, the researchers said.

In addition, both clinicians and the public must be educated about the rising probability of the disease in people younger than 55 years. Timely follow-up of symptoms, regardless of patient age, must be emphasized. Younger adults are nearly 60% more likely than are older adults to be diagnosed with advanced CRC, largely because they delay seeking medical care. The disease simply isn’t “on the radar” of young adults or their providers, the investigators added.

This study was supported by the American Cancer Society and the National Institutes of Health. Ms. Siegel and her associates did not provide their conflicts of interest.

 

AGA Resource

The AGA Colorectal Cancer Clinical Service Line provides tools to help you become more efficient, understand quality standards and improve the process of care for patients: http://www.gastro.org/patient-care/conditions-diseases/colorectal-cancer

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME

Proportion of rectal cancer in young adults doubles

‘An environmental culprit’?
Article Type
Changed
Wed, 05/26/2021 - 13:53

 

The proportion of rectal cancer cases diagnosed in people younger than 55 years doubled over the past 2 decades, according to a report published online Feb. 28 in the Journal of the National Cancer Institute.

In contrast, the proportion diagnosed in people older than 55 years has decreased over the last 4 decades, said Rebecca L. Siegel, MPH, strategic director of surveillance information services of surveillance and health services research at the American Cancer Society and her associates.

Courtesy Wikimedia Commons/nephron/Creative Commons License
A tumor budding in colorectal carcinoma is shown here.
They examined time trends in colorectal cancer incidence using data from nine geographical areas in the Surveillance, Epidemiology, and End Results program regarding people aged 20 years and older who were diagnosed between 1974 and 2013. They used a statistical tool called age-period-cohort modeling to help differentiate factors that influence all age groups (period effects), such as changes in medical practice, from factors that vary by generation (cohort effects), which typically result from behavioral changes (J Natl Cancer Inst. 2017. doi: 10.1093/jnci/djw322).

The study population comprised 490,305 patients.

The incidence of rectal cancer increased by 3.2% per year during the study period among patients aged 20-29 years and in those aged 30-39 years. It didn’t begin rising until the 1990s in adults aged 40-49 years and 50-54 years, and then it rose by a smaller amount – 2.3% per year. In contrast, the incidence of rectal cancer generally declined throughout the 40-year study period among adults aged 55 and older.

Because of these opposing trends, there was a net increase in rectal cancer of 4% per year for people in their twenties together with a net decrease of 2% per year for those aged 75 years and older.

The decreasing rate of rectal cancer in older adults “may partly reflect detection and removal of precancerous lesions during clinical inspection of the rectum, which was common practice well before formal colorectal cancer screening. Inherent differences within the colorectum in the way environmental factors initiate and or promote carcinogenesis, as well as the influence of unknown risk factors, may also have contributed,” Ms. Siegel and her associates said.

The temporal pattern was somewhat different for colon cancer. The risk of colon cancer declined “for successive generations during the first half of the twentieth century but has escalated back to the level of those born circa 1890 for current birth cohorts.”

The rising incidence of both colon and rectal cancers among younger adults is “sobering,” given that such trends “often provide a bellwether of the future disease burden,” they noted.

“The strong birth cohort effects we observed signal relatively recent changes in exposures that influence risk,” including excess body weight, high intake of processed meat, low intake of dietary fiber, and low levels of physical activity. “New strategies to curb the obesity epidemic and shift Americans toward healthier eating and more active lifestyles” are needed, the researchers said.

In addition, both clinicians and the public must be educated about the rising probability of the disease in people younger than 55 years. Timely follow-up of symptoms, regardless of patient age, must be emphasized. Younger adults are nearly 60% more likely than are older adults to be diagnosed with advanced colorectal cancer, largely because they delay seeking medical care. The disease simply isn’t “on the radar” of young adults or their providers, the investigators added.

Body

Colorectal cancer has been a “good news” story over the past 10-15 years. In the United States we have seen 30% reduction in both incidence and mortality over 10 years. This may be due to many factors, including increased rates of screening. The increased use of aspirin for cardiovascular protection, NSAIDs for joint and muscle pain, use of hormone replacement therapy, and reductions in smoking all likely contribute to the trend in CRC reduction.

Dr. David Lieberman
Despite this good news, there is further evidence of rising incidence of CRC in individuals less than 54 years over the past 30 years.  Rates of both colon and rectal cancer are increasing for 20 to 54-year-olds. This age group represent a small absolute risk of CRC, accounting for less than 10% of CRC, but the trend is disturbing and begs explanation. Obesity, diabetes, and metabolic syndrome are increasing in younger individuals, and these are potential risk factors for CRC.

New or changing environmental exposures may place younger people at risk. The introduction of industrialized food in our diet over the past 4 decades could have both direct and indirect effects. It is possible that some food chemicals could be carcinogenic, but it is also quite possible that alteration of the microbiome by diet and environmental factors could lead to development of neoplasia in predisposed individuals. The use of antibiotics in our food chain may alter the microbiome.

There is considerable state-to-state variation in rates of CRC incidence and mortality. This is not new, but remains largely unexplained. The highest risk appears to be in the so-called “Rust Belt” and deep South, raising questions about environmental exposures that might predispose to CRC. Lower rates in states like Texas, Colorado, and California may be influenced by the population mix. There is evidence that Hispanics may have lower age-adjusted risk of CRC than blacks and Caucasians, so higher proportions of low-risk groups could impact the statewide risk of CRC. The differences between high-risk (West Virginia’s death rate of 23.4/100,000) and low-risk (Utah’s death rate 8.7/100,000) are too large to be explained by demographic differences alone, and strongly suggests an environmental culprit.
 
David Lieberman, MD, is professor of medicine; chief of the division of gastroenterology and hepatology, Oregon Health and Science University, Portland; and Vice President-elect of AGA.

Publications
Topics
Sections
Related Articles
Body

Colorectal cancer has been a “good news” story over the past 10-15 years. In the United States we have seen 30% reduction in both incidence and mortality over 10 years. This may be due to many factors, including increased rates of screening. The increased use of aspirin for cardiovascular protection, NSAIDs for joint and muscle pain, use of hormone replacement therapy, and reductions in smoking all likely contribute to the trend in CRC reduction.

Dr. David Lieberman
Despite this good news, there is further evidence of rising incidence of CRC in individuals less than 54 years over the past 30 years.  Rates of both colon and rectal cancer are increasing for 20 to 54-year-olds. This age group represent a small absolute risk of CRC, accounting for less than 10% of CRC, but the trend is disturbing and begs explanation. Obesity, diabetes, and metabolic syndrome are increasing in younger individuals, and these are potential risk factors for CRC.

New or changing environmental exposures may place younger people at risk. The introduction of industrialized food in our diet over the past 4 decades could have both direct and indirect effects. It is possible that some food chemicals could be carcinogenic, but it is also quite possible that alteration of the microbiome by diet and environmental factors could lead to development of neoplasia in predisposed individuals. The use of antibiotics in our food chain may alter the microbiome.

There is considerable state-to-state variation in rates of CRC incidence and mortality. This is not new, but remains largely unexplained. The highest risk appears to be in the so-called “Rust Belt” and deep South, raising questions about environmental exposures that might predispose to CRC. Lower rates in states like Texas, Colorado, and California may be influenced by the population mix. There is evidence that Hispanics may have lower age-adjusted risk of CRC than blacks and Caucasians, so higher proportions of low-risk groups could impact the statewide risk of CRC. The differences between high-risk (West Virginia’s death rate of 23.4/100,000) and low-risk (Utah’s death rate 8.7/100,000) are too large to be explained by demographic differences alone, and strongly suggests an environmental culprit.
 
David Lieberman, MD, is professor of medicine; chief of the division of gastroenterology and hepatology, Oregon Health and Science University, Portland; and Vice President-elect of AGA.

Body

Colorectal cancer has been a “good news” story over the past 10-15 years. In the United States we have seen 30% reduction in both incidence and mortality over 10 years. This may be due to many factors, including increased rates of screening. The increased use of aspirin for cardiovascular protection, NSAIDs for joint and muscle pain, use of hormone replacement therapy, and reductions in smoking all likely contribute to the trend in CRC reduction.

Dr. David Lieberman
Despite this good news, there is further evidence of rising incidence of CRC in individuals less than 54 years over the past 30 years.  Rates of both colon and rectal cancer are increasing for 20 to 54-year-olds. This age group represent a small absolute risk of CRC, accounting for less than 10% of CRC, but the trend is disturbing and begs explanation. Obesity, diabetes, and metabolic syndrome are increasing in younger individuals, and these are potential risk factors for CRC.

New or changing environmental exposures may place younger people at risk. The introduction of industrialized food in our diet over the past 4 decades could have both direct and indirect effects. It is possible that some food chemicals could be carcinogenic, but it is also quite possible that alteration of the microbiome by diet and environmental factors could lead to development of neoplasia in predisposed individuals. The use of antibiotics in our food chain may alter the microbiome.

There is considerable state-to-state variation in rates of CRC incidence and mortality. This is not new, but remains largely unexplained. The highest risk appears to be in the so-called “Rust Belt” and deep South, raising questions about environmental exposures that might predispose to CRC. Lower rates in states like Texas, Colorado, and California may be influenced by the population mix. There is evidence that Hispanics may have lower age-adjusted risk of CRC than blacks and Caucasians, so higher proportions of low-risk groups could impact the statewide risk of CRC. The differences between high-risk (West Virginia’s death rate of 23.4/100,000) and low-risk (Utah’s death rate 8.7/100,000) are too large to be explained by demographic differences alone, and strongly suggests an environmental culprit.
 
David Lieberman, MD, is professor of medicine; chief of the division of gastroenterology and hepatology, Oregon Health and Science University, Portland; and Vice President-elect of AGA.

Title
‘An environmental culprit’?
‘An environmental culprit’?

 

The proportion of rectal cancer cases diagnosed in people younger than 55 years doubled over the past 2 decades, according to a report published online Feb. 28 in the Journal of the National Cancer Institute.

In contrast, the proportion diagnosed in people older than 55 years has decreased over the last 4 decades, said Rebecca L. Siegel, MPH, strategic director of surveillance information services of surveillance and health services research at the American Cancer Society and her associates.

Courtesy Wikimedia Commons/nephron/Creative Commons License
A tumor budding in colorectal carcinoma is shown here.
They examined time trends in colorectal cancer incidence using data from nine geographical areas in the Surveillance, Epidemiology, and End Results program regarding people aged 20 years and older who were diagnosed between 1974 and 2013. They used a statistical tool called age-period-cohort modeling to help differentiate factors that influence all age groups (period effects), such as changes in medical practice, from factors that vary by generation (cohort effects), which typically result from behavioral changes (J Natl Cancer Inst. 2017. doi: 10.1093/jnci/djw322).

The study population comprised 490,305 patients.

The incidence of rectal cancer increased by 3.2% per year during the study period among patients aged 20-29 years and in those aged 30-39 years. It didn’t begin rising until the 1990s in adults aged 40-49 years and 50-54 years, and then it rose by a smaller amount – 2.3% per year. In contrast, the incidence of rectal cancer generally declined throughout the 40-year study period among adults aged 55 and older.

Because of these opposing trends, there was a net increase in rectal cancer of 4% per year for people in their twenties together with a net decrease of 2% per year for those aged 75 years and older.

The decreasing rate of rectal cancer in older adults “may partly reflect detection and removal of precancerous lesions during clinical inspection of the rectum, which was common practice well before formal colorectal cancer screening. Inherent differences within the colorectum in the way environmental factors initiate and or promote carcinogenesis, as well as the influence of unknown risk factors, may also have contributed,” Ms. Siegel and her associates said.

The temporal pattern was somewhat different for colon cancer. The risk of colon cancer declined “for successive generations during the first half of the twentieth century but has escalated back to the level of those born circa 1890 for current birth cohorts.”

The rising incidence of both colon and rectal cancers among younger adults is “sobering,” given that such trends “often provide a bellwether of the future disease burden,” they noted.

“The strong birth cohort effects we observed signal relatively recent changes in exposures that influence risk,” including excess body weight, high intake of processed meat, low intake of dietary fiber, and low levels of physical activity. “New strategies to curb the obesity epidemic and shift Americans toward healthier eating and more active lifestyles” are needed, the researchers said.

In addition, both clinicians and the public must be educated about the rising probability of the disease in people younger than 55 years. Timely follow-up of symptoms, regardless of patient age, must be emphasized. Younger adults are nearly 60% more likely than are older adults to be diagnosed with advanced colorectal cancer, largely because they delay seeking medical care. The disease simply isn’t “on the radar” of young adults or their providers, the investigators added.

 

The proportion of rectal cancer cases diagnosed in people younger than 55 years doubled over the past 2 decades, according to a report published online Feb. 28 in the Journal of the National Cancer Institute.

In contrast, the proportion diagnosed in people older than 55 years has decreased over the last 4 decades, said Rebecca L. Siegel, MPH, strategic director of surveillance information services of surveillance and health services research at the American Cancer Society and her associates.

Courtesy Wikimedia Commons/nephron/Creative Commons License
A tumor budding in colorectal carcinoma is shown here.
They examined time trends in colorectal cancer incidence using data from nine geographical areas in the Surveillance, Epidemiology, and End Results program regarding people aged 20 years and older who were diagnosed between 1974 and 2013. They used a statistical tool called age-period-cohort modeling to help differentiate factors that influence all age groups (period effects), such as changes in medical practice, from factors that vary by generation (cohort effects), which typically result from behavioral changes (J Natl Cancer Inst. 2017. doi: 10.1093/jnci/djw322).

The study population comprised 490,305 patients.

The incidence of rectal cancer increased by 3.2% per year during the study period among patients aged 20-29 years and in those aged 30-39 years. It didn’t begin rising until the 1990s in adults aged 40-49 years and 50-54 years, and then it rose by a smaller amount – 2.3% per year. In contrast, the incidence of rectal cancer generally declined throughout the 40-year study period among adults aged 55 and older.

Because of these opposing trends, there was a net increase in rectal cancer of 4% per year for people in their twenties together with a net decrease of 2% per year for those aged 75 years and older.

The decreasing rate of rectal cancer in older adults “may partly reflect detection and removal of precancerous lesions during clinical inspection of the rectum, which was common practice well before formal colorectal cancer screening. Inherent differences within the colorectum in the way environmental factors initiate and or promote carcinogenesis, as well as the influence of unknown risk factors, may also have contributed,” Ms. Siegel and her associates said.

The temporal pattern was somewhat different for colon cancer. The risk of colon cancer declined “for successive generations during the first half of the twentieth century but has escalated back to the level of those born circa 1890 for current birth cohorts.”

The rising incidence of both colon and rectal cancers among younger adults is “sobering,” given that such trends “often provide a bellwether of the future disease burden,” they noted.

“The strong birth cohort effects we observed signal relatively recent changes in exposures that influence risk,” including excess body weight, high intake of processed meat, low intake of dietary fiber, and low levels of physical activity. “New strategies to curb the obesity epidemic and shift Americans toward healthier eating and more active lifestyles” are needed, the researchers said.

In addition, both clinicians and the public must be educated about the rising probability of the disease in people younger than 55 years. Timely follow-up of symptoms, regardless of patient age, must be emphasized. Younger adults are nearly 60% more likely than are older adults to be diagnosed with advanced colorectal cancer, largely because they delay seeking medical care. The disease simply isn’t “on the radar” of young adults or their providers, the investigators added.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE JOURNAL OF THE NATIONAL CANCER INSTITUTE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: The proportion of rectal cancer cases diagnosed in people younger than age 55 doubled over the past 2 decades.

Key numerical finding: The incidence of rectal cancer increased by 3.2% per year among patients aged 20-29 and 30-39 years, and increased by 2.3% per year in those aged 40-49 years and 50-54 years, but declined among adults aged 55 and older.

Data source: A retrospective cohort study involving 490,305 patients aged 20 years and older diagnosed between 1974 and 2013.

Disclosures: This study was supported by the American Cancer Society and the National Institutes of Health. Ms. Siegel and her associates did not provide their conflicts of interest.

Pediatric lupus patients face large burden of serious infection

Ten times higher than general rate
Article Type
Changed
Fri, 01/18/2019 - 16:34

 

The burden of serious infection is quite high among children who have systemic lupus erythematosus, with a “striking” preponderance of bacterial pneumonia, according to a report published in Arthritis Care & Research.

Infections are known to be commonplace among systemic lupus erythematosus (SLE) patients in general, and the increased risk is attributed both to the disease and to immunosuppressant therapies. However, most information on this topic comes from studies of adult patients seen at individual academic medical centers, said Linda T. Hiraki, MD, ScD, of the division of rheumatology at The Hospital for Sick Children, Toronto, and her associates.

To examine the nationwide prevalence of serious infections among children with SLE, they analyzed administrative data from a Medicaid database. They focused on 3,500 patients aged 5-18 years, including 1,297 who also had lupus nephritis, who were enrolled in Medicaid during 2000-2006 and followed for a mean of 2.6 years. This yielded a cumulative follow-up of more than 10,100 person-years (Arthritis Care Res. 2017 Feb 19. doi: 10.1002/acr.23219).

Courtesy Wikimedia Commons/doktorinternet/Creative Commons License
A total of 593 of these children had 1,053 serious infections requiring hospitalization during the study period, including 326 children with concurrent lupus nephritis who had 624 infections. A total of 17% of the entire study population and 25% of the subset with lupus nephritis developed at least one such infection. A substantial proportion – 18% of the overall cohort and 21% of those with lupus nephritis – had three or more serious infections requiring hospitalization during the relatively short follow-up, Dr. Hiraki and her associates noted.

The overall incidence was 10.4 serious infections per 100 person-years, and it was 17.65 per 100 person-years in the subset of patients who had lupus nephritis. By comparison, this overall rate is nearly four times higher than that reported for children with juvenile idiopathic arthritis, and the incidence among children with concomitant lupus nephritis is more than six times higher.

Infection rates were markedly higher among African American (incidence rate ratio [IRR], 1.83) and Native American (IRR, 1.81) children, compared with white children. They also were higher in early adolescence (ages 9-12 years) than earlier in childhood (ages 5-8 years), the investigators said.

Most of the infections (87%) were bacterial, whereas 11% were viral and 1.3% were fungal. (The remaining amount was unknown in the data because of too few numbers for federal reporting.) The most frequent bacterial infections were pneumonia (438 cases), followed by bacteremia (274 cases) and cellulitis (272 cases). Herpes zoster was the most frequent viral infection, accounting for 81 cases. The investigators noted that the low rate of fungal infections may be an artifact of the study protocol, which excluded, for technical reasons, cases of systemic candidiasis.

Not surprisingly, the rate of serious infection was higher among children with a high comorbidity burden than among healthier children.

Overall, the risk of serious infection was 59% higher for SLE patients who took corticosteroids during the study’s 6-month baseline period, in which 67% of patients took them (minimum of 20 mg/day of prednisone equivalent). However, the risk of serious infection was no different between those who used immunosuppressants (31%) or didn’t use them during that period.

A total of 26 children died within 30 days of hospital admission for a serious infection, for an overall mortality of 4.4% among children who developed serious infections. In comparison, 1.6% in the total cohort of 3,500 died. More than half of the children who died had concomitant lupus nephritis. In addition, 77% of those who died were taking corticosteroids when they developed the infections.

It is difficult to distinguish whether the high infection rate could be attributed to SLE itself or to its treatments. More studies are needed to further investigate this, as well as to address the disproportionate incidence among nonwhite children and any potential benefits from prophylactic use of antibiotics and vaccinations, Dr. Hiraki and her associates said.

The Canadian Institutes of Health Research, the Lupus Foundation of America, the Rheumatology Research Foundation, and the National Institutes of Health supported the study. Dr. Hiraki and her associates reported having no relevant disclosures.

Body

 

The study by Hiraki et al. is important because very little is known about the risks of infection in childhood SLE, and there are few sources of data involving large numbers of affected children.

The overall rate of 10.4 serious infections necessitating hospitalization per 100 person-years reported in the study is approximately 10 times higher than the rate in the general Medicaid population. The findings should prompt further study of infection in childhood SLE so we can work toward decreasing this excessive risk.

The investigators unfortunately did not assess medication use throughout the study or try to find factors besides a high SLE risk adjustment index that were associated with infection, and these missed opportunities are the most significant weaknesses of an otherwise well-conducted study because added information about the role of disease activity and medication use would have a greater impact on clinical care than the nonetheless useful knowledge that childhood SLE is associated with a markedly increased infection rate.
 

Timothy Beukelman, MD, and his associates are with the University of Alabama at Birmingham. They made these remarks in an editorial accompanying Dr. Hiraki and colleagues’ report (Arthritis Care Res. 2017 Feb 19. doi: 10.1002/acr.23221). No disclosure information was available with their editorial manuscript.

Publications
Topics
Sections
Body

 

The study by Hiraki et al. is important because very little is known about the risks of infection in childhood SLE, and there are few sources of data involving large numbers of affected children.

The overall rate of 10.4 serious infections necessitating hospitalization per 100 person-years reported in the study is approximately 10 times higher than the rate in the general Medicaid population. The findings should prompt further study of infection in childhood SLE so we can work toward decreasing this excessive risk.

The investigators unfortunately did not assess medication use throughout the study or try to find factors besides a high SLE risk adjustment index that were associated with infection, and these missed opportunities are the most significant weaknesses of an otherwise well-conducted study because added information about the role of disease activity and medication use would have a greater impact on clinical care than the nonetheless useful knowledge that childhood SLE is associated with a markedly increased infection rate.
 

Timothy Beukelman, MD, and his associates are with the University of Alabama at Birmingham. They made these remarks in an editorial accompanying Dr. Hiraki and colleagues’ report (Arthritis Care Res. 2017 Feb 19. doi: 10.1002/acr.23221). No disclosure information was available with their editorial manuscript.

Body

 

The study by Hiraki et al. is important because very little is known about the risks of infection in childhood SLE, and there are few sources of data involving large numbers of affected children.

The overall rate of 10.4 serious infections necessitating hospitalization per 100 person-years reported in the study is approximately 10 times higher than the rate in the general Medicaid population. The findings should prompt further study of infection in childhood SLE so we can work toward decreasing this excessive risk.

The investigators unfortunately did not assess medication use throughout the study or try to find factors besides a high SLE risk adjustment index that were associated with infection, and these missed opportunities are the most significant weaknesses of an otherwise well-conducted study because added information about the role of disease activity and medication use would have a greater impact on clinical care than the nonetheless useful knowledge that childhood SLE is associated with a markedly increased infection rate.
 

Timothy Beukelman, MD, and his associates are with the University of Alabama at Birmingham. They made these remarks in an editorial accompanying Dr. Hiraki and colleagues’ report (Arthritis Care Res. 2017 Feb 19. doi: 10.1002/acr.23221). No disclosure information was available with their editorial manuscript.

Title
Ten times higher than general rate
Ten times higher than general rate

 

The burden of serious infection is quite high among children who have systemic lupus erythematosus, with a “striking” preponderance of bacterial pneumonia, according to a report published in Arthritis Care & Research.

Infections are known to be commonplace among systemic lupus erythematosus (SLE) patients in general, and the increased risk is attributed both to the disease and to immunosuppressant therapies. However, most information on this topic comes from studies of adult patients seen at individual academic medical centers, said Linda T. Hiraki, MD, ScD, of the division of rheumatology at The Hospital for Sick Children, Toronto, and her associates.

To examine the nationwide prevalence of serious infections among children with SLE, they analyzed administrative data from a Medicaid database. They focused on 3,500 patients aged 5-18 years, including 1,297 who also had lupus nephritis, who were enrolled in Medicaid during 2000-2006 and followed for a mean of 2.6 years. This yielded a cumulative follow-up of more than 10,100 person-years (Arthritis Care Res. 2017 Feb 19. doi: 10.1002/acr.23219).

Courtesy Wikimedia Commons/doktorinternet/Creative Commons License
A total of 593 of these children had 1,053 serious infections requiring hospitalization during the study period, including 326 children with concurrent lupus nephritis who had 624 infections. A total of 17% of the entire study population and 25% of the subset with lupus nephritis developed at least one such infection. A substantial proportion – 18% of the overall cohort and 21% of those with lupus nephritis – had three or more serious infections requiring hospitalization during the relatively short follow-up, Dr. Hiraki and her associates noted.

The overall incidence was 10.4 serious infections per 100 person-years, and it was 17.65 per 100 person-years in the subset of patients who had lupus nephritis. By comparison, this overall rate is nearly four times higher than that reported for children with juvenile idiopathic arthritis, and the incidence among children with concomitant lupus nephritis is more than six times higher.

Infection rates were markedly higher among African American (incidence rate ratio [IRR], 1.83) and Native American (IRR, 1.81) children, compared with white children. They also were higher in early adolescence (ages 9-12 years) than earlier in childhood (ages 5-8 years), the investigators said.

Most of the infections (87%) were bacterial, whereas 11% were viral and 1.3% were fungal. (The remaining amount was unknown in the data because of too few numbers for federal reporting.) The most frequent bacterial infections were pneumonia (438 cases), followed by bacteremia (274 cases) and cellulitis (272 cases). Herpes zoster was the most frequent viral infection, accounting for 81 cases. The investigators noted that the low rate of fungal infections may be an artifact of the study protocol, which excluded, for technical reasons, cases of systemic candidiasis.

Not surprisingly, the rate of serious infection was higher among children with a high comorbidity burden than among healthier children.

Overall, the risk of serious infection was 59% higher for SLE patients who took corticosteroids during the study’s 6-month baseline period, in which 67% of patients took them (minimum of 20 mg/day of prednisone equivalent). However, the risk of serious infection was no different between those who used immunosuppressants (31%) or didn’t use them during that period.

A total of 26 children died within 30 days of hospital admission for a serious infection, for an overall mortality of 4.4% among children who developed serious infections. In comparison, 1.6% in the total cohort of 3,500 died. More than half of the children who died had concomitant lupus nephritis. In addition, 77% of those who died were taking corticosteroids when they developed the infections.

It is difficult to distinguish whether the high infection rate could be attributed to SLE itself or to its treatments. More studies are needed to further investigate this, as well as to address the disproportionate incidence among nonwhite children and any potential benefits from prophylactic use of antibiotics and vaccinations, Dr. Hiraki and her associates said.

The Canadian Institutes of Health Research, the Lupus Foundation of America, the Rheumatology Research Foundation, and the National Institutes of Health supported the study. Dr. Hiraki and her associates reported having no relevant disclosures.

 

The burden of serious infection is quite high among children who have systemic lupus erythematosus, with a “striking” preponderance of bacterial pneumonia, according to a report published in Arthritis Care & Research.

Infections are known to be commonplace among systemic lupus erythematosus (SLE) patients in general, and the increased risk is attributed both to the disease and to immunosuppressant therapies. However, most information on this topic comes from studies of adult patients seen at individual academic medical centers, said Linda T. Hiraki, MD, ScD, of the division of rheumatology at The Hospital for Sick Children, Toronto, and her associates.

To examine the nationwide prevalence of serious infections among children with SLE, they analyzed administrative data from a Medicaid database. They focused on 3,500 patients aged 5-18 years, including 1,297 who also had lupus nephritis, who were enrolled in Medicaid during 2000-2006 and followed for a mean of 2.6 years. This yielded a cumulative follow-up of more than 10,100 person-years (Arthritis Care Res. 2017 Feb 19. doi: 10.1002/acr.23219).

Courtesy Wikimedia Commons/doktorinternet/Creative Commons License
A total of 593 of these children had 1,053 serious infections requiring hospitalization during the study period, including 326 children with concurrent lupus nephritis who had 624 infections. A total of 17% of the entire study population and 25% of the subset with lupus nephritis developed at least one such infection. A substantial proportion – 18% of the overall cohort and 21% of those with lupus nephritis – had three or more serious infections requiring hospitalization during the relatively short follow-up, Dr. Hiraki and her associates noted.

The overall incidence was 10.4 serious infections per 100 person-years, and it was 17.65 per 100 person-years in the subset of patients who had lupus nephritis. By comparison, this overall rate is nearly four times higher than that reported for children with juvenile idiopathic arthritis, and the incidence among children with concomitant lupus nephritis is more than six times higher.

Infection rates were markedly higher among African American (incidence rate ratio [IRR], 1.83) and Native American (IRR, 1.81) children, compared with white children. They also were higher in early adolescence (ages 9-12 years) than earlier in childhood (ages 5-8 years), the investigators said.

Most of the infections (87%) were bacterial, whereas 11% were viral and 1.3% were fungal. (The remaining amount was unknown in the data because of too few numbers for federal reporting.) The most frequent bacterial infections were pneumonia (438 cases), followed by bacteremia (274 cases) and cellulitis (272 cases). Herpes zoster was the most frequent viral infection, accounting for 81 cases. The investigators noted that the low rate of fungal infections may be an artifact of the study protocol, which excluded, for technical reasons, cases of systemic candidiasis.

Not surprisingly, the rate of serious infection was higher among children with a high comorbidity burden than among healthier children.

Overall, the risk of serious infection was 59% higher for SLE patients who took corticosteroids during the study’s 6-month baseline period, in which 67% of patients took them (minimum of 20 mg/day of prednisone equivalent). However, the risk of serious infection was no different between those who used immunosuppressants (31%) or didn’t use them during that period.

A total of 26 children died within 30 days of hospital admission for a serious infection, for an overall mortality of 4.4% among children who developed serious infections. In comparison, 1.6% in the total cohort of 3,500 died. More than half of the children who died had concomitant lupus nephritis. In addition, 77% of those who died were taking corticosteroids when they developed the infections.

It is difficult to distinguish whether the high infection rate could be attributed to SLE itself or to its treatments. More studies are needed to further investigate this, as well as to address the disproportionate incidence among nonwhite children and any potential benefits from prophylactic use of antibiotics and vaccinations, Dr. Hiraki and her associates said.

The Canadian Institutes of Health Research, the Lupus Foundation of America, the Rheumatology Research Foundation, and the National Institutes of Health supported the study. Dr. Hiraki and her associates reported having no relevant disclosures.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM ARTHRITIS CARE & RESEARCH

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
132199
Vitals

 

Key clinical point: The burden of serious infection is quite high among children who have SLE, with a “striking” preponderance of bacterial pneumonia.

Major finding: The overall incidence was 10.4 serious infections per 100 person-years, and it was 17.65 per 100 person-years in the subset of patients who had lupus nephritis.

Data source: A retrospective cohort study using administrative Medicaid data for 3,500 affected U.S. children aged 5-18 years.

Disclosures: The Canadian Institutes of Health Research, the Lupus Foundation of America, the Rheumatology Research Foundation, and the National Institutes of Health supported the study. Dr. Hiraki and her associates reported having no relevant disclosures.