Join AGA in Supporting GI Research

Article Type
Changed
Sun, 01/22/2017 - 13:03

 

Decades of research have revolutionized the care of many digestive disease patients. These patients, as well as everyone in the GI field, clinicians and researchers alike, have benefited from the discoveries of dedicated investigators, past and present. As the charitable arm of the American Gastroenterological Association (AGA), the AGA Research Foundation contributes to this tradition of discovery to combat the continued lower quality of life and suffering brought on by digestive diseases.

AGA Institute
Ms. Rani Richardson
The AGA Research Foundation’s mission is to raise funds to support young researchers in gastroenterology and hepatology. The foundation provides a key source of funding at a critical juncture in a young investigators’ career.

“Using this award, I plan to study the cytoskeletal intermediate filament proteins that are expressed in digestive-type epithelia, allowing me to better understand the molecular basis of GI diseases. My goal is to create a career in medical research and develop more ways to make biomedical research meaningful for clinical health-care professionals, and ultimately for patients,” said Rani Richardson, the 2016 AGA Investing in the Future Student Research Fellowship Award Recipient.
 

By joining others in donating to the AGA Research Foundation, you can help fill the funding gap and protect the next generation of investigators.

Help provide critical funding to young researchers today by making a donation to the AGA Research Foundation on the foundation’s website at www.gastro.org/contribute or by mail to 4930 Del Ray Avenue, Bethesda, MD 20814.

Publications
Topics
Sections

 

Decades of research have revolutionized the care of many digestive disease patients. These patients, as well as everyone in the GI field, clinicians and researchers alike, have benefited from the discoveries of dedicated investigators, past and present. As the charitable arm of the American Gastroenterological Association (AGA), the AGA Research Foundation contributes to this tradition of discovery to combat the continued lower quality of life and suffering brought on by digestive diseases.

AGA Institute
Ms. Rani Richardson
The AGA Research Foundation’s mission is to raise funds to support young researchers in gastroenterology and hepatology. The foundation provides a key source of funding at a critical juncture in a young investigators’ career.

“Using this award, I plan to study the cytoskeletal intermediate filament proteins that are expressed in digestive-type epithelia, allowing me to better understand the molecular basis of GI diseases. My goal is to create a career in medical research and develop more ways to make biomedical research meaningful for clinical health-care professionals, and ultimately for patients,” said Rani Richardson, the 2016 AGA Investing in the Future Student Research Fellowship Award Recipient.
 

By joining others in donating to the AGA Research Foundation, you can help fill the funding gap and protect the next generation of investigators.

Help provide critical funding to young researchers today by making a donation to the AGA Research Foundation on the foundation’s website at www.gastro.org/contribute or by mail to 4930 Del Ray Avenue, Bethesda, MD 20814.

 

Decades of research have revolutionized the care of many digestive disease patients. These patients, as well as everyone in the GI field, clinicians and researchers alike, have benefited from the discoveries of dedicated investigators, past and present. As the charitable arm of the American Gastroenterological Association (AGA), the AGA Research Foundation contributes to this tradition of discovery to combat the continued lower quality of life and suffering brought on by digestive diseases.

AGA Institute
Ms. Rani Richardson
The AGA Research Foundation’s mission is to raise funds to support young researchers in gastroenterology and hepatology. The foundation provides a key source of funding at a critical juncture in a young investigators’ career.

“Using this award, I plan to study the cytoskeletal intermediate filament proteins that are expressed in digestive-type epithelia, allowing me to better understand the molecular basis of GI diseases. My goal is to create a career in medical research and develop more ways to make biomedical research meaningful for clinical health-care professionals, and ultimately for patients,” said Rani Richardson, the 2016 AGA Investing in the Future Student Research Fellowship Award Recipient.
 

By joining others in donating to the AGA Research Foundation, you can help fill the funding gap and protect the next generation of investigators.

Help provide critical funding to young researchers today by making a donation to the AGA Research Foundation on the foundation’s website at www.gastro.org/contribute or by mail to 4930 Del Ray Avenue, Bethesda, MD 20814.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME

Serum magnesium level reflects risk of death, irrespective of CKD

Article Type
Changed
Fri, 01/18/2019 - 16:28

 

– Low levels of serum magnesium were associated with increased all-cause mortality, whether or not patients had chronic kidney disease, in a single center, retrospective study of 3,551 people.

The association was independent of sociodemographic factors, comorbidities, and use of diuretics.

If causality is shown, “magnesium supplementation could be a simple therapy to lessen the chance of death in CKD patients,” study investigator Silvia Ferrè, PhD, University of Texas Southwestern, Dallas, said in an interview regarding the results of the Dallas Heart Study, which was presented at the annual meeting of the American Society of Nephrology.

Brian Hoyle/Frontline Medical News
Dr. Silvia Ferrè
This retrospective analysis involved 3,551 study participants in the Dallas Heart Study who had serum magnesium measured at their baseline visit. Of these, CKD did not develop in 3,245 subjects and did develop in 306 subjects. Both groups could be stratified according to low, medium, and high serum magnesium. In the non-CKD group, the low (736 subjects), medium (1,461), and high (1,048) tertiles of serum magnesium comprised. The respective numbers in the CKD group were 118, 109, and 79 subjects.

In both groups, the subjects with low serum magnesium were younger, more likely to be female, had a higher body mass index, and were more burdened by comorbidities including type 2 diabetes mellitus and hypertension. Subjects without CKD and low serum magnesium were significantly more likely to use diuretics. Diuretic use was comparable in subjects with CKD regardless of serum magnesium level.

Irrespective of CKD status, survival was significantly lower in subjects with low serum magnesium in the median 12.3-year follow-up compared to the other two serum magnesium tertiles (P less than 0.001 and P equal to 0.03, respectively). Following adjustment for age, gender, race/ethnicity, body mass index, phosphorus, calcium, bicarbonate, albumin, intact parathyroid hormone, total cholesterol, high-density lipoprotein, and use of diuretics and supplements, low serum magnesium was independently associated with all-cause death in subjects with CKD (Hazard Ratio, 1.92; 9%% Confidence Interval, 1.03 to 3.59; P equal to 0.04) and those without CKD (HR, 1.43; 1.43; 95% CI, 0.95 to 2.15; P equal to 0.09), when compared to high serum magnesium as the referent.

Dr. Ferrè said that screening for serum magnesium and supplementation with magnesium as part of routine blood testing might improve survival. Low magnesium level alsonhas been linked with osteoporosis, diabetes, and cardiovascular disease.

The Dallas Heart Study was a multiethnic, population-based study involving 6,101 adults residing in Dallas County. The study, which ran from 2000 to the end of 2011, was designed to explore the early detection of cardiovascular disease and the social, behavioral, and environmental factors associated with risk, with the goal of interventions that can be provided at the community level.

The study sponsor was University of Texas Southwestern Medical Center. The study was funded by the National Institutes of Health and the Donald W. Reynolds Foundation. Dr. Ferrè reported having no financial disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Low levels of serum magnesium were associated with increased all-cause mortality, whether or not patients had chronic kidney disease, in a single center, retrospective study of 3,551 people.

The association was independent of sociodemographic factors, comorbidities, and use of diuretics.

If causality is shown, “magnesium supplementation could be a simple therapy to lessen the chance of death in CKD patients,” study investigator Silvia Ferrè, PhD, University of Texas Southwestern, Dallas, said in an interview regarding the results of the Dallas Heart Study, which was presented at the annual meeting of the American Society of Nephrology.

Brian Hoyle/Frontline Medical News
Dr. Silvia Ferrè
This retrospective analysis involved 3,551 study participants in the Dallas Heart Study who had serum magnesium measured at their baseline visit. Of these, CKD did not develop in 3,245 subjects and did develop in 306 subjects. Both groups could be stratified according to low, medium, and high serum magnesium. In the non-CKD group, the low (736 subjects), medium (1,461), and high (1,048) tertiles of serum magnesium comprised. The respective numbers in the CKD group were 118, 109, and 79 subjects.

In both groups, the subjects with low serum magnesium were younger, more likely to be female, had a higher body mass index, and were more burdened by comorbidities including type 2 diabetes mellitus and hypertension. Subjects without CKD and low serum magnesium were significantly more likely to use diuretics. Diuretic use was comparable in subjects with CKD regardless of serum magnesium level.

Irrespective of CKD status, survival was significantly lower in subjects with low serum magnesium in the median 12.3-year follow-up compared to the other two serum magnesium tertiles (P less than 0.001 and P equal to 0.03, respectively). Following adjustment for age, gender, race/ethnicity, body mass index, phosphorus, calcium, bicarbonate, albumin, intact parathyroid hormone, total cholesterol, high-density lipoprotein, and use of diuretics and supplements, low serum magnesium was independently associated with all-cause death in subjects with CKD (Hazard Ratio, 1.92; 9%% Confidence Interval, 1.03 to 3.59; P equal to 0.04) and those without CKD (HR, 1.43; 1.43; 95% CI, 0.95 to 2.15; P equal to 0.09), when compared to high serum magnesium as the referent.

Dr. Ferrè said that screening for serum magnesium and supplementation with magnesium as part of routine blood testing might improve survival. Low magnesium level alsonhas been linked with osteoporosis, diabetes, and cardiovascular disease.

The Dallas Heart Study was a multiethnic, population-based study involving 6,101 adults residing in Dallas County. The study, which ran from 2000 to the end of 2011, was designed to explore the early detection of cardiovascular disease and the social, behavioral, and environmental factors associated with risk, with the goal of interventions that can be provided at the community level.

The study sponsor was University of Texas Southwestern Medical Center. The study was funded by the National Institutes of Health and the Donald W. Reynolds Foundation. Dr. Ferrè reported having no financial disclosures.

 

– Low levels of serum magnesium were associated with increased all-cause mortality, whether or not patients had chronic kidney disease, in a single center, retrospective study of 3,551 people.

The association was independent of sociodemographic factors, comorbidities, and use of diuretics.

If causality is shown, “magnesium supplementation could be a simple therapy to lessen the chance of death in CKD patients,” study investigator Silvia Ferrè, PhD, University of Texas Southwestern, Dallas, said in an interview regarding the results of the Dallas Heart Study, which was presented at the annual meeting of the American Society of Nephrology.

Brian Hoyle/Frontline Medical News
Dr. Silvia Ferrè
This retrospective analysis involved 3,551 study participants in the Dallas Heart Study who had serum magnesium measured at their baseline visit. Of these, CKD did not develop in 3,245 subjects and did develop in 306 subjects. Both groups could be stratified according to low, medium, and high serum magnesium. In the non-CKD group, the low (736 subjects), medium (1,461), and high (1,048) tertiles of serum magnesium comprised. The respective numbers in the CKD group were 118, 109, and 79 subjects.

In both groups, the subjects with low serum magnesium were younger, more likely to be female, had a higher body mass index, and were more burdened by comorbidities including type 2 diabetes mellitus and hypertension. Subjects without CKD and low serum magnesium were significantly more likely to use diuretics. Diuretic use was comparable in subjects with CKD regardless of serum magnesium level.

Irrespective of CKD status, survival was significantly lower in subjects with low serum magnesium in the median 12.3-year follow-up compared to the other two serum magnesium tertiles (P less than 0.001 and P equal to 0.03, respectively). Following adjustment for age, gender, race/ethnicity, body mass index, phosphorus, calcium, bicarbonate, albumin, intact parathyroid hormone, total cholesterol, high-density lipoprotein, and use of diuretics and supplements, low serum magnesium was independently associated with all-cause death in subjects with CKD (Hazard Ratio, 1.92; 9%% Confidence Interval, 1.03 to 3.59; P equal to 0.04) and those without CKD (HR, 1.43; 1.43; 95% CI, 0.95 to 2.15; P equal to 0.09), when compared to high serum magnesium as the referent.

Dr. Ferrè said that screening for serum magnesium and supplementation with magnesium as part of routine blood testing might improve survival. Low magnesium level alsonhas been linked with osteoporosis, diabetes, and cardiovascular disease.

The Dallas Heart Study was a multiethnic, population-based study involving 6,101 adults residing in Dallas County. The study, which ran from 2000 to the end of 2011, was designed to explore the early detection of cardiovascular disease and the social, behavioral, and environmental factors associated with risk, with the goal of interventions that can be provided at the community level.

The study sponsor was University of Texas Southwestern Medical Center. The study was funded by the National Institutes of Health and the Donald W. Reynolds Foundation. Dr. Ferrè reported having no financial disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE ANNUAL MEETING OF THE AMERICAN SOCIETY FOR NEPHROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: If causality is shown, magnesium supplementation could be a simple therapy to reduce mortality in patients with chronic kidney disease.

Major finding: Low serum magnesium was significantly associated with risk of death in patients without CKD (P less than 0.01) and patients with CKD (P equal to 0.03).

Data source: A single-center, retrospective cohort of 3,551 patients in the Dallas Heart Study.

Disclosures: The study sponsor was University of Texas Southwestern Medical Center. The study was funded by the National Institutes of Health and the Donald W. Reynolds Foundation. Dr. Ferrè reported having no financial disclosures.

Constipation severity linked with chronic kidney disease and decline in kidney function

Article Type
Changed
Fri, 01/18/2019 - 16:28

 

– Constipation was associated with poor kidney health in a large nationwide cohort of 3.5 million United States veterans, and researchers are considering whether effectively treating constipation could help prevent or treat kidney disease.

“In this large nationwide cohort ... patients with constipation had higher risks of developing chronic kidney disease and end-stage renal disease, and were more likely to experience rapid decline in kidney function, even after adjusting for various known risk factors. We also found that more severe constipation was associated with an incrementally higher risk for both incident CKD (chronic kidney disease) and ESRD (end-stage renal disease),” said Keiichi Sumida, MD, a visiting scholar at the University of Tennessee Health Science Center in Memphis.

Dr. Keiichi Sumida
Dr. Sumida presented a poster describing the findings at the American Society of Nephrology’s Kidney Week 2016 meeting.

In a multivariable analysis, those with constipation had a 13% higher likelihood of developing CKD (Hazard Ratio, 1.13; 95% Confidence Interval, 1.11 to 1.14) and a 9% higher likelihood of developing ESRD (HR, 1.09; 95% CI, 1.01 to 1.18) compared to those without constipation. As well, those with constipation experienced a faster decline in estimated glomerular filtration ratio (eGFR).

Scrutiny of US Veterans Administration databases identified nearly 4.5 million patients with serum creatinine measurements obtained between October 2004 and September 2006. Of these, 3,504,732 patients had an eGFR greater than or equal to 60 ml/min/1.73 m2 but no other symptoms of CKD. All were followed through 2013.

Constipation was defined as at least two ICD-9-CM diagnoses for constipation made at least 60 days apart or two or more prescriptions for laxatives separated by 60 days for up to a year. The severity of constipation was based on the number of different type of laxatives prescribed, with no laxative use being considered as absence of constipation, one laxative type being indicative of mild constipation, and two or more types of laxatives being indicative of severe constipation.

Co-primary outcomes were incident CKD, incident ESRD, and change in eGFR from baseline. As expected in the propensity-matched cohort, baseline demographic and clinical characteristics were comparable for the 3,251,291 individuals who experienced constipation and the 253,441 individuals who did not.

“Our findings highlight the plausible link between the gut and the kidneys, and provide additional insights into the pathogenesis of kidney disease progression. Our results suggest the need for careful observation of kidney function in patients with constipation, particularly among those with more severe constipation,” Dr. Sumida concluded.

Dr. Sumida hypothesized that altered gut microflora in constipation may result in inflammation, changes in metabolites, or accumulation of toxins. Alternative explanations increased serotonin related to laxative use, nephrotoxicity, dehydration, or electrolyte imbalance.

These possibilities need to be examined, as does the idea that relieving constipation could prevent renal decline. “Given the high prevalence of constipation in the general population and the simplicity of its assessment in primary care settings, the management of constipation through lifestyle modifications and/or use of probiotics rather than laxatives could become a useful tool in preventing the development of CKD, or in retarding the progression of existing CKD,” Dr. Sumida said.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Constipation was associated with poor kidney health in a large nationwide cohort of 3.5 million United States veterans, and researchers are considering whether effectively treating constipation could help prevent or treat kidney disease.

“In this large nationwide cohort ... patients with constipation had higher risks of developing chronic kidney disease and end-stage renal disease, and were more likely to experience rapid decline in kidney function, even after adjusting for various known risk factors. We also found that more severe constipation was associated with an incrementally higher risk for both incident CKD (chronic kidney disease) and ESRD (end-stage renal disease),” said Keiichi Sumida, MD, a visiting scholar at the University of Tennessee Health Science Center in Memphis.

Dr. Keiichi Sumida
Dr. Sumida presented a poster describing the findings at the American Society of Nephrology’s Kidney Week 2016 meeting.

In a multivariable analysis, those with constipation had a 13% higher likelihood of developing CKD (Hazard Ratio, 1.13; 95% Confidence Interval, 1.11 to 1.14) and a 9% higher likelihood of developing ESRD (HR, 1.09; 95% CI, 1.01 to 1.18) compared to those without constipation. As well, those with constipation experienced a faster decline in estimated glomerular filtration ratio (eGFR).

Scrutiny of US Veterans Administration databases identified nearly 4.5 million patients with serum creatinine measurements obtained between October 2004 and September 2006. Of these, 3,504,732 patients had an eGFR greater than or equal to 60 ml/min/1.73 m2 but no other symptoms of CKD. All were followed through 2013.

Constipation was defined as at least two ICD-9-CM diagnoses for constipation made at least 60 days apart or two or more prescriptions for laxatives separated by 60 days for up to a year. The severity of constipation was based on the number of different type of laxatives prescribed, with no laxative use being considered as absence of constipation, one laxative type being indicative of mild constipation, and two or more types of laxatives being indicative of severe constipation.

Co-primary outcomes were incident CKD, incident ESRD, and change in eGFR from baseline. As expected in the propensity-matched cohort, baseline demographic and clinical characteristics were comparable for the 3,251,291 individuals who experienced constipation and the 253,441 individuals who did not.

“Our findings highlight the plausible link between the gut and the kidneys, and provide additional insights into the pathogenesis of kidney disease progression. Our results suggest the need for careful observation of kidney function in patients with constipation, particularly among those with more severe constipation,” Dr. Sumida concluded.

Dr. Sumida hypothesized that altered gut microflora in constipation may result in inflammation, changes in metabolites, or accumulation of toxins. Alternative explanations increased serotonin related to laxative use, nephrotoxicity, dehydration, or electrolyte imbalance.

These possibilities need to be examined, as does the idea that relieving constipation could prevent renal decline. “Given the high prevalence of constipation in the general population and the simplicity of its assessment in primary care settings, the management of constipation through lifestyle modifications and/or use of probiotics rather than laxatives could become a useful tool in preventing the development of CKD, or in retarding the progression of existing CKD,” Dr. Sumida said.

 

– Constipation was associated with poor kidney health in a large nationwide cohort of 3.5 million United States veterans, and researchers are considering whether effectively treating constipation could help prevent or treat kidney disease.

“In this large nationwide cohort ... patients with constipation had higher risks of developing chronic kidney disease and end-stage renal disease, and were more likely to experience rapid decline in kidney function, even after adjusting for various known risk factors. We also found that more severe constipation was associated with an incrementally higher risk for both incident CKD (chronic kidney disease) and ESRD (end-stage renal disease),” said Keiichi Sumida, MD, a visiting scholar at the University of Tennessee Health Science Center in Memphis.

Dr. Keiichi Sumida
Dr. Sumida presented a poster describing the findings at the American Society of Nephrology’s Kidney Week 2016 meeting.

In a multivariable analysis, those with constipation had a 13% higher likelihood of developing CKD (Hazard Ratio, 1.13; 95% Confidence Interval, 1.11 to 1.14) and a 9% higher likelihood of developing ESRD (HR, 1.09; 95% CI, 1.01 to 1.18) compared to those without constipation. As well, those with constipation experienced a faster decline in estimated glomerular filtration ratio (eGFR).

Scrutiny of US Veterans Administration databases identified nearly 4.5 million patients with serum creatinine measurements obtained between October 2004 and September 2006. Of these, 3,504,732 patients had an eGFR greater than or equal to 60 ml/min/1.73 m2 but no other symptoms of CKD. All were followed through 2013.

Constipation was defined as at least two ICD-9-CM diagnoses for constipation made at least 60 days apart or two or more prescriptions for laxatives separated by 60 days for up to a year. The severity of constipation was based on the number of different type of laxatives prescribed, with no laxative use being considered as absence of constipation, one laxative type being indicative of mild constipation, and two or more types of laxatives being indicative of severe constipation.

Co-primary outcomes were incident CKD, incident ESRD, and change in eGFR from baseline. As expected in the propensity-matched cohort, baseline demographic and clinical characteristics were comparable for the 3,251,291 individuals who experienced constipation and the 253,441 individuals who did not.

“Our findings highlight the plausible link between the gut and the kidneys, and provide additional insights into the pathogenesis of kidney disease progression. Our results suggest the need for careful observation of kidney function in patients with constipation, particularly among those with more severe constipation,” Dr. Sumida concluded.

Dr. Sumida hypothesized that altered gut microflora in constipation may result in inflammation, changes in metabolites, or accumulation of toxins. Alternative explanations increased serotonin related to laxative use, nephrotoxicity, dehydration, or electrolyte imbalance.

These possibilities need to be examined, as does the idea that relieving constipation could prevent renal decline. “Given the high prevalence of constipation in the general population and the simplicity of its assessment in primary care settings, the management of constipation through lifestyle modifications and/or use of probiotics rather than laxatives could become a useful tool in preventing the development of CKD, or in retarding the progression of existing CKD,” Dr. Sumida said.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE ANNUAL MEETING OF THE AMERICAN SOCIETY FOR NEPHROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Presence and severity of constipation increases the risks of developing chronic kidney disease and end stage renal disease, and accelerates the decline in kidney function.

Major finding: Individuals with constipation were 13% more likely to develop chronic kidney disease and 9% more likely to develop end stage renal disease compared to those without constipation.

Data source: Retrospective analysis of Veteran’s Administration databases. The study included 3,504,732 subjects.

Disclosures: The study sponsor was the University of Tennessee Health Science Center. Funding was provided by the United States Department of Veterans Affairs. Dr. Sumida reported having no financial disclosures.

Cell ratios predict short-term possibility of death in patients beginning hemodialysis

Article Type
Changed
Fri, 01/18/2019 - 16:28

 

– Two simple-to-calculate ratios – neutrophil lymphocyte ratio and platelet lymphocyte ratio – may be able to predict impending death in patients who have recently begun hemodialysis, based on data from 108,548 incident hemodialysis patients in the database of DaVita HealthCare Partners from 2007 to 2011.

“Neutrophil lymphocyte ratio (NLR) and platelet lymphocyte ratio (PLR), and inflammatory and nutritional indices, which are calculated from complete blood count, were identified as strong predictors of impending death ... and thus are inexpensive and immediately available markers for predicting short-term mortality,” said Yoshitsugu Obi, MD, PhD, a visiting scholar at the Harold Simmons Center for Kidney Disease Research & Epidemiology, University of California Irvine School of Medicine, Irvine, California.

Dr. Yoshitsugu Obi
The findings reported by Dr. Obi as a poster at the American Society of Nephrology’s Kidney Week 2016 meeting extend the utility of the NLR and PLR beyond their established value in predicting the prognosis of cancer.

The data were obtained from the database of a large dialysis organization; 108,548 patients who began hemodialysis from 2007 to 2011 were included. The range of NLR values were divided into 12 categories with ratios of less than 1.5 and greater than or equal to 6.5 as the bracketing ratios. The 10 other intervening ratios differed incrementally by 0.5. Eight SLR categories were created with the bracketing ratios being less than 5 and greater than or equal to 35. The intervening six ratios differed incrementally by 5.

The mean age of the cohort was 63 ± 15 years. Males predominated (56%), 59% of the subjects were diabetic, and 31% were African American. At baseline the median NLR and PLR were 3.64 and 13.12, respectively.

In an unadjusted regression analysis, the categories of NLR and PLR had a strong and linear relationship with all-cause mortality. In an analysis that adjusted for covariates, including demographics and comorbidities, as well as markers of malnutrition and inflammation, the association of the two ratios with all-cause mortality persisted.

Unlike previous small and inconclusive studies, the size of the present study makes robust the connection between these cell ratios and death in dialysis patients, he said. The plan now is to compare the mortality predictability of NLR and PLR with other established risk factors including albumin, phosphorus, and alkaline phosphatase.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Two simple-to-calculate ratios – neutrophil lymphocyte ratio and platelet lymphocyte ratio – may be able to predict impending death in patients who have recently begun hemodialysis, based on data from 108,548 incident hemodialysis patients in the database of DaVita HealthCare Partners from 2007 to 2011.

“Neutrophil lymphocyte ratio (NLR) and platelet lymphocyte ratio (PLR), and inflammatory and nutritional indices, which are calculated from complete blood count, were identified as strong predictors of impending death ... and thus are inexpensive and immediately available markers for predicting short-term mortality,” said Yoshitsugu Obi, MD, PhD, a visiting scholar at the Harold Simmons Center for Kidney Disease Research & Epidemiology, University of California Irvine School of Medicine, Irvine, California.

Dr. Yoshitsugu Obi
The findings reported by Dr. Obi as a poster at the American Society of Nephrology’s Kidney Week 2016 meeting extend the utility of the NLR and PLR beyond their established value in predicting the prognosis of cancer.

The data were obtained from the database of a large dialysis organization; 108,548 patients who began hemodialysis from 2007 to 2011 were included. The range of NLR values were divided into 12 categories with ratios of less than 1.5 and greater than or equal to 6.5 as the bracketing ratios. The 10 other intervening ratios differed incrementally by 0.5. Eight SLR categories were created with the bracketing ratios being less than 5 and greater than or equal to 35. The intervening six ratios differed incrementally by 5.

The mean age of the cohort was 63 ± 15 years. Males predominated (56%), 59% of the subjects were diabetic, and 31% were African American. At baseline the median NLR and PLR were 3.64 and 13.12, respectively.

In an unadjusted regression analysis, the categories of NLR and PLR had a strong and linear relationship with all-cause mortality. In an analysis that adjusted for covariates, including demographics and comorbidities, as well as markers of malnutrition and inflammation, the association of the two ratios with all-cause mortality persisted.

Unlike previous small and inconclusive studies, the size of the present study makes robust the connection between these cell ratios and death in dialysis patients, he said. The plan now is to compare the mortality predictability of NLR and PLR with other established risk factors including albumin, phosphorus, and alkaline phosphatase.

 

– Two simple-to-calculate ratios – neutrophil lymphocyte ratio and platelet lymphocyte ratio – may be able to predict impending death in patients who have recently begun hemodialysis, based on data from 108,548 incident hemodialysis patients in the database of DaVita HealthCare Partners from 2007 to 2011.

“Neutrophil lymphocyte ratio (NLR) and platelet lymphocyte ratio (PLR), and inflammatory and nutritional indices, which are calculated from complete blood count, were identified as strong predictors of impending death ... and thus are inexpensive and immediately available markers for predicting short-term mortality,” said Yoshitsugu Obi, MD, PhD, a visiting scholar at the Harold Simmons Center for Kidney Disease Research & Epidemiology, University of California Irvine School of Medicine, Irvine, California.

Dr. Yoshitsugu Obi
The findings reported by Dr. Obi as a poster at the American Society of Nephrology’s Kidney Week 2016 meeting extend the utility of the NLR and PLR beyond their established value in predicting the prognosis of cancer.

The data were obtained from the database of a large dialysis organization; 108,548 patients who began hemodialysis from 2007 to 2011 were included. The range of NLR values were divided into 12 categories with ratios of less than 1.5 and greater than or equal to 6.5 as the bracketing ratios. The 10 other intervening ratios differed incrementally by 0.5. Eight SLR categories were created with the bracketing ratios being less than 5 and greater than or equal to 35. The intervening six ratios differed incrementally by 5.

The mean age of the cohort was 63 ± 15 years. Males predominated (56%), 59% of the subjects were diabetic, and 31% were African American. At baseline the median NLR and PLR were 3.64 and 13.12, respectively.

In an unadjusted regression analysis, the categories of NLR and PLR had a strong and linear relationship with all-cause mortality. In an analysis that adjusted for covariates, including demographics and comorbidities, as well as markers of malnutrition and inflammation, the association of the two ratios with all-cause mortality persisted.

Unlike previous small and inconclusive studies, the size of the present study makes robust the connection between these cell ratios and death in dialysis patients, he said. The plan now is to compare the mortality predictability of NLR and PLR with other established risk factors including albumin, phosphorus, and alkaline phosphatase.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE ANNUAL MEETING OF THE AMERICAN SOCIETY FOR NEPHROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: The neutrophil lymphocyte ratio (NLR) and platelet lymphocyte ratio (PLR) are strongly associated with imminent death in patients who have recently started hemodialysis.

Major finding: Increasing NLR and PLR were linearly associated with death in 108,548 hemodialysis patients.

Data source: Database of DaVita HealthCare Partners from 2007 to 2011.

Disclosures: The study was sponsored by University of Irvine School of Medicine. The study was funded by the National Institutes of Health. Dr. Obi had no disclosures.

Subclinical AF found in 1/3 of asymptomatic elderly

Findings weaken stroke, subclinical AF link
Article Type
Changed
Tue, 07/21/2020 - 14:18

 

– About a third of elderly people at high cardiovascular risk but otherwise healthy and asymptomatic had subclinical atrial fibrillation in a multicenter study of 273 people.

This finding that subclinical atrial fibrillation (AF) is “extremely common” in elderly people with cardiovascular risk factors “weakens the case that detecting subclinical AF in patients following a stroke implies causality” of the stroke “because subclinical AF is so prevalent,” Jeff S. Healey, MD, said at the American Heart Association Scientific Sessions.

Jeff S. Healey
He advised against taking any new steps to screen for or treat subclinical AF. Possible benefit from treating patients with subclinical AF with an anticoagulant is “unproven,” noted Dr. Healey. He also called it “premature” to routinely screen people aged 65 or older with an enlarged left atrium by implanting a loop recorder.

“I think that subclinical AF is a distinct subgroup of AF, with a risk for stroke that is quite low, about 1.5%-2% per year,” said Dr. Healey, a cardiologist at McMaster University in Hamilton, Canada. “Given that this was an elderly population [study participants averaged 74 years old] with bleeding risk, it’s reasonable to question” whether many people with subclinical AF need anticoagulation. The question of whether “45 seconds of AF seen 6 months after a stroke is worthy of treatment with an anticoagulant should give people pause,” he said.

The Prevalence of Sub-Clinical Atrial Fibrillation Using an Implantable Cardiac Monitor (ASSERT-II) study initially enrolled 273 people at 26 sites in Canada and The Netherlands. Researchers actually placed a loop recorder in 256, and complete follow-up of at least 9 months occurred for 252. Enrolled patients had to be at 65 years old, and have at least one of these risk factors for AF or stroke: a CHA2DS2-VASc score of 2 or greater; documented obstructive sleep apnea; or a body mass index greater than 30 kg/m2. In addition, enrollees also had to have one of these risk factors for AF: a left atrial volume of at least 58 ml; a left atrial diameter of at least 4.4 cm; or a serum NT-proBNP level of at least 290 pg/mL.

Dr. Healey and his associates prespecified subclinical AF as at least 5 minutes of AF seen in the loop recording during follow-up, which occurred in 34% of the participants during an average 16 months of follow-up, he reported. At least 30 minutes of AF occurred in 22% during follow-up, at least 6 hours in 7%, and at least 24 hours in 3%.

In a prespecified set of subgroup analyses, people with a large left atrium formed the only subgroup with a statistically significant association with outcome. People with a left atrial size at or above the study median of 73.5 ml had an 85% increased rate of subclinical AF compared with those with smaller left atria in the multivariate analysis. But increased left atrial size alone did not fully explain subclinical atrial fibrillation. Even among participants in the lowest quartile for left atrial diameter, less than 4.3 cm, the prevalence of subclinical AF was 27%, Dr. Healey noted.
 

Body

 

The results reported by Dr. Healey provide robust data that bridges a major gap we have had in our understanding of atrial fibrillation. The new finding of a high prevalence of subclinical atrial fibrillation in elderly people with cardiovascular risk factors, regardless of whether they had a prior stroke, substantially weakens the case that subclinical atrial fibrillation detected following a stroke has a causal relationship to the stroke. This implication is quite important.

Mitchel L. Zoler/Frontline Medical News
Dr. N.A. Mark Estes III
The finding that 34% of the studied patients have subclinical atrial fibrillation is consistent with results from several prior studies, which have documented subclinical atrial fibrillation prevalence rates of 12%-55%. Many of the prior studies used implanted pacemakers or defibrillation devices to monitor atrial fibrillation; the current study used an implanted loop recorder. For example, a prior study by Dr. Healey involving 2,580 patients with either a pacemaker or implanted defibrillator found that about a third of these patients developed subclinical AF during an average 2.5 years of follow-up (New Engl J Med. 2012 Jan 12;366[2]:120-9). It’s unknown whether there is a difference in the nature of atrial fibrillation detected by a pacemaker or defibrillator and detected by a loop recorder.

Many questions remain about the meaning of subclinical atrial fibrillation. What relationship does it have with stroke, and what thresholds exist for atrial fibrillation to raise stroke risk? Also, what are the risks and benefits of anticoagulation in people with subclinical AF and is intermittent anticoagulation helpful?

N.A. Mark Estes III, MD , is professor of medicine and director of the New England Cardiac Arrhythmia Center at Tufts Medical Center in Boston. He has been a consultant to Boston Scientific, Medtronic and St. Jude. He made these comments as designated discussant for ASSERT-II.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Body

 

The results reported by Dr. Healey provide robust data that bridges a major gap we have had in our understanding of atrial fibrillation. The new finding of a high prevalence of subclinical atrial fibrillation in elderly people with cardiovascular risk factors, regardless of whether they had a prior stroke, substantially weakens the case that subclinical atrial fibrillation detected following a stroke has a causal relationship to the stroke. This implication is quite important.

Mitchel L. Zoler/Frontline Medical News
Dr. N.A. Mark Estes III
The finding that 34% of the studied patients have subclinical atrial fibrillation is consistent with results from several prior studies, which have documented subclinical atrial fibrillation prevalence rates of 12%-55%. Many of the prior studies used implanted pacemakers or defibrillation devices to monitor atrial fibrillation; the current study used an implanted loop recorder. For example, a prior study by Dr. Healey involving 2,580 patients with either a pacemaker or implanted defibrillator found that about a third of these patients developed subclinical AF during an average 2.5 years of follow-up (New Engl J Med. 2012 Jan 12;366[2]:120-9). It’s unknown whether there is a difference in the nature of atrial fibrillation detected by a pacemaker or defibrillator and detected by a loop recorder.

Many questions remain about the meaning of subclinical atrial fibrillation. What relationship does it have with stroke, and what thresholds exist for atrial fibrillation to raise stroke risk? Also, what are the risks and benefits of anticoagulation in people with subclinical AF and is intermittent anticoagulation helpful?

N.A. Mark Estes III, MD , is professor of medicine and director of the New England Cardiac Arrhythmia Center at Tufts Medical Center in Boston. He has been a consultant to Boston Scientific, Medtronic and St. Jude. He made these comments as designated discussant for ASSERT-II.

Body

 

The results reported by Dr. Healey provide robust data that bridges a major gap we have had in our understanding of atrial fibrillation. The new finding of a high prevalence of subclinical atrial fibrillation in elderly people with cardiovascular risk factors, regardless of whether they had a prior stroke, substantially weakens the case that subclinical atrial fibrillation detected following a stroke has a causal relationship to the stroke. This implication is quite important.

Mitchel L. Zoler/Frontline Medical News
Dr. N.A. Mark Estes III
The finding that 34% of the studied patients have subclinical atrial fibrillation is consistent with results from several prior studies, which have documented subclinical atrial fibrillation prevalence rates of 12%-55%. Many of the prior studies used implanted pacemakers or defibrillation devices to monitor atrial fibrillation; the current study used an implanted loop recorder. For example, a prior study by Dr. Healey involving 2,580 patients with either a pacemaker or implanted defibrillator found that about a third of these patients developed subclinical AF during an average 2.5 years of follow-up (New Engl J Med. 2012 Jan 12;366[2]:120-9). It’s unknown whether there is a difference in the nature of atrial fibrillation detected by a pacemaker or defibrillator and detected by a loop recorder.

Many questions remain about the meaning of subclinical atrial fibrillation. What relationship does it have with stroke, and what thresholds exist for atrial fibrillation to raise stroke risk? Also, what are the risks and benefits of anticoagulation in people with subclinical AF and is intermittent anticoagulation helpful?

N.A. Mark Estes III, MD , is professor of medicine and director of the New England Cardiac Arrhythmia Center at Tufts Medical Center in Boston. He has been a consultant to Boston Scientific, Medtronic and St. Jude. He made these comments as designated discussant for ASSERT-II.

Title
Findings weaken stroke, subclinical AF link
Findings weaken stroke, subclinical AF link

 

– About a third of elderly people at high cardiovascular risk but otherwise healthy and asymptomatic had subclinical atrial fibrillation in a multicenter study of 273 people.

This finding that subclinical atrial fibrillation (AF) is “extremely common” in elderly people with cardiovascular risk factors “weakens the case that detecting subclinical AF in patients following a stroke implies causality” of the stroke “because subclinical AF is so prevalent,” Jeff S. Healey, MD, said at the American Heart Association Scientific Sessions.

Jeff S. Healey
He advised against taking any new steps to screen for or treat subclinical AF. Possible benefit from treating patients with subclinical AF with an anticoagulant is “unproven,” noted Dr. Healey. He also called it “premature” to routinely screen people aged 65 or older with an enlarged left atrium by implanting a loop recorder.

“I think that subclinical AF is a distinct subgroup of AF, with a risk for stroke that is quite low, about 1.5%-2% per year,” said Dr. Healey, a cardiologist at McMaster University in Hamilton, Canada. “Given that this was an elderly population [study participants averaged 74 years old] with bleeding risk, it’s reasonable to question” whether many people with subclinical AF need anticoagulation. The question of whether “45 seconds of AF seen 6 months after a stroke is worthy of treatment with an anticoagulant should give people pause,” he said.

The Prevalence of Sub-Clinical Atrial Fibrillation Using an Implantable Cardiac Monitor (ASSERT-II) study initially enrolled 273 people at 26 sites in Canada and The Netherlands. Researchers actually placed a loop recorder in 256, and complete follow-up of at least 9 months occurred for 252. Enrolled patients had to be at 65 years old, and have at least one of these risk factors for AF or stroke: a CHA2DS2-VASc score of 2 or greater; documented obstructive sleep apnea; or a body mass index greater than 30 kg/m2. In addition, enrollees also had to have one of these risk factors for AF: a left atrial volume of at least 58 ml; a left atrial diameter of at least 4.4 cm; or a serum NT-proBNP level of at least 290 pg/mL.

Dr. Healey and his associates prespecified subclinical AF as at least 5 minutes of AF seen in the loop recording during follow-up, which occurred in 34% of the participants during an average 16 months of follow-up, he reported. At least 30 minutes of AF occurred in 22% during follow-up, at least 6 hours in 7%, and at least 24 hours in 3%.

In a prespecified set of subgroup analyses, people with a large left atrium formed the only subgroup with a statistically significant association with outcome. People with a left atrial size at or above the study median of 73.5 ml had an 85% increased rate of subclinical AF compared with those with smaller left atria in the multivariate analysis. But increased left atrial size alone did not fully explain subclinical atrial fibrillation. Even among participants in the lowest quartile for left atrial diameter, less than 4.3 cm, the prevalence of subclinical AF was 27%, Dr. Healey noted.
 

 

– About a third of elderly people at high cardiovascular risk but otherwise healthy and asymptomatic had subclinical atrial fibrillation in a multicenter study of 273 people.

This finding that subclinical atrial fibrillation (AF) is “extremely common” in elderly people with cardiovascular risk factors “weakens the case that detecting subclinical AF in patients following a stroke implies causality” of the stroke “because subclinical AF is so prevalent,” Jeff S. Healey, MD, said at the American Heart Association Scientific Sessions.

Jeff S. Healey
He advised against taking any new steps to screen for or treat subclinical AF. Possible benefit from treating patients with subclinical AF with an anticoagulant is “unproven,” noted Dr. Healey. He also called it “premature” to routinely screen people aged 65 or older with an enlarged left atrium by implanting a loop recorder.

“I think that subclinical AF is a distinct subgroup of AF, with a risk for stroke that is quite low, about 1.5%-2% per year,” said Dr. Healey, a cardiologist at McMaster University in Hamilton, Canada. “Given that this was an elderly population [study participants averaged 74 years old] with bleeding risk, it’s reasonable to question” whether many people with subclinical AF need anticoagulation. The question of whether “45 seconds of AF seen 6 months after a stroke is worthy of treatment with an anticoagulant should give people pause,” he said.

The Prevalence of Sub-Clinical Atrial Fibrillation Using an Implantable Cardiac Monitor (ASSERT-II) study initially enrolled 273 people at 26 sites in Canada and The Netherlands. Researchers actually placed a loop recorder in 256, and complete follow-up of at least 9 months occurred for 252. Enrolled patients had to be at 65 years old, and have at least one of these risk factors for AF or stroke: a CHA2DS2-VASc score of 2 or greater; documented obstructive sleep apnea; or a body mass index greater than 30 kg/m2. In addition, enrollees also had to have one of these risk factors for AF: a left atrial volume of at least 58 ml; a left atrial diameter of at least 4.4 cm; or a serum NT-proBNP level of at least 290 pg/mL.

Dr. Healey and his associates prespecified subclinical AF as at least 5 minutes of AF seen in the loop recording during follow-up, which occurred in 34% of the participants during an average 16 months of follow-up, he reported. At least 30 minutes of AF occurred in 22% during follow-up, at least 6 hours in 7%, and at least 24 hours in 3%.

In a prespecified set of subgroup analyses, people with a large left atrium formed the only subgroup with a statistically significant association with outcome. People with a left atrial size at or above the study median of 73.5 ml had an 85% increased rate of subclinical AF compared with those with smaller left atria in the multivariate analysis. But increased left atrial size alone did not fully explain subclinical atrial fibrillation. Even among participants in the lowest quartile for left atrial diameter, less than 4.3 cm, the prevalence of subclinical AF was 27%, Dr. Healey noted.
 

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE AHA SCIENTIFIC SESSIONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Subcinical atrial fibrillation is highly prevalent among asymptomatic elderly people with at least two cardiovascular disease risk factors.

Major finding: One-third of asymptomatic elderly people with cardiovascular risk factors had subclinical atrial fibrillation.

Data source: A multicenter study with 252 people followed for an average of 16 months.

Disclosures: Dr. Healey has been a consultant to or received honoraria from Bayer, Medtronic, Pfizer and Servier. He has received research support from Boehringer Ingelheim, Boston Scientific, Bristol-Myers Squibb, Medtronic and St. Jude.

Vaccine candidate can protect humans from malaria

Article Type
Changed
Sun, 01/22/2017 - 06:00
Display Headline
Vaccine candidate can protect humans from malaria

Blood smear showing
Plasmodium falciparum
Photo by Mae Melvin, CDC

In a phase 2 trial, a malaria vaccine candidate was able to prevent volunteers from contracting Plasmodium falciparum malaria.

The vaccine, known as PfSPZ vaccine, is composed of live but weakened P falciparum sporozoites.

Three to 5 doses of PfSPZ vaccine protected subjects against malaria parasites similar to those in the vaccine as well as parasites different from those in the vaccine.

A majority of subjects were protected at 3 weeks after vaccination. For some subjects, this protection was sustained at 24 weeks.

PfSPZ vaccine was considered well-tolerated, as all adverse events (AEs) in this trial were grade 1 or 2.

These results were published in JCI Insight.

The study was funded by the US Department of Defense through the Joint Warfighter Program, the Military Infectious Disease Research Program, and US Navy Advanced Medical Development, with additional support from Sanaria, Inc., the company developing PfSPZ vaccine.

“Our military continues to be at risk from malaria as it deploys worldwide,” said Kenneth A. Bertram, MD, of the US Army Medical Research and Materiel Command in Ft Detrick, Maryland.

“We are excited about the results of this clinical trial and are now investing in the ongoing clinical trial to finalize the vaccination regimen for PfSPZ vaccine.”

The current trial included 67 volunteers with a median age of 29.6 (range, 19 to 45).

They were randomized to 3 treatment groups. Forty-five subjects were set to receive the PfSPZ vaccine, and 22 subjects served as controls.

The volunteers underwent controlled human malaria infection (CHMI) 3 weeks after vaccinated subjects received their final vaccine dose and then again at 24 weeks.

Efficacy at 3 weeks


Group 1

In this group, 13 subjects received 5 doses of the vaccine at 2.7 × 105. They and 6 control subjects underwent CHMI with a homologous strain of P falciparum, Pf3D7.

Twelve of the 13 fully immunized subjects, or 92.3%, did not develop parasitemia. However, all 6 control subjects did, with a median prepatent period of 11.6 days. The prepatent period in the immunized subject who developed parasitemia was 13.9 days.

Group 2

In this group, 5 subjects received 5 doses of the vaccine at 2.7 × 105. They and 4 control subjects underwent CHMI with a heterologous strain of P falciparum, Pf7G8. 

Four of the 5 fully immunized subjects, or 80%, did not develop parasitemia. However, all 4 control subjects did, with a median prepatent period of 11.9 days. The prepatent period in the fully immunized subject who developed parasitemia was 11.9 days.

Group 3

In this group, 15 subjects received 3 doses of the vaccine at 4.5 × 105 and underwent homologous Pf3D7 CHMI. The control subjects for this group were the same as those in group 1.

Thirteen of the 15 immunized subjects, or 86.7%, did not develop parasitemia. The prepatent periods in the 2 immunized subjects who did develop parasitemia were 13.9 days and 16.9 days.

Efficacy at 24 weeks


Study subjects underwent a second CHMI at 24 weeks after immunized participants received their final dose of vaccine.

Group 1

Seven of the 10 fully immunized subjects who underwent a second CHMI did not develop parasitemia (70%). However, all 6 control subjects did, with a median prepatent period of 11.6 days. The median prepatent period in the 3 fully immunized subjects who developed parasitemia was 15.4 days.

Group 2

One of the 10 fully immunized subjects who underwent a second CHMI did not develop parasitemia (10%). However, all control subjects did, with a median prepatent period of 10.9 days. The median prepatent period in the 9 fully immunized subjects who developed parasitemia was 11.9 days.

Group 3

Eight of the 14 fully immunized subjects who underwent a second CHMI did not develop parasitemia (57.1%). The median prepatent period in the 6 fully immunized subjects who did develop parasitemia was 14.0 days.

Safety

There were 66 solicited AEs reported within 7 days of immunization that were considered possibly, probably, or definitely related to vaccination. Ninety-two percent of these AEs were grade 1, and 8% were grade 2. All unsolicited AEs reported within 7 days of immunization were grade 1.

The incidence of AEs was not higher in group 3 than in groups 1 or 2, and the incidence of AEs did not increase as subjects received additional doses of the vaccine.

The most common AEs (with an incidence of 10% or higher in at least 1 group) were injection site pain, headache, fatigue, malaise, myalgia, injection site hemorrhage, and cough.

“The results of this clinical trial, along with recent results from other trials of this vaccine in the US and Africa, were critical to our decision to move forward with a trial involving 400 infants in Kenya,” said Tina Oneko, MD, of the Kenya Medical Research Institute, who is the principal investigator of the Kenya trial but was not involved in the current trial.

“This represents significant progress toward the development of a regimen for PfSPZ vaccine that we anticipate will provide a high level [of] efficacy for malaria prevention in all age groups in Africa.”

Publications
Topics

Blood smear showing
Plasmodium falciparum
Photo by Mae Melvin, CDC

In a phase 2 trial, a malaria vaccine candidate was able to prevent volunteers from contracting Plasmodium falciparum malaria.

The vaccine, known as PfSPZ vaccine, is composed of live but weakened P falciparum sporozoites.

Three to 5 doses of PfSPZ vaccine protected subjects against malaria parasites similar to those in the vaccine as well as parasites different from those in the vaccine.

A majority of subjects were protected at 3 weeks after vaccination. For some subjects, this protection was sustained at 24 weeks.

PfSPZ vaccine was considered well-tolerated, as all adverse events (AEs) in this trial were grade 1 or 2.

These results were published in JCI Insight.

The study was funded by the US Department of Defense through the Joint Warfighter Program, the Military Infectious Disease Research Program, and US Navy Advanced Medical Development, with additional support from Sanaria, Inc., the company developing PfSPZ vaccine.

“Our military continues to be at risk from malaria as it deploys worldwide,” said Kenneth A. Bertram, MD, of the US Army Medical Research and Materiel Command in Ft Detrick, Maryland.

“We are excited about the results of this clinical trial and are now investing in the ongoing clinical trial to finalize the vaccination regimen for PfSPZ vaccine.”

The current trial included 67 volunteers with a median age of 29.6 (range, 19 to 45).

They were randomized to 3 treatment groups. Forty-five subjects were set to receive the PfSPZ vaccine, and 22 subjects served as controls.

The volunteers underwent controlled human malaria infection (CHMI) 3 weeks after vaccinated subjects received their final vaccine dose and then again at 24 weeks.

Efficacy at 3 weeks


Group 1

In this group, 13 subjects received 5 doses of the vaccine at 2.7 × 105. They and 6 control subjects underwent CHMI with a homologous strain of P falciparum, Pf3D7.

Twelve of the 13 fully immunized subjects, or 92.3%, did not develop parasitemia. However, all 6 control subjects did, with a median prepatent period of 11.6 days. The prepatent period in the immunized subject who developed parasitemia was 13.9 days.

Group 2

In this group, 5 subjects received 5 doses of the vaccine at 2.7 × 105. They and 4 control subjects underwent CHMI with a heterologous strain of P falciparum, Pf7G8. 

Four of the 5 fully immunized subjects, or 80%, did not develop parasitemia. However, all 4 control subjects did, with a median prepatent period of 11.9 days. The prepatent period in the fully immunized subject who developed parasitemia was 11.9 days.

Group 3

In this group, 15 subjects received 3 doses of the vaccine at 4.5 × 105 and underwent homologous Pf3D7 CHMI. The control subjects for this group were the same as those in group 1.

Thirteen of the 15 immunized subjects, or 86.7%, did not develop parasitemia. The prepatent periods in the 2 immunized subjects who did develop parasitemia were 13.9 days and 16.9 days.

Efficacy at 24 weeks


Study subjects underwent a second CHMI at 24 weeks after immunized participants received their final dose of vaccine.

Group 1

Seven of the 10 fully immunized subjects who underwent a second CHMI did not develop parasitemia (70%). However, all 6 control subjects did, with a median prepatent period of 11.6 days. The median prepatent period in the 3 fully immunized subjects who developed parasitemia was 15.4 days.

Group 2

One of the 10 fully immunized subjects who underwent a second CHMI did not develop parasitemia (10%). However, all control subjects did, with a median prepatent period of 10.9 days. The median prepatent period in the 9 fully immunized subjects who developed parasitemia was 11.9 days.

Group 3

Eight of the 14 fully immunized subjects who underwent a second CHMI did not develop parasitemia (57.1%). The median prepatent period in the 6 fully immunized subjects who did develop parasitemia was 14.0 days.

Safety

There were 66 solicited AEs reported within 7 days of immunization that were considered possibly, probably, or definitely related to vaccination. Ninety-two percent of these AEs were grade 1, and 8% were grade 2. All unsolicited AEs reported within 7 days of immunization were grade 1.

The incidence of AEs was not higher in group 3 than in groups 1 or 2, and the incidence of AEs did not increase as subjects received additional doses of the vaccine.

The most common AEs (with an incidence of 10% or higher in at least 1 group) were injection site pain, headache, fatigue, malaise, myalgia, injection site hemorrhage, and cough.

“The results of this clinical trial, along with recent results from other trials of this vaccine in the US and Africa, were critical to our decision to move forward with a trial involving 400 infants in Kenya,” said Tina Oneko, MD, of the Kenya Medical Research Institute, who is the principal investigator of the Kenya trial but was not involved in the current trial.

“This represents significant progress toward the development of a regimen for PfSPZ vaccine that we anticipate will provide a high level [of] efficacy for malaria prevention in all age groups in Africa.”

Blood smear showing
Plasmodium falciparum
Photo by Mae Melvin, CDC

In a phase 2 trial, a malaria vaccine candidate was able to prevent volunteers from contracting Plasmodium falciparum malaria.

The vaccine, known as PfSPZ vaccine, is composed of live but weakened P falciparum sporozoites.

Three to 5 doses of PfSPZ vaccine protected subjects against malaria parasites similar to those in the vaccine as well as parasites different from those in the vaccine.

A majority of subjects were protected at 3 weeks after vaccination. For some subjects, this protection was sustained at 24 weeks.

PfSPZ vaccine was considered well-tolerated, as all adverse events (AEs) in this trial were grade 1 or 2.

These results were published in JCI Insight.

The study was funded by the US Department of Defense through the Joint Warfighter Program, the Military Infectious Disease Research Program, and US Navy Advanced Medical Development, with additional support from Sanaria, Inc., the company developing PfSPZ vaccine.

“Our military continues to be at risk from malaria as it deploys worldwide,” said Kenneth A. Bertram, MD, of the US Army Medical Research and Materiel Command in Ft Detrick, Maryland.

“We are excited about the results of this clinical trial and are now investing in the ongoing clinical trial to finalize the vaccination regimen for PfSPZ vaccine.”

The current trial included 67 volunteers with a median age of 29.6 (range, 19 to 45).

They were randomized to 3 treatment groups. Forty-five subjects were set to receive the PfSPZ vaccine, and 22 subjects served as controls.

The volunteers underwent controlled human malaria infection (CHMI) 3 weeks after vaccinated subjects received their final vaccine dose and then again at 24 weeks.

Efficacy at 3 weeks


Group 1

In this group, 13 subjects received 5 doses of the vaccine at 2.7 × 105. They and 6 control subjects underwent CHMI with a homologous strain of P falciparum, Pf3D7.

Twelve of the 13 fully immunized subjects, or 92.3%, did not develop parasitemia. However, all 6 control subjects did, with a median prepatent period of 11.6 days. The prepatent period in the immunized subject who developed parasitemia was 13.9 days.

Group 2

In this group, 5 subjects received 5 doses of the vaccine at 2.7 × 105. They and 4 control subjects underwent CHMI with a heterologous strain of P falciparum, Pf7G8. 

Four of the 5 fully immunized subjects, or 80%, did not develop parasitemia. However, all 4 control subjects did, with a median prepatent period of 11.9 days. The prepatent period in the fully immunized subject who developed parasitemia was 11.9 days.

Group 3

In this group, 15 subjects received 3 doses of the vaccine at 4.5 × 105 and underwent homologous Pf3D7 CHMI. The control subjects for this group were the same as those in group 1.

Thirteen of the 15 immunized subjects, or 86.7%, did not develop parasitemia. The prepatent periods in the 2 immunized subjects who did develop parasitemia were 13.9 days and 16.9 days.

Efficacy at 24 weeks


Study subjects underwent a second CHMI at 24 weeks after immunized participants received their final dose of vaccine.

Group 1

Seven of the 10 fully immunized subjects who underwent a second CHMI did not develop parasitemia (70%). However, all 6 control subjects did, with a median prepatent period of 11.6 days. The median prepatent period in the 3 fully immunized subjects who developed parasitemia was 15.4 days.

Group 2

One of the 10 fully immunized subjects who underwent a second CHMI did not develop parasitemia (10%). However, all control subjects did, with a median prepatent period of 10.9 days. The median prepatent period in the 9 fully immunized subjects who developed parasitemia was 11.9 days.

Group 3

Eight of the 14 fully immunized subjects who underwent a second CHMI did not develop parasitemia (57.1%). The median prepatent period in the 6 fully immunized subjects who did develop parasitemia was 14.0 days.

Safety

There were 66 solicited AEs reported within 7 days of immunization that were considered possibly, probably, or definitely related to vaccination. Ninety-two percent of these AEs were grade 1, and 8% were grade 2. All unsolicited AEs reported within 7 days of immunization were grade 1.

The incidence of AEs was not higher in group 3 than in groups 1 or 2, and the incidence of AEs did not increase as subjects received additional doses of the vaccine.

The most common AEs (with an incidence of 10% or higher in at least 1 group) were injection site pain, headache, fatigue, malaise, myalgia, injection site hemorrhage, and cough.

“The results of this clinical trial, along with recent results from other trials of this vaccine in the US and Africa, were critical to our decision to move forward with a trial involving 400 infants in Kenya,” said Tina Oneko, MD, of the Kenya Medical Research Institute, who is the principal investigator of the Kenya trial but was not involved in the current trial.

“This represents significant progress toward the development of a regimen for PfSPZ vaccine that we anticipate will provide a high level [of] efficacy for malaria prevention in all age groups in Africa.”

Publications
Publications
Topics
Article Type
Display Headline
Vaccine candidate can protect humans from malaria
Display Headline
Vaccine candidate can protect humans from malaria
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

High-flow oxygen noninferior to noninvasive ventilation postextubation

Article Type
Changed
Fri, 09/14/2018 - 12:01

Clinical Question: Is high-flow oxygen noninferior to noninvasive ventilation (NIV) in preventing postextubation respiratory failure and reintubation?

Background: Studies that suggest NIV usage following extubation reduces the risk of postextubation respiratory failure have led to an increase in use of this practice. Compared with NIV, high-flow, conditioned oxygen therapy has many advantages and fewer adverse effects, suggesting it might be a useful alternative.

Study design: Randomized clinical trial.

Setting: Three ICUs in Spain.

Synopsis: Investigators randomized 604 patients who were identified for planned extubation and at high risk of extubation failure to either NIV or high-flow oxygen therapy via nasal cannula for 24 hours following extubation. Per the noninferiority threshold, high-flow oxygen therapy was noninferior to NIV with respect to rates of reintubation (22.8% vs. 19.1%, respectively; one-sided 95% CI, –9.1% to ∞) and postextubation respiratory failure (26.9% vs. 39.8%, respectively; one-sided 95% CI, 6.6% to ∞).
 

Rates of most secondary outcomes, including infection, mortality, and hospital length of stay (LOS) were similar between the two groups. ICU LOS was significantly less in the high-flow oxygen group (3d vs. 4d; 95% CI, –6.8 to –0.8).

Additionally, every patient tolerated high-flow oxygen therapy, while 40% of patients in the NIV arm required withdrawal of therapy for at least 6 hours due to adverse effects (P less than .001).

Bottom line: High-flow oxygen immediately following extubation may be a useful alternative to NIV in preventing postextubation respiratory failure.

Citation: Hernández G, Vaquero C, González P, et al. Effect of postextubation high-flow nasal cannula vs conventional oxygen therapy on reintubation in low-risk patients: a randomized clinical trial. JAMA. 2016;315(13):1354-61.

 

Dr. Murphy is a clinical instructor at the University of Utah School of Medicine and an academic hospitalist at the University of Utah Hospital.

Publications
Sections

Clinical Question: Is high-flow oxygen noninferior to noninvasive ventilation (NIV) in preventing postextubation respiratory failure and reintubation?

Background: Studies that suggest NIV usage following extubation reduces the risk of postextubation respiratory failure have led to an increase in use of this practice. Compared with NIV, high-flow, conditioned oxygen therapy has many advantages and fewer adverse effects, suggesting it might be a useful alternative.

Study design: Randomized clinical trial.

Setting: Three ICUs in Spain.

Synopsis: Investigators randomized 604 patients who were identified for planned extubation and at high risk of extubation failure to either NIV or high-flow oxygen therapy via nasal cannula for 24 hours following extubation. Per the noninferiority threshold, high-flow oxygen therapy was noninferior to NIV with respect to rates of reintubation (22.8% vs. 19.1%, respectively; one-sided 95% CI, –9.1% to ∞) and postextubation respiratory failure (26.9% vs. 39.8%, respectively; one-sided 95% CI, 6.6% to ∞).
 

Rates of most secondary outcomes, including infection, mortality, and hospital length of stay (LOS) were similar between the two groups. ICU LOS was significantly less in the high-flow oxygen group (3d vs. 4d; 95% CI, –6.8 to –0.8).

Additionally, every patient tolerated high-flow oxygen therapy, while 40% of patients in the NIV arm required withdrawal of therapy for at least 6 hours due to adverse effects (P less than .001).

Bottom line: High-flow oxygen immediately following extubation may be a useful alternative to NIV in preventing postextubation respiratory failure.

Citation: Hernández G, Vaquero C, González P, et al. Effect of postextubation high-flow nasal cannula vs conventional oxygen therapy on reintubation in low-risk patients: a randomized clinical trial. JAMA. 2016;315(13):1354-61.

 

Dr. Murphy is a clinical instructor at the University of Utah School of Medicine and an academic hospitalist at the University of Utah Hospital.

Clinical Question: Is high-flow oxygen noninferior to noninvasive ventilation (NIV) in preventing postextubation respiratory failure and reintubation?

Background: Studies that suggest NIV usage following extubation reduces the risk of postextubation respiratory failure have led to an increase in use of this practice. Compared with NIV, high-flow, conditioned oxygen therapy has many advantages and fewer adverse effects, suggesting it might be a useful alternative.

Study design: Randomized clinical trial.

Setting: Three ICUs in Spain.

Synopsis: Investigators randomized 604 patients who were identified for planned extubation and at high risk of extubation failure to either NIV or high-flow oxygen therapy via nasal cannula for 24 hours following extubation. Per the noninferiority threshold, high-flow oxygen therapy was noninferior to NIV with respect to rates of reintubation (22.8% vs. 19.1%, respectively; one-sided 95% CI, –9.1% to ∞) and postextubation respiratory failure (26.9% vs. 39.8%, respectively; one-sided 95% CI, 6.6% to ∞).
 

Rates of most secondary outcomes, including infection, mortality, and hospital length of stay (LOS) were similar between the two groups. ICU LOS was significantly less in the high-flow oxygen group (3d vs. 4d; 95% CI, –6.8 to –0.8).

Additionally, every patient tolerated high-flow oxygen therapy, while 40% of patients in the NIV arm required withdrawal of therapy for at least 6 hours due to adverse effects (P less than .001).

Bottom line: High-flow oxygen immediately following extubation may be a useful alternative to NIV in preventing postextubation respiratory failure.

Citation: Hernández G, Vaquero C, González P, et al. Effect of postextubation high-flow nasal cannula vs conventional oxygen therapy on reintubation in low-risk patients: a randomized clinical trial. JAMA. 2016;315(13):1354-61.

 

Dr. Murphy is a clinical instructor at the University of Utah School of Medicine and an academic hospitalist at the University of Utah Hospital.

Publications
Publications
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME

Clonal hematopoiesis increases risk for therapy-related cancers

Goal: Eradicate clonal hematopoiesis
Article Type
Changed
Fri, 01/04/2019 - 09:58

 

Small pre-leukemic clones left behind after treatment for non-myeloid malignancies appear to increase the risk for therapy-related myelodysplasia or leukemia, report investigators in two studies.

An analysis of peripheral blood samples taken from patients at the time of their primary cancer diagnosis and bone marrow samples taken at the time of a later therapy-related myeloid neoplasm diagnosis showed that 10 of 14 patients (71%) had clonal hematopoiesis before starting on cytotoxic chemotherapy. In contrast, clonal hematopoiesis was detected in pre-treatment samples of only 17 of 54 controls (31%), reported Koichi Takahashi, MD, and colleagues from the University of Texas MD Anderson Cancer Center in Houston.

“Preleukemic clonal hematopoiesis is common in patients with therapy-related myeloid neoplasms at the time of their primary cancer diagnosis and before they have been exposed to treatment. Our results suggest that clonal hematopoiesis could be used as a predictive marker to identify patients with cancer who are at risk of developing therapy-related myeloid neoplasms,” they wrote (Lancet Oncol 2017; 18: 100–11).

In a separate study, investigators from the Moffitt Cancer Center in Tampa, Florida, found in a nested case-control study that patients with therapy-related myeloid neoplasms were more likely than controls to have clonal hematopoiesis of indeterminate potential (CHIP), and that the CHIP was often present before exposure to chemotherapy.

“We recorded a significantly higher prevalence of CHIP in individuals who developed therapy-related myeloid neoplasms (cases) than in those who did not (controls); however, around 27% of individuals with CHIP did not develop therapy-related myeloid neoplasms, suggesting that this feature alone should not be used to determine a patient’s suitability for chemotherapy,” wrote Nancy K. Gillis, PharmD, and colleagues (Lancet Oncol 2017; 18:112-21).

Risk factors examined

Dr. Takahashi and colleagues noted that previous studies have identified several treatment-related risk factors as being associated with therapy-related myeloid dysplasia or leukemia, including the use of alkylating agents, topoisomerase II inhibitors, and high-dose chemotherapy with autologous stem-cell transplantation.

“By contrast, little is known about patient-specific risk factors. Older age was shown to increase the risk of therapy-related myeloid neoplasms. Several germline polymorphisms have also been associated with this risk, but none have been validated. As such, no predictive biomarkers exist for therapy-related myeloid neoplasms,” they wrote.

They performed a retrospective case-control study comparing patients treated for a primary cancer at their center from 1997 through 2015 who subsequently developed a myeloid neoplasm with controls treated during the same period. Controls were age-matched patients treated with combination chemotherapy for lymphoma who did not develop a therapy-related myeloid malignancy after at least 5 years of follow-up.

In addition, the investigators further explored the association between clonal hematopoiesis and therapy-related cancers in an external cohort of patients with lymphoma treated in a randomized trial at their center from 1999 through 2001. That trial compared the CHOP regimen (cyclophosphamide, doxorubicin, vincristine and prednisone) with and without melatonin.

To detect clonal hematopoiesis in pre-treatment peripheral blood, the investigators used molecular barcode sequencing of 32 genes. They also used targeted gene sequencing on bone marrow samples from cases to investigate clonal evolution from clonal hematopoiesis to the development of therapy-related myeloid neoplasms.

As noted before, 10 of 14 cases had evidence of pre-treatment clonal hematopoiesis, compared with 17 of 54 controls. For both cases and controls, the cumulative incidence of therapy-related myeloid cancers after 5 years was significantly higher among those with baseline clonal hematopoiesis, at 30% vs. 7% for patients without it (P = .016).

Five of 74 patients in the external cohort (7%) went on to develop therapy-related myeloid neoplasms, and of this group, four (80%) had clonal hematopoiesis at baseline. In contrast, of the 69 patients who did not develop therapy-related cancers, 11 (16%) had baseline clonal hematopoiesis.

In a multivariate model using data from the external cohort, clonal hematopoiesis was significantly associated with risk for therapy-related myeloid neoplasms, with a hazard ratio of 13.7 (P = .013).

Elderly patient study

Dr. Gillis and her colleagues conducted a nested, case-control, proof-of-concept study to compare the prevalence of CHIP between patients with cancer who later developed therapy-related myeloid neoplasms (cases) and patients who did not (controls).

The cases were identified from an internal biobank of 123,357 patients, and included all patients who were diagnosed with a primary cancer, treated with chemotherapy, and subsequently developed a therapy-related myeloid neoplasm. The patients had to be 70 or older at the time of either primary or therapy-related cancer diagnosis with peripheral blood or mononuclear samples collected before the diagnosis of the second cancer.

Controls were patients diagnosed with a primary malignancy at age 70 or older who had chemotherapy but did not develop therapy-related myeloid neoplasms. Every case was matched with at least four controls selected for sex, primary tumor type, age at diagnosis, smoking status, chemotherapy drug class, and duration of follow up.

They used sequential targeted and whole-exome sequencing to assess clonal evolution in cases for whom paired CHIP and therapy-related myeloid neoplasm samples were available.

They identified a total of 13 cases and 56 controls. Among all patients, CHIP was seen in 23 (33%). In contrast, previous studies have shown a prevalence of CHIP among older patients without cancer of about 10%, the authors note in their article.

The prevalence of CHIP was significantly higher among cases than among controls, occurring in 8 of 13 cases (62%) vs 15 of 56 controls (27%; P = .024). The odds ratio for therapy-related neoplasms with CHIP was 5.75 (P = .013).

The most commonly mutated genes were TET2 and TP53 among cases, and TET2 among controls.

“The distribution of CHIP-related gene mutations differs between individuals with therapy-related myeloid neoplasm and those without, suggesting that mutation-specific differences might exist in therapy-related myeloid neoplasm risk,” the investigators write.

Dr. Takahashi’s study was supported by the Cancer Prevention Research Institute of Texas, Red and Charline McCombs Institute for the Early Detection and Treatment of Cancer, The National Institutes of Health through MD Anderson Cancer Center Support Grant, and the MD Anderson MDS & AML Moon Shots Program. Dr. Gillis’ study was internally funded. Dr. Takahasi and colleagues reported no competing financial interests. Two of Dr. Gillis’ colleagues reported grants or fees from several drug companies.

Body

 

The real importance of the work reported by Gillis and colleagues and Takahashi and colleagues will come when therapies exist that can effectively eradicate nascent clonal hematopoiesis, thereby preventing therapy-related myeloid neoplasm evolution in at-risk patients.

Although high-intensity treatments, such as anthracycline-based induction chemotherapy, can eradicate myeloid clones, their effectiveness in clearing TP53-mutant cells is limited, and it is difficult to imagine intense approaches having a favorable risk–benefit balance in patients whose clonal hematopoiesis might never become a problem. Existing lower-intensity therapies for myeloid neoplasms such as DNA hypomethylating agents are not curative and often do not result in the reduction of VAF [variant allele frequencies] even when hematopoietic improvement occurs during therapy, so such agents would not be expected to eliminate pre-therapy-related myeloid neoplasm clones (although this hypothesis might still be worth testing, given that the emergence of therapy-related myeloid neoplasm could at least be delayed – even if not entirely prevented – with azacitidine or decitabine).

Dr. David P. Steensma
Similarly, in de novo myelodysplastic syndrome associated with deletion of the long arm of chromosome 5 (del5q), patients who are treated with lenalidomide and achieve complete hematological and cytogenetic remission can still be shown by sensitive techniques (eg, sorting for quiescent cells and fluorescence in situ hybridization assays to show persistent del5q in these quiescent cells) to have a small population of residual hematopoietic progenitors bearing the 5q deletion.

More promising are strategies that change the bone marrow microenvironment or break the immune tolerance of abnormal clones, although the use of these approaches for myeloid neoplasia is still in the very early stages. Although no method yet exists to reliably eliminate the preleukemic clones that can give rise to therapy-related myeloid neoplasms, identification of higher risk patients could still affect monitoring practices, such as the frequency of clinical assessments. Molecular genetic panels are expensive at present but are becoming less so. Because VAF assessment by next-generation sequencing is quantitative and proportional to clone size, serial assessment could identify patients whose mutant clones are large and expanding and who therefore warrant closer monitoring or enrollment in so-called preventive hematology trials.

David P. Steensma, MD, is with the Dana-Farber Cancer Institute, Harvard Medical School, Boston. His remarks were excerpted from an accompanying editorial.

Publications
Topics
Sections
Body

 

The real importance of the work reported by Gillis and colleagues and Takahashi and colleagues will come when therapies exist that can effectively eradicate nascent clonal hematopoiesis, thereby preventing therapy-related myeloid neoplasm evolution in at-risk patients.

Although high-intensity treatments, such as anthracycline-based induction chemotherapy, can eradicate myeloid clones, their effectiveness in clearing TP53-mutant cells is limited, and it is difficult to imagine intense approaches having a favorable risk–benefit balance in patients whose clonal hematopoiesis might never become a problem. Existing lower-intensity therapies for myeloid neoplasms such as DNA hypomethylating agents are not curative and often do not result in the reduction of VAF [variant allele frequencies] even when hematopoietic improvement occurs during therapy, so such agents would not be expected to eliminate pre-therapy-related myeloid neoplasm clones (although this hypothesis might still be worth testing, given that the emergence of therapy-related myeloid neoplasm could at least be delayed – even if not entirely prevented – with azacitidine or decitabine).

Dr. David P. Steensma
Similarly, in de novo myelodysplastic syndrome associated with deletion of the long arm of chromosome 5 (del5q), patients who are treated with lenalidomide and achieve complete hematological and cytogenetic remission can still be shown by sensitive techniques (eg, sorting for quiescent cells and fluorescence in situ hybridization assays to show persistent del5q in these quiescent cells) to have a small population of residual hematopoietic progenitors bearing the 5q deletion.

More promising are strategies that change the bone marrow microenvironment or break the immune tolerance of abnormal clones, although the use of these approaches for myeloid neoplasia is still in the very early stages. Although no method yet exists to reliably eliminate the preleukemic clones that can give rise to therapy-related myeloid neoplasms, identification of higher risk patients could still affect monitoring practices, such as the frequency of clinical assessments. Molecular genetic panels are expensive at present but are becoming less so. Because VAF assessment by next-generation sequencing is quantitative and proportional to clone size, serial assessment could identify patients whose mutant clones are large and expanding and who therefore warrant closer monitoring or enrollment in so-called preventive hematology trials.

David P. Steensma, MD, is with the Dana-Farber Cancer Institute, Harvard Medical School, Boston. His remarks were excerpted from an accompanying editorial.

Body

 

The real importance of the work reported by Gillis and colleagues and Takahashi and colleagues will come when therapies exist that can effectively eradicate nascent clonal hematopoiesis, thereby preventing therapy-related myeloid neoplasm evolution in at-risk patients.

Although high-intensity treatments, such as anthracycline-based induction chemotherapy, can eradicate myeloid clones, their effectiveness in clearing TP53-mutant cells is limited, and it is difficult to imagine intense approaches having a favorable risk–benefit balance in patients whose clonal hematopoiesis might never become a problem. Existing lower-intensity therapies for myeloid neoplasms such as DNA hypomethylating agents are not curative and often do not result in the reduction of VAF [variant allele frequencies] even when hematopoietic improvement occurs during therapy, so such agents would not be expected to eliminate pre-therapy-related myeloid neoplasm clones (although this hypothesis might still be worth testing, given that the emergence of therapy-related myeloid neoplasm could at least be delayed – even if not entirely prevented – with azacitidine or decitabine).

Dr. David P. Steensma
Similarly, in de novo myelodysplastic syndrome associated with deletion of the long arm of chromosome 5 (del5q), patients who are treated with lenalidomide and achieve complete hematological and cytogenetic remission can still be shown by sensitive techniques (eg, sorting for quiescent cells and fluorescence in situ hybridization assays to show persistent del5q in these quiescent cells) to have a small population of residual hematopoietic progenitors bearing the 5q deletion.

More promising are strategies that change the bone marrow microenvironment or break the immune tolerance of abnormal clones, although the use of these approaches for myeloid neoplasia is still in the very early stages. Although no method yet exists to reliably eliminate the preleukemic clones that can give rise to therapy-related myeloid neoplasms, identification of higher risk patients could still affect monitoring practices, such as the frequency of clinical assessments. Molecular genetic panels are expensive at present but are becoming less so. Because VAF assessment by next-generation sequencing is quantitative and proportional to clone size, serial assessment could identify patients whose mutant clones are large and expanding and who therefore warrant closer monitoring or enrollment in so-called preventive hematology trials.

David P. Steensma, MD, is with the Dana-Farber Cancer Institute, Harvard Medical School, Boston. His remarks were excerpted from an accompanying editorial.

Title
Goal: Eradicate clonal hematopoiesis
Goal: Eradicate clonal hematopoiesis

 

Small pre-leukemic clones left behind after treatment for non-myeloid malignancies appear to increase the risk for therapy-related myelodysplasia or leukemia, report investigators in two studies.

An analysis of peripheral blood samples taken from patients at the time of their primary cancer diagnosis and bone marrow samples taken at the time of a later therapy-related myeloid neoplasm diagnosis showed that 10 of 14 patients (71%) had clonal hematopoiesis before starting on cytotoxic chemotherapy. In contrast, clonal hematopoiesis was detected in pre-treatment samples of only 17 of 54 controls (31%), reported Koichi Takahashi, MD, and colleagues from the University of Texas MD Anderson Cancer Center in Houston.

“Preleukemic clonal hematopoiesis is common in patients with therapy-related myeloid neoplasms at the time of their primary cancer diagnosis and before they have been exposed to treatment. Our results suggest that clonal hematopoiesis could be used as a predictive marker to identify patients with cancer who are at risk of developing therapy-related myeloid neoplasms,” they wrote (Lancet Oncol 2017; 18: 100–11).

In a separate study, investigators from the Moffitt Cancer Center in Tampa, Florida, found in a nested case-control study that patients with therapy-related myeloid neoplasms were more likely than controls to have clonal hematopoiesis of indeterminate potential (CHIP), and that the CHIP was often present before exposure to chemotherapy.

“We recorded a significantly higher prevalence of CHIP in individuals who developed therapy-related myeloid neoplasms (cases) than in those who did not (controls); however, around 27% of individuals with CHIP did not develop therapy-related myeloid neoplasms, suggesting that this feature alone should not be used to determine a patient’s suitability for chemotherapy,” wrote Nancy K. Gillis, PharmD, and colleagues (Lancet Oncol 2017; 18:112-21).

Risk factors examined

Dr. Takahashi and colleagues noted that previous studies have identified several treatment-related risk factors as being associated with therapy-related myeloid dysplasia or leukemia, including the use of alkylating agents, topoisomerase II inhibitors, and high-dose chemotherapy with autologous stem-cell transplantation.

“By contrast, little is known about patient-specific risk factors. Older age was shown to increase the risk of therapy-related myeloid neoplasms. Several germline polymorphisms have also been associated with this risk, but none have been validated. As such, no predictive biomarkers exist for therapy-related myeloid neoplasms,” they wrote.

They performed a retrospective case-control study comparing patients treated for a primary cancer at their center from 1997 through 2015 who subsequently developed a myeloid neoplasm with controls treated during the same period. Controls were age-matched patients treated with combination chemotherapy for lymphoma who did not develop a therapy-related myeloid malignancy after at least 5 years of follow-up.

In addition, the investigators further explored the association between clonal hematopoiesis and therapy-related cancers in an external cohort of patients with lymphoma treated in a randomized trial at their center from 1999 through 2001. That trial compared the CHOP regimen (cyclophosphamide, doxorubicin, vincristine and prednisone) with and without melatonin.

To detect clonal hematopoiesis in pre-treatment peripheral blood, the investigators used molecular barcode sequencing of 32 genes. They also used targeted gene sequencing on bone marrow samples from cases to investigate clonal evolution from clonal hematopoiesis to the development of therapy-related myeloid neoplasms.

As noted before, 10 of 14 cases had evidence of pre-treatment clonal hematopoiesis, compared with 17 of 54 controls. For both cases and controls, the cumulative incidence of therapy-related myeloid cancers after 5 years was significantly higher among those with baseline clonal hematopoiesis, at 30% vs. 7% for patients without it (P = .016).

Five of 74 patients in the external cohort (7%) went on to develop therapy-related myeloid neoplasms, and of this group, four (80%) had clonal hematopoiesis at baseline. In contrast, of the 69 patients who did not develop therapy-related cancers, 11 (16%) had baseline clonal hematopoiesis.

In a multivariate model using data from the external cohort, clonal hematopoiesis was significantly associated with risk for therapy-related myeloid neoplasms, with a hazard ratio of 13.7 (P = .013).

Elderly patient study

Dr. Gillis and her colleagues conducted a nested, case-control, proof-of-concept study to compare the prevalence of CHIP between patients with cancer who later developed therapy-related myeloid neoplasms (cases) and patients who did not (controls).

The cases were identified from an internal biobank of 123,357 patients, and included all patients who were diagnosed with a primary cancer, treated with chemotherapy, and subsequently developed a therapy-related myeloid neoplasm. The patients had to be 70 or older at the time of either primary or therapy-related cancer diagnosis with peripheral blood or mononuclear samples collected before the diagnosis of the second cancer.

Controls were patients diagnosed with a primary malignancy at age 70 or older who had chemotherapy but did not develop therapy-related myeloid neoplasms. Every case was matched with at least four controls selected for sex, primary tumor type, age at diagnosis, smoking status, chemotherapy drug class, and duration of follow up.

They used sequential targeted and whole-exome sequencing to assess clonal evolution in cases for whom paired CHIP and therapy-related myeloid neoplasm samples were available.

They identified a total of 13 cases and 56 controls. Among all patients, CHIP was seen in 23 (33%). In contrast, previous studies have shown a prevalence of CHIP among older patients without cancer of about 10%, the authors note in their article.

The prevalence of CHIP was significantly higher among cases than among controls, occurring in 8 of 13 cases (62%) vs 15 of 56 controls (27%; P = .024). The odds ratio for therapy-related neoplasms with CHIP was 5.75 (P = .013).

The most commonly mutated genes were TET2 and TP53 among cases, and TET2 among controls.

“The distribution of CHIP-related gene mutations differs between individuals with therapy-related myeloid neoplasm and those without, suggesting that mutation-specific differences might exist in therapy-related myeloid neoplasm risk,” the investigators write.

Dr. Takahashi’s study was supported by the Cancer Prevention Research Institute of Texas, Red and Charline McCombs Institute for the Early Detection and Treatment of Cancer, The National Institutes of Health through MD Anderson Cancer Center Support Grant, and the MD Anderson MDS & AML Moon Shots Program. Dr. Gillis’ study was internally funded. Dr. Takahasi and colleagues reported no competing financial interests. Two of Dr. Gillis’ colleagues reported grants or fees from several drug companies.

 

Small pre-leukemic clones left behind after treatment for non-myeloid malignancies appear to increase the risk for therapy-related myelodysplasia or leukemia, report investigators in two studies.

An analysis of peripheral blood samples taken from patients at the time of their primary cancer diagnosis and bone marrow samples taken at the time of a later therapy-related myeloid neoplasm diagnosis showed that 10 of 14 patients (71%) had clonal hematopoiesis before starting on cytotoxic chemotherapy. In contrast, clonal hematopoiesis was detected in pre-treatment samples of only 17 of 54 controls (31%), reported Koichi Takahashi, MD, and colleagues from the University of Texas MD Anderson Cancer Center in Houston.

“Preleukemic clonal hematopoiesis is common in patients with therapy-related myeloid neoplasms at the time of their primary cancer diagnosis and before they have been exposed to treatment. Our results suggest that clonal hematopoiesis could be used as a predictive marker to identify patients with cancer who are at risk of developing therapy-related myeloid neoplasms,” they wrote (Lancet Oncol 2017; 18: 100–11).

In a separate study, investigators from the Moffitt Cancer Center in Tampa, Florida, found in a nested case-control study that patients with therapy-related myeloid neoplasms were more likely than controls to have clonal hematopoiesis of indeterminate potential (CHIP), and that the CHIP was often present before exposure to chemotherapy.

“We recorded a significantly higher prevalence of CHIP in individuals who developed therapy-related myeloid neoplasms (cases) than in those who did not (controls); however, around 27% of individuals with CHIP did not develop therapy-related myeloid neoplasms, suggesting that this feature alone should not be used to determine a patient’s suitability for chemotherapy,” wrote Nancy K. Gillis, PharmD, and colleagues (Lancet Oncol 2017; 18:112-21).

Risk factors examined

Dr. Takahashi and colleagues noted that previous studies have identified several treatment-related risk factors as being associated with therapy-related myeloid dysplasia or leukemia, including the use of alkylating agents, topoisomerase II inhibitors, and high-dose chemotherapy with autologous stem-cell transplantation.

“By contrast, little is known about patient-specific risk factors. Older age was shown to increase the risk of therapy-related myeloid neoplasms. Several germline polymorphisms have also been associated with this risk, but none have been validated. As such, no predictive biomarkers exist for therapy-related myeloid neoplasms,” they wrote.

They performed a retrospective case-control study comparing patients treated for a primary cancer at their center from 1997 through 2015 who subsequently developed a myeloid neoplasm with controls treated during the same period. Controls were age-matched patients treated with combination chemotherapy for lymphoma who did not develop a therapy-related myeloid malignancy after at least 5 years of follow-up.

In addition, the investigators further explored the association between clonal hematopoiesis and therapy-related cancers in an external cohort of patients with lymphoma treated in a randomized trial at their center from 1999 through 2001. That trial compared the CHOP regimen (cyclophosphamide, doxorubicin, vincristine and prednisone) with and without melatonin.

To detect clonal hematopoiesis in pre-treatment peripheral blood, the investigators used molecular barcode sequencing of 32 genes. They also used targeted gene sequencing on bone marrow samples from cases to investigate clonal evolution from clonal hematopoiesis to the development of therapy-related myeloid neoplasms.

As noted before, 10 of 14 cases had evidence of pre-treatment clonal hematopoiesis, compared with 17 of 54 controls. For both cases and controls, the cumulative incidence of therapy-related myeloid cancers after 5 years was significantly higher among those with baseline clonal hematopoiesis, at 30% vs. 7% for patients without it (P = .016).

Five of 74 patients in the external cohort (7%) went on to develop therapy-related myeloid neoplasms, and of this group, four (80%) had clonal hematopoiesis at baseline. In contrast, of the 69 patients who did not develop therapy-related cancers, 11 (16%) had baseline clonal hematopoiesis.

In a multivariate model using data from the external cohort, clonal hematopoiesis was significantly associated with risk for therapy-related myeloid neoplasms, with a hazard ratio of 13.7 (P = .013).

Elderly patient study

Dr. Gillis and her colleagues conducted a nested, case-control, proof-of-concept study to compare the prevalence of CHIP between patients with cancer who later developed therapy-related myeloid neoplasms (cases) and patients who did not (controls).

The cases were identified from an internal biobank of 123,357 patients, and included all patients who were diagnosed with a primary cancer, treated with chemotherapy, and subsequently developed a therapy-related myeloid neoplasm. The patients had to be 70 or older at the time of either primary or therapy-related cancer diagnosis with peripheral blood or mononuclear samples collected before the diagnosis of the second cancer.

Controls were patients diagnosed with a primary malignancy at age 70 or older who had chemotherapy but did not develop therapy-related myeloid neoplasms. Every case was matched with at least four controls selected for sex, primary tumor type, age at diagnosis, smoking status, chemotherapy drug class, and duration of follow up.

They used sequential targeted and whole-exome sequencing to assess clonal evolution in cases for whom paired CHIP and therapy-related myeloid neoplasm samples were available.

They identified a total of 13 cases and 56 controls. Among all patients, CHIP was seen in 23 (33%). In contrast, previous studies have shown a prevalence of CHIP among older patients without cancer of about 10%, the authors note in their article.

The prevalence of CHIP was significantly higher among cases than among controls, occurring in 8 of 13 cases (62%) vs 15 of 56 controls (27%; P = .024). The odds ratio for therapy-related neoplasms with CHIP was 5.75 (P = .013).

The most commonly mutated genes were TET2 and TP53 among cases, and TET2 among controls.

“The distribution of CHIP-related gene mutations differs between individuals with therapy-related myeloid neoplasm and those without, suggesting that mutation-specific differences might exist in therapy-related myeloid neoplasm risk,” the investigators write.

Dr. Takahashi’s study was supported by the Cancer Prevention Research Institute of Texas, Red and Charline McCombs Institute for the Early Detection and Treatment of Cancer, The National Institutes of Health through MD Anderson Cancer Center Support Grant, and the MD Anderson MDS & AML Moon Shots Program. Dr. Gillis’ study was internally funded. Dr. Takahasi and colleagues reported no competing financial interests. Two of Dr. Gillis’ colleagues reported grants or fees from several drug companies.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM LANCET ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
129731
Vitals

 

Key clinical point: Pre-therapy clonal hematopoiesis is associated with increased risk for therapy-related myeloid neoplasms.

Major finding: In two studies, the incidence of therapy-related myeloid neoplasms was higher among patients with clonal hematopoiesis at baseline.

Data source: Retrospective case-control studies.

Disclosures: Dr. Takahashi’s study was supported by the Cancer Prevention Research Institute of Texas, Red and Charline McCombs Institute for the Early Detection and Treatment of Cancer, The National Institutes of Health through MD Anderson Cancer Center Support Grant, and the MD Anderson MDS & AML Moon Shots Program. Dr. Gillis’ study was internally funded. Dr. Takahasi and colleagues reported no competing financial interests. Two of Dr. Gillis’ colleagues reported grants or fees from several drug companies.

Rituximab after ASCT boosted survival in mantle cell lymphoma

Article Type
Changed
Tue, 01/17/2023 - 11:25

 

– Maintenance therapy every other month with rituximab significantly prolonged event-free and overall survival after autologous stem cell transplantation among younger patients with mantle cell lymphoma, based on results from a multicenter, randomized, phase 3 trial.

After a median follow-up period of 50 months, 79% of the rituximab maintenance arm remained alive and free of progression, relapse, and severe infection, compared with 61% of the no-maintenance arm (P = .001), said Steven Le Gouill, MD, PhD, at the 2016 meeting of the American Society of Hematology.

Courtesy Wikimedia Commons/Nephron/Creative Commons
Intermediate magnification micrograph of mantle cell lymphoma of the terminal ileum.
Rituximab maintenance improved the chances of event-free survival by about 54% (hazard ratio, 0.46; 95% confidence interval, 0.28 to 0.74; P = .0016), and secondary analyses linked rituximab maintenance to superior 4-year rates of progression-free survival (82% vs. 65%; P = .0005) and overall survival (89% vs. 81%; P = .041).

This is the first study linking rituximab maintenance after ASCT to improved survival in younger patients with mantle cell lymphoma, said Dr. Le Gouill of Nantes University Hospital in Nantes, France.

The trial “demonstrates for the first time that rituximab maintenance after ASCT prolongs event-free survival, progression-free survival, and overall survival” in younger patients with treatment-naïve mantle cell lymphoma, Dr. Le Gouill said. The findings confirm rituximab maintenance as “a new standard of care” in younger patients with mantle cell lymphoma, he concluded.

Prior research http://www.nejm.org/doi/full/10.1056/NEJMoa1200920#t=abstract supports maintenance therapy with rituximab rather than interferon alfa for older patients whose mantle cell lymphoma responded to induction with rituximab plus cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP), noted Dr. Le Gouill. To examine outcomes in younger patients with treatment-naïve mantle cell lymphoma, he and his associates treated 299 individuals aged 65 years and younger (median age, 57 years) with standard induction consisting of 4 courses of rituximab, dexamethasone, high-dose cytarabine, and salt platinum (R-DHAP) every 21 days, followed by conditioning with rituximab plus BiCNU, etoposide, cytarabine, and melphalan (R-BEAM) and ASCT. Patients without at least a partial response to R-DHAP received 4 additional courses of R-CHOP-14 before ASCT. Patients then were randomized either to no maintenance or to infusions of 375 mg R per m2 every 2 months for 3 years.

A total of 53% of patients were mantle cell lymphoma international prognostic index (MIPI) low risk, 27% were intermediate and 19% were high-risk. Rituximab maintenance was associated with a 60% lower risk of progression (HR, 0.4; 95% CI, 0.23 to 0.68; P = .0007) and a 50% lower risk of death (HR, 0.5; 95% CI, 0.25 to 0.98; P = .04).

The French Innovative Leukemia Organisation sponsored the trial. Dr. Le Gouill disclosed ties to Roche, Janssen-Cilag, and Celgene.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Maintenance therapy every other month with rituximab significantly prolonged event-free and overall survival after autologous stem cell transplantation among younger patients with mantle cell lymphoma, based on results from a multicenter, randomized, phase 3 trial.

After a median follow-up period of 50 months, 79% of the rituximab maintenance arm remained alive and free of progression, relapse, and severe infection, compared with 61% of the no-maintenance arm (P = .001), said Steven Le Gouill, MD, PhD, at the 2016 meeting of the American Society of Hematology.

Courtesy Wikimedia Commons/Nephron/Creative Commons
Intermediate magnification micrograph of mantle cell lymphoma of the terminal ileum.
Rituximab maintenance improved the chances of event-free survival by about 54% (hazard ratio, 0.46; 95% confidence interval, 0.28 to 0.74; P = .0016), and secondary analyses linked rituximab maintenance to superior 4-year rates of progression-free survival (82% vs. 65%; P = .0005) and overall survival (89% vs. 81%; P = .041).

This is the first study linking rituximab maintenance after ASCT to improved survival in younger patients with mantle cell lymphoma, said Dr. Le Gouill of Nantes University Hospital in Nantes, France.

The trial “demonstrates for the first time that rituximab maintenance after ASCT prolongs event-free survival, progression-free survival, and overall survival” in younger patients with treatment-naïve mantle cell lymphoma, Dr. Le Gouill said. The findings confirm rituximab maintenance as “a new standard of care” in younger patients with mantle cell lymphoma, he concluded.

Prior research http://www.nejm.org/doi/full/10.1056/NEJMoa1200920#t=abstract supports maintenance therapy with rituximab rather than interferon alfa for older patients whose mantle cell lymphoma responded to induction with rituximab plus cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP), noted Dr. Le Gouill. To examine outcomes in younger patients with treatment-naïve mantle cell lymphoma, he and his associates treated 299 individuals aged 65 years and younger (median age, 57 years) with standard induction consisting of 4 courses of rituximab, dexamethasone, high-dose cytarabine, and salt platinum (R-DHAP) every 21 days, followed by conditioning with rituximab plus BiCNU, etoposide, cytarabine, and melphalan (R-BEAM) and ASCT. Patients without at least a partial response to R-DHAP received 4 additional courses of R-CHOP-14 before ASCT. Patients then were randomized either to no maintenance or to infusions of 375 mg R per m2 every 2 months for 3 years.

A total of 53% of patients were mantle cell lymphoma international prognostic index (MIPI) low risk, 27% were intermediate and 19% were high-risk. Rituximab maintenance was associated with a 60% lower risk of progression (HR, 0.4; 95% CI, 0.23 to 0.68; P = .0007) and a 50% lower risk of death (HR, 0.5; 95% CI, 0.25 to 0.98; P = .04).

The French Innovative Leukemia Organisation sponsored the trial. Dr. Le Gouill disclosed ties to Roche, Janssen-Cilag, and Celgene.

 

– Maintenance therapy every other month with rituximab significantly prolonged event-free and overall survival after autologous stem cell transplantation among younger patients with mantle cell lymphoma, based on results from a multicenter, randomized, phase 3 trial.

After a median follow-up period of 50 months, 79% of the rituximab maintenance arm remained alive and free of progression, relapse, and severe infection, compared with 61% of the no-maintenance arm (P = .001), said Steven Le Gouill, MD, PhD, at the 2016 meeting of the American Society of Hematology.

Courtesy Wikimedia Commons/Nephron/Creative Commons
Intermediate magnification micrograph of mantle cell lymphoma of the terminal ileum.
Rituximab maintenance improved the chances of event-free survival by about 54% (hazard ratio, 0.46; 95% confidence interval, 0.28 to 0.74; P = .0016), and secondary analyses linked rituximab maintenance to superior 4-year rates of progression-free survival (82% vs. 65%; P = .0005) and overall survival (89% vs. 81%; P = .041).

This is the first study linking rituximab maintenance after ASCT to improved survival in younger patients with mantle cell lymphoma, said Dr. Le Gouill of Nantes University Hospital in Nantes, France.

The trial “demonstrates for the first time that rituximab maintenance after ASCT prolongs event-free survival, progression-free survival, and overall survival” in younger patients with treatment-naïve mantle cell lymphoma, Dr. Le Gouill said. The findings confirm rituximab maintenance as “a new standard of care” in younger patients with mantle cell lymphoma, he concluded.

Prior research http://www.nejm.org/doi/full/10.1056/NEJMoa1200920#t=abstract supports maintenance therapy with rituximab rather than interferon alfa for older patients whose mantle cell lymphoma responded to induction with rituximab plus cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP), noted Dr. Le Gouill. To examine outcomes in younger patients with treatment-naïve mantle cell lymphoma, he and his associates treated 299 individuals aged 65 years and younger (median age, 57 years) with standard induction consisting of 4 courses of rituximab, dexamethasone, high-dose cytarabine, and salt platinum (R-DHAP) every 21 days, followed by conditioning with rituximab plus BiCNU, etoposide, cytarabine, and melphalan (R-BEAM) and ASCT. Patients without at least a partial response to R-DHAP received 4 additional courses of R-CHOP-14 before ASCT. Patients then were randomized either to no maintenance or to infusions of 375 mg R per m2 every 2 months for 3 years.

A total of 53% of patients were mantle cell lymphoma international prognostic index (MIPI) low risk, 27% were intermediate and 19% were high-risk. Rituximab maintenance was associated with a 60% lower risk of progression (HR, 0.4; 95% CI, 0.23 to 0.68; P = .0007) and a 50% lower risk of death (HR, 0.5; 95% CI, 0.25 to 0.98; P = .04).

The French Innovative Leukemia Organisation sponsored the trial. Dr. Le Gouill disclosed ties to Roche, Janssen-Cilag, and Celgene.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

AT ASH 2016

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Maintenance therapy with rituximab after autologous stem cell transplantation was associated with significantly increased survival among younger patients with mantle cell lymphoma.

Major finding: After a median follow-up time of 50 months, 79% of patients who received rituximab maintenance remained alive and free of progression, relapse, and severe infection, compared with 61% of those who received no maintenance therapy (P = .001).

Data source: A multicenter randomized phase 3 trial of 299 adults up to 65 years old with mantle cell lymphoma.

Disclosures: The French Innovative Leukemia Organisation sponsored the trial. Dr. Le Gouill disclosed ties to Roche, Janssen-Cilag, and Celgene.

President Trump hits ground running on ACA repeal

Article Type
Changed
Wed, 04/03/2019 - 10:29

 

WASHINGTON – President Trump wasted no time in getting the executive branch’s wheels in motion toward repeal of the Affordable Care Act.

Within hours of being sworn in as the 45th president of the United States on Jan. 20, he signed an executive order that announced the incoming administration’s policy “to seek the prompt repeal of the Patient Protection and Affordable Care Act.”

The order opens the door for federal agencies to tackle ACA provisions such as the individual mandate and its tax penalties for not carrying insurance, as well as other financial aspects of the ACA that impact patients, providers, insurers, and manufacturers.

Gage Skidmore/Wikimedia Commons/CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons
Donald J. Trump
President Trump directed the Health & Human Services department and other departments with ACA oversight to “exercise all authority and discretion available to them to waive, defer, grant exemptions from, or delay the implementation of any provision or requirement of the Act that would impose a fiscal burden on any State or a cost, fee, tax, penalty, or regulatory burden on individuals, families, health care providers, health insurers, patients, recipients of health care services, purchasers of health insurance, or makers of medical devices, products, or medications.”

The order directs the secretaries of HHS, the Treasury department, and the Labor department to “exercise all authority and discretion available to them to provide greater flexibility to States and cooperate with them in implementing healthcare programs.”

With this order, President Trump also set the stage for creating a framework to sell insurance products across state lines by directing secretaries with oversight of insurance markets to “encourage the development of a free and open market in interstate commerce for the offering of healthcare services and health insurance, with the goal of achieving and preserving maximum options for patients and consumers.”

Little action is expected on the executive order until secretaries are approved for HHS, Treasury, and Labor. Rep. Tom Price (R-Ga.) is scheduled to appear before the Senate Finance Committee on Jan. 24.
 

Publications
Topics
Sections

 

WASHINGTON – President Trump wasted no time in getting the executive branch’s wheels in motion toward repeal of the Affordable Care Act.

Within hours of being sworn in as the 45th president of the United States on Jan. 20, he signed an executive order that announced the incoming administration’s policy “to seek the prompt repeal of the Patient Protection and Affordable Care Act.”

The order opens the door for federal agencies to tackle ACA provisions such as the individual mandate and its tax penalties for not carrying insurance, as well as other financial aspects of the ACA that impact patients, providers, insurers, and manufacturers.

Gage Skidmore/Wikimedia Commons/CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons
Donald J. Trump
President Trump directed the Health & Human Services department and other departments with ACA oversight to “exercise all authority and discretion available to them to waive, defer, grant exemptions from, or delay the implementation of any provision or requirement of the Act that would impose a fiscal burden on any State or a cost, fee, tax, penalty, or regulatory burden on individuals, families, health care providers, health insurers, patients, recipients of health care services, purchasers of health insurance, or makers of medical devices, products, or medications.”

The order directs the secretaries of HHS, the Treasury department, and the Labor department to “exercise all authority and discretion available to them to provide greater flexibility to States and cooperate with them in implementing healthcare programs.”

With this order, President Trump also set the stage for creating a framework to sell insurance products across state lines by directing secretaries with oversight of insurance markets to “encourage the development of a free and open market in interstate commerce for the offering of healthcare services and health insurance, with the goal of achieving and preserving maximum options for patients and consumers.”

Little action is expected on the executive order until secretaries are approved for HHS, Treasury, and Labor. Rep. Tom Price (R-Ga.) is scheduled to appear before the Senate Finance Committee on Jan. 24.
 

 

WASHINGTON – President Trump wasted no time in getting the executive branch’s wheels in motion toward repeal of the Affordable Care Act.

Within hours of being sworn in as the 45th president of the United States on Jan. 20, he signed an executive order that announced the incoming administration’s policy “to seek the prompt repeal of the Patient Protection and Affordable Care Act.”

The order opens the door for federal agencies to tackle ACA provisions such as the individual mandate and its tax penalties for not carrying insurance, as well as other financial aspects of the ACA that impact patients, providers, insurers, and manufacturers.

Gage Skidmore/Wikimedia Commons/CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons
Donald J. Trump
President Trump directed the Health & Human Services department and other departments with ACA oversight to “exercise all authority and discretion available to them to waive, defer, grant exemptions from, or delay the implementation of any provision or requirement of the Act that would impose a fiscal burden on any State or a cost, fee, tax, penalty, or regulatory burden on individuals, families, health care providers, health insurers, patients, recipients of health care services, purchasers of health insurance, or makers of medical devices, products, or medications.”

The order directs the secretaries of HHS, the Treasury department, and the Labor department to “exercise all authority and discretion available to them to provide greater flexibility to States and cooperate with them in implementing healthcare programs.”

With this order, President Trump also set the stage for creating a framework to sell insurance products across state lines by directing secretaries with oversight of insurance markets to “encourage the development of a free and open market in interstate commerce for the offering of healthcare services and health insurance, with the goal of achieving and preserving maximum options for patients and consumers.”

Little action is expected on the executive order until secretaries are approved for HHS, Treasury, and Labor. Rep. Tom Price (R-Ga.) is scheduled to appear before the Senate Finance Committee on Jan. 24.
 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME