User login
Many new cancer drugs lack evidence of survival or QoL benefit
Even after several years on the market, only about half of cancer drug indications recently approved by the European Medicines Agency (EMA) had conclusive evidence that they can extend or improve quality of life, according to results of a retrospective cohort study.
With a median of 5.4 years of follow-up, significant improvements in overall survival or quality of life had been published for 35 of 68 (51%) cancer drug indications approved by the EMA, according to the report by Courtney Davis, MD, senior lecturer in the department of global health and social medicine, King’s College London, United Kingdom, and colleagues.
Furthermore, not all survival benefits were clinically meaningful, according to an analysis published in the report.
The dearth of evidence for survival or quality-of-life benefits has “negative implications” for both patients and public health, Dr. Davis and colleagues said in their article (BMJ 2017 Oct 5. doi:10.1136/bmj.j4530).
“When expensive drugs that lack clinically meaningful benefits are approved and paid for within publicly funded healthcare systems, individual patients can be harmed, important societal resources wasted, and the delivery of equitable and affordable care undermined,” they wrote.
Dr. Davis and associates systematically evaluated the evidence base for regulatory and scientific reports on 48 cancer drugs approved for 68 indications by the EMA between 2009-2013. Of those indications, 17 were for hematologic malignancies and 51 were for solid tumors.
Only 18 of 68 indications (26%) were supported by pivotal studies that had a primary outcome of overall survival, according to the investigators. That was an important finding for the investigators, who wrote that that EMA commonly accepts use of surrogate measures of drug benefit despite their own statements that overall survival is the “most persuasive outcome” in studies of new oncology drugs.
“To a large extent, regulatory evidence standards determine the clinical value of … new oncology drugs,” Dr. Davis and co-authors wrote. “Our study suggests these standards are failing to incentivize drug development that best meets the needs of patients, clinicians, and healthcare systems.”
The investigators also assessed the clinical value of reported improvements using the European Society for Medical Oncology-Magnitude of Clinical Benefit Scale (ESMO-MCBS). According to investigators, only 11 of the 23 drugs used to treat solid tumors (48%) reached the threshold for a meaningful survival benefit.
This report in BMJ echoes findings of an earlier study by Chul Kim, MD, and colleagues looking at cancer drugs approved by the U.S. Food and Drug Administration (FDA) between 2008 and 2012 (JAMA Intern Med. 2015;175(12):1992-4).
Dr. Kim, of the medical oncology service, National Cancer Institute, National Institutes of Health, Bethesda, Md., and colleagues found that 36 of 54 FDA approvals (67%) occurred with no evidence of survival or quality of life benefit. After a median of 4.4 years of follow-up, only 5 of those 36 (14%) had additional randomized study data that showed an improvement in overall survival, according to the published report.
The study was supported by Health Action International, which did not have a role in study design or data collection, analysis, or interpretation. The authors did not give financial disclosures.
The expense and toxicity of cancer drugs mean we have an obligation to expose patients to treatment only when they can reasonably expect an improvement in survival or quality of life. The study by Davis and colleagues suggests we may be falling far short of this important benchmark.
Few cancer drugs come to market with good evidence that they improve patient centered outcomes. If they do, they often offer marginal benefits that may be lost in the heterogeneous patients of the real world. Most approvals of cancer drugs are based on flimsy or untested surrogate endpoints, and postmarketing studies rarely validate the efficacy and safety of these drugs on patient centered endpoints.
In the United States, this broken system means huge expenditures on cancer drugs with certain toxicity but uncertain benefit. In Europe, payers yield the stick left unused by lax regulatoers. The National Institute for Health and Care Excellence (NICE) excludes from reimbursement drugs that provide only marginal or uncertain benefits at high cost. Their decisions are continually subjected to political scrutiny and public criticism.
What can be done? The default path to market for all cancer drugs should include rigorous testing against the best standard of care in randomized trials powered to rule in or rule out a clinically meaningful difference in patient centered outcomes in a representative population. The use of uncontrolled study designs or surrogate endpoints should be the exception, not the rule.
Vinay Prasad, MD, MPH, is assistant professor of medicine at Oregon Health and Science University, Portland. He declared a competing interest (royalties from his book Ending Medical Reversal). These comments are from his editorial (BMJ 2017 Oct 5. doi: 10.1136/bmj.j4528 )
The expense and toxicity of cancer drugs mean we have an obligation to expose patients to treatment only when they can reasonably expect an improvement in survival or quality of life. The study by Davis and colleagues suggests we may be falling far short of this important benchmark.
Few cancer drugs come to market with good evidence that they improve patient centered outcomes. If they do, they often offer marginal benefits that may be lost in the heterogeneous patients of the real world. Most approvals of cancer drugs are based on flimsy or untested surrogate endpoints, and postmarketing studies rarely validate the efficacy and safety of these drugs on patient centered endpoints.
In the United States, this broken system means huge expenditures on cancer drugs with certain toxicity but uncertain benefit. In Europe, payers yield the stick left unused by lax regulatoers. The National Institute for Health and Care Excellence (NICE) excludes from reimbursement drugs that provide only marginal or uncertain benefits at high cost. Their decisions are continually subjected to political scrutiny and public criticism.
What can be done? The default path to market for all cancer drugs should include rigorous testing against the best standard of care in randomized trials powered to rule in or rule out a clinically meaningful difference in patient centered outcomes in a representative population. The use of uncontrolled study designs or surrogate endpoints should be the exception, not the rule.
Vinay Prasad, MD, MPH, is assistant professor of medicine at Oregon Health and Science University, Portland. He declared a competing interest (royalties from his book Ending Medical Reversal). These comments are from his editorial (BMJ 2017 Oct 5. doi: 10.1136/bmj.j4528 )
The expense and toxicity of cancer drugs mean we have an obligation to expose patients to treatment only when they can reasonably expect an improvement in survival or quality of life. The study by Davis and colleagues suggests we may be falling far short of this important benchmark.
Few cancer drugs come to market with good evidence that they improve patient centered outcomes. If they do, they often offer marginal benefits that may be lost in the heterogeneous patients of the real world. Most approvals of cancer drugs are based on flimsy or untested surrogate endpoints, and postmarketing studies rarely validate the efficacy and safety of these drugs on patient centered endpoints.
In the United States, this broken system means huge expenditures on cancer drugs with certain toxicity but uncertain benefit. In Europe, payers yield the stick left unused by lax regulatoers. The National Institute for Health and Care Excellence (NICE) excludes from reimbursement drugs that provide only marginal or uncertain benefits at high cost. Their decisions are continually subjected to political scrutiny and public criticism.
What can be done? The default path to market for all cancer drugs should include rigorous testing against the best standard of care in randomized trials powered to rule in or rule out a clinically meaningful difference in patient centered outcomes in a representative population. The use of uncontrolled study designs or surrogate endpoints should be the exception, not the rule.
Vinay Prasad, MD, MPH, is assistant professor of medicine at Oregon Health and Science University, Portland. He declared a competing interest (royalties from his book Ending Medical Reversal). These comments are from his editorial (BMJ 2017 Oct 5. doi: 10.1136/bmj.j4528 )
Even after several years on the market, only about half of cancer drug indications recently approved by the European Medicines Agency (EMA) had conclusive evidence that they can extend or improve quality of life, according to results of a retrospective cohort study.
With a median of 5.4 years of follow-up, significant improvements in overall survival or quality of life had been published for 35 of 68 (51%) cancer drug indications approved by the EMA, according to the report by Courtney Davis, MD, senior lecturer in the department of global health and social medicine, King’s College London, United Kingdom, and colleagues.
Furthermore, not all survival benefits were clinically meaningful, according to an analysis published in the report.
The dearth of evidence for survival or quality-of-life benefits has “negative implications” for both patients and public health, Dr. Davis and colleagues said in their article (BMJ 2017 Oct 5. doi:10.1136/bmj.j4530).
“When expensive drugs that lack clinically meaningful benefits are approved and paid for within publicly funded healthcare systems, individual patients can be harmed, important societal resources wasted, and the delivery of equitable and affordable care undermined,” they wrote.
Dr. Davis and associates systematically evaluated the evidence base for regulatory and scientific reports on 48 cancer drugs approved for 68 indications by the EMA between 2009-2013. Of those indications, 17 were for hematologic malignancies and 51 were for solid tumors.
Only 18 of 68 indications (26%) were supported by pivotal studies that had a primary outcome of overall survival, according to the investigators. That was an important finding for the investigators, who wrote that that EMA commonly accepts use of surrogate measures of drug benefit despite their own statements that overall survival is the “most persuasive outcome” in studies of new oncology drugs.
“To a large extent, regulatory evidence standards determine the clinical value of … new oncology drugs,” Dr. Davis and co-authors wrote. “Our study suggests these standards are failing to incentivize drug development that best meets the needs of patients, clinicians, and healthcare systems.”
The investigators also assessed the clinical value of reported improvements using the European Society for Medical Oncology-Magnitude of Clinical Benefit Scale (ESMO-MCBS). According to investigators, only 11 of the 23 drugs used to treat solid tumors (48%) reached the threshold for a meaningful survival benefit.
This report in BMJ echoes findings of an earlier study by Chul Kim, MD, and colleagues looking at cancer drugs approved by the U.S. Food and Drug Administration (FDA) between 2008 and 2012 (JAMA Intern Med. 2015;175(12):1992-4).
Dr. Kim, of the medical oncology service, National Cancer Institute, National Institutes of Health, Bethesda, Md., and colleagues found that 36 of 54 FDA approvals (67%) occurred with no evidence of survival or quality of life benefit. After a median of 4.4 years of follow-up, only 5 of those 36 (14%) had additional randomized study data that showed an improvement in overall survival, according to the published report.
The study was supported by Health Action International, which did not have a role in study design or data collection, analysis, or interpretation. The authors did not give financial disclosures.
Even after several years on the market, only about half of cancer drug indications recently approved by the European Medicines Agency (EMA) had conclusive evidence that they can extend or improve quality of life, according to results of a retrospective cohort study.
With a median of 5.4 years of follow-up, significant improvements in overall survival or quality of life had been published for 35 of 68 (51%) cancer drug indications approved by the EMA, according to the report by Courtney Davis, MD, senior lecturer in the department of global health and social medicine, King’s College London, United Kingdom, and colleagues.
Furthermore, not all survival benefits were clinically meaningful, according to an analysis published in the report.
The dearth of evidence for survival or quality-of-life benefits has “negative implications” for both patients and public health, Dr. Davis and colleagues said in their article (BMJ 2017 Oct 5. doi:10.1136/bmj.j4530).
“When expensive drugs that lack clinically meaningful benefits are approved and paid for within publicly funded healthcare systems, individual patients can be harmed, important societal resources wasted, and the delivery of equitable and affordable care undermined,” they wrote.
Dr. Davis and associates systematically evaluated the evidence base for regulatory and scientific reports on 48 cancer drugs approved for 68 indications by the EMA between 2009-2013. Of those indications, 17 were for hematologic malignancies and 51 were for solid tumors.
Only 18 of 68 indications (26%) were supported by pivotal studies that had a primary outcome of overall survival, according to the investigators. That was an important finding for the investigators, who wrote that that EMA commonly accepts use of surrogate measures of drug benefit despite their own statements that overall survival is the “most persuasive outcome” in studies of new oncology drugs.
“To a large extent, regulatory evidence standards determine the clinical value of … new oncology drugs,” Dr. Davis and co-authors wrote. “Our study suggests these standards are failing to incentivize drug development that best meets the needs of patients, clinicians, and healthcare systems.”
The investigators also assessed the clinical value of reported improvements using the European Society for Medical Oncology-Magnitude of Clinical Benefit Scale (ESMO-MCBS). According to investigators, only 11 of the 23 drugs used to treat solid tumors (48%) reached the threshold for a meaningful survival benefit.
This report in BMJ echoes findings of an earlier study by Chul Kim, MD, and colleagues looking at cancer drugs approved by the U.S. Food and Drug Administration (FDA) between 2008 and 2012 (JAMA Intern Med. 2015;175(12):1992-4).
Dr. Kim, of the medical oncology service, National Cancer Institute, National Institutes of Health, Bethesda, Md., and colleagues found that 36 of 54 FDA approvals (67%) occurred with no evidence of survival or quality of life benefit. After a median of 4.4 years of follow-up, only 5 of those 36 (14%) had additional randomized study data that showed an improvement in overall survival, according to the published report.
The study was supported by Health Action International, which did not have a role in study design or data collection, analysis, or interpretation. The authors did not give financial disclosures.
From BMJ
Key clinical point: Even after several years on the market, only about half of cancer drug indications recently approved by the European Medicines Agency (EMA) lacked conclusive evidence that they can extend or improve quality of life.
Major finding: With a median of 5.4 years of follow-up, significant improvements in overall survival or quality of life had been published for 35 of 68 (51%) cancer drug indications approved by the EMA.
Data source: Retrospective cohort study of regulatory and scientific reports on 48 cancer drugs approved for 68 indications by the EMA between 2009-2013.
Disclosures: The study was supported by Health Action International, which did not have a role in study design or data collection, analysis, or interpretation.
Semaglutide aids T2DM weight loss over 2 years
LISBON – The investigational glucagon-like peptide (GLP)-1 receptor agonist semaglutide added to standard care for type 2 diabetes mellitus (T2DM) resulted in clinically significant weight loss over 2 years in the SUSTAIN-6 phase 3 trial.
Participants treated with semaglutide in the study lost an average of 3.6 to 4.9 kg, depending on the dose they were given (0.5 mg or 1.0 mg), which was significantly (P less than .0001) more than those who were randomized to matching placebos (-0.7 mg and -0.5 mg).
Semaglutide is under development by Novo Nordisk and is currently under review by regulatory agencies in the United States, Europe, and Japan. It has 94% homology to human GLP-1 and modifications have been made to help it avoid degradation and which give it a half-life that allows it to be given once a week.
SUSTAIN 6 is part of an ongoing phase 3 program and is a long-term outcome study with the primary objective of evaluating the cardiovascular safety of semaglutide. Effects on macro- and microvascular complications, glycemic control, body weight, body mass index and weight circumference are key secondary endpoints, together with assessment of its overall safety and tolerability.
Other trials in the program, have evaluated treatment with semaglutide as monotherapy (SUSTAIN 1; Lancet Diabetes Endocrinol. 2017;5:251-60) or versus other treatments including sitagliptin (Januvia, Merck; SUSTAIN 2; Lancet Diabetes Endocrinol. 2017;5:341-54), exenatide extended release (Bydureon, AstraZeneca; SUSTAIN 3), or insulin glargine (SUSTAIN 4), as add-on to basal insulin with or without metformin (SUSTAIN 5), and most recently, versus dulaglutide (Trulicity, Eli Lilly; SUSTAIN 7).
SUSTAIN 6 involved 3,297 people with T2DM with established cardiovascular disease or chronic kidney disease or otherwise identified as being at increased cardiovascular risk, according to Dr. Consoli, who is an endocrinologist and professor at the University of Chieti-Pescara, Italy. The results of the primary endpoint have been reported (N Engl J Med. 2016; 375:1834-44) and showed that the composite rate of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke was significantly lower among patients receiving semaglutide than among those receiving placebo. The hazard ratio for the reduction in the composite endpoint was 0.74 (95% CI, 0.58-.95; P less than .001 for noninferiority).
Results of the secondary analyses reported by Dr. Consoli at EASD 2017 showed that semaglutide could help more patients than placebo achieve significant weight loss, which could help further reduce their cardiovascular risk. He reported that a 5% or greater weight loss at 2 years was achieved by 36% and 47% of patients treated with semaglutide 0.5 mg and 1 mg groups, respectively, and by 18% and 19% of patients in the matching placebo groups (P less than .0001 for both comparisons). A 10% or greater weight loss was achieved by 13% and 21% of the semaglutide-treated patients and by 6% and 7% of those given placebo.
“The effect of weight was not dependent on BMI [body mass index] at baseline,” Dr. Consoli said, emphasizing that there was a consistent reduction in the weight in all BMI categories. Importantly, Dr. Consoli observed, the effects of semaglutide on weight seen were not driven by just a few patients losing weight, and around 80% of patients in the study experienced some degree of weight loss.
“As expected, the subjects treated with the GLP-1a had more GI [gastrointestinal] effects,” Dr. Consoli reported. Nausea or vomiting were reported in twice as many patients treated with semaglutide 0.5 mg (21.9%) and 1 mg (27.3%) as their placebo-matched counterparts (10.8% and 10.6%).
A post-hoc analysis found that the effect of semaglutide on weight loss was not likely to be down to these side effects, however, with a similar weight reductions seen in those who did and did not experience nausea or vomiting. The “estimated natural direct effect of treatment” was -2.75 kg for the 0.5 mg dose and -4.32 for the 1 mg dose of semaglutide versus their placebos Dr. Consoli said. GI drove the weigjht loss to a small degree; -0.12 kg and -0.04 kg of weight loss seen in the 0.5 mg and 1 mg semaglutide groups versus their placebos could be ascribed to nausea or vomiting.
In a poster presentation at the meeting, data on another post-hoc analysis from the SUSTAIN phase 3 program were reported. In a responder analysis of T2DM patients achieving glycemic and weight loss thresholds, a greater proportion of those treated with semaglutide achieved clinically meaningful reductions in both glycated hemoglobin (HbA1c) and body weight than those given comparator treatments.
The composite endpoint of at least a 1% reduction in HbA1c and a 5% or greater decrease in body weight was achieved by 25%–35% of patients treated with the 0.5 mg dose of semaglutide, by 38%–56% of those given the higher dose, and by 2%–13% or all comparators (P less than .0001). The higher dose of semaglutide also allowed more people to achieve this endpoint than the lower dose.
Novo Nordisk supported the study. Dr. Consoli disclosed receiving research funding from AstraZeneca and Novo Nordisk and speaker’s bureau or consultation fees from AstraZeneca, Boehringer Ingelheim, Eli Lilly & Co., Merck, Sharp & Dohme, Novartis, Sanofi-Aventis, and Takeda.
LISBON – The investigational glucagon-like peptide (GLP)-1 receptor agonist semaglutide added to standard care for type 2 diabetes mellitus (T2DM) resulted in clinically significant weight loss over 2 years in the SUSTAIN-6 phase 3 trial.
Participants treated with semaglutide in the study lost an average of 3.6 to 4.9 kg, depending on the dose they were given (0.5 mg or 1.0 mg), which was significantly (P less than .0001) more than those who were randomized to matching placebos (-0.7 mg and -0.5 mg).
Semaglutide is under development by Novo Nordisk and is currently under review by regulatory agencies in the United States, Europe, and Japan. It has 94% homology to human GLP-1 and modifications have been made to help it avoid degradation and which give it a half-life that allows it to be given once a week.
SUSTAIN 6 is part of an ongoing phase 3 program and is a long-term outcome study with the primary objective of evaluating the cardiovascular safety of semaglutide. Effects on macro- and microvascular complications, glycemic control, body weight, body mass index and weight circumference are key secondary endpoints, together with assessment of its overall safety and tolerability.
Other trials in the program, have evaluated treatment with semaglutide as monotherapy (SUSTAIN 1; Lancet Diabetes Endocrinol. 2017;5:251-60) or versus other treatments including sitagliptin (Januvia, Merck; SUSTAIN 2; Lancet Diabetes Endocrinol. 2017;5:341-54), exenatide extended release (Bydureon, AstraZeneca; SUSTAIN 3), or insulin glargine (SUSTAIN 4), as add-on to basal insulin with or without metformin (SUSTAIN 5), and most recently, versus dulaglutide (Trulicity, Eli Lilly; SUSTAIN 7).
SUSTAIN 6 involved 3,297 people with T2DM with established cardiovascular disease or chronic kidney disease or otherwise identified as being at increased cardiovascular risk, according to Dr. Consoli, who is an endocrinologist and professor at the University of Chieti-Pescara, Italy. The results of the primary endpoint have been reported (N Engl J Med. 2016; 375:1834-44) and showed that the composite rate of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke was significantly lower among patients receiving semaglutide than among those receiving placebo. The hazard ratio for the reduction in the composite endpoint was 0.74 (95% CI, 0.58-.95; P less than .001 for noninferiority).
Results of the secondary analyses reported by Dr. Consoli at EASD 2017 showed that semaglutide could help more patients than placebo achieve significant weight loss, which could help further reduce their cardiovascular risk. He reported that a 5% or greater weight loss at 2 years was achieved by 36% and 47% of patients treated with semaglutide 0.5 mg and 1 mg groups, respectively, and by 18% and 19% of patients in the matching placebo groups (P less than .0001 for both comparisons). A 10% or greater weight loss was achieved by 13% and 21% of the semaglutide-treated patients and by 6% and 7% of those given placebo.
“The effect of weight was not dependent on BMI [body mass index] at baseline,” Dr. Consoli said, emphasizing that there was a consistent reduction in the weight in all BMI categories. Importantly, Dr. Consoli observed, the effects of semaglutide on weight seen were not driven by just a few patients losing weight, and around 80% of patients in the study experienced some degree of weight loss.
“As expected, the subjects treated with the GLP-1a had more GI [gastrointestinal] effects,” Dr. Consoli reported. Nausea or vomiting were reported in twice as many patients treated with semaglutide 0.5 mg (21.9%) and 1 mg (27.3%) as their placebo-matched counterparts (10.8% and 10.6%).
A post-hoc analysis found that the effect of semaglutide on weight loss was not likely to be down to these side effects, however, with a similar weight reductions seen in those who did and did not experience nausea or vomiting. The “estimated natural direct effect of treatment” was -2.75 kg for the 0.5 mg dose and -4.32 for the 1 mg dose of semaglutide versus their placebos Dr. Consoli said. GI drove the weigjht loss to a small degree; -0.12 kg and -0.04 kg of weight loss seen in the 0.5 mg and 1 mg semaglutide groups versus their placebos could be ascribed to nausea or vomiting.
In a poster presentation at the meeting, data on another post-hoc analysis from the SUSTAIN phase 3 program were reported. In a responder analysis of T2DM patients achieving glycemic and weight loss thresholds, a greater proportion of those treated with semaglutide achieved clinically meaningful reductions in both glycated hemoglobin (HbA1c) and body weight than those given comparator treatments.
The composite endpoint of at least a 1% reduction in HbA1c and a 5% or greater decrease in body weight was achieved by 25%–35% of patients treated with the 0.5 mg dose of semaglutide, by 38%–56% of those given the higher dose, and by 2%–13% or all comparators (P less than .0001). The higher dose of semaglutide also allowed more people to achieve this endpoint than the lower dose.
Novo Nordisk supported the study. Dr. Consoli disclosed receiving research funding from AstraZeneca and Novo Nordisk and speaker’s bureau or consultation fees from AstraZeneca, Boehringer Ingelheim, Eli Lilly & Co., Merck, Sharp & Dohme, Novartis, Sanofi-Aventis, and Takeda.
LISBON – The investigational glucagon-like peptide (GLP)-1 receptor agonist semaglutide added to standard care for type 2 diabetes mellitus (T2DM) resulted in clinically significant weight loss over 2 years in the SUSTAIN-6 phase 3 trial.
Participants treated with semaglutide in the study lost an average of 3.6 to 4.9 kg, depending on the dose they were given (0.5 mg or 1.0 mg), which was significantly (P less than .0001) more than those who were randomized to matching placebos (-0.7 mg and -0.5 mg).
Semaglutide is under development by Novo Nordisk and is currently under review by regulatory agencies in the United States, Europe, and Japan. It has 94% homology to human GLP-1 and modifications have been made to help it avoid degradation and which give it a half-life that allows it to be given once a week.
SUSTAIN 6 is part of an ongoing phase 3 program and is a long-term outcome study with the primary objective of evaluating the cardiovascular safety of semaglutide. Effects on macro- and microvascular complications, glycemic control, body weight, body mass index and weight circumference are key secondary endpoints, together with assessment of its overall safety and tolerability.
Other trials in the program, have evaluated treatment with semaglutide as monotherapy (SUSTAIN 1; Lancet Diabetes Endocrinol. 2017;5:251-60) or versus other treatments including sitagliptin (Januvia, Merck; SUSTAIN 2; Lancet Diabetes Endocrinol. 2017;5:341-54), exenatide extended release (Bydureon, AstraZeneca; SUSTAIN 3), or insulin glargine (SUSTAIN 4), as add-on to basal insulin with or without metformin (SUSTAIN 5), and most recently, versus dulaglutide (Trulicity, Eli Lilly; SUSTAIN 7).
SUSTAIN 6 involved 3,297 people with T2DM with established cardiovascular disease or chronic kidney disease or otherwise identified as being at increased cardiovascular risk, according to Dr. Consoli, who is an endocrinologist and professor at the University of Chieti-Pescara, Italy. The results of the primary endpoint have been reported (N Engl J Med. 2016; 375:1834-44) and showed that the composite rate of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke was significantly lower among patients receiving semaglutide than among those receiving placebo. The hazard ratio for the reduction in the composite endpoint was 0.74 (95% CI, 0.58-.95; P less than .001 for noninferiority).
Results of the secondary analyses reported by Dr. Consoli at EASD 2017 showed that semaglutide could help more patients than placebo achieve significant weight loss, which could help further reduce their cardiovascular risk. He reported that a 5% or greater weight loss at 2 years was achieved by 36% and 47% of patients treated with semaglutide 0.5 mg and 1 mg groups, respectively, and by 18% and 19% of patients in the matching placebo groups (P less than .0001 for both comparisons). A 10% or greater weight loss was achieved by 13% and 21% of the semaglutide-treated patients and by 6% and 7% of those given placebo.
“The effect of weight was not dependent on BMI [body mass index] at baseline,” Dr. Consoli said, emphasizing that there was a consistent reduction in the weight in all BMI categories. Importantly, Dr. Consoli observed, the effects of semaglutide on weight seen were not driven by just a few patients losing weight, and around 80% of patients in the study experienced some degree of weight loss.
“As expected, the subjects treated with the GLP-1a had more GI [gastrointestinal] effects,” Dr. Consoli reported. Nausea or vomiting were reported in twice as many patients treated with semaglutide 0.5 mg (21.9%) and 1 mg (27.3%) as their placebo-matched counterparts (10.8% and 10.6%).
A post-hoc analysis found that the effect of semaglutide on weight loss was not likely to be down to these side effects, however, with a similar weight reductions seen in those who did and did not experience nausea or vomiting. The “estimated natural direct effect of treatment” was -2.75 kg for the 0.5 mg dose and -4.32 for the 1 mg dose of semaglutide versus their placebos Dr. Consoli said. GI drove the weigjht loss to a small degree; -0.12 kg and -0.04 kg of weight loss seen in the 0.5 mg and 1 mg semaglutide groups versus their placebos could be ascribed to nausea or vomiting.
In a poster presentation at the meeting, data on another post-hoc analysis from the SUSTAIN phase 3 program were reported. In a responder analysis of T2DM patients achieving glycemic and weight loss thresholds, a greater proportion of those treated with semaglutide achieved clinically meaningful reductions in both glycated hemoglobin (HbA1c) and body weight than those given comparator treatments.
The composite endpoint of at least a 1% reduction in HbA1c and a 5% or greater decrease in body weight was achieved by 25%–35% of patients treated with the 0.5 mg dose of semaglutide, by 38%–56% of those given the higher dose, and by 2%–13% or all comparators (P less than .0001). The higher dose of semaglutide also allowed more people to achieve this endpoint than the lower dose.
Novo Nordisk supported the study. Dr. Consoli disclosed receiving research funding from AstraZeneca and Novo Nordisk and speaker’s bureau or consultation fees from AstraZeneca, Boehringer Ingelheim, Eli Lilly & Co., Merck, Sharp & Dohme, Novartis, Sanofi-Aventis, and Takeda.
AT EASD 2017
Key clinical point: Semaglutide added to standard of care was associated with significant weight loss independent of any gastrointestinal side effects.
Major finding: There was a -2.87 kg to -4.35 kg change in body weight comparing two doses of semaglutide with matching placebos (both P less than .0001).
Data source: SUSTAIN-6: A long-term outcomes study in 3,297 patients with type 2 diabetes treated with once-weekly semaglutide or placebo for 104 weeks.
Disclosures: Novo Nordisk supported the study. Dr. Consoli disclosed receiving research funding from AstraZeneca and Novo Nordisk and speaker’s bureau or consultation fees from AstraZeneca, Boehringer Ingelheim, Eli Lilly, Merck, Sharp & Dohme, Novartis, Sanofi-Aventis, and Takeda.
HOPE-3 wades into fray regarding optimal blood pressure targets
BARCELONA – How low to go in treating hypertension is a topic of considerable recent controversy. Now the HOPE-3 trial investigators have weighed in, reporting that optimal outcomes in their landmark randomized trial were seen with an achieved, on-treatment systolic blood pressure of 130-140 mm Hg and a diastolic blood pressure of 75-80 mm Hg, Eva M. Lonn, MD, reported at the annual congress of the European Society of Cardiology.
Those results stand in glaring contrast to the findings of the much-discussed SPRINT trial, in which hypertensive patients fared best with an on-treatment SBP driven below 120 mm Hg (N Engl J Med. 2015 Nov 26; 373:2103-16).
“Please note that lower blood pressures, both systolic and diastolic, weren’t associated with lower risk, whereas higher blood pressures considerably increased the risk for major vascular events,” she added.
HOPE-3 (the Third Heart Outcomes Prevention Evaluation) included 12,705 patients in 21 countries who did not have cardiovascular disease and were at intermediate risk, with an average age of 65 years at enrollment and a Framinhgam Risk Score of about 10%. They were randomized double-blind in a 2x2 factorial design to rosuvastatin at 10 mg per day or placebo and/or candesartan at 16 mg plus hydrochlorothiazide at 12.5 mg per day or placebo and prospectively followed for a median of 5.6 years.
The primary outcomes of HOPE-3 have been published (N Engl J Med. 2016 May 26;374[21]:2009-20 and 2021-31). This was a practice-changing trial that opened the door to broader use of statin therapy for primary prevention.
At the ESC congress in Barcelona, Dr. Lonn presented a secondary post-hoc analysis that focused on the impact of antihypertensive therapy in HOPE-3. The results shed new light on the optimal blood pressure levels for triggering initation of antihypertensive therapy, as well as defining the achieved blood pressures that resulted in the greatest reductions in major vascular events.
As this was essentially an all-comers trial of intermediate-risk patients, participants presented with a range of blood pressures at baseline. But more than 4,700 subjects had a baseline SBP of 140-159.9 mm Hg, and 833 had an SBP of 160 mm Hg or more.
The candesartan/hydrochlorothiazide regimen resulted in what Dr. Lonn termed a “moderate” net placebo-subtracted blood pressure reduction of 6/3 mm Hg. The higher the baseline blood pressure, the bigger the reduction.
In the one-third of subjects with a baseline SBP greater than 143.5 mm Hg, antihypertensive therapy resulted in a significant 27% reduction in the composite endpoint of cardiovascular death, MI, or stroke compared with placebo. Those with a baseline SBP of 150 mm Hg or more showed even greater benefit from antihypertensive therapy, with a composite event rate of 4.8% compared with 7.2% for placebo, representing a 34% relative risk reduction in which the event curves began separating at about 2 years.
In contrast, antihypertensive therapy brought no significant reduction in events in patients in the lower two tertiles of baseline SBP. And there was no association at all between baseline DBP and major cardiovascular events across the range of DBP values evaluated in HOPE-3.
But wait: Things get more interesting, according to the cardiac electrophysiologist.
“I find the association between mean in-trial blood pressure as recorded in many measurements and vascular outcomes to be the most interesting analysis. This may be a better look at the association between blood pressure and outcomes than a measurement obtained just once or twice at baseline,” she explained.
Of note, among the 6,356 subjects on candesartan/hydrochlorothiazide, those with a mean on-treatment SBP of 160 mm Hg or more had a 2.61% per year rate of the composite of cardiovascular death, MI, stroke, rescue from cardiac arrest, heart failure, or revascularization. This was more than three-fold higher than the 0.75% per year rate in patients with an on-treatment SBP of 120-139.9 mm Hg. The composite event rate was also significantly higher in those with a mean on-treatment SBP of 140-159.9 mm Hg, at 1.4% per year. The event rate in patients with an on-treatment SBP below 120 mm Hg was identical to that of patients with a value of 120-139.9 mm Hg.
Only among patients with an on-treatment DBP of 90 mm Hg or more was the composite event rate significantly greater than in those with a DBP of 70-79.9 mm Hg, who had the lowest event rate by a margin of 1.89% versus 0.75% per year.
An Australian cardiologist in the audience who has been involved in revamping hypertension treatment guidelines Down Under expressed frustration. He only recently succeeded in wrangling his fellow panelists into incorporating the SPRINT results into the draft guidelines; now HOPE-3 is sending a very different message. What gives? Could the disparate findings simply be due to play of chance? he asked.
Highly unlikely, Dr. Lonn replied.
“There were substantial differences between our trials,” she explained. “First of all, the SPRINT population was at substantially higher risk. They either had to have established cardiovascular disease – we eliminated those people – or significant renal disease – we eliminated those people, too – or age greater than 75, or a Framingham Risk Score above 15%.”
Also, the SPRINT protocol controversially called for unattended blood pressure measurement.
“This is a very pure way ot eliminating white coat hypertension, but it is different from other studies, so it is very difficult to compare SPRINT to older studies or to HOPE-3. Some other investigators have suggested that the difference between attended and unattended blood pressure is close to 10 mm Hg. So our SBP of 130 mm Hg, which had the best outcomes in HOPE-3, may be the same as about 120 mm Hg in SPRINT,” according to Dr. Lonn.
HOPE-3 was funded by the Canadian Institutes of Health Research and AstraZeneca. Dr. Lonn reported serving as a consultant to and receiving research grants from AstraZeneca, Amgen, Bayer, and Novartis.
BARCELONA – How low to go in treating hypertension is a topic of considerable recent controversy. Now the HOPE-3 trial investigators have weighed in, reporting that optimal outcomes in their landmark randomized trial were seen with an achieved, on-treatment systolic blood pressure of 130-140 mm Hg and a diastolic blood pressure of 75-80 mm Hg, Eva M. Lonn, MD, reported at the annual congress of the European Society of Cardiology.
Those results stand in glaring contrast to the findings of the much-discussed SPRINT trial, in which hypertensive patients fared best with an on-treatment SBP driven below 120 mm Hg (N Engl J Med. 2015 Nov 26; 373:2103-16).
“Please note that lower blood pressures, both systolic and diastolic, weren’t associated with lower risk, whereas higher blood pressures considerably increased the risk for major vascular events,” she added.
HOPE-3 (the Third Heart Outcomes Prevention Evaluation) included 12,705 patients in 21 countries who did not have cardiovascular disease and were at intermediate risk, with an average age of 65 years at enrollment and a Framinhgam Risk Score of about 10%. They were randomized double-blind in a 2x2 factorial design to rosuvastatin at 10 mg per day or placebo and/or candesartan at 16 mg plus hydrochlorothiazide at 12.5 mg per day or placebo and prospectively followed for a median of 5.6 years.
The primary outcomes of HOPE-3 have been published (N Engl J Med. 2016 May 26;374[21]:2009-20 and 2021-31). This was a practice-changing trial that opened the door to broader use of statin therapy for primary prevention.
At the ESC congress in Barcelona, Dr. Lonn presented a secondary post-hoc analysis that focused on the impact of antihypertensive therapy in HOPE-3. The results shed new light on the optimal blood pressure levels for triggering initation of antihypertensive therapy, as well as defining the achieved blood pressures that resulted in the greatest reductions in major vascular events.
As this was essentially an all-comers trial of intermediate-risk patients, participants presented with a range of blood pressures at baseline. But more than 4,700 subjects had a baseline SBP of 140-159.9 mm Hg, and 833 had an SBP of 160 mm Hg or more.
The candesartan/hydrochlorothiazide regimen resulted in what Dr. Lonn termed a “moderate” net placebo-subtracted blood pressure reduction of 6/3 mm Hg. The higher the baseline blood pressure, the bigger the reduction.
In the one-third of subjects with a baseline SBP greater than 143.5 mm Hg, antihypertensive therapy resulted in a significant 27% reduction in the composite endpoint of cardiovascular death, MI, or stroke compared with placebo. Those with a baseline SBP of 150 mm Hg or more showed even greater benefit from antihypertensive therapy, with a composite event rate of 4.8% compared with 7.2% for placebo, representing a 34% relative risk reduction in which the event curves began separating at about 2 years.
In contrast, antihypertensive therapy brought no significant reduction in events in patients in the lower two tertiles of baseline SBP. And there was no association at all between baseline DBP and major cardiovascular events across the range of DBP values evaluated in HOPE-3.
But wait: Things get more interesting, according to the cardiac electrophysiologist.
“I find the association between mean in-trial blood pressure as recorded in many measurements and vascular outcomes to be the most interesting analysis. This may be a better look at the association between blood pressure and outcomes than a measurement obtained just once or twice at baseline,” she explained.
Of note, among the 6,356 subjects on candesartan/hydrochlorothiazide, those with a mean on-treatment SBP of 160 mm Hg or more had a 2.61% per year rate of the composite of cardiovascular death, MI, stroke, rescue from cardiac arrest, heart failure, or revascularization. This was more than three-fold higher than the 0.75% per year rate in patients with an on-treatment SBP of 120-139.9 mm Hg. The composite event rate was also significantly higher in those with a mean on-treatment SBP of 140-159.9 mm Hg, at 1.4% per year. The event rate in patients with an on-treatment SBP below 120 mm Hg was identical to that of patients with a value of 120-139.9 mm Hg.
Only among patients with an on-treatment DBP of 90 mm Hg or more was the composite event rate significantly greater than in those with a DBP of 70-79.9 mm Hg, who had the lowest event rate by a margin of 1.89% versus 0.75% per year.
An Australian cardiologist in the audience who has been involved in revamping hypertension treatment guidelines Down Under expressed frustration. He only recently succeeded in wrangling his fellow panelists into incorporating the SPRINT results into the draft guidelines; now HOPE-3 is sending a very different message. What gives? Could the disparate findings simply be due to play of chance? he asked.
Highly unlikely, Dr. Lonn replied.
“There were substantial differences between our trials,” she explained. “First of all, the SPRINT population was at substantially higher risk. They either had to have established cardiovascular disease – we eliminated those people – or significant renal disease – we eliminated those people, too – or age greater than 75, or a Framingham Risk Score above 15%.”
Also, the SPRINT protocol controversially called for unattended blood pressure measurement.
“This is a very pure way ot eliminating white coat hypertension, but it is different from other studies, so it is very difficult to compare SPRINT to older studies or to HOPE-3. Some other investigators have suggested that the difference between attended and unattended blood pressure is close to 10 mm Hg. So our SBP of 130 mm Hg, which had the best outcomes in HOPE-3, may be the same as about 120 mm Hg in SPRINT,” according to Dr. Lonn.
HOPE-3 was funded by the Canadian Institutes of Health Research and AstraZeneca. Dr. Lonn reported serving as a consultant to and receiving research grants from AstraZeneca, Amgen, Bayer, and Novartis.
BARCELONA – How low to go in treating hypertension is a topic of considerable recent controversy. Now the HOPE-3 trial investigators have weighed in, reporting that optimal outcomes in their landmark randomized trial were seen with an achieved, on-treatment systolic blood pressure of 130-140 mm Hg and a diastolic blood pressure of 75-80 mm Hg, Eva M. Lonn, MD, reported at the annual congress of the European Society of Cardiology.
Those results stand in glaring contrast to the findings of the much-discussed SPRINT trial, in which hypertensive patients fared best with an on-treatment SBP driven below 120 mm Hg (N Engl J Med. 2015 Nov 26; 373:2103-16).
“Please note that lower blood pressures, both systolic and diastolic, weren’t associated with lower risk, whereas higher blood pressures considerably increased the risk for major vascular events,” she added.
HOPE-3 (the Third Heart Outcomes Prevention Evaluation) included 12,705 patients in 21 countries who did not have cardiovascular disease and were at intermediate risk, with an average age of 65 years at enrollment and a Framinhgam Risk Score of about 10%. They were randomized double-blind in a 2x2 factorial design to rosuvastatin at 10 mg per day or placebo and/or candesartan at 16 mg plus hydrochlorothiazide at 12.5 mg per day or placebo and prospectively followed for a median of 5.6 years.
The primary outcomes of HOPE-3 have been published (N Engl J Med. 2016 May 26;374[21]:2009-20 and 2021-31). This was a practice-changing trial that opened the door to broader use of statin therapy for primary prevention.
At the ESC congress in Barcelona, Dr. Lonn presented a secondary post-hoc analysis that focused on the impact of antihypertensive therapy in HOPE-3. The results shed new light on the optimal blood pressure levels for triggering initation of antihypertensive therapy, as well as defining the achieved blood pressures that resulted in the greatest reductions in major vascular events.
As this was essentially an all-comers trial of intermediate-risk patients, participants presented with a range of blood pressures at baseline. But more than 4,700 subjects had a baseline SBP of 140-159.9 mm Hg, and 833 had an SBP of 160 mm Hg or more.
The candesartan/hydrochlorothiazide regimen resulted in what Dr. Lonn termed a “moderate” net placebo-subtracted blood pressure reduction of 6/3 mm Hg. The higher the baseline blood pressure, the bigger the reduction.
In the one-third of subjects with a baseline SBP greater than 143.5 mm Hg, antihypertensive therapy resulted in a significant 27% reduction in the composite endpoint of cardiovascular death, MI, or stroke compared with placebo. Those with a baseline SBP of 150 mm Hg or more showed even greater benefit from antihypertensive therapy, with a composite event rate of 4.8% compared with 7.2% for placebo, representing a 34% relative risk reduction in which the event curves began separating at about 2 years.
In contrast, antihypertensive therapy brought no significant reduction in events in patients in the lower two tertiles of baseline SBP. And there was no association at all between baseline DBP and major cardiovascular events across the range of DBP values evaluated in HOPE-3.
But wait: Things get more interesting, according to the cardiac electrophysiologist.
“I find the association between mean in-trial blood pressure as recorded in many measurements and vascular outcomes to be the most interesting analysis. This may be a better look at the association between blood pressure and outcomes than a measurement obtained just once or twice at baseline,” she explained.
Of note, among the 6,356 subjects on candesartan/hydrochlorothiazide, those with a mean on-treatment SBP of 160 mm Hg or more had a 2.61% per year rate of the composite of cardiovascular death, MI, stroke, rescue from cardiac arrest, heart failure, or revascularization. This was more than three-fold higher than the 0.75% per year rate in patients with an on-treatment SBP of 120-139.9 mm Hg. The composite event rate was also significantly higher in those with a mean on-treatment SBP of 140-159.9 mm Hg, at 1.4% per year. The event rate in patients with an on-treatment SBP below 120 mm Hg was identical to that of patients with a value of 120-139.9 mm Hg.
Only among patients with an on-treatment DBP of 90 mm Hg or more was the composite event rate significantly greater than in those with a DBP of 70-79.9 mm Hg, who had the lowest event rate by a margin of 1.89% versus 0.75% per year.
An Australian cardiologist in the audience who has been involved in revamping hypertension treatment guidelines Down Under expressed frustration. He only recently succeeded in wrangling his fellow panelists into incorporating the SPRINT results into the draft guidelines; now HOPE-3 is sending a very different message. What gives? Could the disparate findings simply be due to play of chance? he asked.
Highly unlikely, Dr. Lonn replied.
“There were substantial differences between our trials,” she explained. “First of all, the SPRINT population was at substantially higher risk. They either had to have established cardiovascular disease – we eliminated those people – or significant renal disease – we eliminated those people, too – or age greater than 75, or a Framingham Risk Score above 15%.”
Also, the SPRINT protocol controversially called for unattended blood pressure measurement.
“This is a very pure way ot eliminating white coat hypertension, but it is different from other studies, so it is very difficult to compare SPRINT to older studies or to HOPE-3. Some other investigators have suggested that the difference between attended and unattended blood pressure is close to 10 mm Hg. So our SBP of 130 mm Hg, which had the best outcomes in HOPE-3, may be the same as about 120 mm Hg in SPRINT,” according to Dr. Lonn.
HOPE-3 was funded by the Canadian Institutes of Health Research and AstraZeneca. Dr. Lonn reported serving as a consultant to and receiving research grants from AstraZeneca, Amgen, Bayer, and Novartis.
AT THE ESC CONGRESS 2017
Key clinical point:
Major finding: The on-treatment systolic blood pressure target associated with the greatest reduction in vascular events in the HOPE-3 trial was 130-140 mm Hg.
Data source: The HOPE-3 trial was a randomized, double-blind, placebo-controlled study of 12,705 intermediate-cardiovascular-risk patients in 21 countries who were prospectively followed for a median 5.6 years.
Disclosures: HOPE-3 was funded by the Canadian Institutes of Health Research and AstraZeneca. The presenter reported serving as a consultant to and receiving research grants from AstraZeneca, Amgen, Bayer, and Novartis.
Rosacea patients host the most mites
Infestation with Demodex mites was significantly more common in patients with rosacea compared with healthy controls, based on data from a meta-analysis of 1,513 adults with rosacea. The findings were published in the September issue of the Journal of the American Academy of Dermatology.
The cause of rosacea remains unclear and differs within subgroups, but previous studies have suggested an association between rosacea and the presence of Demodex mites, wrote Yin-Shuo Chang, MD, and Yu-Chen Huang, MD, both of Taipei Medical University, Taiwan (J Am Acad Dermatol. 2017; 77[3]:441-7).
The researchers had no financial conflicts to disclose.
Find the full study online here: http://www.jaad.org/article/S0190-9622(17)30429-2/fulltext.
Infestation with Demodex mites was significantly more common in patients with rosacea compared with healthy controls, based on data from a meta-analysis of 1,513 adults with rosacea. The findings were published in the September issue of the Journal of the American Academy of Dermatology.
The cause of rosacea remains unclear and differs within subgroups, but previous studies have suggested an association between rosacea and the presence of Demodex mites, wrote Yin-Shuo Chang, MD, and Yu-Chen Huang, MD, both of Taipei Medical University, Taiwan (J Am Acad Dermatol. 2017; 77[3]:441-7).
The researchers had no financial conflicts to disclose.
Find the full study online here: http://www.jaad.org/article/S0190-9622(17)30429-2/fulltext.
Infestation with Demodex mites was significantly more common in patients with rosacea compared with healthy controls, based on data from a meta-analysis of 1,513 adults with rosacea. The findings were published in the September issue of the Journal of the American Academy of Dermatology.
The cause of rosacea remains unclear and differs within subgroups, but previous studies have suggested an association between rosacea and the presence of Demodex mites, wrote Yin-Shuo Chang, MD, and Yu-Chen Huang, MD, both of Taipei Medical University, Taiwan (J Am Acad Dermatol. 2017; 77[3]:441-7).
The researchers had no financial conflicts to disclose.
Find the full study online here: http://www.jaad.org/article/S0190-9622(17)30429-2/fulltext.
FROM JAAD
Key clinical point: Demodex mite infestations are significantly associated with rosacea.
Major finding: Rosacea patients were 9 times more likely to experience Demodex mite infestations compared with healthy controls.
Data source: The data come from a meta-analysis of 1,513 adults with rosacea.
Disclosures: The researchers had no financial conflicts to disclose.
Household MRSA contamination predicted human recolonization
SAN DIEGO – Patients who were successfully treated for methicillin-resistant Staphylococcus aureus infections were about four times more likely to become recolonized if their homes contained MRSA, according to the results of a longitudinal household study.
“Many of these homes were contaminated with a classic community strain,” Meghan Frost Davis, DVM, PhD, MPH, said during an oral presentation at an annual meeting on infectious disease. “We need to think about interventions in the home environment to improve our ability to achieve successful decolonization.”
Importantly, Dr. Davis and her associates recently published another study in which biocidal disinfectants failed to eliminate MRSA from homes and appeared to increase the risk of multi-drug resistance (Appl Environ Microbiol. online 22 September 2017, doi: 10.1128/AEM.01369-17). Her team is testing MRSA isolates for resistance to disinfectants and hopes to have more information in about a year, she said. Until then, Dr. Davis suggests advising patients with MRSA to clean sheets and pillowcases frequently.
S. aureus can survive in the environment for long periods. In one case, a MRSA strain tied to an outbreak was cultured from a dry mop that had been locked in a closet for 79 months, Dr. Davis said. “This is concerning because the home is a place that receives bacteria from us,” she said. “A person who was originally colonized or infected with MRSA may clear naturally or through treatment, but the environment may become a reservoir for recolonization and infection.”
To better understand the role of this reservoir, she and her associates recruited 88 index outpatients with MRSA skin and soft tissue infections who were part of a randomized trial of MRSA decolonization strategies. At baseline and 3 months later, the researchers sampled multiple sites in each patient’s home and all household pets. Patients and household members also swabbed themselves in multiple body sites every 2 weeks for up to 3 months. Swabs were cultured in enrichment broth, and positive results were confirmed by PCR (Infect Control Hosp Epidemiol. 2016 Oct;37[10]:1226-33. doi: 10.1017/ice.2016.138. Epub 2016 Jul 28).
Even after accounting for potential confounders, household contamination with MRSA was associated with about a three- to five-fold increase in the odds of human colonization, which was statistically significant. Seventy percent of households had at least one pet, but only 10% had a pet colonized with MRSA. Having such a pet increased the risk of human carriage slightly, but not significantly. However, having more than one pet did predict human colonization, Dr. Davis said. Even if pets aren’t colonized, they still can carry MRSA on “the petting zone” – the top of the head and back, she explained. Thus, pets can serve as reservoirs for MRSA without being colonized.
In all, 53 index patients had at least two consecutive negative cultures and thus were considered decolonized. However, 43% of these individuals were subsequently re-colonized, and those whose homes contained MRSA at baseline were about 4.3 times more likely to become recolonized than those whose households cultured negative (hazard ratio, 4.3; 95% CI, 1.2-16; P less than .03).
A total of six patients were persistently colonized with MRSA, and 62% of contaminated homes tested positive for MRSA Staph protein A (spa) type t008, a common community-onset strain. Living in one of these households significantly increased the chances of persistent colonization (odds ratio, 12.7; 95% CI, 1.33-122; P less than .03).
Pets testing positive for MRSA always came from homes that also tested positive, so these factors couldn’t be disentangled, Dr. Davis said. Repository surfaces in homes – such as the top of a refrigerator – were just as likely to be contaminated with MRSA as high-touch surfaces. However, pillowcases often were most contaminated of all. “If I can give you one take-home message, when you treat people with MRSA, you may want to tell them to clean their sheets and pillowcases a lot.”
Dr. Davis and her associates had no disclosures.
SAN DIEGO – Patients who were successfully treated for methicillin-resistant Staphylococcus aureus infections were about four times more likely to become recolonized if their homes contained MRSA, according to the results of a longitudinal household study.
“Many of these homes were contaminated with a classic community strain,” Meghan Frost Davis, DVM, PhD, MPH, said during an oral presentation at an annual meeting on infectious disease. “We need to think about interventions in the home environment to improve our ability to achieve successful decolonization.”
Importantly, Dr. Davis and her associates recently published another study in which biocidal disinfectants failed to eliminate MRSA from homes and appeared to increase the risk of multi-drug resistance (Appl Environ Microbiol. online 22 September 2017, doi: 10.1128/AEM.01369-17). Her team is testing MRSA isolates for resistance to disinfectants and hopes to have more information in about a year, she said. Until then, Dr. Davis suggests advising patients with MRSA to clean sheets and pillowcases frequently.
S. aureus can survive in the environment for long periods. In one case, a MRSA strain tied to an outbreak was cultured from a dry mop that had been locked in a closet for 79 months, Dr. Davis said. “This is concerning because the home is a place that receives bacteria from us,” she said. “A person who was originally colonized or infected with MRSA may clear naturally or through treatment, but the environment may become a reservoir for recolonization and infection.”
To better understand the role of this reservoir, she and her associates recruited 88 index outpatients with MRSA skin and soft tissue infections who were part of a randomized trial of MRSA decolonization strategies. At baseline and 3 months later, the researchers sampled multiple sites in each patient’s home and all household pets. Patients and household members also swabbed themselves in multiple body sites every 2 weeks for up to 3 months. Swabs were cultured in enrichment broth, and positive results were confirmed by PCR (Infect Control Hosp Epidemiol. 2016 Oct;37[10]:1226-33. doi: 10.1017/ice.2016.138. Epub 2016 Jul 28).
Even after accounting for potential confounders, household contamination with MRSA was associated with about a three- to five-fold increase in the odds of human colonization, which was statistically significant. Seventy percent of households had at least one pet, but only 10% had a pet colonized with MRSA. Having such a pet increased the risk of human carriage slightly, but not significantly. However, having more than one pet did predict human colonization, Dr. Davis said. Even if pets aren’t colonized, they still can carry MRSA on “the petting zone” – the top of the head and back, she explained. Thus, pets can serve as reservoirs for MRSA without being colonized.
In all, 53 index patients had at least two consecutive negative cultures and thus were considered decolonized. However, 43% of these individuals were subsequently re-colonized, and those whose homes contained MRSA at baseline were about 4.3 times more likely to become recolonized than those whose households cultured negative (hazard ratio, 4.3; 95% CI, 1.2-16; P less than .03).
A total of six patients were persistently colonized with MRSA, and 62% of contaminated homes tested positive for MRSA Staph protein A (spa) type t008, a common community-onset strain. Living in one of these households significantly increased the chances of persistent colonization (odds ratio, 12.7; 95% CI, 1.33-122; P less than .03).
Pets testing positive for MRSA always came from homes that also tested positive, so these factors couldn’t be disentangled, Dr. Davis said. Repository surfaces in homes – such as the top of a refrigerator – were just as likely to be contaminated with MRSA as high-touch surfaces. However, pillowcases often were most contaminated of all. “If I can give you one take-home message, when you treat people with MRSA, you may want to tell them to clean their sheets and pillowcases a lot.”
Dr. Davis and her associates had no disclosures.
SAN DIEGO – Patients who were successfully treated for methicillin-resistant Staphylococcus aureus infections were about four times more likely to become recolonized if their homes contained MRSA, according to the results of a longitudinal household study.
“Many of these homes were contaminated with a classic community strain,” Meghan Frost Davis, DVM, PhD, MPH, said during an oral presentation at an annual meeting on infectious disease. “We need to think about interventions in the home environment to improve our ability to achieve successful decolonization.”
Importantly, Dr. Davis and her associates recently published another study in which biocidal disinfectants failed to eliminate MRSA from homes and appeared to increase the risk of multi-drug resistance (Appl Environ Microbiol. online 22 September 2017, doi: 10.1128/AEM.01369-17). Her team is testing MRSA isolates for resistance to disinfectants and hopes to have more information in about a year, she said. Until then, Dr. Davis suggests advising patients with MRSA to clean sheets and pillowcases frequently.
S. aureus can survive in the environment for long periods. In one case, a MRSA strain tied to an outbreak was cultured from a dry mop that had been locked in a closet for 79 months, Dr. Davis said. “This is concerning because the home is a place that receives bacteria from us,” she said. “A person who was originally colonized or infected with MRSA may clear naturally or through treatment, but the environment may become a reservoir for recolonization and infection.”
To better understand the role of this reservoir, she and her associates recruited 88 index outpatients with MRSA skin and soft tissue infections who were part of a randomized trial of MRSA decolonization strategies. At baseline and 3 months later, the researchers sampled multiple sites in each patient’s home and all household pets. Patients and household members also swabbed themselves in multiple body sites every 2 weeks for up to 3 months. Swabs were cultured in enrichment broth, and positive results were confirmed by PCR (Infect Control Hosp Epidemiol. 2016 Oct;37[10]:1226-33. doi: 10.1017/ice.2016.138. Epub 2016 Jul 28).
Even after accounting for potential confounders, household contamination with MRSA was associated with about a three- to five-fold increase in the odds of human colonization, which was statistically significant. Seventy percent of households had at least one pet, but only 10% had a pet colonized with MRSA. Having such a pet increased the risk of human carriage slightly, but not significantly. However, having more than one pet did predict human colonization, Dr. Davis said. Even if pets aren’t colonized, they still can carry MRSA on “the petting zone” – the top of the head and back, she explained. Thus, pets can serve as reservoirs for MRSA without being colonized.
In all, 53 index patients had at least two consecutive negative cultures and thus were considered decolonized. However, 43% of these individuals were subsequently re-colonized, and those whose homes contained MRSA at baseline were about 4.3 times more likely to become recolonized than those whose households cultured negative (hazard ratio, 4.3; 95% CI, 1.2-16; P less than .03).
A total of six patients were persistently colonized with MRSA, and 62% of contaminated homes tested positive for MRSA Staph protein A (spa) type t008, a common community-onset strain. Living in one of these households significantly increased the chances of persistent colonization (odds ratio, 12.7; 95% CI, 1.33-122; P less than .03).
Pets testing positive for MRSA always came from homes that also tested positive, so these factors couldn’t be disentangled, Dr. Davis said. Repository surfaces in homes – such as the top of a refrigerator – were just as likely to be contaminated with MRSA as high-touch surfaces. However, pillowcases often were most contaminated of all. “If I can give you one take-home message, when you treat people with MRSA, you may want to tell them to clean their sheets and pillowcases a lot.”
Dr. Davis and her associates had no disclosures.
IDWEEK 2017
Key clinical point: Patients who were successfully decolonized of MRSA were significantly more likely to become recolonized if their homes were contaminated with MRSA.
Major finding: The risk of recolonization was about four-fold higher if a home was contaminated at baseline (hazard ratio, 4.3; 95% CI, 1.2-16).
Data source: A nested household study that enrolled 88 index patients with MRSA skin and soft tissue infections.
Disclosures: Dr. Davis and her associates had no disclosures.
Drug receives orphan designation for treatment of MDS
The European Commission has granted orphan designation to asunercept (APG101) for the treatment of myelodysplastic syndromes (MDS).
Asunercept is a fully human fusion protein that consists of the extracellular domain of the CD95 receptor and the Fc domain of an IgG1 antibody.
Asunercept binds to the CD95 ligand and blocks activation of the CD95 receptor.
Excessive stimulation of the CD95 receptor on hematopoietic precursors inhibits erythropoiesis in MDS patients.
As a result, the patients develop transfusion-dependent anemia that is refractory to erythropoiesis-stimulating agents (ESAs).
Treatment with asunercept, by inhibiting the CD95 system, stimulates the production of red blood cells and decreases transfusion dependency.
Asunercept has been evaluated in a phase 1 trial, the results of which were presented at the 2016 ASH Annual Meeting.
The trial enrolled 20 patients with low- to intermediate-risk MDS. All patients had anemia resulting in a high transfusion burden, had hemoglobin levels of less than 10 g/dL, and were refractory to ESAs.
Patients received once-weekly asunercept infusions for 12 weeks. Eight of the 20 patients (40%) experienced a reduction in transfusion frequency for 6 months.
Asunercept was considered generally well tolerated, with no grade 3 or higher treatment-related adverse events reported. The most common treatment-emergent adverse events were peripheral edema (n=6), urinary tract infection (n=4), and oral herpes (n=3).
One patient developed acute myeloid leukemia, and 1 patient died from sepsis due to pre-existing neutropenia.
“We are highly encouraged by the data from our clinical phase 1 trial with asunercept in these patients and are currently preparing to initiate a clinical phase 2 proof-of-concept trial to further evaluate the efficacy of asunercept in MDS,” said Harald Fricke, chief medical officer of Apogenix AG, the company developing asunercept.
About orphan designation
Orphan designation provides regulatory and financial incentives for companies to develop and market therapies that treat life-threatening or chronically debilitating conditions affecting no more than 5 in 10,000 people in the European Union, and where no satisfactory treatment is available.
Orphan designation provides a 10-year period of marketing exclusivity if the drug receives regulatory approval.
The designation also provides incentives for companies seeking protocol assistance from the European Medicines Agency during the product development phase and direct access to the centralized authorization procedure. ![]()
The European Commission has granted orphan designation to asunercept (APG101) for the treatment of myelodysplastic syndromes (MDS).
Asunercept is a fully human fusion protein that consists of the extracellular domain of the CD95 receptor and the Fc domain of an IgG1 antibody.
Asunercept binds to the CD95 ligand and blocks activation of the CD95 receptor.
Excessive stimulation of the CD95 receptor on hematopoietic precursors inhibits erythropoiesis in MDS patients.
As a result, the patients develop transfusion-dependent anemia that is refractory to erythropoiesis-stimulating agents (ESAs).
Treatment with asunercept, by inhibiting the CD95 system, stimulates the production of red blood cells and decreases transfusion dependency.
Asunercept has been evaluated in a phase 1 trial, the results of which were presented at the 2016 ASH Annual Meeting.
The trial enrolled 20 patients with low- to intermediate-risk MDS. All patients had anemia resulting in a high transfusion burden, had hemoglobin levels of less than 10 g/dL, and were refractory to ESAs.
Patients received once-weekly asunercept infusions for 12 weeks. Eight of the 20 patients (40%) experienced a reduction in transfusion frequency for 6 months.
Asunercept was considered generally well tolerated, with no grade 3 or higher treatment-related adverse events reported. The most common treatment-emergent adverse events were peripheral edema (n=6), urinary tract infection (n=4), and oral herpes (n=3).
One patient developed acute myeloid leukemia, and 1 patient died from sepsis due to pre-existing neutropenia.
“We are highly encouraged by the data from our clinical phase 1 trial with asunercept in these patients and are currently preparing to initiate a clinical phase 2 proof-of-concept trial to further evaluate the efficacy of asunercept in MDS,” said Harald Fricke, chief medical officer of Apogenix AG, the company developing asunercept.
About orphan designation
Orphan designation provides regulatory and financial incentives for companies to develop and market therapies that treat life-threatening or chronically debilitating conditions affecting no more than 5 in 10,000 people in the European Union, and where no satisfactory treatment is available.
Orphan designation provides a 10-year period of marketing exclusivity if the drug receives regulatory approval.
The designation also provides incentives for companies seeking protocol assistance from the European Medicines Agency during the product development phase and direct access to the centralized authorization procedure. ![]()
The European Commission has granted orphan designation to asunercept (APG101) for the treatment of myelodysplastic syndromes (MDS).
Asunercept is a fully human fusion protein that consists of the extracellular domain of the CD95 receptor and the Fc domain of an IgG1 antibody.
Asunercept binds to the CD95 ligand and blocks activation of the CD95 receptor.
Excessive stimulation of the CD95 receptor on hematopoietic precursors inhibits erythropoiesis in MDS patients.
As a result, the patients develop transfusion-dependent anemia that is refractory to erythropoiesis-stimulating agents (ESAs).
Treatment with asunercept, by inhibiting the CD95 system, stimulates the production of red blood cells and decreases transfusion dependency.
Asunercept has been evaluated in a phase 1 trial, the results of which were presented at the 2016 ASH Annual Meeting.
The trial enrolled 20 patients with low- to intermediate-risk MDS. All patients had anemia resulting in a high transfusion burden, had hemoglobin levels of less than 10 g/dL, and were refractory to ESAs.
Patients received once-weekly asunercept infusions for 12 weeks. Eight of the 20 patients (40%) experienced a reduction in transfusion frequency for 6 months.
Asunercept was considered generally well tolerated, with no grade 3 or higher treatment-related adverse events reported. The most common treatment-emergent adverse events were peripheral edema (n=6), urinary tract infection (n=4), and oral herpes (n=3).
One patient developed acute myeloid leukemia, and 1 patient died from sepsis due to pre-existing neutropenia.
“We are highly encouraged by the data from our clinical phase 1 trial with asunercept in these patients and are currently preparing to initiate a clinical phase 2 proof-of-concept trial to further evaluate the efficacy of asunercept in MDS,” said Harald Fricke, chief medical officer of Apogenix AG, the company developing asunercept.
About orphan designation
Orphan designation provides regulatory and financial incentives for companies to develop and market therapies that treat life-threatening or chronically debilitating conditions affecting no more than 5 in 10,000 people in the European Union, and where no satisfactory treatment is available.
Orphan designation provides a 10-year period of marketing exclusivity if the drug receives regulatory approval.
The designation also provides incentives for companies seeking protocol assistance from the European Medicines Agency during the product development phase and direct access to the centralized authorization procedure. ![]()
Study IDs risk predictors of PrEP use in MSM
SAN DIEGO – Recent sexual risk behavior and partnership type may be important predictors of pre-exposure prophylaxis in men who have sex with men, results from a 48-week study suggest.
“We know from other studies including iPrEX, the Partners PrEP, and the Demo project that individuals who report higher risk behaviors are more likely to be adherent to pre-exposure prophylaxis (PrEP),” lead study author Jill Blumenthal, MD, said in an interview in advance of an annual scientific meeting on infectious diseases.
As part of a PrEP California Collaborative Treatment Group demonstration study of 398 HIV-negative at-risk men who have sex with men and transgender women, Dr. Blumenthal, of the Antiviral Research Center at the University of California, San Diego, and her associates estimated their HIV risk score at baseline and week 48. Their score was estimated as the probability of seroconversion over the next year based on number of condomless anal sex acts with HIV+/unknown partners in the last month and any sexually transmitted infection diagnosed at study visit. The researchers categorized HIV risk score as low (less than 0.12), moderate (0.12-0.59) and high (greater than 0.59) risk based on population seroconversion probabilities. They assigned partnership as no/single HIV- partner, single HIV+ partner, or multiple partners of any serostatus in the past 3 months. They estimated PrEP adherence by intracellular tenofovir-diphosphate (TFV-DP) levels as a continuous variable at week 48.
Of 313 study participants who completed week 48, the researchers observed no significant change in HIV risk category from baseline to week 48 (low: 44% to 42%; moderate: 27% to 24%; high: 28% to 34%; P=0.25). However, there was a significant change in partnership type, with the proportion of those with no or single HIV- partnerships increasing from 1% to 9% (P less than 0.001). Univariate analysis revealed that moderate and high risk groups had higher TFV-DP levels, compared with the low risk group at week 48 (P = 0.018). Participants with no/single HIV- partner had significantly lower TFV-DP levels, compared with those who had one HIV+ partner or multiple partners (P = 0.007). On multivariable linear regression, only low risk partnerships remained significant where no/single HIV- partnerships were associated with lower TFV-DP levels (P = 0.014).
“Although more individuals in our study reported having either no or a single HIV-negative partners by the end of the study, there was no decrease in risk behavior based on reported condomless anal sex acts and laboratory-confirmed STIs over time,” Dr. Blumenthal said. “However, those risk behaviors did not increase either, arguing against risk compensation. Individuals with higher HIV risk behaviors and in riskier partnerships (those with either a single HIV+ or multiple partners) had higher TFV-DP levels at week 48 suggesting a maintained, strong motivation for PrEP adherence.”
She acknowledged certain limitations of the study, including the fact that the risk behavior score used in the analysis has not been validated in prospective studies of HIV incidence. “In addition, participants in the study were not allowed to start and stop PrEP, so less risky individuals may have remained in the study in the event they wanted to restart PrEP,” she said. Dr. Blumenthal disclosed that she is a Gilead Educational Grant recipient and that the study drug was provided by Gilead.
SAN DIEGO – Recent sexual risk behavior and partnership type may be important predictors of pre-exposure prophylaxis in men who have sex with men, results from a 48-week study suggest.
“We know from other studies including iPrEX, the Partners PrEP, and the Demo project that individuals who report higher risk behaviors are more likely to be adherent to pre-exposure prophylaxis (PrEP),” lead study author Jill Blumenthal, MD, said in an interview in advance of an annual scientific meeting on infectious diseases.
As part of a PrEP California Collaborative Treatment Group demonstration study of 398 HIV-negative at-risk men who have sex with men and transgender women, Dr. Blumenthal, of the Antiviral Research Center at the University of California, San Diego, and her associates estimated their HIV risk score at baseline and week 48. Their score was estimated as the probability of seroconversion over the next year based on number of condomless anal sex acts with HIV+/unknown partners in the last month and any sexually transmitted infection diagnosed at study visit. The researchers categorized HIV risk score as low (less than 0.12), moderate (0.12-0.59) and high (greater than 0.59) risk based on population seroconversion probabilities. They assigned partnership as no/single HIV- partner, single HIV+ partner, or multiple partners of any serostatus in the past 3 months. They estimated PrEP adherence by intracellular tenofovir-diphosphate (TFV-DP) levels as a continuous variable at week 48.
Of 313 study participants who completed week 48, the researchers observed no significant change in HIV risk category from baseline to week 48 (low: 44% to 42%; moderate: 27% to 24%; high: 28% to 34%; P=0.25). However, there was a significant change in partnership type, with the proportion of those with no or single HIV- partnerships increasing from 1% to 9% (P less than 0.001). Univariate analysis revealed that moderate and high risk groups had higher TFV-DP levels, compared with the low risk group at week 48 (P = 0.018). Participants with no/single HIV- partner had significantly lower TFV-DP levels, compared with those who had one HIV+ partner or multiple partners (P = 0.007). On multivariable linear regression, only low risk partnerships remained significant where no/single HIV- partnerships were associated with lower TFV-DP levels (P = 0.014).
“Although more individuals in our study reported having either no or a single HIV-negative partners by the end of the study, there was no decrease in risk behavior based on reported condomless anal sex acts and laboratory-confirmed STIs over time,” Dr. Blumenthal said. “However, those risk behaviors did not increase either, arguing against risk compensation. Individuals with higher HIV risk behaviors and in riskier partnerships (those with either a single HIV+ or multiple partners) had higher TFV-DP levels at week 48 suggesting a maintained, strong motivation for PrEP adherence.”
She acknowledged certain limitations of the study, including the fact that the risk behavior score used in the analysis has not been validated in prospective studies of HIV incidence. “In addition, participants in the study were not allowed to start and stop PrEP, so less risky individuals may have remained in the study in the event they wanted to restart PrEP,” she said. Dr. Blumenthal disclosed that she is a Gilead Educational Grant recipient and that the study drug was provided by Gilead.
SAN DIEGO – Recent sexual risk behavior and partnership type may be important predictors of pre-exposure prophylaxis in men who have sex with men, results from a 48-week study suggest.
“We know from other studies including iPrEX, the Partners PrEP, and the Demo project that individuals who report higher risk behaviors are more likely to be adherent to pre-exposure prophylaxis (PrEP),” lead study author Jill Blumenthal, MD, said in an interview in advance of an annual scientific meeting on infectious diseases.
As part of a PrEP California Collaborative Treatment Group demonstration study of 398 HIV-negative at-risk men who have sex with men and transgender women, Dr. Blumenthal, of the Antiviral Research Center at the University of California, San Diego, and her associates estimated their HIV risk score at baseline and week 48. Their score was estimated as the probability of seroconversion over the next year based on number of condomless anal sex acts with HIV+/unknown partners in the last month and any sexually transmitted infection diagnosed at study visit. The researchers categorized HIV risk score as low (less than 0.12), moderate (0.12-0.59) and high (greater than 0.59) risk based on population seroconversion probabilities. They assigned partnership as no/single HIV- partner, single HIV+ partner, or multiple partners of any serostatus in the past 3 months. They estimated PrEP adherence by intracellular tenofovir-diphosphate (TFV-DP) levels as a continuous variable at week 48.
Of 313 study participants who completed week 48, the researchers observed no significant change in HIV risk category from baseline to week 48 (low: 44% to 42%; moderate: 27% to 24%; high: 28% to 34%; P=0.25). However, there was a significant change in partnership type, with the proportion of those with no or single HIV- partnerships increasing from 1% to 9% (P less than 0.001). Univariate analysis revealed that moderate and high risk groups had higher TFV-DP levels, compared with the low risk group at week 48 (P = 0.018). Participants with no/single HIV- partner had significantly lower TFV-DP levels, compared with those who had one HIV+ partner or multiple partners (P = 0.007). On multivariable linear regression, only low risk partnerships remained significant where no/single HIV- partnerships were associated with lower TFV-DP levels (P = 0.014).
“Although more individuals in our study reported having either no or a single HIV-negative partners by the end of the study, there was no decrease in risk behavior based on reported condomless anal sex acts and laboratory-confirmed STIs over time,” Dr. Blumenthal said. “However, those risk behaviors did not increase either, arguing against risk compensation. Individuals with higher HIV risk behaviors and in riskier partnerships (those with either a single HIV+ or multiple partners) had higher TFV-DP levels at week 48 suggesting a maintained, strong motivation for PrEP adherence.”
She acknowledged certain limitations of the study, including the fact that the risk behavior score used in the analysis has not been validated in prospective studies of HIV incidence. “In addition, participants in the study were not allowed to start and stop PrEP, so less risky individuals may have remained in the study in the event they wanted to restart PrEP,” she said. Dr. Blumenthal disclosed that she is a Gilead Educational Grant recipient and that the study drug was provided by Gilead.
ID WEEK 2017
Key clinical point: Recent HIV risk behavior and partnership type predict pre-exposure prophylaxis adherence in men who have sex with men.
Major finding: After 48 weeks, the proportion of those with no or single HIV- partnerships increased from 1% to 9% (P less than 0.001).
Study details: A demonstration study of 398 HIV-negative at-risk men who have sex with men and transgender women.
Disclosures: Dr. Blumenthal disclosed that she is a Gilead Educational Grant recipient and that the study drug was provided by Gilead.
Focal cultures, PCR upped Kingella detection in pediatric hematogenous osteomyelitis
SAN DIEGO – Early focal cultures and strategic use of polymerase chain reaction (PCR) testing helped a hospital detect Kingella kingae seven times more often in a study of young children with acute non-complex hematogenous osteomyelitis, Rachel Quick, MSN, CNS, said at an annual meeting on infectious diseases.
Kingella kingae turned out to be the leading culprit in these cases, although the new approach also enhanced detection of Staphylococcus aureus and other bacteria, said Ms. Quick of Seton Healthcare Family in Austin, Texas. Children also transitioned to oral antibiotics a median of 22 days sooner and needed peripherally inserted central catheters (PICC) two-thirds less often after the guideline was implemented, she said during her oral presentation.
After implementing the guideline, Ms. Quick and her associates compared 25 children treated beforehand with 24 children treated afterward. Patients were 6 months to 5 years old, had physical signs and symptoms of acute hematogenous osteomyelitis or septic joint, and had been symptomatic for less than 14 days. The study was conducted between 2009 and 2016.
Kingella kingae was identified in one patient (4%) from the baseline cohort and in seven patients (29%) after the guideline was rolled out (P = .02), Ms. Quick said. Kingella was cultured from focal samples only, not from blood. Detection of methicillin-sensitive Staphylococcus aureus (MSSA) jumped from 8% to 17%, while cases with no detectable pathogen dropped from 80% to 46%. Lengths of stay and readmission rates did not change significantly.
Taken together, the findings show how early focal cultures and PCR can facilitate targeted therapy in acute pediatric bone and joint infections, prevent unnecessary antibiotic use, and expedite a targeted transition to oral antibiotics, said Ms. Quick. “We recognize that we have a small sample and that these are not complicated cases,” she said. “Our findings do not suggest it’s more important to look for Kingella than Staphylococcus aureus, but that Kingella should be up there in the ranks of what we’re looking for.”
The investigators reported having no conflicts of interest.
SAN DIEGO – Early focal cultures and strategic use of polymerase chain reaction (PCR) testing helped a hospital detect Kingella kingae seven times more often in a study of young children with acute non-complex hematogenous osteomyelitis, Rachel Quick, MSN, CNS, said at an annual meeting on infectious diseases.
Kingella kingae turned out to be the leading culprit in these cases, although the new approach also enhanced detection of Staphylococcus aureus and other bacteria, said Ms. Quick of Seton Healthcare Family in Austin, Texas. Children also transitioned to oral antibiotics a median of 22 days sooner and needed peripherally inserted central catheters (PICC) two-thirds less often after the guideline was implemented, she said during her oral presentation.
After implementing the guideline, Ms. Quick and her associates compared 25 children treated beforehand with 24 children treated afterward. Patients were 6 months to 5 years old, had physical signs and symptoms of acute hematogenous osteomyelitis or septic joint, and had been symptomatic for less than 14 days. The study was conducted between 2009 and 2016.
Kingella kingae was identified in one patient (4%) from the baseline cohort and in seven patients (29%) after the guideline was rolled out (P = .02), Ms. Quick said. Kingella was cultured from focal samples only, not from blood. Detection of methicillin-sensitive Staphylococcus aureus (MSSA) jumped from 8% to 17%, while cases with no detectable pathogen dropped from 80% to 46%. Lengths of stay and readmission rates did not change significantly.
Taken together, the findings show how early focal cultures and PCR can facilitate targeted therapy in acute pediatric bone and joint infections, prevent unnecessary antibiotic use, and expedite a targeted transition to oral antibiotics, said Ms. Quick. “We recognize that we have a small sample and that these are not complicated cases,” she said. “Our findings do not suggest it’s more important to look for Kingella than Staphylococcus aureus, but that Kingella should be up there in the ranks of what we’re looking for.”
The investigators reported having no conflicts of interest.
SAN DIEGO – Early focal cultures and strategic use of polymerase chain reaction (PCR) testing helped a hospital detect Kingella kingae seven times more often in a study of young children with acute non-complex hematogenous osteomyelitis, Rachel Quick, MSN, CNS, said at an annual meeting on infectious diseases.
Kingella kingae turned out to be the leading culprit in these cases, although the new approach also enhanced detection of Staphylococcus aureus and other bacteria, said Ms. Quick of Seton Healthcare Family in Austin, Texas. Children also transitioned to oral antibiotics a median of 22 days sooner and needed peripherally inserted central catheters (PICC) two-thirds less often after the guideline was implemented, she said during her oral presentation.
After implementing the guideline, Ms. Quick and her associates compared 25 children treated beforehand with 24 children treated afterward. Patients were 6 months to 5 years old, had physical signs and symptoms of acute hematogenous osteomyelitis or septic joint, and had been symptomatic for less than 14 days. The study was conducted between 2009 and 2016.
Kingella kingae was identified in one patient (4%) from the baseline cohort and in seven patients (29%) after the guideline was rolled out (P = .02), Ms. Quick said. Kingella was cultured from focal samples only, not from blood. Detection of methicillin-sensitive Staphylococcus aureus (MSSA) jumped from 8% to 17%, while cases with no detectable pathogen dropped from 80% to 46%. Lengths of stay and readmission rates did not change significantly.
Taken together, the findings show how early focal cultures and PCR can facilitate targeted therapy in acute pediatric bone and joint infections, prevent unnecessary antibiotic use, and expedite a targeted transition to oral antibiotics, said Ms. Quick. “We recognize that we have a small sample and that these are not complicated cases,” she said. “Our findings do not suggest it’s more important to look for Kingella than Staphylococcus aureus, but that Kingella should be up there in the ranks of what we’re looking for.”
The investigators reported having no conflicts of interest.
IDWEEK 2017
Key clinical point: Early focal cultures and strategic use of polymerase chain reaction (PCR) enhanced detection of Kingella kingae and other bacteria in young children with acute non-complex hematogenous osteomyelitis
Major finding: Detection of Kingella surged from 4% to 29%.
Data source: A retrospective cohort study of 49 children with non-complex acute hematogenous osteomyelitis or septic arthritis.
Disclosures: The investigators reported having no conflicts of interest.
BP accuracy is the ghost in the machine
SAN FRANCISCO – Amid all the talk about subgroup blood pressure targets and tiny differences in drug regimens at a recent hypertension meeting, there was an elephant in the room that attendees refused to ignore.
“We do it wrong,” said Dr. Steven Yarows, a primary care physician in Chelsea, Mich., who estimated he’s taken 44,000 blood pressures in his 36 years of practice.
Everyone in medicine is taught that people should rest a bit and not talk while their blood pressure is taken; that the last measurement matters more than the first; and that most Americans need a large-sized cuff. Current guidelines are based on patients sitting for 5-10 minutes alone in a quiet room while an automatic machine averages their last 3-5 blood pressures.
But when Dr. Yarows asked his 300 or so audience members – hypertension physicians who paid to come to the meeting – how many actually followed those rules, four hands went up. It’s not good enough; “if you are going to make a diagnosis that lasts a lifetime, you have to be accurate,” he said at the joint scientific sessions of the American Heart Association Council on Hypertension, AHA Council on Kidney Cardiovascular Disease, and American Society of Hypertension.
There’s resistance. No one has a room set aside for blood pressure; staff don’t want to deal with it; and at a time when primary care doctors are nickel and dimed for everything they do, insurers haven’t stepped up to pay to make accurate blood pressure a priority.
To do it right, you have to ask patients to come in 10 minutes early and have a room set up for them where they can sit alone with a large oscillometric cuff to average a few blood pressures at rest, Dr. Yarows said. They also need at least one 24-hour monitoring.
“Most of the time, the patient walks over from the waiting room, they get on the scale which automatically elevates the blood pressure tremendously, and then they sit down and talk about their family while their blood pressure is being taken.” Even in normotensive patients, that alone could raise systolic pressure 20 mm Hg or more, he said. It makes one-time blood pressure pretty much meaningless.
The biggest problem is that blood pressure is hugely variable, so it’s hard to know what matters. In one of Dr. Yarows’ normotensive patients, BP varied 44 mm Hg systolic and 37 mm Hg diastolic over 24 hours. In a hypertensive patient, systolic pressure varied 62 mm Hg and diastolic 48 mm Hg over 24 hours. Another patient was 114/85 mm Hg at noon, and 159/73 mm Hg an hour later. “That’s a huge spread,” he said.
Twenty-four hour monitoring is the only way to really know if patients are hypertensive and need treatment. “Any person you suspect of having hypertension, before you place them on medicine, you should have 24 hour blood pressure monitoring. This is the most effective way to determine if they do have high blood pressure,” and how much it needs to be lowered, he said.
Dr. Yarows had no disclosures.
SAN FRANCISCO – Amid all the talk about subgroup blood pressure targets and tiny differences in drug regimens at a recent hypertension meeting, there was an elephant in the room that attendees refused to ignore.
“We do it wrong,” said Dr. Steven Yarows, a primary care physician in Chelsea, Mich., who estimated he’s taken 44,000 blood pressures in his 36 years of practice.
Everyone in medicine is taught that people should rest a bit and not talk while their blood pressure is taken; that the last measurement matters more than the first; and that most Americans need a large-sized cuff. Current guidelines are based on patients sitting for 5-10 minutes alone in a quiet room while an automatic machine averages their last 3-5 blood pressures.
But when Dr. Yarows asked his 300 or so audience members – hypertension physicians who paid to come to the meeting – how many actually followed those rules, four hands went up. It’s not good enough; “if you are going to make a diagnosis that lasts a lifetime, you have to be accurate,” he said at the joint scientific sessions of the American Heart Association Council on Hypertension, AHA Council on Kidney Cardiovascular Disease, and American Society of Hypertension.
There’s resistance. No one has a room set aside for blood pressure; staff don’t want to deal with it; and at a time when primary care doctors are nickel and dimed for everything they do, insurers haven’t stepped up to pay to make accurate blood pressure a priority.
To do it right, you have to ask patients to come in 10 minutes early and have a room set up for them where they can sit alone with a large oscillometric cuff to average a few blood pressures at rest, Dr. Yarows said. They also need at least one 24-hour monitoring.
“Most of the time, the patient walks over from the waiting room, they get on the scale which automatically elevates the blood pressure tremendously, and then they sit down and talk about their family while their blood pressure is being taken.” Even in normotensive patients, that alone could raise systolic pressure 20 mm Hg or more, he said. It makes one-time blood pressure pretty much meaningless.
The biggest problem is that blood pressure is hugely variable, so it’s hard to know what matters. In one of Dr. Yarows’ normotensive patients, BP varied 44 mm Hg systolic and 37 mm Hg diastolic over 24 hours. In a hypertensive patient, systolic pressure varied 62 mm Hg and diastolic 48 mm Hg over 24 hours. Another patient was 114/85 mm Hg at noon, and 159/73 mm Hg an hour later. “That’s a huge spread,” he said.
Twenty-four hour monitoring is the only way to really know if patients are hypertensive and need treatment. “Any person you suspect of having hypertension, before you place them on medicine, you should have 24 hour blood pressure monitoring. This is the most effective way to determine if they do have high blood pressure,” and how much it needs to be lowered, he said.
Dr. Yarows had no disclosures.
SAN FRANCISCO – Amid all the talk about subgroup blood pressure targets and tiny differences in drug regimens at a recent hypertension meeting, there was an elephant in the room that attendees refused to ignore.
“We do it wrong,” said Dr. Steven Yarows, a primary care physician in Chelsea, Mich., who estimated he’s taken 44,000 blood pressures in his 36 years of practice.
Everyone in medicine is taught that people should rest a bit and not talk while their blood pressure is taken; that the last measurement matters more than the first; and that most Americans need a large-sized cuff. Current guidelines are based on patients sitting for 5-10 minutes alone in a quiet room while an automatic machine averages their last 3-5 blood pressures.
But when Dr. Yarows asked his 300 or so audience members – hypertension physicians who paid to come to the meeting – how many actually followed those rules, four hands went up. It’s not good enough; “if you are going to make a diagnosis that lasts a lifetime, you have to be accurate,” he said at the joint scientific sessions of the American Heart Association Council on Hypertension, AHA Council on Kidney Cardiovascular Disease, and American Society of Hypertension.
There’s resistance. No one has a room set aside for blood pressure; staff don’t want to deal with it; and at a time when primary care doctors are nickel and dimed for everything they do, insurers haven’t stepped up to pay to make accurate blood pressure a priority.
To do it right, you have to ask patients to come in 10 minutes early and have a room set up for them where they can sit alone with a large oscillometric cuff to average a few blood pressures at rest, Dr. Yarows said. They also need at least one 24-hour monitoring.
“Most of the time, the patient walks over from the waiting room, they get on the scale which automatically elevates the blood pressure tremendously, and then they sit down and talk about their family while their blood pressure is being taken.” Even in normotensive patients, that alone could raise systolic pressure 20 mm Hg or more, he said. It makes one-time blood pressure pretty much meaningless.
The biggest problem is that blood pressure is hugely variable, so it’s hard to know what matters. In one of Dr. Yarows’ normotensive patients, BP varied 44 mm Hg systolic and 37 mm Hg diastolic over 24 hours. In a hypertensive patient, systolic pressure varied 62 mm Hg and diastolic 48 mm Hg over 24 hours. Another patient was 114/85 mm Hg at noon, and 159/73 mm Hg an hour later. “That’s a huge spread,” he said.
Twenty-four hour monitoring is the only way to really know if patients are hypertensive and need treatment. “Any person you suspect of having hypertension, before you place them on medicine, you should have 24 hour blood pressure monitoring. This is the most effective way to determine if they do have high blood pressure,” and how much it needs to be lowered, he said.
Dr. Yarows had no disclosures.
EXPERT ANALYSIS FROM JOINT HYPERTENSION 2017
Obesity paradox slings its weight around in atrial fibrillation
BARCELONA – The obesity paradox is alive and well in the rapidly growing population with atrial fibrillation, Samuel Z. Goldhaber, MD, reported at the annual congress of the European Society of Cardiology.
In an analysis of 22,541 participants in the international registry known as GARFIELD-AF (Global Anticoagulation Registry in the Field-Atrial Fibrillation) – the largest ongoing prospective AF registry in the world – all-cause mortality during the first 2 years after diagnosis of the arrhythmia paradoxically decreased as body mass index increased all the way up to a ceiling of 40 kg/m2, according to Dr. Goldhaber, professor of medicine at Harvard Medical School and section head of vascular medicine and director of the thrombosis research group at Brigham and Women’s Hospital, Boston.
“It’s pretty impressive. This degree of mortality reduction with overweight and obesity is of about the same magnitude you might get with thrombolytic therapy for an acute MI, or with beta blocker therapy or statins post-MI,” the cardiologist said in an interview.
This newly described obesity paradox in AF is all the more baffling in light of the heavy cardiovascular risk factor burden that accompanied increased BMI. For example, the prevalence of type 2 diabetes rose from 14.1% among normal-weight individuals with AF, to 19.7% in the overweight, 27.5% in the obese, 34.6% in those with a BMI of 35 to less than 40 kg/m2, and 41.6% in patients with a BMI of 40 kg/m2 or more.
“It’s perplexing,” Dr. Goldhaber confessed. “I think there’s some hidden message here that we haven’t decoded yet.”
Patients with AF who were at the low extreme of the BMI spectrum -- the 3.3% who were underweight, with a BMI below 20 kg/m2 – fared particularly poorly. Their 2-year all-cause mortality rate of 8.71 per 100 person-years was by far the highest of any BMI class in the study. But that’s relatively straightforward to understand, as it’s likely that many underweight patients with AF were frail and/or had cancer. Indeed, only 49% of deaths in the underweight group were due to cardiovascular disease, in contrast to 64% of deaths in persons whose BMI was 40 kg/m2 or more.
Interestingly, only 5%-6% of all deaths in GARFIELD-AF were due to stroke; deaths from heart failure were far more common.
Overall, 71% of patients who presented with newly diagnosed AF in GARFIELD-AF were overweight or obese. The only contributory factor to the obesity paradox that Dr. Goldhaber could spot was that the heavier patients were younger at diagnosis of AF. But this age disparity seems unlikely to overcome their massive burden of multiple other cardiovascular risk factors. So he solicited theories from his audience. One physician argued that morbidly obese individuals are sedentary, so they don’t leave the house as much as leaner AF patients.
“You stay at home because you cannot move. You don’t get into fatal car accidents. You don’t get into conflicts and get murdered,” the physician postulated.
Dr. Goldhaber wasn’t convinced.
“When I go out to bars or dancing, I look around at people, and I estimate that a lot of them have a BMI of 35,” he commented.
Another proposed theory was that overweight and obese AF patients exercise less, so they inhale less of the airborne toxic fine particulates present in the urban environment. The adverse health impact of air pollution is a particularly hot topic of late among European cardiologists, but the notion that obese AF patients fare better because they don’t exercise runs contrary to a wealth of data supporting the health benefits of working out.
The GARFIELD-AF registry is funded by Bayer AG. Dr. Goldhaber reported receiving research grants and/or serving as a consultant to Bayer and numerous other entities, including the National Heart, Lung and Blood Institute.
BARCELONA – The obesity paradox is alive and well in the rapidly growing population with atrial fibrillation, Samuel Z. Goldhaber, MD, reported at the annual congress of the European Society of Cardiology.
In an analysis of 22,541 participants in the international registry known as GARFIELD-AF (Global Anticoagulation Registry in the Field-Atrial Fibrillation) – the largest ongoing prospective AF registry in the world – all-cause mortality during the first 2 years after diagnosis of the arrhythmia paradoxically decreased as body mass index increased all the way up to a ceiling of 40 kg/m2, according to Dr. Goldhaber, professor of medicine at Harvard Medical School and section head of vascular medicine and director of the thrombosis research group at Brigham and Women’s Hospital, Boston.
“It’s pretty impressive. This degree of mortality reduction with overweight and obesity is of about the same magnitude you might get with thrombolytic therapy for an acute MI, or with beta blocker therapy or statins post-MI,” the cardiologist said in an interview.
This newly described obesity paradox in AF is all the more baffling in light of the heavy cardiovascular risk factor burden that accompanied increased BMI. For example, the prevalence of type 2 diabetes rose from 14.1% among normal-weight individuals with AF, to 19.7% in the overweight, 27.5% in the obese, 34.6% in those with a BMI of 35 to less than 40 kg/m2, and 41.6% in patients with a BMI of 40 kg/m2 or more.
“It’s perplexing,” Dr. Goldhaber confessed. “I think there’s some hidden message here that we haven’t decoded yet.”
Patients with AF who were at the low extreme of the BMI spectrum -- the 3.3% who were underweight, with a BMI below 20 kg/m2 – fared particularly poorly. Their 2-year all-cause mortality rate of 8.71 per 100 person-years was by far the highest of any BMI class in the study. But that’s relatively straightforward to understand, as it’s likely that many underweight patients with AF were frail and/or had cancer. Indeed, only 49% of deaths in the underweight group were due to cardiovascular disease, in contrast to 64% of deaths in persons whose BMI was 40 kg/m2 or more.
Interestingly, only 5%-6% of all deaths in GARFIELD-AF were due to stroke; deaths from heart failure were far more common.
Overall, 71% of patients who presented with newly diagnosed AF in GARFIELD-AF were overweight or obese. The only contributory factor to the obesity paradox that Dr. Goldhaber could spot was that the heavier patients were younger at diagnosis of AF. But this age disparity seems unlikely to overcome their massive burden of multiple other cardiovascular risk factors. So he solicited theories from his audience. One physician argued that morbidly obese individuals are sedentary, so they don’t leave the house as much as leaner AF patients.
“You stay at home because you cannot move. You don’t get into fatal car accidents. You don’t get into conflicts and get murdered,” the physician postulated.
Dr. Goldhaber wasn’t convinced.
“When I go out to bars or dancing, I look around at people, and I estimate that a lot of them have a BMI of 35,” he commented.
Another proposed theory was that overweight and obese AF patients exercise less, so they inhale less of the airborne toxic fine particulates present in the urban environment. The adverse health impact of air pollution is a particularly hot topic of late among European cardiologists, but the notion that obese AF patients fare better because they don’t exercise runs contrary to a wealth of data supporting the health benefits of working out.
The GARFIELD-AF registry is funded by Bayer AG. Dr. Goldhaber reported receiving research grants and/or serving as a consultant to Bayer and numerous other entities, including the National Heart, Lung and Blood Institute.
BARCELONA – The obesity paradox is alive and well in the rapidly growing population with atrial fibrillation, Samuel Z. Goldhaber, MD, reported at the annual congress of the European Society of Cardiology.
In an analysis of 22,541 participants in the international registry known as GARFIELD-AF (Global Anticoagulation Registry in the Field-Atrial Fibrillation) – the largest ongoing prospective AF registry in the world – all-cause mortality during the first 2 years after diagnosis of the arrhythmia paradoxically decreased as body mass index increased all the way up to a ceiling of 40 kg/m2, according to Dr. Goldhaber, professor of medicine at Harvard Medical School and section head of vascular medicine and director of the thrombosis research group at Brigham and Women’s Hospital, Boston.
“It’s pretty impressive. This degree of mortality reduction with overweight and obesity is of about the same magnitude you might get with thrombolytic therapy for an acute MI, or with beta blocker therapy or statins post-MI,” the cardiologist said in an interview.
This newly described obesity paradox in AF is all the more baffling in light of the heavy cardiovascular risk factor burden that accompanied increased BMI. For example, the prevalence of type 2 diabetes rose from 14.1% among normal-weight individuals with AF, to 19.7% in the overweight, 27.5% in the obese, 34.6% in those with a BMI of 35 to less than 40 kg/m2, and 41.6% in patients with a BMI of 40 kg/m2 or more.
“It’s perplexing,” Dr. Goldhaber confessed. “I think there’s some hidden message here that we haven’t decoded yet.”
Patients with AF who were at the low extreme of the BMI spectrum -- the 3.3% who were underweight, with a BMI below 20 kg/m2 – fared particularly poorly. Their 2-year all-cause mortality rate of 8.71 per 100 person-years was by far the highest of any BMI class in the study. But that’s relatively straightforward to understand, as it’s likely that many underweight patients with AF were frail and/or had cancer. Indeed, only 49% of deaths in the underweight group were due to cardiovascular disease, in contrast to 64% of deaths in persons whose BMI was 40 kg/m2 or more.
Interestingly, only 5%-6% of all deaths in GARFIELD-AF were due to stroke; deaths from heart failure were far more common.
Overall, 71% of patients who presented with newly diagnosed AF in GARFIELD-AF were overweight or obese. The only contributory factor to the obesity paradox that Dr. Goldhaber could spot was that the heavier patients were younger at diagnosis of AF. But this age disparity seems unlikely to overcome their massive burden of multiple other cardiovascular risk factors. So he solicited theories from his audience. One physician argued that morbidly obese individuals are sedentary, so they don’t leave the house as much as leaner AF patients.
“You stay at home because you cannot move. You don’t get into fatal car accidents. You don’t get into conflicts and get murdered,” the physician postulated.
Dr. Goldhaber wasn’t convinced.
“When I go out to bars or dancing, I look around at people, and I estimate that a lot of them have a BMI of 35,” he commented.
Another proposed theory was that overweight and obese AF patients exercise less, so they inhale less of the airborne toxic fine particulates present in the urban environment. The adverse health impact of air pollution is a particularly hot topic of late among European cardiologists, but the notion that obese AF patients fare better because they don’t exercise runs contrary to a wealth of data supporting the health benefits of working out.
The GARFIELD-AF registry is funded by Bayer AG. Dr. Goldhaber reported receiving research grants and/or serving as a consultant to Bayer and numerous other entities, including the National Heart, Lung and Blood Institute.
AT THE ESC CONGRESS 2017
Key clinical point:
Major finding: Two-year all-cause mortality following diagnosis of atrial fibrillation fell substantially in stepwise fashion with increasing body mass index, for reasons unknown.
Data source: GARFIELD-AF is an ongoing enormous international prospective registry of patients newly diagnosed with atrial fibrillation.
Disclosures: The GARFIELD-AF registry is funded by Bayer AG. The presenter reported receiving research grants and/or consultant fees from that company and numerous others.





