User login
Short Course of Oral Antibiotics Effective for Acute Osteomyelitis and Septic Arthritis in Children
By Mark Shen, MD

Clinical question: Is a short course (less than four weeks) of antibiotics effective for the treatment of acute osteomyelitis and septic arthritis?
Background: The optimal duration of treatment for acute bone and joint infections in children has not been assessed adequately in prospectively designed trials. Historically, intravenous (IV) antibiotics in four- to six-week durations have been recommended, although the evidence for this practice is limited. There is widespread variation in both the route of administration (oral vs. IV) and duration of this treatment.
Study design: Prospective cohort study.
Setting: Two children’s hospitals in Australia.
Synopsis: Seventy children ages 17 and under who presented to two tertiary-care children’s hospitals with osteomyelitis or septic arthritis were enrolled. Primary surgical drainage was performed for patients with septic arthritis. Intravenous antibiotics were administered for at least three days, and until clinical symptoms improved and the C-reactive protein levels had stabilized. Patients then were transitioned to oral antibiotics and discharged to complete a minimum of three weeks of therapy.
Fifty-nine percent of patients were converted to oral antibiotics by day three, 86% by day five of therapy. Based on clinical and hematologic assessment, 83% of patients had oral antibiotics stopped at the three-week followup and remained well through the 12-month follow-up period.
This study essentially involved prospective data collection for a cohort of children receiving standardized care. Although the results suggest that a majority of children can be treated with a three-week course of oral antibiotics, the results would have been further strengthened by an explicit protocol with well-defined criteria for the oral to IV transition and cessation of antibiotic therapy. Additional limitations include pathogens and antibiotic choices that might not be applicable to North American populations.
Bottom line: After initial intravenous therapy, a three-week course of oral antibiotics can be effective for acute osteomyelitis and septic arthritis in children.
Citation: Jagodzinski NA, Kanwar R, Graham K, Bache CE. Prospective evaluation of a shortened regimen of treatment for acute osteomyelitis and septic arthritis in children. J Pediatr Orthop. 2009;29(5):518-525.
By Mark Shen, MD

Clinical question: Is a short course (less than four weeks) of antibiotics effective for the treatment of acute osteomyelitis and septic arthritis?
Background: The optimal duration of treatment for acute bone and joint infections in children has not been assessed adequately in prospectively designed trials. Historically, intravenous (IV) antibiotics in four- to six-week durations have been recommended, although the evidence for this practice is limited. There is widespread variation in both the route of administration (oral vs. IV) and duration of this treatment.
Study design: Prospective cohort study.
Setting: Two children’s hospitals in Australia.
Synopsis: Seventy children ages 17 and under who presented to two tertiary-care children’s hospitals with osteomyelitis or septic arthritis were enrolled. Primary surgical drainage was performed for patients with septic arthritis. Intravenous antibiotics were administered for at least three days, and until clinical symptoms improved and the C-reactive protein levels had stabilized. Patients then were transitioned to oral antibiotics and discharged to complete a minimum of three weeks of therapy.
Fifty-nine percent of patients were converted to oral antibiotics by day three, 86% by day five of therapy. Based on clinical and hematologic assessment, 83% of patients had oral antibiotics stopped at the three-week followup and remained well through the 12-month follow-up period.
This study essentially involved prospective data collection for a cohort of children receiving standardized care. Although the results suggest that a majority of children can be treated with a three-week course of oral antibiotics, the results would have been further strengthened by an explicit protocol with well-defined criteria for the oral to IV transition and cessation of antibiotic therapy. Additional limitations include pathogens and antibiotic choices that might not be applicable to North American populations.
Bottom line: After initial intravenous therapy, a three-week course of oral antibiotics can be effective for acute osteomyelitis and septic arthritis in children.
Citation: Jagodzinski NA, Kanwar R, Graham K, Bache CE. Prospective evaluation of a shortened regimen of treatment for acute osteomyelitis and septic arthritis in children. J Pediatr Orthop. 2009;29(5):518-525.
By Mark Shen, MD

Clinical question: Is a short course (less than four weeks) of antibiotics effective for the treatment of acute osteomyelitis and septic arthritis?
Background: The optimal duration of treatment for acute bone and joint infections in children has not been assessed adequately in prospectively designed trials. Historically, intravenous (IV) antibiotics in four- to six-week durations have been recommended, although the evidence for this practice is limited. There is widespread variation in both the route of administration (oral vs. IV) and duration of this treatment.
Study design: Prospective cohort study.
Setting: Two children’s hospitals in Australia.
Synopsis: Seventy children ages 17 and under who presented to two tertiary-care children’s hospitals with osteomyelitis or septic arthritis were enrolled. Primary surgical drainage was performed for patients with septic arthritis. Intravenous antibiotics were administered for at least three days, and until clinical symptoms improved and the C-reactive protein levels had stabilized. Patients then were transitioned to oral antibiotics and discharged to complete a minimum of three weeks of therapy.
Fifty-nine percent of patients were converted to oral antibiotics by day three, 86% by day five of therapy. Based on clinical and hematologic assessment, 83% of patients had oral antibiotics stopped at the three-week followup and remained well through the 12-month follow-up period.
This study essentially involved prospective data collection for a cohort of children receiving standardized care. Although the results suggest that a majority of children can be treated with a three-week course of oral antibiotics, the results would have been further strengthened by an explicit protocol with well-defined criteria for the oral to IV transition and cessation of antibiotic therapy. Additional limitations include pathogens and antibiotic choices that might not be applicable to North American populations.
Bottom line: After initial intravenous therapy, a three-week course of oral antibiotics can be effective for acute osteomyelitis and septic arthritis in children.
Citation: Jagodzinski NA, Kanwar R, Graham K, Bache CE. Prospective evaluation of a shortened regimen of treatment for acute osteomyelitis and septic arthritis in children. J Pediatr Orthop. 2009;29(5):518-525.
Is Ease of Smoking Cessation an Early Sign of Parkinson’s Disease?
Patients with Parkinson’s disease can quit smoking more easily than healthy individuals can, according to data published October 14 in Neurology. A decreased responsiveness to nicotine during the prodromal phase of Parkinson’s disease may account for these findings.
The authors hypothesized that “ease of smoking cessation is an aspect of premanifest Parkinson’s disease similar to olfactory dysfunction, REM sleep disorders, or constipation.” The apparent neuroprotective effect of smoking observed in previous epidemiologic studies may result from reverse causation, they added.
Beate Ritz, MD, PhD, Professor and Vice Chair of the Epidemiology Department at the University of California, Los Angeles, and colleagues conducted a case–control study to investigate the relationship between smoking and Parkinson’s disease. The researchers examined the Danish National Hospital Register and identified 1,808 patients with Parkinson’s disease who had been diagnosed between 1996 and 2009. Using the Danish Central Population Registry, Dr. Ritz’s group matched the patients with 1,876 population controls regarding sex and year of birth. Through telephone interviews, the investigators obtained information on demographics, education, and lifestyle habits, including lifelong smoking history and use of nicotine substitutes.
Patients with Parkinson’s disease were less likely to ever smoke cigarettes, and associations were strongest for current smokers, followed by former smokers. Former smokers who said that it was “extremely difficult to quit smoking” had a 31% decreased risk of developing Parkinson’s disease, compared with individuals who reported that “quitting was easy.”
Approximately 3% of cases and 5% of controls reported ever using nicotine substitutes. Participants who had ever used nicotine substitutes had a reduced risk of Parkinson’s disease. The use of nicotine substitutes was strongly associated with quitting difficulty and heavy smoking.
“We find a strong association between smoking and Parkinson’s disease, and a trend in risk with increasing smoking duration,” said Dr. Ritz. “These observations suggest that a mechanism associated with Parkinson’s disease risk may influence smoking behavior or that less reward from nicotinic stimulation might be an event prodromal to Parkinson’s disease,” she added.
The study results indicate that “practicing neurologists should not recommend cigarette use or nicotine substitutes to delay the onset of Parkinson’s disease,” said Linda A. Hershey, MD, PhD, Professor of Neurology at the University of Oklahoma Health Sciences Center in Oklahoma City, and Joel S. Perlmutter, MD, Head of the Movement Disorders Section at Washington University in St. Louis, in an accompanying editorial. “Physicians, including neurologists, should encourage their patients to stop smoking because cigarette smoking is a significant risk factor for stroke,” they concluded.
—Erik Greb
Suggested Reading
Hershey LA, Perlmutter JS. Smoking and Parkinson disease: Where there is smoke there may not be fire. Neurology. 2014;83(16):1392-1393.
Ritz B, Lee PC, Lassen CF, Arah OA. Parkinson disease and smoking revisited: Ease of quitting is an early sign of the disease. Neurology. 2014;83(16):1396-1402.
Patients with Parkinson’s disease can quit smoking more easily than healthy individuals can, according to data published October 14 in Neurology. A decreased responsiveness to nicotine during the prodromal phase of Parkinson’s disease may account for these findings.
The authors hypothesized that “ease of smoking cessation is an aspect of premanifest Parkinson’s disease similar to olfactory dysfunction, REM sleep disorders, or constipation.” The apparent neuroprotective effect of smoking observed in previous epidemiologic studies may result from reverse causation, they added.
Beate Ritz, MD, PhD, Professor and Vice Chair of the Epidemiology Department at the University of California, Los Angeles, and colleagues conducted a case–control study to investigate the relationship between smoking and Parkinson’s disease. The researchers examined the Danish National Hospital Register and identified 1,808 patients with Parkinson’s disease who had been diagnosed between 1996 and 2009. Using the Danish Central Population Registry, Dr. Ritz’s group matched the patients with 1,876 population controls regarding sex and year of birth. Through telephone interviews, the investigators obtained information on demographics, education, and lifestyle habits, including lifelong smoking history and use of nicotine substitutes.
Patients with Parkinson’s disease were less likely to ever smoke cigarettes, and associations were strongest for current smokers, followed by former smokers. Former smokers who said that it was “extremely difficult to quit smoking” had a 31% decreased risk of developing Parkinson’s disease, compared with individuals who reported that “quitting was easy.”
Approximately 3% of cases and 5% of controls reported ever using nicotine substitutes. Participants who had ever used nicotine substitutes had a reduced risk of Parkinson’s disease. The use of nicotine substitutes was strongly associated with quitting difficulty and heavy smoking.
“We find a strong association between smoking and Parkinson’s disease, and a trend in risk with increasing smoking duration,” said Dr. Ritz. “These observations suggest that a mechanism associated with Parkinson’s disease risk may influence smoking behavior or that less reward from nicotinic stimulation might be an event prodromal to Parkinson’s disease,” she added.
The study results indicate that “practicing neurologists should not recommend cigarette use or nicotine substitutes to delay the onset of Parkinson’s disease,” said Linda A. Hershey, MD, PhD, Professor of Neurology at the University of Oklahoma Health Sciences Center in Oklahoma City, and Joel S. Perlmutter, MD, Head of the Movement Disorders Section at Washington University in St. Louis, in an accompanying editorial. “Physicians, including neurologists, should encourage their patients to stop smoking because cigarette smoking is a significant risk factor for stroke,” they concluded.
—Erik Greb
Patients with Parkinson’s disease can quit smoking more easily than healthy individuals can, according to data published October 14 in Neurology. A decreased responsiveness to nicotine during the prodromal phase of Parkinson’s disease may account for these findings.
The authors hypothesized that “ease of smoking cessation is an aspect of premanifest Parkinson’s disease similar to olfactory dysfunction, REM sleep disorders, or constipation.” The apparent neuroprotective effect of smoking observed in previous epidemiologic studies may result from reverse causation, they added.
Beate Ritz, MD, PhD, Professor and Vice Chair of the Epidemiology Department at the University of California, Los Angeles, and colleagues conducted a case–control study to investigate the relationship between smoking and Parkinson’s disease. The researchers examined the Danish National Hospital Register and identified 1,808 patients with Parkinson’s disease who had been diagnosed between 1996 and 2009. Using the Danish Central Population Registry, Dr. Ritz’s group matched the patients with 1,876 population controls regarding sex and year of birth. Through telephone interviews, the investigators obtained information on demographics, education, and lifestyle habits, including lifelong smoking history and use of nicotine substitutes.
Patients with Parkinson’s disease were less likely to ever smoke cigarettes, and associations were strongest for current smokers, followed by former smokers. Former smokers who said that it was “extremely difficult to quit smoking” had a 31% decreased risk of developing Parkinson’s disease, compared with individuals who reported that “quitting was easy.”
Approximately 3% of cases and 5% of controls reported ever using nicotine substitutes. Participants who had ever used nicotine substitutes had a reduced risk of Parkinson’s disease. The use of nicotine substitutes was strongly associated with quitting difficulty and heavy smoking.
“We find a strong association between smoking and Parkinson’s disease, and a trend in risk with increasing smoking duration,” said Dr. Ritz. “These observations suggest that a mechanism associated with Parkinson’s disease risk may influence smoking behavior or that less reward from nicotinic stimulation might be an event prodromal to Parkinson’s disease,” she added.
The study results indicate that “practicing neurologists should not recommend cigarette use or nicotine substitutes to delay the onset of Parkinson’s disease,” said Linda A. Hershey, MD, PhD, Professor of Neurology at the University of Oklahoma Health Sciences Center in Oklahoma City, and Joel S. Perlmutter, MD, Head of the Movement Disorders Section at Washington University in St. Louis, in an accompanying editorial. “Physicians, including neurologists, should encourage their patients to stop smoking because cigarette smoking is a significant risk factor for stroke,” they concluded.
—Erik Greb
Suggested Reading
Hershey LA, Perlmutter JS. Smoking and Parkinson disease: Where there is smoke there may not be fire. Neurology. 2014;83(16):1392-1393.
Ritz B, Lee PC, Lassen CF, Arah OA. Parkinson disease and smoking revisited: Ease of quitting is an early sign of the disease. Neurology. 2014;83(16):1396-1402.
Suggested Reading
Hershey LA, Perlmutter JS. Smoking and Parkinson disease: Where there is smoke there may not be fire. Neurology. 2014;83(16):1392-1393.
Ritz B, Lee PC, Lassen CF, Arah OA. Parkinson disease and smoking revisited: Ease of quitting is an early sign of the disease. Neurology. 2014;83(16):1396-1402.
Once-Weekly Antibiotic Might Be Effective for Treatment of Acute Bacterial Skin Infections
Clinical question: Is once-weekly intravenous dalbavancin as effective as conventional therapy for the treatment of acute bacterial skin infections?
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of ≤37.6°C at 48-72 hours. Secondary endpoints included a decrease in lesion area of ≥20% at 48-72 hours and clinical success at end of therapy (determined by clinical and historical features).
Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups.
Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48-72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Clinical question: Is once-weekly intravenous dalbavancin as effective as conventional therapy for the treatment of acute bacterial skin infections?
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of ≤37.6°C at 48-72 hours. Secondary endpoints included a decrease in lesion area of ≥20% at 48-72 hours and clinical success at end of therapy (determined by clinical and historical features).
Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups.
Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48-72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Clinical question: Is once-weekly intravenous dalbavancin as effective as conventional therapy for the treatment of acute bacterial skin infections?
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of ≤37.6°C at 48-72 hours. Secondary endpoints included a decrease in lesion area of ≥20% at 48-72 hours and clinical success at end of therapy (determined by clinical and historical features).
Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups.
Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48-72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Continuous Positive Airway Pressure Outperforms Noctural Oxygen for Blood Pressure Reduction
Clinical question: What is the effect of continuous positive airway pressure (CPAP) or supplemental oxygen on ambulatory blood pressures and markers of cardiovascular risk when combined with sleep hygiene education in patients with obstructive sleep apnea (OSA) and coronary artery disease or cardiac risk factors?
Background: OSA is considered a risk factor for the development of hypertension. One meta-analysis showed reduction of mean arterial pressure (MAP) with CPAP therapy, but randomized controlled data on blood pressure reduction with treatment of OSA is lacking.
Study design: Randomized, parallel-group trial.
Setting: Four outpatient cardiology practices.
Synopsis: Patients ages 45-75 with OSA were randomized to receive nocturnal CPAP and healthy lifestyle and sleep education (HLSE), nocturnal oxygen therapy and HSLE, or HSLE alone. The primary outcome was 24-hour MAP. Secondary outcomes included fasting blood glucose, lipid panel, insulin level, erythrocyte sedimentation rate, C-reactive protein (CRP), and N-terminal pro-brain naturetic peptide.
Participants had high rates of diabetes, hypertension, and coronary artery disease. At 12 weeks, the CPAP arm experienced greater reductions in 24-hour MAP compared to both the nocturnal oxygen and HSLE arms (-2.8 mmHg [P=0.02] and -2.4 mmHg [P=0.04], respectively). No significant decrease in MAP was identified in the nocturnal oxygen arm when compared to the HSLE arm. The only significant difference in secondary outcomes was a decrease in CRP in the CPAP arm when compared to the HSLE arm, the clinical significance of which is unclear.
Bottom line: CPAP therapy with sleep hygiene education appears superior to nocturnal oxygen therapy with sleep hygiene education and sleep hygiene education alone in decreasing 24-hour MAP in patients with OSA and coronary artery disease or cardiac risk factors.
Citation: Gottlieb DJ, Punjabi NM, Mehra R, et al. CPAP versus oxygen in obstructive sleep apnea. N Engl J Med. 2014;370(24):2276-2285.
Clinical question: What is the effect of continuous positive airway pressure (CPAP) or supplemental oxygen on ambulatory blood pressures and markers of cardiovascular risk when combined with sleep hygiene education in patients with obstructive sleep apnea (OSA) and coronary artery disease or cardiac risk factors?
Background: OSA is considered a risk factor for the development of hypertension. One meta-analysis showed reduction of mean arterial pressure (MAP) with CPAP therapy, but randomized controlled data on blood pressure reduction with treatment of OSA is lacking.
Study design: Randomized, parallel-group trial.
Setting: Four outpatient cardiology practices.
Synopsis: Patients ages 45-75 with OSA were randomized to receive nocturnal CPAP and healthy lifestyle and sleep education (HLSE), nocturnal oxygen therapy and HSLE, or HSLE alone. The primary outcome was 24-hour MAP. Secondary outcomes included fasting blood glucose, lipid panel, insulin level, erythrocyte sedimentation rate, C-reactive protein (CRP), and N-terminal pro-brain naturetic peptide.
Participants had high rates of diabetes, hypertension, and coronary artery disease. At 12 weeks, the CPAP arm experienced greater reductions in 24-hour MAP compared to both the nocturnal oxygen and HSLE arms (-2.8 mmHg [P=0.02] and -2.4 mmHg [P=0.04], respectively). No significant decrease in MAP was identified in the nocturnal oxygen arm when compared to the HSLE arm. The only significant difference in secondary outcomes was a decrease in CRP in the CPAP arm when compared to the HSLE arm, the clinical significance of which is unclear.
Bottom line: CPAP therapy with sleep hygiene education appears superior to nocturnal oxygen therapy with sleep hygiene education and sleep hygiene education alone in decreasing 24-hour MAP in patients with OSA and coronary artery disease or cardiac risk factors.
Citation: Gottlieb DJ, Punjabi NM, Mehra R, et al. CPAP versus oxygen in obstructive sleep apnea. N Engl J Med. 2014;370(24):2276-2285.
Clinical question: What is the effect of continuous positive airway pressure (CPAP) or supplemental oxygen on ambulatory blood pressures and markers of cardiovascular risk when combined with sleep hygiene education in patients with obstructive sleep apnea (OSA) and coronary artery disease or cardiac risk factors?
Background: OSA is considered a risk factor for the development of hypertension. One meta-analysis showed reduction of mean arterial pressure (MAP) with CPAP therapy, but randomized controlled data on blood pressure reduction with treatment of OSA is lacking.
Study design: Randomized, parallel-group trial.
Setting: Four outpatient cardiology practices.
Synopsis: Patients ages 45-75 with OSA were randomized to receive nocturnal CPAP and healthy lifestyle and sleep education (HLSE), nocturnal oxygen therapy and HSLE, or HSLE alone. The primary outcome was 24-hour MAP. Secondary outcomes included fasting blood glucose, lipid panel, insulin level, erythrocyte sedimentation rate, C-reactive protein (CRP), and N-terminal pro-brain naturetic peptide.
Participants had high rates of diabetes, hypertension, and coronary artery disease. At 12 weeks, the CPAP arm experienced greater reductions in 24-hour MAP compared to both the nocturnal oxygen and HSLE arms (-2.8 mmHg [P=0.02] and -2.4 mmHg [P=0.04], respectively). No significant decrease in MAP was identified in the nocturnal oxygen arm when compared to the HSLE arm. The only significant difference in secondary outcomes was a decrease in CRP in the CPAP arm when compared to the HSLE arm, the clinical significance of which is unclear.
Bottom line: CPAP therapy with sleep hygiene education appears superior to nocturnal oxygen therapy with sleep hygiene education and sleep hygiene education alone in decreasing 24-hour MAP in patients with OSA and coronary artery disease or cardiac risk factors.
Citation: Gottlieb DJ, Punjabi NM, Mehra R, et al. CPAP versus oxygen in obstructive sleep apnea. N Engl J Med. 2014;370(24):2276-2285.
Lactate Clearance Portends Better Outcomes after Cardiac Arrest
Clinical question: Is greater lactate clearance following resuscitation from cardiac arrest associated with lower mortality and better neurologic outcomes?
Background: Recommendations from the International Liaison Committee on Resuscitation for monitoring serial lactate levels in post-resuscitation patients are based primarily on extrapolation from other conditions such as sepsis. Two single-retrospective analyses found effective lactate clearance was associated with decreased mortality. This association had not previously been validated in a multicenter, prospective study.
Study design: Multicenter, prospective, observational study.
Setting: Four urban, tertiary-care teaching hospitals.
Synopsis: Absolute lactate levels and the differences in the percent lactate change over 24 hours were compared in 100 patients who suffered out-of-hospital cardiac arrest. Ninety-seven percent received therapeutic hypothermia, and overall survival was 46%. Survivors and patients with a good neurologic outcome had lower lactate levels at zero hours (4.1 vs. 7.3), 12 hours (2.2 vs. 6.0), and 24 hours (1.6 vs. 4.4) compared with nonsurvivors and patients with bad neurologic outcomes.
The percent lactate decreased was greater in survivors and in those with good neurologic outcomes (odds ratio, 2.2; 95% confidence interval, 1.1–4.4).
Nonsurvivors or those with poor neurologic outcomes were less likely to have received bystander CPR, to have suffered a witnessed arrest, or to have had a shockable rhythm at presentation. Superior lactate clearance in survivors and those with good neurologic outcomes suggests a potential role in developing markers of effective resuscitation.
Bottom line: Lower lactate levels and more effective clearance of lactate in patients following cardiac arrest are associated with improved survival and good neurologic outcome.
Citation: Donnino MW, Andersen LW, Giberson T, et al. Initial lactate and lactate change in post-cardiac arrest: a multicenter validation study. Crit Care Med. 2014;42(8):1804-1811.
Clinical question: Is greater lactate clearance following resuscitation from cardiac arrest associated with lower mortality and better neurologic outcomes?
Background: Recommendations from the International Liaison Committee on Resuscitation for monitoring serial lactate levels in post-resuscitation patients are based primarily on extrapolation from other conditions such as sepsis. Two single-retrospective analyses found effective lactate clearance was associated with decreased mortality. This association had not previously been validated in a multicenter, prospective study.
Study design: Multicenter, prospective, observational study.
Setting: Four urban, tertiary-care teaching hospitals.
Synopsis: Absolute lactate levels and the differences in the percent lactate change over 24 hours were compared in 100 patients who suffered out-of-hospital cardiac arrest. Ninety-seven percent received therapeutic hypothermia, and overall survival was 46%. Survivors and patients with a good neurologic outcome had lower lactate levels at zero hours (4.1 vs. 7.3), 12 hours (2.2 vs. 6.0), and 24 hours (1.6 vs. 4.4) compared with nonsurvivors and patients with bad neurologic outcomes.
The percent lactate decreased was greater in survivors and in those with good neurologic outcomes (odds ratio, 2.2; 95% confidence interval, 1.1–4.4).
Nonsurvivors or those with poor neurologic outcomes were less likely to have received bystander CPR, to have suffered a witnessed arrest, or to have had a shockable rhythm at presentation. Superior lactate clearance in survivors and those with good neurologic outcomes suggests a potential role in developing markers of effective resuscitation.
Bottom line: Lower lactate levels and more effective clearance of lactate in patients following cardiac arrest are associated with improved survival and good neurologic outcome.
Citation: Donnino MW, Andersen LW, Giberson T, et al. Initial lactate and lactate change in post-cardiac arrest: a multicenter validation study. Crit Care Med. 2014;42(8):1804-1811.
Clinical question: Is greater lactate clearance following resuscitation from cardiac arrest associated with lower mortality and better neurologic outcomes?
Background: Recommendations from the International Liaison Committee on Resuscitation for monitoring serial lactate levels in post-resuscitation patients are based primarily on extrapolation from other conditions such as sepsis. Two single-retrospective analyses found effective lactate clearance was associated with decreased mortality. This association had not previously been validated in a multicenter, prospective study.
Study design: Multicenter, prospective, observational study.
Setting: Four urban, tertiary-care teaching hospitals.
Synopsis: Absolute lactate levels and the differences in the percent lactate change over 24 hours were compared in 100 patients who suffered out-of-hospital cardiac arrest. Ninety-seven percent received therapeutic hypothermia, and overall survival was 46%. Survivors and patients with a good neurologic outcome had lower lactate levels at zero hours (4.1 vs. 7.3), 12 hours (2.2 vs. 6.0), and 24 hours (1.6 vs. 4.4) compared with nonsurvivors and patients with bad neurologic outcomes.
The percent lactate decreased was greater in survivors and in those with good neurologic outcomes (odds ratio, 2.2; 95% confidence interval, 1.1–4.4).
Nonsurvivors or those with poor neurologic outcomes were less likely to have received bystander CPR, to have suffered a witnessed arrest, or to have had a shockable rhythm at presentation. Superior lactate clearance in survivors and those with good neurologic outcomes suggests a potential role in developing markers of effective resuscitation.
Bottom line: Lower lactate levels and more effective clearance of lactate in patients following cardiac arrest are associated with improved survival and good neurologic outcome.
Citation: Donnino MW, Andersen LW, Giberson T, et al. Initial lactate and lactate change in post-cardiac arrest: a multicenter validation study. Crit Care Med. 2014;42(8):1804-1811.
Time to Meds Matters for Patients with Cardiac Arrest Due to Nonshockable Rhythms
Clinical question: Is earlier administration of epinephrine in patients with cardiac arrest due to nonshockable rhythms associated with increased return of spontaneous circulation, survival, and neurologically intact survival?
Background: About 200,000 hospitalized patients in the U.S. have a cardiac arrest, commonly due to nonshockable rhythms. Cardiopulmonary resuscitation has been the only efficacious intervention. There are no well-controlled trials of the use of epinephrine on survival and neurological outcomes.
Study design: Prospective cohort from a large multicenter registry of in-hospital cardiac arrests.
Setting: Data from 570 hospitals from 2000 to 2009.
Synopsis: Authors included 25,095 adults from 570 hospitals who had cardiac arrests in hospital with asystole or pulseless electrical activity as the initial rhythm. Time to first administration of epinephrine was recorded and then separated into quartiles, and odds ratios were evaluated using one to three minutes as the reference group. Outcomes of survival to hospital discharge (10%), return of spontaneous circulation (47%), and survival to hospital discharge with favorable neurologic status (7%) were assessed.
Survival to discharge decreased as the time to administration of the first dose of epinephrine increased. Of those patients receiving epinephrine in one minute, 12% survived. This dropped to 7% for those first receiving epinephrine after seven minutes. Return of spontaneous circulation and survival to discharge with favorable neurologic status showed a similar stepwise decrease with longer times to first administration of epinephrine.
Bottom line: Earlier administration of epinephrine to patients with cardiac arrest due to nonshockable rhythms is associated with improved survival to discharge, return of spontaneous circulation, and neurologically intact survival.
Citation: Donnino MW, Salciccioli JD, Howell MD, et al. Time to administration of epinephrine and outcome after in-hospital cardiac arrest with non-shockable rhythms: restrospective analysis of large in-hospital data registry. BMJ. 2014;348:g3028.
Clinical question: Is earlier administration of epinephrine in patients with cardiac arrest due to nonshockable rhythms associated with increased return of spontaneous circulation, survival, and neurologically intact survival?
Background: About 200,000 hospitalized patients in the U.S. have a cardiac arrest, commonly due to nonshockable rhythms. Cardiopulmonary resuscitation has been the only efficacious intervention. There are no well-controlled trials of the use of epinephrine on survival and neurological outcomes.
Study design: Prospective cohort from a large multicenter registry of in-hospital cardiac arrests.
Setting: Data from 570 hospitals from 2000 to 2009.
Synopsis: Authors included 25,095 adults from 570 hospitals who had cardiac arrests in hospital with asystole or pulseless electrical activity as the initial rhythm. Time to first administration of epinephrine was recorded and then separated into quartiles, and odds ratios were evaluated using one to three minutes as the reference group. Outcomes of survival to hospital discharge (10%), return of spontaneous circulation (47%), and survival to hospital discharge with favorable neurologic status (7%) were assessed.
Survival to discharge decreased as the time to administration of the first dose of epinephrine increased. Of those patients receiving epinephrine in one minute, 12% survived. This dropped to 7% for those first receiving epinephrine after seven minutes. Return of spontaneous circulation and survival to discharge with favorable neurologic status showed a similar stepwise decrease with longer times to first administration of epinephrine.
Bottom line: Earlier administration of epinephrine to patients with cardiac arrest due to nonshockable rhythms is associated with improved survival to discharge, return of spontaneous circulation, and neurologically intact survival.
Citation: Donnino MW, Salciccioli JD, Howell MD, et al. Time to administration of epinephrine and outcome after in-hospital cardiac arrest with non-shockable rhythms: restrospective analysis of large in-hospital data registry. BMJ. 2014;348:g3028.
Clinical question: Is earlier administration of epinephrine in patients with cardiac arrest due to nonshockable rhythms associated with increased return of spontaneous circulation, survival, and neurologically intact survival?
Background: About 200,000 hospitalized patients in the U.S. have a cardiac arrest, commonly due to nonshockable rhythms. Cardiopulmonary resuscitation has been the only efficacious intervention. There are no well-controlled trials of the use of epinephrine on survival and neurological outcomes.
Study design: Prospective cohort from a large multicenter registry of in-hospital cardiac arrests.
Setting: Data from 570 hospitals from 2000 to 2009.
Synopsis: Authors included 25,095 adults from 570 hospitals who had cardiac arrests in hospital with asystole or pulseless electrical activity as the initial rhythm. Time to first administration of epinephrine was recorded and then separated into quartiles, and odds ratios were evaluated using one to three minutes as the reference group. Outcomes of survival to hospital discharge (10%), return of spontaneous circulation (47%), and survival to hospital discharge with favorable neurologic status (7%) were assessed.
Survival to discharge decreased as the time to administration of the first dose of epinephrine increased. Of those patients receiving epinephrine in one minute, 12% survived. This dropped to 7% for those first receiving epinephrine after seven minutes. Return of spontaneous circulation and survival to discharge with favorable neurologic status showed a similar stepwise decrease with longer times to first administration of epinephrine.
Bottom line: Earlier administration of epinephrine to patients with cardiac arrest due to nonshockable rhythms is associated with improved survival to discharge, return of spontaneous circulation, and neurologically intact survival.
Citation: Donnino MW, Salciccioli JD, Howell MD, et al. Time to administration of epinephrine and outcome after in-hospital cardiac arrest with non-shockable rhythms: restrospective analysis of large in-hospital data registry. BMJ. 2014;348:g3028.
Frailty Indices Tool Predicts Post-Operative Complications, Mortality after Elective Surgery in Geriatric Patients
Clinical question: Is there a more accurate way to predict adverse post-operative outcomes in geriatric patients undergoing elective surgery?
Background: More than half of all operations in the U.S. involve geriatric patients. Most tools hospitalists use to predict post-operative outcomes are focused on cardiovascular events and do not account for frailty. Common in geriatric patients, frailty is thought to influence post-operative outcomes.
Study design: Prospective cohort study.
Setting: A 1,000-bed academic hospital in Seoul, South Korea.
Synopsis: A cohort of 275 elderly patients (>64 years old) who were scheduled for elective intermediate or high-risk surgery underwent a pre-operative comprehensive geriatric assessment (CGA) that included measures of frailty. This cohort was then followed for mortality, major post-operative complications (pneumonia, urinary infection, pulmonary embolism, and unplanned transfer to intensive care), length of stay, and transfer to a nursing home. Post-operative complications, transfer to a nursing facility, and one-year mortality were associated with a derived scoring tool that included the Charlson Comorbidity Index, activities of daily living (ADL), instrumental activities of daily living (IADL), dementia, risk for delirium, mid-arm circumference, and a mini-nutritional assessment.
This tool was more accurate at predicting one-year mortality than the American Society of Anesthesiologists (ASA) classification.
Bottom line: This study establishes that measures of frailty predict post-operative outcomes in geriatric patients undergoing elective surgery; however, the authors’ tool depends on CGA, which is time-consuming, cumbersome, and depends on indices not familiar to many hospitalists.
Citation: Kim SW, Han HS, Jung HW, et al. Multidimensional frailty scores for the prediction of postoperative mortality risk. JAMA Surg. 2014;149(7):633-640.
Clinical question: Is there a more accurate way to predict adverse post-operative outcomes in geriatric patients undergoing elective surgery?
Background: More than half of all operations in the U.S. involve geriatric patients. Most tools hospitalists use to predict post-operative outcomes are focused on cardiovascular events and do not account for frailty. Common in geriatric patients, frailty is thought to influence post-operative outcomes.
Study design: Prospective cohort study.
Setting: A 1,000-bed academic hospital in Seoul, South Korea.
Synopsis: A cohort of 275 elderly patients (>64 years old) who were scheduled for elective intermediate or high-risk surgery underwent a pre-operative comprehensive geriatric assessment (CGA) that included measures of frailty. This cohort was then followed for mortality, major post-operative complications (pneumonia, urinary infection, pulmonary embolism, and unplanned transfer to intensive care), length of stay, and transfer to a nursing home. Post-operative complications, transfer to a nursing facility, and one-year mortality were associated with a derived scoring tool that included the Charlson Comorbidity Index, activities of daily living (ADL), instrumental activities of daily living (IADL), dementia, risk for delirium, mid-arm circumference, and a mini-nutritional assessment.
This tool was more accurate at predicting one-year mortality than the American Society of Anesthesiologists (ASA) classification.
Bottom line: This study establishes that measures of frailty predict post-operative outcomes in geriatric patients undergoing elective surgery; however, the authors’ tool depends on CGA, which is time-consuming, cumbersome, and depends on indices not familiar to many hospitalists.
Citation: Kim SW, Han HS, Jung HW, et al. Multidimensional frailty scores for the prediction of postoperative mortality risk. JAMA Surg. 2014;149(7):633-640.
Clinical question: Is there a more accurate way to predict adverse post-operative outcomes in geriatric patients undergoing elective surgery?
Background: More than half of all operations in the U.S. involve geriatric patients. Most tools hospitalists use to predict post-operative outcomes are focused on cardiovascular events and do not account for frailty. Common in geriatric patients, frailty is thought to influence post-operative outcomes.
Study design: Prospective cohort study.
Setting: A 1,000-bed academic hospital in Seoul, South Korea.
Synopsis: A cohort of 275 elderly patients (>64 years old) who were scheduled for elective intermediate or high-risk surgery underwent a pre-operative comprehensive geriatric assessment (CGA) that included measures of frailty. This cohort was then followed for mortality, major post-operative complications (pneumonia, urinary infection, pulmonary embolism, and unplanned transfer to intensive care), length of stay, and transfer to a nursing home. Post-operative complications, transfer to a nursing facility, and one-year mortality were associated with a derived scoring tool that included the Charlson Comorbidity Index, activities of daily living (ADL), instrumental activities of daily living (IADL), dementia, risk for delirium, mid-arm circumference, and a mini-nutritional assessment.
This tool was more accurate at predicting one-year mortality than the American Society of Anesthesiologists (ASA) classification.
Bottom line: This study establishes that measures of frailty predict post-operative outcomes in geriatric patients undergoing elective surgery; however, the authors’ tool depends on CGA, which is time-consuming, cumbersome, and depends on indices not familiar to many hospitalists.
Citation: Kim SW, Han HS, Jung HW, et al. Multidimensional frailty scores for the prediction of postoperative mortality risk. JAMA Surg. 2014;149(7):633-640.
Pre-Operative Use of Angiotension Converting Enzyme Inhibitors, Angiotension Receptor Blockers Examined in Elective Joint Replacement Surgery
Clinical question: Should angiotension-converting enzyme inhibitors or angiotension receptor blockers (ACEI/ARB) be held the morning of elective joint replacement?
Background: In patients taking ACEI/ARB, the decision regarding whether or not to give these medications on the day of surgery is controversial. UptoDate recommends holding ACEI/ARB the day of surgery; American College of Physicians Guidelines and SHM Consult Medicine recommend giving these drugs on the day of surgery.
Study design: Retrospective cohort (case control) study.
Setting: A large academic hospital in Pennsylvania.
Synopsis: Researchers studied adults undergoing elective spinal fusion, total knee replacement, or total hip replacement, and compared outcomes in 323 patients who were taking an ACEI/ARB (study group) to outcomes in the 579 patients who were not taking an ACEI/ARB (control group) before surgery. It was assumed—but not studied—that the ACEI/ARB was continued the morning of surgery in all patients in the study group, because that was the standard practice at this hospital.
Compared to the control group, the study group had more post-induction hypotension (12.2% vs. 6.7%) and more post-operative acute kidney injury (5.76% vs. 3.28%). Patients who developed acute kidney injury had longer length of stay (5.76 vs. 3.28 days) but no difference in two-year mortality.
Patients in the study group had higher baseline creatinine, were older, were more likely to be taking a diuretic, and were more likely to have diabetes, heart failure, and coronary artery disease. The authors used multiple logistic regression to adjust for these differences. Anesthesia and intra-operative fluid management were not standardized or compared.
Bottom line: ACEI/ARB administration on the morning of elective major orthopedic surgery is likely associated with a higher risk of intra-operative hypotension and acute kidney injury.
Citation: Nielson E, Hennrikus E, Lehman E, Mets B. Angiotensin axis blockade, hypotension, and acute kidney injury in elective major orthopedic surgery. J Hosp Med. 2014;9(5):283-288.
Clinical question: Should angiotension-converting enzyme inhibitors or angiotension receptor blockers (ACEI/ARB) be held the morning of elective joint replacement?
Background: In patients taking ACEI/ARB, the decision regarding whether or not to give these medications on the day of surgery is controversial. UptoDate recommends holding ACEI/ARB the day of surgery; American College of Physicians Guidelines and SHM Consult Medicine recommend giving these drugs on the day of surgery.
Study design: Retrospective cohort (case control) study.
Setting: A large academic hospital in Pennsylvania.
Synopsis: Researchers studied adults undergoing elective spinal fusion, total knee replacement, or total hip replacement, and compared outcomes in 323 patients who were taking an ACEI/ARB (study group) to outcomes in the 579 patients who were not taking an ACEI/ARB (control group) before surgery. It was assumed—but not studied—that the ACEI/ARB was continued the morning of surgery in all patients in the study group, because that was the standard practice at this hospital.
Compared to the control group, the study group had more post-induction hypotension (12.2% vs. 6.7%) and more post-operative acute kidney injury (5.76% vs. 3.28%). Patients who developed acute kidney injury had longer length of stay (5.76 vs. 3.28 days) but no difference in two-year mortality.
Patients in the study group had higher baseline creatinine, were older, were more likely to be taking a diuretic, and were more likely to have diabetes, heart failure, and coronary artery disease. The authors used multiple logistic regression to adjust for these differences. Anesthesia and intra-operative fluid management were not standardized or compared.
Bottom line: ACEI/ARB administration on the morning of elective major orthopedic surgery is likely associated with a higher risk of intra-operative hypotension and acute kidney injury.
Citation: Nielson E, Hennrikus E, Lehman E, Mets B. Angiotensin axis blockade, hypotension, and acute kidney injury in elective major orthopedic surgery. J Hosp Med. 2014;9(5):283-288.
Clinical question: Should angiotension-converting enzyme inhibitors or angiotension receptor blockers (ACEI/ARB) be held the morning of elective joint replacement?
Background: In patients taking ACEI/ARB, the decision regarding whether or not to give these medications on the day of surgery is controversial. UptoDate recommends holding ACEI/ARB the day of surgery; American College of Physicians Guidelines and SHM Consult Medicine recommend giving these drugs on the day of surgery.
Study design: Retrospective cohort (case control) study.
Setting: A large academic hospital in Pennsylvania.
Synopsis: Researchers studied adults undergoing elective spinal fusion, total knee replacement, or total hip replacement, and compared outcomes in 323 patients who were taking an ACEI/ARB (study group) to outcomes in the 579 patients who were not taking an ACEI/ARB (control group) before surgery. It was assumed—but not studied—that the ACEI/ARB was continued the morning of surgery in all patients in the study group, because that was the standard practice at this hospital.
Compared to the control group, the study group had more post-induction hypotension (12.2% vs. 6.7%) and more post-operative acute kidney injury (5.76% vs. 3.28%). Patients who developed acute kidney injury had longer length of stay (5.76 vs. 3.28 days) but no difference in two-year mortality.
Patients in the study group had higher baseline creatinine, were older, were more likely to be taking a diuretic, and were more likely to have diabetes, heart failure, and coronary artery disease. The authors used multiple logistic regression to adjust for these differences. Anesthesia and intra-operative fluid management were not standardized or compared.
Bottom line: ACEI/ARB administration on the morning of elective major orthopedic surgery is likely associated with a higher risk of intra-operative hypotension and acute kidney injury.
Citation: Nielson E, Hennrikus E, Lehman E, Mets B. Angiotensin axis blockade, hypotension, and acute kidney injury in elective major orthopedic surgery. J Hosp Med. 2014;9(5):283-288.
No Mortality Difference Associated with Pre-Operative Beta Blocker Use for Coronary Artery Bypass Grafting Without Recent Myocardial Infarction
Clinical question: Is the use of beta blockers within 24 hours of coronary artery bypass grafting (CABG) surgery without recent myocardial infarction (MI) associated with decreased peri-operative mortality?
Background: Several retrospective observational studies suggest a reduction in peri-operative mortality with CABG surgery if beta blockers are administered prior to surgery. Although the use of beta blockers pre-operatively for CABG is now a quality measure, the use of pre-operative beta blockers is still controversial due to the results of more recent studies, with the observed benefit thought to be driven mainly by patients with recent MI.
Study design: Retrospective cohort analysis.
Setting: More than 1,100 U.S. hospitals.
Synopsis: The Society of Thoracic Surgeons’ National Adult Cardiac Surgery database identified 506,110 adult patients (without MI within 21 days) nonemergently undergoing CABG surgery. Beta blocker use was defined as receiving a beta blocker within 24 hours before surgery. Although most patients (86%) received beta blockers prior to surgery, there was no significant difference in operative mortality, permanent stroke, prolonged ventilation, and renal failure between patients receiving beta blockers and those who did not, although atrial fibrillation (Afib) was more common with pre-operative beta blocker use.
Bottom line: For patients undergoing nonemergent CABG surgery without recent MI, pre-operative beta blocker use is not associated with improved outcomes and is associated with slightly higher rates of Afib.
Citation: Brinkman W, Herbert MA, O’Brien S, et al. Preoperative beta-blocker use in coronary artery bypass grafting surgery: national database analysis. JAMA Intern Med. 2014;174(8):1320-1327.
Clinical question: Is the use of beta blockers within 24 hours of coronary artery bypass grafting (CABG) surgery without recent myocardial infarction (MI) associated with decreased peri-operative mortality?
Background: Several retrospective observational studies suggest a reduction in peri-operative mortality with CABG surgery if beta blockers are administered prior to surgery. Although the use of beta blockers pre-operatively for CABG is now a quality measure, the use of pre-operative beta blockers is still controversial due to the results of more recent studies, with the observed benefit thought to be driven mainly by patients with recent MI.
Study design: Retrospective cohort analysis.
Setting: More than 1,100 U.S. hospitals.
Synopsis: The Society of Thoracic Surgeons’ National Adult Cardiac Surgery database identified 506,110 adult patients (without MI within 21 days) nonemergently undergoing CABG surgery. Beta blocker use was defined as receiving a beta blocker within 24 hours before surgery. Although most patients (86%) received beta blockers prior to surgery, there was no significant difference in operative mortality, permanent stroke, prolonged ventilation, and renal failure between patients receiving beta blockers and those who did not, although atrial fibrillation (Afib) was more common with pre-operative beta blocker use.
Bottom line: For patients undergoing nonemergent CABG surgery without recent MI, pre-operative beta blocker use is not associated with improved outcomes and is associated with slightly higher rates of Afib.
Citation: Brinkman W, Herbert MA, O’Brien S, et al. Preoperative beta-blocker use in coronary artery bypass grafting surgery: national database analysis. JAMA Intern Med. 2014;174(8):1320-1327.
Clinical question: Is the use of beta blockers within 24 hours of coronary artery bypass grafting (CABG) surgery without recent myocardial infarction (MI) associated with decreased peri-operative mortality?
Background: Several retrospective observational studies suggest a reduction in peri-operative mortality with CABG surgery if beta blockers are administered prior to surgery. Although the use of beta blockers pre-operatively for CABG is now a quality measure, the use of pre-operative beta blockers is still controversial due to the results of more recent studies, with the observed benefit thought to be driven mainly by patients with recent MI.
Study design: Retrospective cohort analysis.
Setting: More than 1,100 U.S. hospitals.
Synopsis: The Society of Thoracic Surgeons’ National Adult Cardiac Surgery database identified 506,110 adult patients (without MI within 21 days) nonemergently undergoing CABG surgery. Beta blocker use was defined as receiving a beta blocker within 24 hours before surgery. Although most patients (86%) received beta blockers prior to surgery, there was no significant difference in operative mortality, permanent stroke, prolonged ventilation, and renal failure between patients receiving beta blockers and those who did not, although atrial fibrillation (Afib) was more common with pre-operative beta blocker use.
Bottom line: For patients undergoing nonemergent CABG surgery without recent MI, pre-operative beta blocker use is not associated with improved outcomes and is associated with slightly higher rates of Afib.
Citation: Brinkman W, Herbert MA, O’Brien S, et al. Preoperative beta-blocker use in coronary artery bypass grafting surgery: national database analysis. JAMA Intern Med. 2014;174(8):1320-1327.
Delirium Severity Scoring System CAM-S Correlates with Length of Stay, Mortality
Clinical question: Does the CAM-S, a modified version of the Confusion Assessment Method (CAM), which measures delirium severity, correlate with clinical outcomes?
Background: In 1990, Dr. Sharon Inouye developed the CAM, which is a common, standard measure to identify the presence of delirium. Although other scoring systems exist to quantify delirium severity, Dr. Inouye proposes an extension of the CAM (CAM-S) to measure delirium severity.
Study design: Validation analysis.
Setting: Three academic medical centers in the U.S.
Synopsis: Two validation cohorts of patients 70 years or older without dementia and moderate-to-high-risk of developing delirium during hospitalization were studied. The first cohort comprised 300 patients scheduled for elective, noncardiac surgery; the second cohort was made up of 250 patients admitted to an inpatient medical service. The CAM-S uses the same items as the original CAM and rates each symptom 0 for absent, 1 for mild, or 2 for marked; acute onset of fluctuation receives 0 (absent) or 1 (present). Higher CAM-S scores appear to correlate with various outcome measures, including increased length of stay, new nursing home placement, and 90-day mortality.
Bottom line: Higher scores on the CAM-S, a scoring system based on the CAM and designed to measure delirium severity, are associated with worse in-hospital and post-discharge outcomes.
Citation: Inouye SK, Kosar CM, Tommet D, et al. The CAM-S: development and validation of a new scoring system for delirium severity in 2 cohorts. Ann Intern Med. 2014;160(8):526-533.
Clinical question: Does the CAM-S, a modified version of the Confusion Assessment Method (CAM), which measures delirium severity, correlate with clinical outcomes?
Background: In 1990, Dr. Sharon Inouye developed the CAM, which is a common, standard measure to identify the presence of delirium. Although other scoring systems exist to quantify delirium severity, Dr. Inouye proposes an extension of the CAM (CAM-S) to measure delirium severity.
Study design: Validation analysis.
Setting: Three academic medical centers in the U.S.
Synopsis: Two validation cohorts of patients 70 years or older without dementia and moderate-to-high-risk of developing delirium during hospitalization were studied. The first cohort comprised 300 patients scheduled for elective, noncardiac surgery; the second cohort was made up of 250 patients admitted to an inpatient medical service. The CAM-S uses the same items as the original CAM and rates each symptom 0 for absent, 1 for mild, or 2 for marked; acute onset of fluctuation receives 0 (absent) or 1 (present). Higher CAM-S scores appear to correlate with various outcome measures, including increased length of stay, new nursing home placement, and 90-day mortality.
Bottom line: Higher scores on the CAM-S, a scoring system based on the CAM and designed to measure delirium severity, are associated with worse in-hospital and post-discharge outcomes.
Citation: Inouye SK, Kosar CM, Tommet D, et al. The CAM-S: development and validation of a new scoring system for delirium severity in 2 cohorts. Ann Intern Med. 2014;160(8):526-533.
Clinical question: Does the CAM-S, a modified version of the Confusion Assessment Method (CAM), which measures delirium severity, correlate with clinical outcomes?
Background: In 1990, Dr. Sharon Inouye developed the CAM, which is a common, standard measure to identify the presence of delirium. Although other scoring systems exist to quantify delirium severity, Dr. Inouye proposes an extension of the CAM (CAM-S) to measure delirium severity.
Study design: Validation analysis.
Setting: Three academic medical centers in the U.S.
Synopsis: Two validation cohorts of patients 70 years or older without dementia and moderate-to-high-risk of developing delirium during hospitalization were studied. The first cohort comprised 300 patients scheduled for elective, noncardiac surgery; the second cohort was made up of 250 patients admitted to an inpatient medical service. The CAM-S uses the same items as the original CAM and rates each symptom 0 for absent, 1 for mild, or 2 for marked; acute onset of fluctuation receives 0 (absent) or 1 (present). Higher CAM-S scores appear to correlate with various outcome measures, including increased length of stay, new nursing home placement, and 90-day mortality.
Bottom line: Higher scores on the CAM-S, a scoring system based on the CAM and designed to measure delirium severity, are associated with worse in-hospital and post-discharge outcomes.
Citation: Inouye SK, Kosar CM, Tommet D, et al. The CAM-S: development and validation of a new scoring system for delirium severity in 2 cohorts. Ann Intern Med. 2014;160(8):526-533.