User login
ITL: Physician Reviews of HM-Relevant Research
In This Edition
Literature At A Glance
A guide to this month’s studies
- Online calculator helps prevent post-op respiratory failure
- New drug for long-term treatment of PE
- Benefits of triple therapy for COPD
- Knee-length compression stockings as good as thigh-length for PTS
- Video monitoring improves hand hygiene
- Asymptomatic bacteriuria often misdiagnosed as UTI
- CT accurate for lower GI bleeding diagnosis
- Switch from albuterol to lavalbuterol to reduce tachycardia not recommended
Preoperative Risk Calculator Can Help Predict Postoperative Respiratory Failure
Clinical question: Can preoperative factors identify patients at risk for postoperative respiratory failure (PRF)?
Background: PRF—when a patient requires mechanical ventilation >48 hours after surgery or needs unplanned intubation within 30 days of surgery—is associated with high mortality, with 30-day mortality rates of 26%.
Study design: Analysis of multicenter, prospective databases of the American College of Surgeons National Surgical Quality Improvement Program (NSQIP).
Setting: Analysis of NSQIP data from 2007 (training set) and 2008 (validation set).
Synopsis: PRF was seen in 3.1% of patients in the 2007 data set and 2.6% in the 2008 data set. Those with PRF had significantly higher mortality rates than those without PRF (25.62% vs. 0.98%; P<0.0001). Preoperative risk factors associated with significantly increased risk of PRF were American Society of Anesthesiologists’ class, functional status, emergent nature of procedure, type of surgery, and preoperative sepsis.
The 2007 data set was used to develop the model, and the 2008 data set was used as a validation set. The selected risk model showed similar results in both sets with a c-statistic of 0.91 in the training set and 0.90 in the validation set. This selected model was then used to develop an interactive calculator predicting PRF (available at www.surgicalriskcalculator.com/prf-risk-calculator).
Bottom line: The PRF risk calculator can identify patients at high risk for PRF, which can aid in tailoring preventive strategies for patients prior to surgery.
Citation: Gupta H, Gupta PK, Fang X, et al. Development and validation of a risk calculator predicting postoperative respiratory failure. Chest. 2011;140:1207-1215.
New Drug for Treatment of Acute Symptomatic Pulmonary Embolism
Clinical question: Is the incidence of recurrent venous thromboembolism (VTE) or bleeding with use of idrabiotaparinux comparable to warfarin for treatment of acute symptomatic pulmonary embolism (PE)?
Background: Warfarin is an effective treatment for PE; however, maintenance of effective and safe levels of anticoagulation is difficult to achieve. A straightforward treatment option would be an attractive alternative. Idrabiotaparinux, a factor Xa inhibitor bound with a biotin moiety, is a weekly subcutaneous injection proposed as an alternative to warfarin for treatment of PE.
Study design: Industry-sponsored double-blind, randomized controlled trial.
Setting: 291 centers in 37 countries.
Synopsis: A group of 3,202 patients aged 18 to 96 years were randomly assigned to receive enoxaparin, followed by idrabiotaparinux or enoxaparin, then overlapped and followed by warfarin for three or six months. The incidence of recurrent VTE (including fatal and nonfatal PE or deep vein thrombosis) did not differ between the two treatment arms.
Of the 1,599 patients treated with idrabiotaparinux, 48 (3%) had a recurrence; of the 1,603 treated with warfarin, 97 (6%) had a recurrence (odds ratio, 0.49). The rate of clinically relevant bleeding was also similar, with 72 (5%) in the idrabiotaparinux group versus 106 (7%) in the warfarin group. Much like warfarin, idrabiotaparinux requires bridging therapy with initial low-molecular-weight heparin.
Bottom line: Use of enoxaparin followed by weekly subcutaneous injection of idrabiotaparinux was as effective as enoxaparin followed by warfarin for preventing recurrent VTE, and may provide a suitable option for treatment of acute symptomatic PE.
Citation: Buller HR, Gallus AS, Pillion G, Prins MH, Raskob GE. Enoxaparin followed by once-weekly idrabiotaparinux versus enoxaparin plus warfarin for patients with acute symptomatic pulmonary embolism: a randomised, double-blind, double-dummy, non-inferiority trial. Lancet. 2012;379:123-129.
Triple Therapy Better than Double for COPD
Clinical question: Does addition of tiotropium to inhaled corticosteroids and long-acting beta-agonists (LABA) have an additive benefit in reducing mortality, hospital admissions, and exacerbations in COPD?
Background: Triple therapy in COPD involves adding LABA and long-acting antimuscarinics (LAMA), such as tiotropium, to inhaled corticosteroids (ICS). Despite the guidelines recommending triple therapy for severe COPD, most studies have evaluated either LAMA or LABA plus ICS, but not all three together.
Study design: Retrospective cohort.
Setting: Tayside, Scotland’s National Health Services database.
Synopsis: Patients with severe COPD were divided into two groups: 1,857 patients had received ICS+LABA (double therapy) and 996 had received ICS+LABA+tiotropium (triple therapy), with follow-up of 4.65 years.
All-cause mortality was 35% lower in the triple therapy group (HR 0.65, 95% CI 0.57-0.75). Corticosteroid use was 29% lower (HR 0.71, 95% CI 0.63-0.80), and hospital admissions were 15% lower (HR 0.85, 95% CI 0.73-0.99) in the triple-therapy group. These results were adjusted for smoking, age, sex, socioeconomic status, and history of diabetes, cardiovascular, and respiratory disease.
This study is limited by its observational retrospective design but provides good evidence of the need for randomized controlled trials to validate the clinical benefits of triple therapy.
Bottom line: Adding tiotropium to ICS plus LABA is associated with lower all-cause mortality, hospitalizations, and corticosteroid use when compared with ICS plus LABA, validating its current use in management of severe COPD.
Citation: Short PM, Williamson PA, Elder DHJ, Lipworth SIW, Schembi S, Lipworth BJ. The impact of tiotropium on mortality and exacerbations when added to inhaled corticosteroids and long-acting β-agonist therapy in COPD. Chest. 2012;141:81-86.
Above-Knee Compression Stockings Not Better than Below-Knee Stockings for Protection Against Post-Thrombotic Syndrome
Clinical question: Do above-knee compression elastic stockings prevent post-thrombotic syndrome (PTS) better than below-knee stockings?
Background: PTS—characterized by leg pain, cramps, edema, and hyperpigmentation—occurs in 25% to 50% of patients after an episode of deep venous thrombosis (DVT). Previous studies demonstrated a 50% reduction in the incidence of PTS when patients used below-knee stockings.
Study design: Open-label, randomized clinical trial.
Setting: Eight hospitals in Italy.
Synopsis: A total of 267 patients with their first episode of DVT were randomized to thigh-length or below-knee compression elastic stockings, as well as therapeutic anticoagulation, with a primary endpoint of three-year incidence of PTS. Assessment was done by study personnel who were blinded to the type of stocking the patients had been prescribed. Severity of PTS was graded by a scoring system incorporating objective and subjective criteria with an independent adjudicator.
The intention-to-treat analysis showed no significant difference in the three-year incidence of PTS between thigh- and knee-length stockings (32.6% vs. 35.6%, respectively). Compliance was better in the knee-length (82.6%) than in the thigh-length (66.7%) group due to the significantly lower rate of stockings-related side effects.
The study is limited by a lack of blinding in the study participants.
Bottom line: Knee-length stockings offer equal similar protection against PTS with better compliance when compared with thigh-length stockings.
Citation: Prandoni P, Noventa F, Quintavalla R, et al. Thigh-length versus below-knee compression elastic stockings for prevention of the post-thrombotic syndrome in patients with proximal-venous thrombosis: a randomized trial. Blood. 2012;119:1561-1565.
Video Auditing With Near- Real-Time Feedback Improves Hand Hygiene Practices
Clinical question: Does the use of direct video monitoring with continuous, multi-modal feedback promote improvement in healthcare workers’ compliance with hand hygiene?
Background: Appropriate hand hygiene is an effective means of infection control. Direct human observation of hand hygiene compliance does little more than provide a biased, temporary, and often overestimated assessment of compliance. The use of video-based monitoring technology in other aspects of society (e.g. traffic signal cameras) has been well demonstrated to modify behavior.
Study design: Prospective cohort study.
Setting: Tertiary-care hospital’s 17-bed medical ICU in the northeastern U.S.
Synopsis: Through the use of 21 motion-activated video cameras with continuous third-party auditing, the provision of near real-time feedback improved hand hygiene rates of healthcare workers from 6.5% to 81.6%. In the four months preceding feedback, only 3,933 hand-washing events out of 60,542 (6.5%) were considered “passing.” During the active feedback period, 59,627 events out of 73,080 (81.6%) passed.
The improvement was sustained in the maintenance period of the study with an average rate of hand hygiene compliance of 87.9%. The improvement in hand hygiene compliance required active provision of feedback as well as the presence of monitoring equipment, making the applicability of this study limited, based on the cost of the technology and the manpower to provide feedback.
Bottom line: Hand hygiene practices improve when healthcare workers are given immediate feedback on their compliance.
Citation: Rebellion D, Husain E, Schilling ME, et al. Using high-technology to enforce low-technology safety measures: the use of third-party remote video auditing and real-time feedback in healthcare. Clin Infect Dis. 2012:54(1):1-7.
Mismanagement of Enterococcal Bacteriuria
Clinical question: Are clinical providers following appropriate guidelines to identify and manage enterococcal bacteriuria?
Background: There are specific evidence-based guidelines for the diagnosis and treatment of urinary tract infections (UTI) and asymptomatic bacteriuria (ABU). ABU is often mistaken for a UTI, and incorrectly treated as one.
Study design: Retrospective cohort.
Setting: Two academic teaching hospitals in Houston, Texas.
Synopsis: Using the current Infectious Disease Society of America (IDSA) guidelines, 375 Enterococcus urine cultures were reviewed and determined to be either UTI or ABU. The cultures were initially reviewed for appropriate treatment and again 30 days later for complications. UTI was defined as bacteriuria with one or more sign or symptom (urgency, frequency, dysuria, suprapubic tenderness, flank pain, rigors, visible hematuria, delirium, or fevers) without another identifiable cause. ABU was defined as bacteriuria without any of the signs or symptoms, or a clear nonurinary source.
Of the 339 cultures matching inclusion criteria, 156 were classified as UTI and 183 classified as ABU. Sixty of the 183 ABU (32.8%) were inappropriately treated with antibiotics, while antibiotics were withheld in 23 of the 156 UTI (14.7%). Eighty-three of 339 cultures (24.5%) were incorrectly treated. The most common reason for ABU being inappropriately treated was the presence of pyuria, associated with a threefold higher use of antibiotics.
There was no significant difference in subsequent infections or infectious complications between UTI and ABU.
Bottom line: Enterococcal ABU is frequently treated with antibiotics, even though guidelines recommend against it; providers should resist overtreating enterococcal ABU.
Citation: Lin E, Bhusal Y, Horwitz D, Shelburne SA, Trautner BW. Overtreatment of enterococcal bacteriuria. Arch Intern Med. 2012;172:33-38.
CT Angiography for the Diagnosis of Acute Lower GI Bleeding in an Emergency Setting
Clinical question: Is CT angiography a reliable initial diagnostic procedure to identify the presence and location of an acute lower gastrointestinal (GI) bleed in the ED setting?
Background: CT angiography has been identified as a potentially useful procedure to identify acute GI bleeds; however, the specific role and timing of the procedure has not been clearly identified.
Study design: Prospective study.
Setting: ED of a university-based hospital in Madrid.
Synopsis: CT angiography was performed on 47 ED patients (27 men, 20 women, with a mean age of 68 years) with an acute lower GI bleed. Study protocol included a preliminary unenhanced CT scan followed by CT angiogram prior to the standard clinical protocol, which included colonoscopy, angiography, or laparotomy.
Images were reviewed by two different ED radiologists, who were blinded to the diagnosis, and compared with the standard protocol findings. CT angiography correctly identified active acute or recent GI bleeding in 46 of the 47 patients, with a sensitivity of 100% (19 of 19), NPV of 100% (27 of 27), specificity of 96% (27 of 28), and PPV of 95% (19 of 20). CT angiography also was 93% accurate in identifying the cause of the GI bleed when compared with the standard reference.
Limitations of the study include its small size and the lack of a control group.
Bottom line: CT angiography is an accurate and more readily available modality for the diagnosis of acute lower GI bleeding, though it does not provide a therapeutic option.
Citation: Martí M, Artigas JM, Garzón G, Alvarez-Sala R, Soto JA. Acute lower intestinal bleeding: feasibility and diagnostic performance of CT angiography. Radiology. 2012;262:109-116.
Substitution of Levalbuterol to Avoid Tachyarrhythmia Not Supported
Clinical question: Does substitution of levalbuterol for albuterol in critically ill adult patients result in decreased incidence of tachyarrhythmias?
Background: Studies have indicated an increased risk of mortality from tachycardia and tachyarrhythmias in ICU patients. Levalbuterol is the R-isomer of albuterol, and it has been proposed that it may mitigate cardiac side effects seen with beta-2 agonists. For this reason, some clinicians have advocated using nebulized levalbuterol in critically ill patients.
Study design: Prospective randomized controlled trial with patient crossover.
Setting: Single academic medical center.
Synopsis: All ICU patients in a single teaching hospital were screened, and 70 patients were included. Patients were randomly crossed over between albuterol and levalbuterol every four to six hours. This resulted in a total of 836 measurements of heart rate, the primary outcome measurement. The study showed no clinically significant differences in average heart rate when using levalbuterol versus albuterol. This was a small study of ICU patients, using a surrogate endpoint of heart rate rather than mortality. Furthermore, the assessment of tachyarrhythmias was limited given the study size and relative rarity of these events. Despite these limitations, the study casts significant doubt on the theory and practice of switching from albuterol to levalbuterol solely for the purpose of reducing or avoiding tachycardia or tachyarrhythmias.
Bottom line: Substitution of levalbuterol for albuterol to avert tachycardia in critically ill patients is not warranted.
Citation: Khorfan FM, Smith P, Watt S, Barber KR. Effects of nebulized bronchodilator therapy on heart rate and arrhythmias in critically ill adult patients. Chest. 2011;140:1466-1472.
In This Edition
Literature At A Glance
A guide to this month’s studies
- Online calculator helps prevent post-op respiratory failure
- New drug for long-term treatment of PE
- Benefits of triple therapy for COPD
- Knee-length compression stockings as good as thigh-length for PTS
- Video monitoring improves hand hygiene
- Asymptomatic bacteriuria often misdiagnosed as UTI
- CT accurate for lower GI bleeding diagnosis
- Switch from albuterol to lavalbuterol to reduce tachycardia not recommended
Preoperative Risk Calculator Can Help Predict Postoperative Respiratory Failure
Clinical question: Can preoperative factors identify patients at risk for postoperative respiratory failure (PRF)?
Background: PRF—when a patient requires mechanical ventilation >48 hours after surgery or needs unplanned intubation within 30 days of surgery—is associated with high mortality, with 30-day mortality rates of 26%.
Study design: Analysis of multicenter, prospective databases of the American College of Surgeons National Surgical Quality Improvement Program (NSQIP).
Setting: Analysis of NSQIP data from 2007 (training set) and 2008 (validation set).
Synopsis: PRF was seen in 3.1% of patients in the 2007 data set and 2.6% in the 2008 data set. Those with PRF had significantly higher mortality rates than those without PRF (25.62% vs. 0.98%; P<0.0001). Preoperative risk factors associated with significantly increased risk of PRF were American Society of Anesthesiologists’ class, functional status, emergent nature of procedure, type of surgery, and preoperative sepsis.
The 2007 data set was used to develop the model, and the 2008 data set was used as a validation set. The selected risk model showed similar results in both sets with a c-statistic of 0.91 in the training set and 0.90 in the validation set. This selected model was then used to develop an interactive calculator predicting PRF (available at www.surgicalriskcalculator.com/prf-risk-calculator).
Bottom line: The PRF risk calculator can identify patients at high risk for PRF, which can aid in tailoring preventive strategies for patients prior to surgery.
Citation: Gupta H, Gupta PK, Fang X, et al. Development and validation of a risk calculator predicting postoperative respiratory failure. Chest. 2011;140:1207-1215.
New Drug for Treatment of Acute Symptomatic Pulmonary Embolism
Clinical question: Is the incidence of recurrent venous thromboembolism (VTE) or bleeding with use of idrabiotaparinux comparable to warfarin for treatment of acute symptomatic pulmonary embolism (PE)?
Background: Warfarin is an effective treatment for PE; however, maintenance of effective and safe levels of anticoagulation is difficult to achieve. A straightforward treatment option would be an attractive alternative. Idrabiotaparinux, a factor Xa inhibitor bound with a biotin moiety, is a weekly subcutaneous injection proposed as an alternative to warfarin for treatment of PE.
Study design: Industry-sponsored double-blind, randomized controlled trial.
Setting: 291 centers in 37 countries.
Synopsis: A group of 3,202 patients aged 18 to 96 years were randomly assigned to receive enoxaparin, followed by idrabiotaparinux or enoxaparin, then overlapped and followed by warfarin for three or six months. The incidence of recurrent VTE (including fatal and nonfatal PE or deep vein thrombosis) did not differ between the two treatment arms.
Of the 1,599 patients treated with idrabiotaparinux, 48 (3%) had a recurrence; of the 1,603 treated with warfarin, 97 (6%) had a recurrence (odds ratio, 0.49). The rate of clinically relevant bleeding was also similar, with 72 (5%) in the idrabiotaparinux group versus 106 (7%) in the warfarin group. Much like warfarin, idrabiotaparinux requires bridging therapy with initial low-molecular-weight heparin.
Bottom line: Use of enoxaparin followed by weekly subcutaneous injection of idrabiotaparinux was as effective as enoxaparin followed by warfarin for preventing recurrent VTE, and may provide a suitable option for treatment of acute symptomatic PE.
Citation: Buller HR, Gallus AS, Pillion G, Prins MH, Raskob GE. Enoxaparin followed by once-weekly idrabiotaparinux versus enoxaparin plus warfarin for patients with acute symptomatic pulmonary embolism: a randomised, double-blind, double-dummy, non-inferiority trial. Lancet. 2012;379:123-129.
Triple Therapy Better than Double for COPD
Clinical question: Does addition of tiotropium to inhaled corticosteroids and long-acting beta-agonists (LABA) have an additive benefit in reducing mortality, hospital admissions, and exacerbations in COPD?
Background: Triple therapy in COPD involves adding LABA and long-acting antimuscarinics (LAMA), such as tiotropium, to inhaled corticosteroids (ICS). Despite the guidelines recommending triple therapy for severe COPD, most studies have evaluated either LAMA or LABA plus ICS, but not all three together.
Study design: Retrospective cohort.
Setting: Tayside, Scotland’s National Health Services database.
Synopsis: Patients with severe COPD were divided into two groups: 1,857 patients had received ICS+LABA (double therapy) and 996 had received ICS+LABA+tiotropium (triple therapy), with follow-up of 4.65 years.
All-cause mortality was 35% lower in the triple therapy group (HR 0.65, 95% CI 0.57-0.75). Corticosteroid use was 29% lower (HR 0.71, 95% CI 0.63-0.80), and hospital admissions were 15% lower (HR 0.85, 95% CI 0.73-0.99) in the triple-therapy group. These results were adjusted for smoking, age, sex, socioeconomic status, and history of diabetes, cardiovascular, and respiratory disease.
This study is limited by its observational retrospective design but provides good evidence of the need for randomized controlled trials to validate the clinical benefits of triple therapy.
Bottom line: Adding tiotropium to ICS plus LABA is associated with lower all-cause mortality, hospitalizations, and corticosteroid use when compared with ICS plus LABA, validating its current use in management of severe COPD.
Citation: Short PM, Williamson PA, Elder DHJ, Lipworth SIW, Schembi S, Lipworth BJ. The impact of tiotropium on mortality and exacerbations when added to inhaled corticosteroids and long-acting β-agonist therapy in COPD. Chest. 2012;141:81-86.
Above-Knee Compression Stockings Not Better than Below-Knee Stockings for Protection Against Post-Thrombotic Syndrome
Clinical question: Do above-knee compression elastic stockings prevent post-thrombotic syndrome (PTS) better than below-knee stockings?
Background: PTS—characterized by leg pain, cramps, edema, and hyperpigmentation—occurs in 25% to 50% of patients after an episode of deep venous thrombosis (DVT). Previous studies demonstrated a 50% reduction in the incidence of PTS when patients used below-knee stockings.
Study design: Open-label, randomized clinical trial.
Setting: Eight hospitals in Italy.
Synopsis: A total of 267 patients with their first episode of DVT were randomized to thigh-length or below-knee compression elastic stockings, as well as therapeutic anticoagulation, with a primary endpoint of three-year incidence of PTS. Assessment was done by study personnel who were blinded to the type of stocking the patients had been prescribed. Severity of PTS was graded by a scoring system incorporating objective and subjective criteria with an independent adjudicator.
The intention-to-treat analysis showed no significant difference in the three-year incidence of PTS between thigh- and knee-length stockings (32.6% vs. 35.6%, respectively). Compliance was better in the knee-length (82.6%) than in the thigh-length (66.7%) group due to the significantly lower rate of stockings-related side effects.
The study is limited by a lack of blinding in the study participants.
Bottom line: Knee-length stockings offer equal similar protection against PTS with better compliance when compared with thigh-length stockings.
Citation: Prandoni P, Noventa F, Quintavalla R, et al. Thigh-length versus below-knee compression elastic stockings for prevention of the post-thrombotic syndrome in patients with proximal-venous thrombosis: a randomized trial. Blood. 2012;119:1561-1565.
Video Auditing With Near- Real-Time Feedback Improves Hand Hygiene Practices
Clinical question: Does the use of direct video monitoring with continuous, multi-modal feedback promote improvement in healthcare workers’ compliance with hand hygiene?
Background: Appropriate hand hygiene is an effective means of infection control. Direct human observation of hand hygiene compliance does little more than provide a biased, temporary, and often overestimated assessment of compliance. The use of video-based monitoring technology in other aspects of society (e.g. traffic signal cameras) has been well demonstrated to modify behavior.
Study design: Prospective cohort study.
Setting: Tertiary-care hospital’s 17-bed medical ICU in the northeastern U.S.
Synopsis: Through the use of 21 motion-activated video cameras with continuous third-party auditing, the provision of near real-time feedback improved hand hygiene rates of healthcare workers from 6.5% to 81.6%. In the four months preceding feedback, only 3,933 hand-washing events out of 60,542 (6.5%) were considered “passing.” During the active feedback period, 59,627 events out of 73,080 (81.6%) passed.
The improvement was sustained in the maintenance period of the study with an average rate of hand hygiene compliance of 87.9%. The improvement in hand hygiene compliance required active provision of feedback as well as the presence of monitoring equipment, making the applicability of this study limited, based on the cost of the technology and the manpower to provide feedback.
Bottom line: Hand hygiene practices improve when healthcare workers are given immediate feedback on their compliance.
Citation: Rebellion D, Husain E, Schilling ME, et al. Using high-technology to enforce low-technology safety measures: the use of third-party remote video auditing and real-time feedback in healthcare. Clin Infect Dis. 2012:54(1):1-7.
Mismanagement of Enterococcal Bacteriuria
Clinical question: Are clinical providers following appropriate guidelines to identify and manage enterococcal bacteriuria?
Background: There are specific evidence-based guidelines for the diagnosis and treatment of urinary tract infections (UTI) and asymptomatic bacteriuria (ABU). ABU is often mistaken for a UTI, and incorrectly treated as one.
Study design: Retrospective cohort.
Setting: Two academic teaching hospitals in Houston, Texas.
Synopsis: Using the current Infectious Disease Society of America (IDSA) guidelines, 375 Enterococcus urine cultures were reviewed and determined to be either UTI or ABU. The cultures were initially reviewed for appropriate treatment and again 30 days later for complications. UTI was defined as bacteriuria with one or more sign or symptom (urgency, frequency, dysuria, suprapubic tenderness, flank pain, rigors, visible hematuria, delirium, or fevers) without another identifiable cause. ABU was defined as bacteriuria without any of the signs or symptoms, or a clear nonurinary source.
Of the 339 cultures matching inclusion criteria, 156 were classified as UTI and 183 classified as ABU. Sixty of the 183 ABU (32.8%) were inappropriately treated with antibiotics, while antibiotics were withheld in 23 of the 156 UTI (14.7%). Eighty-three of 339 cultures (24.5%) were incorrectly treated. The most common reason for ABU being inappropriately treated was the presence of pyuria, associated with a threefold higher use of antibiotics.
There was no significant difference in subsequent infections or infectious complications between UTI and ABU.
Bottom line: Enterococcal ABU is frequently treated with antibiotics, even though guidelines recommend against it; providers should resist overtreating enterococcal ABU.
Citation: Lin E, Bhusal Y, Horwitz D, Shelburne SA, Trautner BW. Overtreatment of enterococcal bacteriuria. Arch Intern Med. 2012;172:33-38.
CT Angiography for the Diagnosis of Acute Lower GI Bleeding in an Emergency Setting
Clinical question: Is CT angiography a reliable initial diagnostic procedure to identify the presence and location of an acute lower gastrointestinal (GI) bleed in the ED setting?
Background: CT angiography has been identified as a potentially useful procedure to identify acute GI bleeds; however, the specific role and timing of the procedure has not been clearly identified.
Study design: Prospective study.
Setting: ED of a university-based hospital in Madrid.
Synopsis: CT angiography was performed on 47 ED patients (27 men, 20 women, with a mean age of 68 years) with an acute lower GI bleed. Study protocol included a preliminary unenhanced CT scan followed by CT angiogram prior to the standard clinical protocol, which included colonoscopy, angiography, or laparotomy.
Images were reviewed by two different ED radiologists, who were blinded to the diagnosis, and compared with the standard protocol findings. CT angiography correctly identified active acute or recent GI bleeding in 46 of the 47 patients, with a sensitivity of 100% (19 of 19), NPV of 100% (27 of 27), specificity of 96% (27 of 28), and PPV of 95% (19 of 20). CT angiography also was 93% accurate in identifying the cause of the GI bleed when compared with the standard reference.
Limitations of the study include its small size and the lack of a control group.
Bottom line: CT angiography is an accurate and more readily available modality for the diagnosis of acute lower GI bleeding, though it does not provide a therapeutic option.
Citation: Martí M, Artigas JM, Garzón G, Alvarez-Sala R, Soto JA. Acute lower intestinal bleeding: feasibility and diagnostic performance of CT angiography. Radiology. 2012;262:109-116.
Substitution of Levalbuterol to Avoid Tachyarrhythmia Not Supported
Clinical question: Does substitution of levalbuterol for albuterol in critically ill adult patients result in decreased incidence of tachyarrhythmias?
Background: Studies have indicated an increased risk of mortality from tachycardia and tachyarrhythmias in ICU patients. Levalbuterol is the R-isomer of albuterol, and it has been proposed that it may mitigate cardiac side effects seen with beta-2 agonists. For this reason, some clinicians have advocated using nebulized levalbuterol in critically ill patients.
Study design: Prospective randomized controlled trial with patient crossover.
Setting: Single academic medical center.
Synopsis: All ICU patients in a single teaching hospital were screened, and 70 patients were included. Patients were randomly crossed over between albuterol and levalbuterol every four to six hours. This resulted in a total of 836 measurements of heart rate, the primary outcome measurement. The study showed no clinically significant differences in average heart rate when using levalbuterol versus albuterol. This was a small study of ICU patients, using a surrogate endpoint of heart rate rather than mortality. Furthermore, the assessment of tachyarrhythmias was limited given the study size and relative rarity of these events. Despite these limitations, the study casts significant doubt on the theory and practice of switching from albuterol to levalbuterol solely for the purpose of reducing or avoiding tachycardia or tachyarrhythmias.
Bottom line: Substitution of levalbuterol for albuterol to avert tachycardia in critically ill patients is not warranted.
Citation: Khorfan FM, Smith P, Watt S, Barber KR. Effects of nebulized bronchodilator therapy on heart rate and arrhythmias in critically ill adult patients. Chest. 2011;140:1466-1472.
In This Edition
Literature At A Glance
A guide to this month’s studies
- Online calculator helps prevent post-op respiratory failure
- New drug for long-term treatment of PE
- Benefits of triple therapy for COPD
- Knee-length compression stockings as good as thigh-length for PTS
- Video monitoring improves hand hygiene
- Asymptomatic bacteriuria often misdiagnosed as UTI
- CT accurate for lower GI bleeding diagnosis
- Switch from albuterol to lavalbuterol to reduce tachycardia not recommended
Preoperative Risk Calculator Can Help Predict Postoperative Respiratory Failure
Clinical question: Can preoperative factors identify patients at risk for postoperative respiratory failure (PRF)?
Background: PRF—when a patient requires mechanical ventilation >48 hours after surgery or needs unplanned intubation within 30 days of surgery—is associated with high mortality, with 30-day mortality rates of 26%.
Study design: Analysis of multicenter, prospective databases of the American College of Surgeons National Surgical Quality Improvement Program (NSQIP).
Setting: Analysis of NSQIP data from 2007 (training set) and 2008 (validation set).
Synopsis: PRF was seen in 3.1% of patients in the 2007 data set and 2.6% in the 2008 data set. Those with PRF had significantly higher mortality rates than those without PRF (25.62% vs. 0.98%; P<0.0001). Preoperative risk factors associated with significantly increased risk of PRF were American Society of Anesthesiologists’ class, functional status, emergent nature of procedure, type of surgery, and preoperative sepsis.
The 2007 data set was used to develop the model, and the 2008 data set was used as a validation set. The selected risk model showed similar results in both sets with a c-statistic of 0.91 in the training set and 0.90 in the validation set. This selected model was then used to develop an interactive calculator predicting PRF (available at www.surgicalriskcalculator.com/prf-risk-calculator).
Bottom line: The PRF risk calculator can identify patients at high risk for PRF, which can aid in tailoring preventive strategies for patients prior to surgery.
Citation: Gupta H, Gupta PK, Fang X, et al. Development and validation of a risk calculator predicting postoperative respiratory failure. Chest. 2011;140:1207-1215.
New Drug for Treatment of Acute Symptomatic Pulmonary Embolism
Clinical question: Is the incidence of recurrent venous thromboembolism (VTE) or bleeding with use of idrabiotaparinux comparable to warfarin for treatment of acute symptomatic pulmonary embolism (PE)?
Background: Warfarin is an effective treatment for PE; however, maintenance of effective and safe levels of anticoagulation is difficult to achieve. A straightforward treatment option would be an attractive alternative. Idrabiotaparinux, a factor Xa inhibitor bound with a biotin moiety, is a weekly subcutaneous injection proposed as an alternative to warfarin for treatment of PE.
Study design: Industry-sponsored double-blind, randomized controlled trial.
Setting: 291 centers in 37 countries.
Synopsis: A group of 3,202 patients aged 18 to 96 years were randomly assigned to receive enoxaparin, followed by idrabiotaparinux or enoxaparin, then overlapped and followed by warfarin for three or six months. The incidence of recurrent VTE (including fatal and nonfatal PE or deep vein thrombosis) did not differ between the two treatment arms.
Of the 1,599 patients treated with idrabiotaparinux, 48 (3%) had a recurrence; of the 1,603 treated with warfarin, 97 (6%) had a recurrence (odds ratio, 0.49). The rate of clinically relevant bleeding was also similar, with 72 (5%) in the idrabiotaparinux group versus 106 (7%) in the warfarin group. Much like warfarin, idrabiotaparinux requires bridging therapy with initial low-molecular-weight heparin.
Bottom line: Use of enoxaparin followed by weekly subcutaneous injection of idrabiotaparinux was as effective as enoxaparin followed by warfarin for preventing recurrent VTE, and may provide a suitable option for treatment of acute symptomatic PE.
Citation: Buller HR, Gallus AS, Pillion G, Prins MH, Raskob GE. Enoxaparin followed by once-weekly idrabiotaparinux versus enoxaparin plus warfarin for patients with acute symptomatic pulmonary embolism: a randomised, double-blind, double-dummy, non-inferiority trial. Lancet. 2012;379:123-129.
Triple Therapy Better than Double for COPD
Clinical question: Does addition of tiotropium to inhaled corticosteroids and long-acting beta-agonists (LABA) have an additive benefit in reducing mortality, hospital admissions, and exacerbations in COPD?
Background: Triple therapy in COPD involves adding LABA and long-acting antimuscarinics (LAMA), such as tiotropium, to inhaled corticosteroids (ICS). Despite the guidelines recommending triple therapy for severe COPD, most studies have evaluated either LAMA or LABA plus ICS, but not all three together.
Study design: Retrospective cohort.
Setting: Tayside, Scotland’s National Health Services database.
Synopsis: Patients with severe COPD were divided into two groups: 1,857 patients had received ICS+LABA (double therapy) and 996 had received ICS+LABA+tiotropium (triple therapy), with follow-up of 4.65 years.
All-cause mortality was 35% lower in the triple therapy group (HR 0.65, 95% CI 0.57-0.75). Corticosteroid use was 29% lower (HR 0.71, 95% CI 0.63-0.80), and hospital admissions were 15% lower (HR 0.85, 95% CI 0.73-0.99) in the triple-therapy group. These results were adjusted for smoking, age, sex, socioeconomic status, and history of diabetes, cardiovascular, and respiratory disease.
This study is limited by its observational retrospective design but provides good evidence of the need for randomized controlled trials to validate the clinical benefits of triple therapy.
Bottom line: Adding tiotropium to ICS plus LABA is associated with lower all-cause mortality, hospitalizations, and corticosteroid use when compared with ICS plus LABA, validating its current use in management of severe COPD.
Citation: Short PM, Williamson PA, Elder DHJ, Lipworth SIW, Schembi S, Lipworth BJ. The impact of tiotropium on mortality and exacerbations when added to inhaled corticosteroids and long-acting β-agonist therapy in COPD. Chest. 2012;141:81-86.
Above-Knee Compression Stockings Not Better than Below-Knee Stockings for Protection Against Post-Thrombotic Syndrome
Clinical question: Do above-knee compression elastic stockings prevent post-thrombotic syndrome (PTS) better than below-knee stockings?
Background: PTS—characterized by leg pain, cramps, edema, and hyperpigmentation—occurs in 25% to 50% of patients after an episode of deep venous thrombosis (DVT). Previous studies demonstrated a 50% reduction in the incidence of PTS when patients used below-knee stockings.
Study design: Open-label, randomized clinical trial.
Setting: Eight hospitals in Italy.
Synopsis: A total of 267 patients with their first episode of DVT were randomized to thigh-length or below-knee compression elastic stockings, as well as therapeutic anticoagulation, with a primary endpoint of three-year incidence of PTS. Assessment was done by study personnel who were blinded to the type of stocking the patients had been prescribed. Severity of PTS was graded by a scoring system incorporating objective and subjective criteria with an independent adjudicator.
The intention-to-treat analysis showed no significant difference in the three-year incidence of PTS between thigh- and knee-length stockings (32.6% vs. 35.6%, respectively). Compliance was better in the knee-length (82.6%) than in the thigh-length (66.7%) group due to the significantly lower rate of stockings-related side effects.
The study is limited by a lack of blinding in the study participants.
Bottom line: Knee-length stockings offer equal similar protection against PTS with better compliance when compared with thigh-length stockings.
Citation: Prandoni P, Noventa F, Quintavalla R, et al. Thigh-length versus below-knee compression elastic stockings for prevention of the post-thrombotic syndrome in patients with proximal-venous thrombosis: a randomized trial. Blood. 2012;119:1561-1565.
Video Auditing With Near- Real-Time Feedback Improves Hand Hygiene Practices
Clinical question: Does the use of direct video monitoring with continuous, multi-modal feedback promote improvement in healthcare workers’ compliance with hand hygiene?
Background: Appropriate hand hygiene is an effective means of infection control. Direct human observation of hand hygiene compliance does little more than provide a biased, temporary, and often overestimated assessment of compliance. The use of video-based monitoring technology in other aspects of society (e.g. traffic signal cameras) has been well demonstrated to modify behavior.
Study design: Prospective cohort study.
Setting: Tertiary-care hospital’s 17-bed medical ICU in the northeastern U.S.
Synopsis: Through the use of 21 motion-activated video cameras with continuous third-party auditing, the provision of near real-time feedback improved hand hygiene rates of healthcare workers from 6.5% to 81.6%. In the four months preceding feedback, only 3,933 hand-washing events out of 60,542 (6.5%) were considered “passing.” During the active feedback period, 59,627 events out of 73,080 (81.6%) passed.
The improvement was sustained in the maintenance period of the study with an average rate of hand hygiene compliance of 87.9%. The improvement in hand hygiene compliance required active provision of feedback as well as the presence of monitoring equipment, making the applicability of this study limited, based on the cost of the technology and the manpower to provide feedback.
Bottom line: Hand hygiene practices improve when healthcare workers are given immediate feedback on their compliance.
Citation: Rebellion D, Husain E, Schilling ME, et al. Using high-technology to enforce low-technology safety measures: the use of third-party remote video auditing and real-time feedback in healthcare. Clin Infect Dis. 2012:54(1):1-7.
Mismanagement of Enterococcal Bacteriuria
Clinical question: Are clinical providers following appropriate guidelines to identify and manage enterococcal bacteriuria?
Background: There are specific evidence-based guidelines for the diagnosis and treatment of urinary tract infections (UTI) and asymptomatic bacteriuria (ABU). ABU is often mistaken for a UTI, and incorrectly treated as one.
Study design: Retrospective cohort.
Setting: Two academic teaching hospitals in Houston, Texas.
Synopsis: Using the current Infectious Disease Society of America (IDSA) guidelines, 375 Enterococcus urine cultures were reviewed and determined to be either UTI or ABU. The cultures were initially reviewed for appropriate treatment and again 30 days later for complications. UTI was defined as bacteriuria with one or more sign or symptom (urgency, frequency, dysuria, suprapubic tenderness, flank pain, rigors, visible hematuria, delirium, or fevers) without another identifiable cause. ABU was defined as bacteriuria without any of the signs or symptoms, or a clear nonurinary source.
Of the 339 cultures matching inclusion criteria, 156 were classified as UTI and 183 classified as ABU. Sixty of the 183 ABU (32.8%) were inappropriately treated with antibiotics, while antibiotics were withheld in 23 of the 156 UTI (14.7%). Eighty-three of 339 cultures (24.5%) were incorrectly treated. The most common reason for ABU being inappropriately treated was the presence of pyuria, associated with a threefold higher use of antibiotics.
There was no significant difference in subsequent infections or infectious complications between UTI and ABU.
Bottom line: Enterococcal ABU is frequently treated with antibiotics, even though guidelines recommend against it; providers should resist overtreating enterococcal ABU.
Citation: Lin E, Bhusal Y, Horwitz D, Shelburne SA, Trautner BW. Overtreatment of enterococcal bacteriuria. Arch Intern Med. 2012;172:33-38.
CT Angiography for the Diagnosis of Acute Lower GI Bleeding in an Emergency Setting
Clinical question: Is CT angiography a reliable initial diagnostic procedure to identify the presence and location of an acute lower gastrointestinal (GI) bleed in the ED setting?
Background: CT angiography has been identified as a potentially useful procedure to identify acute GI bleeds; however, the specific role and timing of the procedure has not been clearly identified.
Study design: Prospective study.
Setting: ED of a university-based hospital in Madrid.
Synopsis: CT angiography was performed on 47 ED patients (27 men, 20 women, with a mean age of 68 years) with an acute lower GI bleed. Study protocol included a preliminary unenhanced CT scan followed by CT angiogram prior to the standard clinical protocol, which included colonoscopy, angiography, or laparotomy.
Images were reviewed by two different ED radiologists, who were blinded to the diagnosis, and compared with the standard protocol findings. CT angiography correctly identified active acute or recent GI bleeding in 46 of the 47 patients, with a sensitivity of 100% (19 of 19), NPV of 100% (27 of 27), specificity of 96% (27 of 28), and PPV of 95% (19 of 20). CT angiography also was 93% accurate in identifying the cause of the GI bleed when compared with the standard reference.
Limitations of the study include its small size and the lack of a control group.
Bottom line: CT angiography is an accurate and more readily available modality for the diagnosis of acute lower GI bleeding, though it does not provide a therapeutic option.
Citation: Martí M, Artigas JM, Garzón G, Alvarez-Sala R, Soto JA. Acute lower intestinal bleeding: feasibility and diagnostic performance of CT angiography. Radiology. 2012;262:109-116.
Substitution of Levalbuterol to Avoid Tachyarrhythmia Not Supported
Clinical question: Does substitution of levalbuterol for albuterol in critically ill adult patients result in decreased incidence of tachyarrhythmias?
Background: Studies have indicated an increased risk of mortality from tachycardia and tachyarrhythmias in ICU patients. Levalbuterol is the R-isomer of albuterol, and it has been proposed that it may mitigate cardiac side effects seen with beta-2 agonists. For this reason, some clinicians have advocated using nebulized levalbuterol in critically ill patients.
Study design: Prospective randomized controlled trial with patient crossover.
Setting: Single academic medical center.
Synopsis: All ICU patients in a single teaching hospital were screened, and 70 patients were included. Patients were randomly crossed over between albuterol and levalbuterol every four to six hours. This resulted in a total of 836 measurements of heart rate, the primary outcome measurement. The study showed no clinically significant differences in average heart rate when using levalbuterol versus albuterol. This was a small study of ICU patients, using a surrogate endpoint of heart rate rather than mortality. Furthermore, the assessment of tachyarrhythmias was limited given the study size and relative rarity of these events. Despite these limitations, the study casts significant doubt on the theory and practice of switching from albuterol to levalbuterol solely for the purpose of reducing or avoiding tachycardia or tachyarrhythmias.
Bottom line: Substitution of levalbuterol for albuterol to avert tachycardia in critically ill patients is not warranted.
Citation: Khorfan FM, Smith P, Watt S, Barber KR. Effects of nebulized bronchodilator therapy on heart rate and arrhythmias in critically ill adult patients. Chest. 2011;140:1466-1472.
In the Literature: Research You Need to Know
Clinical question: What is the in-hospital mortality risk associated with hospital-acquired Clostridium difficile infection after accounting for time to infection and baseline mortality risk at admission?
Background: Hospital-acquired C. diff infection (CDI) has been shown to be associated with a higher mortality rate and longer length of stay and cost. Previous studies have demonstrated an independent association of mortality with CDI, but have not incorporated time to infection and baseline mortality risk in the analyses.
Study design: Retrospective observational study.
Setting: Single-center, tertiary-care teaching hospital.
Synopsis: Patients who were hospitalized for more than three days were eligible. A baseline in-hospital mortality risk was estimated for each patient using an internally validated tool. A total of 136,877 admissions were identified. Mean baseline mortality risk was 1.8%. Overall rate of CDI was 1.02%.
Patients in the highest decile of baseline mortality risk had a higher rate of CDI than patients in the lowest decile (2.6% vs. 0.2%). Median time to diagnosis was 12 days. CDI was associated with an unadjusted fourfold higher risk of in-hospital death. When baseline mortality risk was included, the RR of death with CDI was 1.99 (95% CI 1.81-2.19).
Patients in the lowest decile of mortality risk had the highest risk of death (RR 45.70, 95% CI 11.35-183.98) compared with those in the highest decile (RR 1.29, 95% CI 1.11-1.50). Cox modeling estimated a threefold increase in death.
This study is limited by being single-site and the mortality risk model has not been validated externally. Results are also estimated from a small number of cases in the lower deciles.
Bottom line: CDI is associated with threefold higher in-hospital mortality. Patients with higher baseline mortality risk have a higher risk of CDI but have a lesser risk of dying compared with patients with lower baseline mortality risk. Hospitals should continue their efforts to reduce rates of CDI.
Citation: Oake N, Taljaard M, van Walraven C, Wilson K, Roth V, Forster AJ. The effect of hospital-acquired C. diff infection on in-hospital mortality. Arch Intern Med. 2010;170(20):1804-1810.
For more physician reviews of HM-related research, visit our website.
Clinical question: What is the in-hospital mortality risk associated with hospital-acquired Clostridium difficile infection after accounting for time to infection and baseline mortality risk at admission?
Background: Hospital-acquired C. diff infection (CDI) has been shown to be associated with a higher mortality rate and longer length of stay and cost. Previous studies have demonstrated an independent association of mortality with CDI, but have not incorporated time to infection and baseline mortality risk in the analyses.
Study design: Retrospective observational study.
Setting: Single-center, tertiary-care teaching hospital.
Synopsis: Patients who were hospitalized for more than three days were eligible. A baseline in-hospital mortality risk was estimated for each patient using an internally validated tool. A total of 136,877 admissions were identified. Mean baseline mortality risk was 1.8%. Overall rate of CDI was 1.02%.
Patients in the highest decile of baseline mortality risk had a higher rate of CDI than patients in the lowest decile (2.6% vs. 0.2%). Median time to diagnosis was 12 days. CDI was associated with an unadjusted fourfold higher risk of in-hospital death. When baseline mortality risk was included, the RR of death with CDI was 1.99 (95% CI 1.81-2.19).
Patients in the lowest decile of mortality risk had the highest risk of death (RR 45.70, 95% CI 11.35-183.98) compared with those in the highest decile (RR 1.29, 95% CI 1.11-1.50). Cox modeling estimated a threefold increase in death.
This study is limited by being single-site and the mortality risk model has not been validated externally. Results are also estimated from a small number of cases in the lower deciles.
Bottom line: CDI is associated with threefold higher in-hospital mortality. Patients with higher baseline mortality risk have a higher risk of CDI but have a lesser risk of dying compared with patients with lower baseline mortality risk. Hospitals should continue their efforts to reduce rates of CDI.
Citation: Oake N, Taljaard M, van Walraven C, Wilson K, Roth V, Forster AJ. The effect of hospital-acquired C. diff infection on in-hospital mortality. Arch Intern Med. 2010;170(20):1804-1810.
For more physician reviews of HM-related research, visit our website.
Clinical question: What is the in-hospital mortality risk associated with hospital-acquired Clostridium difficile infection after accounting for time to infection and baseline mortality risk at admission?
Background: Hospital-acquired C. diff infection (CDI) has been shown to be associated with a higher mortality rate and longer length of stay and cost. Previous studies have demonstrated an independent association of mortality with CDI, but have not incorporated time to infection and baseline mortality risk in the analyses.
Study design: Retrospective observational study.
Setting: Single-center, tertiary-care teaching hospital.
Synopsis: Patients who were hospitalized for more than three days were eligible. A baseline in-hospital mortality risk was estimated for each patient using an internally validated tool. A total of 136,877 admissions were identified. Mean baseline mortality risk was 1.8%. Overall rate of CDI was 1.02%.
Patients in the highest decile of baseline mortality risk had a higher rate of CDI than patients in the lowest decile (2.6% vs. 0.2%). Median time to diagnosis was 12 days. CDI was associated with an unadjusted fourfold higher risk of in-hospital death. When baseline mortality risk was included, the RR of death with CDI was 1.99 (95% CI 1.81-2.19).
Patients in the lowest decile of mortality risk had the highest risk of death (RR 45.70, 95% CI 11.35-183.98) compared with those in the highest decile (RR 1.29, 95% CI 1.11-1.50). Cox modeling estimated a threefold increase in death.
This study is limited by being single-site and the mortality risk model has not been validated externally. Results are also estimated from a small number of cases in the lower deciles.
Bottom line: CDI is associated with threefold higher in-hospital mortality. Patients with higher baseline mortality risk have a higher risk of CDI but have a lesser risk of dying compared with patients with lower baseline mortality risk. Hospitals should continue their efforts to reduce rates of CDI.
Citation: Oake N, Taljaard M, van Walraven C, Wilson K, Roth V, Forster AJ. The effect of hospital-acquired C. diff infection on in-hospital mortality. Arch Intern Med. 2010;170(20):1804-1810.
For more physician reviews of HM-related research, visit our website.
In the Literature: HM-Related Research You Need to Know
In This Edition
Literature at a Glance
A guide to this month’s studies
- Early ambulation and LOS in geriatric patients
- Patient-safety movement and hospital harm rates
- Lifestyle modification and weight loss
- Outcomes of transcatheter aortic-valve implantation
- Tool for predicting mortality in advanced dementia
- Residents’ opinion of new duty-hour regulations
- Renal ultrasound predictor for acute kidney injury
- Romiplostim use in immune thrombocytopenia
Increasing Ambulation within 48 Hours of Admission Decreases LOS by Two Days
Clinical question: Is there an association between an early increase in ambulation and length of stay (LOS) in geriatric patients admitted with an acute illness?
Background: Early ambulation leading to better recovery in such illnesses as pneumonia and myocardial infarction is well known, as is early ambulation after hip fracture surgery to prevent complications. However, no specific guidelines exist in regard to ambulation in older patients.
Study design: Prospective, nonblinded study.
Setting: Acute-care geriatric unit in an academic medical center.
Synopsis: A total of 162 patients 65 or older were studied. Data were collected during a four-month period in 2009. A Step Activity Monitor (SAM) was placed on admission. Patients were instructed to walk as usual. Investigators measured the number of steps taken per day and change in steps between the first and second day.
Patients averaged 662.1 steps per day, with a mean step change of 196.5 steps. The adjusted mean difference in LOS for patients who increased their total steps by 600 or more between the first and second day was 2.13 days (95% CI, 1.05-3.97). Patients who had low or negative changes in steps had longer LOS. The 32 patients who walked more than 600 steps were more likely to be men (P=0.02), independently ambulate (P<0.01), and have admitting orders of “ambulate with assist” (P=0.03).
One limitation of this study is that patients who walked more might have been less ill or very functional on admission.
Bottom line: Increasing ambulation early in a hospitalization (first two days) is associated with a decreased LOS in an elderly population.
Citation: Fisher SR, Kuo YF, Graham JE, Ottenbacher KJ, Ostir GV. Early ambulation and length of stay in older adults hospitalized for acute illness. Arch Intern Med. 2010;170(21):1942-1943.
Despite Efforts to Improve Patient Safety in Hospitals, No Reduction in Longitudinal Rates of Harm
Clinical question: As hospitals focus more on programs to improve patient safety, has the rate of harms decreased?
Background: Since the Institute of Medicine published a groundbreaking report (To Err is Human) a little more than a decade ago, policymakers, hospitals, and healthcare organizations have focused more on efforts to improve patient safety with the goal of reducing harms. It is not clear if these efforts have reduced harms.
Study design: Retrospective chart review.
Setting: Ten hospitals in North Carolina.
Synopsis: Ten charts per quarter were randomly selected from each hospital from January 2002 through December 2007. Internal and external reviewers used the IHI Global Trigger Tool for Measuring Adverse Events to identify rates of harm. Harms were classified into categories of severity and assessed for preventability.
Kappa scores were generally higher for internal reviewers, indicating higher reliability for internal reviewers. Internal reviewers identified 588 harms for 10,415 patient days (25.1 harms per 100 patient days), which occurred in 423 unique patients (18.1%). A majority (63.1%) of harms were considered preventable. Forty-one percent of harms were temporary and required intervention; 2.4% caused or contributed to a patient’s death.
There was no significant change over time in the rate of harms (regardless of reviewer type) even after adjusting for demographics.
This study is limited because it is based only in North Carolina hospitals. It was not powered to evaluate change in individual hospitals. There might have been unmeasurable improvements that were not accounted for by the trigger tool.
Bottom line: Despite a higher focus on patient safety, investigators did not find a decrease in the rate of harms. A majority of the harms were preventable. This study should not preclude efforts to continue to improve patient safety.
Citation: Landrigan CP, Parry GJ, Bones CB, Hackbarth AD, Goldmann DA, Sharek PJ. Temporal trends in rates of patient harm resulting from medical care. N Engl J Med. 2010;363(22):2124-2134.
Intensive Lifestyle Modification Improves Weight Loss in Severely Obese Individuals
Clinical question: Does the combination of diet modification and increased physical activity lead to weight loss and improve health risks in severely obese patients?
Background: Obesity is at epidemic proportions, but there are no evidence-based treatment guidelines for severe obesity.
Study design: Randomized, single-blind trial.
Setting: Community volunteers.
Synopsis: A total of 130 individuals with a body mass index (BMI) of ≥35 were randomized to receive lifestyle interventions consisting of diet and initial physical activity for 12 months, or diet for six months and delayed physical activity for the remainder of the year.
The initial-physical-activity group demonstrated greater weight loss at six months, but the overall weight loss did not differ between the two groups. At 12 months, the initial physical activity group lost 12.1 kg and the delayed-physical-activity group lost 9.87 kg. Both groups demonstrated significantly reduced blood pressure, reduced serum liver enzymes, and improved insulin resistance.
Candidates with a history of coronary artery disease, uncontrolled blood pressure, or diabetes were excluded. Participants were provided with prepackaged meal replacements for the first six months and received financial compensation for participation in the study.
This study is limited by the fact that a majority of the participants were female (85.1%). Providing meals to the participants also limits the application of this program to the general public.
Bottom line: The results of this study reflect the importance of diet and exercise on weight loss in obese individuals. However, adherence to the goals of the study required multiple individual and group meetings throughout the year, the provision of prepackaged meals, and some financial incentive for compliance.
Citation: Goodpaster GH, Delany JP, Otto AD, et al. Effects of diet and physical activity interventions on weight loss and cardiometabolic risk factors in severely obese adults: a randomized trial. JAMA. 2010;304 (16):1795-1802.
Transcatheter Aortic-Valve Implantation Is Superior to Standard Nonoperative Therapy for Symptomatic Aortic Stenosis
Clinical question: Is there a mortality benefit to transcatheter valve implantation over standard therapy in nonsurgical candidates with severe aortic stenosis (AS)?
Background: Untreated, symptomatic AS has a high rate of death, but a significant proportion of patients with severe aortic stenosis are poor surgical candidates. Available since 2002, transcatheter aortic-valve implantation (TAVI) is a promising, nonsurgical treatment option for severe AS. However, to date, TAVI has lacked rigorous clinical data.
Study design: Prospective, multicenter, randomized, active-treatment-controlled clinical trial.
Setting: Twenty-one centers, 17 of which were in the U.S.
Synopsis: A total of 358 patients with severe AS who were considered nonsurgical candidates were randomized to either TAVI or standard therapy. A majority (83.8%) of the patients in the standard group underwent balloon aortic valvuloplasty.
Researchers found a significant reduction (HR 0.55, 95% CI 0.40 to 0.74, P<0.001) in all-cause mortality at one year in those patients undergoing TAVI (30.7%) vs. standard therapy (50.7%). Additional benefits included lower rates of the composite endpoints of death from any cause or repeat hospitalization (42.5% vs. 71.6%, P<0.001) and NYHA Functional Class III or IV symptoms (25.2% vs. 58.0%, P<0.001) at one year. However, higher incidences of major strokes (5.0% vs. 1.6%, P=0.06) and major vascular complications (16.2% vs. 1.1%, P<0.001) were seen.
While the one-year mortality benefit of TAVI over standard nonoperative therapy was clearly demonstrated by this study, hospitalists should interpret these data cautiously with respect to their inpatient populations as exclusion criteria were extensive, including bicuspid or noncalcified aortic valve, LVEF less than 20%, and severe renal insufficiency. Additionally, the entity of standard therapy was poorly delineated.
Bottom line: TAVI should be considered in patients with severe aortic stenosis who are not suitable surgical candidates.
Citation: Leon MB, Smith CR, Mack M, et al. Transcatheter aortic-valve implantation for aortic stenosis in patients who cannot undergo surgery. N Engl J Med. 2010;363(17):1597-1607.
ADEPT Score Better Predicts Six-Month Mortality in Nursing Home Residents with Advanced Dementia
Clinical question: Are current Medicare hospice eligibility guidelines accurate enough to predict six-month survival in nursing home residents with dementia when compared with the Advanced Dementia Prognostic Tool (ADEPT)?
Background: Incorrectly estimating the life expectancy in almost 5 million nursing home residents with dementia prevents enrollment to palliative care and hospice for those who would benefit most. Creating and validating a mortality risk score would allow increased services to these residents.
Study design: Prospective cohort study.
Setting: Twenty-one nursing homes in Boston.
Synopsis: A total of 606 nursing home residents with advanced dementia were recruited for this study. Each resident was assessed for Medicare hospice eligibility and assigned an ADEPT score. Mortality rate was determined six months later. These two assessment tools were compared regarding their ability to predict six-month mortality.
The mean ADEPT score was 10.1 (range of 1.0-32.5), with a higher score meaning worse prognosis. Sixty-five residents (10.7%) met Medicare hospice eligibility guidelines. A total of 111 residents (18.3%) died.
The ADEPT score was more sensitive (90% vs. 20%) but less specific (28.3% vs. 89%) than Medicare guidelines. The area under the receiver operating characteristic (AUROC) curve was 0.67 (95% CI, 0.62-0.72) for ADEPT and 0.55 (95% CI, 0.51-0.59) for Medicare.
ADEPT was slightly better than hospice guidelines in predicting six-month mortality.
This study was limited in that the resident data were collected at a single random time point and might not reflect reality, as with palliative care and hospice, there usually is a decline in status that stimulates the referrals.
Bottom line: The ADEPT score might better estimate the six-month mortality in nursing home residents with dementia, which can help expand the enrollment of palliative care and hospice for these residents.
Citation: Mitchell SL, Miller SC, Teno JM, Kiely DK, Davis RB, Shaffer ML. Prediction of 6-month survival of nursing home residents with advanced dementia using ADEPT vs hospice eligibility guidelines. JAMA. 2010;304(17):1929-1935.
Residents Concerned about How New ACGME Duty-Hour Restrictions Will Impact Patient Care and Education
Clinical question: How do residents believe the forthcoming revised ACGME Rules for Supervision and Duty Hours will impact their residency?
Background: On July 1, revised ACGME duty-hour rules go into effect, limiting PGY-1 residents to 16-hour duty periods and PGY-2 and above to 28 hours. The effect these recommendations will have on patient care and resident education is unknown.
Study design: Twenty-question electronic, anonymous survey.
Setting: Twenty-three medical centers in the U.S., including residents from all disciplines and years in training.
Synopsis: Twenty-two percent of residents responded to the survey (n=2,521). Overall, 48% of residents disagreed with this statement: “Overall the changes will have a positive effect on education,” while only 26% agreed. Approximately half of those surveyed agreed that the revisions would improve their quality of life, but the same percentage also believed the revisions would increase the length of their residencies.
Residents reacted negatively to the idea that the proposed changes would improve patient safety and quality of care delivered, promote education over service obligations, and prepare them to assume senior roles. In free-text comments, residents expressed concerns about an increased number of handoffs and decreased continuity of care.
While the sample size is large and diverse, results of this survey can be affected by voluntary response bias and, therefore, could be skewed toward more extreme responses (in this case, more negative responses). The wide distribution of the responses suggests this might not be the case.
Bottom line: Residents do not believe the new requirements—though they could improve their quality of life—will positively impact patient care and education.
Citation: Drolet BC, Spalluto LB, Fischer SA. Residents’ perspectives on ACGME regulation of supervision and duty hours—a national survey. N Engl J Med. 2010;363(23):e34(1)-e34.
Decision Rule Might Help Clinicians Decide When to Order Renal Ultrasound to Evaluate Hospitalized Patients with Acute Kidney Injury
Clinical question: Can a clinical prediction rule aid clinicians in deciding when to order a renal ultrasound (RUS) in hospitalized patients with acute kidney injury?
Background: RUS routinely is obtained in patients admitted with acute kidney injury (AKI) to rule out obstruction as a cause of AKI. It is not known if this test adds any additional information in the routine evaluation of AKI and if obtaining the test is cost-effective.
Study design: Cross-sectional study.
Setting: Yale-New Haven Hospital in Connecticut.
Synopsis: This study evaluated 997 inpatients with AKI who underwent RUS. Outcome events were RUS identification of hydronephrosis (HN) or hydronephrosis requiring intervention (HNRI). The patients were divided into two samples: 200 in derivation sample and 797 in validation sample. The derivation sample was used to identify specific factors associated with HN. Seven clinical variables were identified and were used to create three risk groups: low, medium, and high.
In the validation sample, 10.6% of patients had HN and 3.3% had HNRI. The negative predictive value for HN was 96.9%, sensitivity 91.8%, and negative likelihood ratio 0.27. The number needed to screen (NNS) low-risk patients for HN was 32 and 223 for HNRI. Based on their findings, if the patient was classified low-risk, clinicians might be able to delay or avoid ordering RUS.
The major limitation of this study was that it was based at a single institution. This study only evaluated RUS obtained in patients who were hospitalized and might not be applicable to outpatients.
Bottom line: RUS was not found to change clinical management in patients with AKI and classified as low-risk for HN. Limiting RUS to patients who are high-risk for obstruction will increase the chance of finding useful clinical information that can change management decisions and limit cost of unnecessary testing.
Citation: Licurse A, Kim MC, Dziura J, et al. Renal ultrasonography in the evaluation of acute kidney injury: developing a risk stratification framework. Arch Intern Med. 2010;170(21):1900-1907.
Romiplostim Has Higher Rate of Platelet Response and Fewer Adverse Events in Patients with Immune Thrombocytopenia
Clinical question: Does the use of romiplostim lead to increased platelet counts and lower rates of splenectomy and other adverse events when compared with standard therapy in patients with immune thrombocytopenia?
Background: Romiplostim is a thrombopoetin mimetic used to increase platelet counts in immune thrombocytopenia. Initial treatments for this disease involve glucocorticoids or intravenous immune globulin. Most patients require second-line medical or surgical therapies, including splenectomy.
Study design: Randomized, open-label controlled trial.
Setting: Eighty-five medical centers in North America, Europe, and Australia.
Synopsis: A total of 234 patients were randomized in a 2:1 ratio to receive either romiplostim or the medical standard of care. Co-primary endpoints were the incidence of treatment failure and the incidence of splenectomy; secondary endpoints included time to splenectomy, platelet count, platelet response, and quality of life. Treatment failure was defined as a platelet count of 20x109 per liter or lower for four weeks, or a major bleeding event.
At the end of 52 weeks, patients receiving romiplostim had higher platelet counts, fewer bleeding events, less need for splenectomy (9% vs. 36%), and a better quality of life.
The short-term use of romiplostim in this study was not associated with an increase in adverse events when compared with standard therapy. However, maintenance of the elevated platelet count, which results from romiplostim treatment, requires continuous use of the drug; the long-term effects are unknown.
Bottom line: In patients with immune thrombocytopenia, romiplostim leads to increased platelet counts, decreased bleeding events, and decreased need for splenectomy compared to standard of care. However, the cost of the medication, when compared with current therapies, could be prohibitive.
Citation: Kuter DJ, Rummel M, Boccia R, et al. Romiplostim or standard of care in patients with immune thrombocytopenia. N Engl J Med. 2010;363(20):1889-1899. TH
In This Edition
Literature at a Glance
A guide to this month’s studies
- Early ambulation and LOS in geriatric patients
- Patient-safety movement and hospital harm rates
- Lifestyle modification and weight loss
- Outcomes of transcatheter aortic-valve implantation
- Tool for predicting mortality in advanced dementia
- Residents’ opinion of new duty-hour regulations
- Renal ultrasound predictor for acute kidney injury
- Romiplostim use in immune thrombocytopenia
Increasing Ambulation within 48 Hours of Admission Decreases LOS by Two Days
Clinical question: Is there an association between an early increase in ambulation and length of stay (LOS) in geriatric patients admitted with an acute illness?
Background: Early ambulation leading to better recovery in such illnesses as pneumonia and myocardial infarction is well known, as is early ambulation after hip fracture surgery to prevent complications. However, no specific guidelines exist in regard to ambulation in older patients.
Study design: Prospective, nonblinded study.
Setting: Acute-care geriatric unit in an academic medical center.
Synopsis: A total of 162 patients 65 or older were studied. Data were collected during a four-month period in 2009. A Step Activity Monitor (SAM) was placed on admission. Patients were instructed to walk as usual. Investigators measured the number of steps taken per day and change in steps between the first and second day.
Patients averaged 662.1 steps per day, with a mean step change of 196.5 steps. The adjusted mean difference in LOS for patients who increased their total steps by 600 or more between the first and second day was 2.13 days (95% CI, 1.05-3.97). Patients who had low or negative changes in steps had longer LOS. The 32 patients who walked more than 600 steps were more likely to be men (P=0.02), independently ambulate (P<0.01), and have admitting orders of “ambulate with assist” (P=0.03).
One limitation of this study is that patients who walked more might have been less ill or very functional on admission.
Bottom line: Increasing ambulation early in a hospitalization (first two days) is associated with a decreased LOS in an elderly population.
Citation: Fisher SR, Kuo YF, Graham JE, Ottenbacher KJ, Ostir GV. Early ambulation and length of stay in older adults hospitalized for acute illness. Arch Intern Med. 2010;170(21):1942-1943.
Despite Efforts to Improve Patient Safety in Hospitals, No Reduction in Longitudinal Rates of Harm
Clinical question: As hospitals focus more on programs to improve patient safety, has the rate of harms decreased?
Background: Since the Institute of Medicine published a groundbreaking report (To Err is Human) a little more than a decade ago, policymakers, hospitals, and healthcare organizations have focused more on efforts to improve patient safety with the goal of reducing harms. It is not clear if these efforts have reduced harms.
Study design: Retrospective chart review.
Setting: Ten hospitals in North Carolina.
Synopsis: Ten charts per quarter were randomly selected from each hospital from January 2002 through December 2007. Internal and external reviewers used the IHI Global Trigger Tool for Measuring Adverse Events to identify rates of harm. Harms were classified into categories of severity and assessed for preventability.
Kappa scores were generally higher for internal reviewers, indicating higher reliability for internal reviewers. Internal reviewers identified 588 harms for 10,415 patient days (25.1 harms per 100 patient days), which occurred in 423 unique patients (18.1%). A majority (63.1%) of harms were considered preventable. Forty-one percent of harms were temporary and required intervention; 2.4% caused or contributed to a patient’s death.
There was no significant change over time in the rate of harms (regardless of reviewer type) even after adjusting for demographics.
This study is limited because it is based only in North Carolina hospitals. It was not powered to evaluate change in individual hospitals. There might have been unmeasurable improvements that were not accounted for by the trigger tool.
Bottom line: Despite a higher focus on patient safety, investigators did not find a decrease in the rate of harms. A majority of the harms were preventable. This study should not preclude efforts to continue to improve patient safety.
Citation: Landrigan CP, Parry GJ, Bones CB, Hackbarth AD, Goldmann DA, Sharek PJ. Temporal trends in rates of patient harm resulting from medical care. N Engl J Med. 2010;363(22):2124-2134.
Intensive Lifestyle Modification Improves Weight Loss in Severely Obese Individuals
Clinical question: Does the combination of diet modification and increased physical activity lead to weight loss and improve health risks in severely obese patients?
Background: Obesity is at epidemic proportions, but there are no evidence-based treatment guidelines for severe obesity.
Study design: Randomized, single-blind trial.
Setting: Community volunteers.
Synopsis: A total of 130 individuals with a body mass index (BMI) of ≥35 were randomized to receive lifestyle interventions consisting of diet and initial physical activity for 12 months, or diet for six months and delayed physical activity for the remainder of the year.
The initial-physical-activity group demonstrated greater weight loss at six months, but the overall weight loss did not differ between the two groups. At 12 months, the initial physical activity group lost 12.1 kg and the delayed-physical-activity group lost 9.87 kg. Both groups demonstrated significantly reduced blood pressure, reduced serum liver enzymes, and improved insulin resistance.
Candidates with a history of coronary artery disease, uncontrolled blood pressure, or diabetes were excluded. Participants were provided with prepackaged meal replacements for the first six months and received financial compensation for participation in the study.
This study is limited by the fact that a majority of the participants were female (85.1%). Providing meals to the participants also limits the application of this program to the general public.
Bottom line: The results of this study reflect the importance of diet and exercise on weight loss in obese individuals. However, adherence to the goals of the study required multiple individual and group meetings throughout the year, the provision of prepackaged meals, and some financial incentive for compliance.
Citation: Goodpaster GH, Delany JP, Otto AD, et al. Effects of diet and physical activity interventions on weight loss and cardiometabolic risk factors in severely obese adults: a randomized trial. JAMA. 2010;304 (16):1795-1802.
Transcatheter Aortic-Valve Implantation Is Superior to Standard Nonoperative Therapy for Symptomatic Aortic Stenosis
Clinical question: Is there a mortality benefit to transcatheter valve implantation over standard therapy in nonsurgical candidates with severe aortic stenosis (AS)?
Background: Untreated, symptomatic AS has a high rate of death, but a significant proportion of patients with severe aortic stenosis are poor surgical candidates. Available since 2002, transcatheter aortic-valve implantation (TAVI) is a promising, nonsurgical treatment option for severe AS. However, to date, TAVI has lacked rigorous clinical data.
Study design: Prospective, multicenter, randomized, active-treatment-controlled clinical trial.
Setting: Twenty-one centers, 17 of which were in the U.S.
Synopsis: A total of 358 patients with severe AS who were considered nonsurgical candidates were randomized to either TAVI or standard therapy. A majority (83.8%) of the patients in the standard group underwent balloon aortic valvuloplasty.
Researchers found a significant reduction (HR 0.55, 95% CI 0.40 to 0.74, P<0.001) in all-cause mortality at one year in those patients undergoing TAVI (30.7%) vs. standard therapy (50.7%). Additional benefits included lower rates of the composite endpoints of death from any cause or repeat hospitalization (42.5% vs. 71.6%, P<0.001) and NYHA Functional Class III or IV symptoms (25.2% vs. 58.0%, P<0.001) at one year. However, higher incidences of major strokes (5.0% vs. 1.6%, P=0.06) and major vascular complications (16.2% vs. 1.1%, P<0.001) were seen.
While the one-year mortality benefit of TAVI over standard nonoperative therapy was clearly demonstrated by this study, hospitalists should interpret these data cautiously with respect to their inpatient populations as exclusion criteria were extensive, including bicuspid or noncalcified aortic valve, LVEF less than 20%, and severe renal insufficiency. Additionally, the entity of standard therapy was poorly delineated.
Bottom line: TAVI should be considered in patients with severe aortic stenosis who are not suitable surgical candidates.
Citation: Leon MB, Smith CR, Mack M, et al. Transcatheter aortic-valve implantation for aortic stenosis in patients who cannot undergo surgery. N Engl J Med. 2010;363(17):1597-1607.
ADEPT Score Better Predicts Six-Month Mortality in Nursing Home Residents with Advanced Dementia
Clinical question: Are current Medicare hospice eligibility guidelines accurate enough to predict six-month survival in nursing home residents with dementia when compared with the Advanced Dementia Prognostic Tool (ADEPT)?
Background: Incorrectly estimating the life expectancy in almost 5 million nursing home residents with dementia prevents enrollment to palliative care and hospice for those who would benefit most. Creating and validating a mortality risk score would allow increased services to these residents.
Study design: Prospective cohort study.
Setting: Twenty-one nursing homes in Boston.
Synopsis: A total of 606 nursing home residents with advanced dementia were recruited for this study. Each resident was assessed for Medicare hospice eligibility and assigned an ADEPT score. Mortality rate was determined six months later. These two assessment tools were compared regarding their ability to predict six-month mortality.
The mean ADEPT score was 10.1 (range of 1.0-32.5), with a higher score meaning worse prognosis. Sixty-five residents (10.7%) met Medicare hospice eligibility guidelines. A total of 111 residents (18.3%) died.
The ADEPT score was more sensitive (90% vs. 20%) but less specific (28.3% vs. 89%) than Medicare guidelines. The area under the receiver operating characteristic (AUROC) curve was 0.67 (95% CI, 0.62-0.72) for ADEPT and 0.55 (95% CI, 0.51-0.59) for Medicare.
ADEPT was slightly better than hospice guidelines in predicting six-month mortality.
This study was limited in that the resident data were collected at a single random time point and might not reflect reality, as with palliative care and hospice, there usually is a decline in status that stimulates the referrals.
Bottom line: The ADEPT score might better estimate the six-month mortality in nursing home residents with dementia, which can help expand the enrollment of palliative care and hospice for these residents.
Citation: Mitchell SL, Miller SC, Teno JM, Kiely DK, Davis RB, Shaffer ML. Prediction of 6-month survival of nursing home residents with advanced dementia using ADEPT vs hospice eligibility guidelines. JAMA. 2010;304(17):1929-1935.
Residents Concerned about How New ACGME Duty-Hour Restrictions Will Impact Patient Care and Education
Clinical question: How do residents believe the forthcoming revised ACGME Rules for Supervision and Duty Hours will impact their residency?
Background: On July 1, revised ACGME duty-hour rules go into effect, limiting PGY-1 residents to 16-hour duty periods and PGY-2 and above to 28 hours. The effect these recommendations will have on patient care and resident education is unknown.
Study design: Twenty-question electronic, anonymous survey.
Setting: Twenty-three medical centers in the U.S., including residents from all disciplines and years in training.
Synopsis: Twenty-two percent of residents responded to the survey (n=2,521). Overall, 48% of residents disagreed with this statement: “Overall the changes will have a positive effect on education,” while only 26% agreed. Approximately half of those surveyed agreed that the revisions would improve their quality of life, but the same percentage also believed the revisions would increase the length of their residencies.
Residents reacted negatively to the idea that the proposed changes would improve patient safety and quality of care delivered, promote education over service obligations, and prepare them to assume senior roles. In free-text comments, residents expressed concerns about an increased number of handoffs and decreased continuity of care.
While the sample size is large and diverse, results of this survey can be affected by voluntary response bias and, therefore, could be skewed toward more extreme responses (in this case, more negative responses). The wide distribution of the responses suggests this might not be the case.
Bottom line: Residents do not believe the new requirements—though they could improve their quality of life—will positively impact patient care and education.
Citation: Drolet BC, Spalluto LB, Fischer SA. Residents’ perspectives on ACGME regulation of supervision and duty hours—a national survey. N Engl J Med. 2010;363(23):e34(1)-e34.
Decision Rule Might Help Clinicians Decide When to Order Renal Ultrasound to Evaluate Hospitalized Patients with Acute Kidney Injury
Clinical question: Can a clinical prediction rule aid clinicians in deciding when to order a renal ultrasound (RUS) in hospitalized patients with acute kidney injury?
Background: RUS routinely is obtained in patients admitted with acute kidney injury (AKI) to rule out obstruction as a cause of AKI. It is not known if this test adds any additional information in the routine evaluation of AKI and if obtaining the test is cost-effective.
Study design: Cross-sectional study.
Setting: Yale-New Haven Hospital in Connecticut.
Synopsis: This study evaluated 997 inpatients with AKI who underwent RUS. Outcome events were RUS identification of hydronephrosis (HN) or hydronephrosis requiring intervention (HNRI). The patients were divided into two samples: 200 in derivation sample and 797 in validation sample. The derivation sample was used to identify specific factors associated with HN. Seven clinical variables were identified and were used to create three risk groups: low, medium, and high.
In the validation sample, 10.6% of patients had HN and 3.3% had HNRI. The negative predictive value for HN was 96.9%, sensitivity 91.8%, and negative likelihood ratio 0.27. The number needed to screen (NNS) low-risk patients for HN was 32 and 223 for HNRI. Based on their findings, if the patient was classified low-risk, clinicians might be able to delay or avoid ordering RUS.
The major limitation of this study was that it was based at a single institution. This study only evaluated RUS obtained in patients who were hospitalized and might not be applicable to outpatients.
Bottom line: RUS was not found to change clinical management in patients with AKI and classified as low-risk for HN. Limiting RUS to patients who are high-risk for obstruction will increase the chance of finding useful clinical information that can change management decisions and limit cost of unnecessary testing.
Citation: Licurse A, Kim MC, Dziura J, et al. Renal ultrasonography in the evaluation of acute kidney injury: developing a risk stratification framework. Arch Intern Med. 2010;170(21):1900-1907.
Romiplostim Has Higher Rate of Platelet Response and Fewer Adverse Events in Patients with Immune Thrombocytopenia
Clinical question: Does the use of romiplostim lead to increased platelet counts and lower rates of splenectomy and other adverse events when compared with standard therapy in patients with immune thrombocytopenia?
Background: Romiplostim is a thrombopoetin mimetic used to increase platelet counts in immune thrombocytopenia. Initial treatments for this disease involve glucocorticoids or intravenous immune globulin. Most patients require second-line medical or surgical therapies, including splenectomy.
Study design: Randomized, open-label controlled trial.
Setting: Eighty-five medical centers in North America, Europe, and Australia.
Synopsis: A total of 234 patients were randomized in a 2:1 ratio to receive either romiplostim or the medical standard of care. Co-primary endpoints were the incidence of treatment failure and the incidence of splenectomy; secondary endpoints included time to splenectomy, platelet count, platelet response, and quality of life. Treatment failure was defined as a platelet count of 20x109 per liter or lower for four weeks, or a major bleeding event.
At the end of 52 weeks, patients receiving romiplostim had higher platelet counts, fewer bleeding events, less need for splenectomy (9% vs. 36%), and a better quality of life.
The short-term use of romiplostim in this study was not associated with an increase in adverse events when compared with standard therapy. However, maintenance of the elevated platelet count, which results from romiplostim treatment, requires continuous use of the drug; the long-term effects are unknown.
Bottom line: In patients with immune thrombocytopenia, romiplostim leads to increased platelet counts, decreased bleeding events, and decreased need for splenectomy compared to standard of care. However, the cost of the medication, when compared with current therapies, could be prohibitive.
Citation: Kuter DJ, Rummel M, Boccia R, et al. Romiplostim or standard of care in patients with immune thrombocytopenia. N Engl J Med. 2010;363(20):1889-1899. TH
In This Edition
Literature at a Glance
A guide to this month’s studies
- Early ambulation and LOS in geriatric patients
- Patient-safety movement and hospital harm rates
- Lifestyle modification and weight loss
- Outcomes of transcatheter aortic-valve implantation
- Tool for predicting mortality in advanced dementia
- Residents’ opinion of new duty-hour regulations
- Renal ultrasound predictor for acute kidney injury
- Romiplostim use in immune thrombocytopenia
Increasing Ambulation within 48 Hours of Admission Decreases LOS by Two Days
Clinical question: Is there an association between an early increase in ambulation and length of stay (LOS) in geriatric patients admitted with an acute illness?
Background: Early ambulation leading to better recovery in such illnesses as pneumonia and myocardial infarction is well known, as is early ambulation after hip fracture surgery to prevent complications. However, no specific guidelines exist in regard to ambulation in older patients.
Study design: Prospective, nonblinded study.
Setting: Acute-care geriatric unit in an academic medical center.
Synopsis: A total of 162 patients 65 or older were studied. Data were collected during a four-month period in 2009. A Step Activity Monitor (SAM) was placed on admission. Patients were instructed to walk as usual. Investigators measured the number of steps taken per day and change in steps between the first and second day.
Patients averaged 662.1 steps per day, with a mean step change of 196.5 steps. The adjusted mean difference in LOS for patients who increased their total steps by 600 or more between the first and second day was 2.13 days (95% CI, 1.05-3.97). Patients who had low or negative changes in steps had longer LOS. The 32 patients who walked more than 600 steps were more likely to be men (P=0.02), independently ambulate (P<0.01), and have admitting orders of “ambulate with assist” (P=0.03).
One limitation of this study is that patients who walked more might have been less ill or very functional on admission.
Bottom line: Increasing ambulation early in a hospitalization (first two days) is associated with a decreased LOS in an elderly population.
Citation: Fisher SR, Kuo YF, Graham JE, Ottenbacher KJ, Ostir GV. Early ambulation and length of stay in older adults hospitalized for acute illness. Arch Intern Med. 2010;170(21):1942-1943.
Despite Efforts to Improve Patient Safety in Hospitals, No Reduction in Longitudinal Rates of Harm
Clinical question: As hospitals focus more on programs to improve patient safety, has the rate of harms decreased?
Background: Since the Institute of Medicine published a groundbreaking report (To Err is Human) a little more than a decade ago, policymakers, hospitals, and healthcare organizations have focused more on efforts to improve patient safety with the goal of reducing harms. It is not clear if these efforts have reduced harms.
Study design: Retrospective chart review.
Setting: Ten hospitals in North Carolina.
Synopsis: Ten charts per quarter were randomly selected from each hospital from January 2002 through December 2007. Internal and external reviewers used the IHI Global Trigger Tool for Measuring Adverse Events to identify rates of harm. Harms were classified into categories of severity and assessed for preventability.
Kappa scores were generally higher for internal reviewers, indicating higher reliability for internal reviewers. Internal reviewers identified 588 harms for 10,415 patient days (25.1 harms per 100 patient days), which occurred in 423 unique patients (18.1%). A majority (63.1%) of harms were considered preventable. Forty-one percent of harms were temporary and required intervention; 2.4% caused or contributed to a patient’s death.
There was no significant change over time in the rate of harms (regardless of reviewer type) even after adjusting for demographics.
This study is limited because it is based only in North Carolina hospitals. It was not powered to evaluate change in individual hospitals. There might have been unmeasurable improvements that were not accounted for by the trigger tool.
Bottom line: Despite a higher focus on patient safety, investigators did not find a decrease in the rate of harms. A majority of the harms were preventable. This study should not preclude efforts to continue to improve patient safety.
Citation: Landrigan CP, Parry GJ, Bones CB, Hackbarth AD, Goldmann DA, Sharek PJ. Temporal trends in rates of patient harm resulting from medical care. N Engl J Med. 2010;363(22):2124-2134.
Intensive Lifestyle Modification Improves Weight Loss in Severely Obese Individuals
Clinical question: Does the combination of diet modification and increased physical activity lead to weight loss and improve health risks in severely obese patients?
Background: Obesity is at epidemic proportions, but there are no evidence-based treatment guidelines for severe obesity.
Study design: Randomized, single-blind trial.
Setting: Community volunteers.
Synopsis: A total of 130 individuals with a body mass index (BMI) of ≥35 were randomized to receive lifestyle interventions consisting of diet and initial physical activity for 12 months, or diet for six months and delayed physical activity for the remainder of the year.
The initial-physical-activity group demonstrated greater weight loss at six months, but the overall weight loss did not differ between the two groups. At 12 months, the initial physical activity group lost 12.1 kg and the delayed-physical-activity group lost 9.87 kg. Both groups demonstrated significantly reduced blood pressure, reduced serum liver enzymes, and improved insulin resistance.
Candidates with a history of coronary artery disease, uncontrolled blood pressure, or diabetes were excluded. Participants were provided with prepackaged meal replacements for the first six months and received financial compensation for participation in the study.
This study is limited by the fact that a majority of the participants were female (85.1%). Providing meals to the participants also limits the application of this program to the general public.
Bottom line: The results of this study reflect the importance of diet and exercise on weight loss in obese individuals. However, adherence to the goals of the study required multiple individual and group meetings throughout the year, the provision of prepackaged meals, and some financial incentive for compliance.
Citation: Goodpaster GH, Delany JP, Otto AD, et al. Effects of diet and physical activity interventions on weight loss and cardiometabolic risk factors in severely obese adults: a randomized trial. JAMA. 2010;304 (16):1795-1802.
Transcatheter Aortic-Valve Implantation Is Superior to Standard Nonoperative Therapy for Symptomatic Aortic Stenosis
Clinical question: Is there a mortality benefit to transcatheter valve implantation over standard therapy in nonsurgical candidates with severe aortic stenosis (AS)?
Background: Untreated, symptomatic AS has a high rate of death, but a significant proportion of patients with severe aortic stenosis are poor surgical candidates. Available since 2002, transcatheter aortic-valve implantation (TAVI) is a promising, nonsurgical treatment option for severe AS. However, to date, TAVI has lacked rigorous clinical data.
Study design: Prospective, multicenter, randomized, active-treatment-controlled clinical trial.
Setting: Twenty-one centers, 17 of which were in the U.S.
Synopsis: A total of 358 patients with severe AS who were considered nonsurgical candidates were randomized to either TAVI or standard therapy. A majority (83.8%) of the patients in the standard group underwent balloon aortic valvuloplasty.
Researchers found a significant reduction (HR 0.55, 95% CI 0.40 to 0.74, P<0.001) in all-cause mortality at one year in those patients undergoing TAVI (30.7%) vs. standard therapy (50.7%). Additional benefits included lower rates of the composite endpoints of death from any cause or repeat hospitalization (42.5% vs. 71.6%, P<0.001) and NYHA Functional Class III or IV symptoms (25.2% vs. 58.0%, P<0.001) at one year. However, higher incidences of major strokes (5.0% vs. 1.6%, P=0.06) and major vascular complications (16.2% vs. 1.1%, P<0.001) were seen.
While the one-year mortality benefit of TAVI over standard nonoperative therapy was clearly demonstrated by this study, hospitalists should interpret these data cautiously with respect to their inpatient populations as exclusion criteria were extensive, including bicuspid or noncalcified aortic valve, LVEF less than 20%, and severe renal insufficiency. Additionally, the entity of standard therapy was poorly delineated.
Bottom line: TAVI should be considered in patients with severe aortic stenosis who are not suitable surgical candidates.
Citation: Leon MB, Smith CR, Mack M, et al. Transcatheter aortic-valve implantation for aortic stenosis in patients who cannot undergo surgery. N Engl J Med. 2010;363(17):1597-1607.
ADEPT Score Better Predicts Six-Month Mortality in Nursing Home Residents with Advanced Dementia
Clinical question: Are current Medicare hospice eligibility guidelines accurate enough to predict six-month survival in nursing home residents with dementia when compared with the Advanced Dementia Prognostic Tool (ADEPT)?
Background: Incorrectly estimating the life expectancy in almost 5 million nursing home residents with dementia prevents enrollment to palliative care and hospice for those who would benefit most. Creating and validating a mortality risk score would allow increased services to these residents.
Study design: Prospective cohort study.
Setting: Twenty-one nursing homes in Boston.
Synopsis: A total of 606 nursing home residents with advanced dementia were recruited for this study. Each resident was assessed for Medicare hospice eligibility and assigned an ADEPT score. Mortality rate was determined six months later. These two assessment tools were compared regarding their ability to predict six-month mortality.
The mean ADEPT score was 10.1 (range of 1.0-32.5), with a higher score meaning worse prognosis. Sixty-five residents (10.7%) met Medicare hospice eligibility guidelines. A total of 111 residents (18.3%) died.
The ADEPT score was more sensitive (90% vs. 20%) but less specific (28.3% vs. 89%) than Medicare guidelines. The area under the receiver operating characteristic (AUROC) curve was 0.67 (95% CI, 0.62-0.72) for ADEPT and 0.55 (95% CI, 0.51-0.59) for Medicare.
ADEPT was slightly better than hospice guidelines in predicting six-month mortality.
This study was limited in that the resident data were collected at a single random time point and might not reflect reality, as with palliative care and hospice, there usually is a decline in status that stimulates the referrals.
Bottom line: The ADEPT score might better estimate the six-month mortality in nursing home residents with dementia, which can help expand the enrollment of palliative care and hospice for these residents.
Citation: Mitchell SL, Miller SC, Teno JM, Kiely DK, Davis RB, Shaffer ML. Prediction of 6-month survival of nursing home residents with advanced dementia using ADEPT vs hospice eligibility guidelines. JAMA. 2010;304(17):1929-1935.
Residents Concerned about How New ACGME Duty-Hour Restrictions Will Impact Patient Care and Education
Clinical question: How do residents believe the forthcoming revised ACGME Rules for Supervision and Duty Hours will impact their residency?
Background: On July 1, revised ACGME duty-hour rules go into effect, limiting PGY-1 residents to 16-hour duty periods and PGY-2 and above to 28 hours. The effect these recommendations will have on patient care and resident education is unknown.
Study design: Twenty-question electronic, anonymous survey.
Setting: Twenty-three medical centers in the U.S., including residents from all disciplines and years in training.
Synopsis: Twenty-two percent of residents responded to the survey (n=2,521). Overall, 48% of residents disagreed with this statement: “Overall the changes will have a positive effect on education,” while only 26% agreed. Approximately half of those surveyed agreed that the revisions would improve their quality of life, but the same percentage also believed the revisions would increase the length of their residencies.
Residents reacted negatively to the idea that the proposed changes would improve patient safety and quality of care delivered, promote education over service obligations, and prepare them to assume senior roles. In free-text comments, residents expressed concerns about an increased number of handoffs and decreased continuity of care.
While the sample size is large and diverse, results of this survey can be affected by voluntary response bias and, therefore, could be skewed toward more extreme responses (in this case, more negative responses). The wide distribution of the responses suggests this might not be the case.
Bottom line: Residents do not believe the new requirements—though they could improve their quality of life—will positively impact patient care and education.
Citation: Drolet BC, Spalluto LB, Fischer SA. Residents’ perspectives on ACGME regulation of supervision and duty hours—a national survey. N Engl J Med. 2010;363(23):e34(1)-e34.
Decision Rule Might Help Clinicians Decide When to Order Renal Ultrasound to Evaluate Hospitalized Patients with Acute Kidney Injury
Clinical question: Can a clinical prediction rule aid clinicians in deciding when to order a renal ultrasound (RUS) in hospitalized patients with acute kidney injury?
Background: RUS routinely is obtained in patients admitted with acute kidney injury (AKI) to rule out obstruction as a cause of AKI. It is not known if this test adds any additional information in the routine evaluation of AKI and if obtaining the test is cost-effective.
Study design: Cross-sectional study.
Setting: Yale-New Haven Hospital in Connecticut.
Synopsis: This study evaluated 997 inpatients with AKI who underwent RUS. Outcome events were RUS identification of hydronephrosis (HN) or hydronephrosis requiring intervention (HNRI). The patients were divided into two samples: 200 in derivation sample and 797 in validation sample. The derivation sample was used to identify specific factors associated with HN. Seven clinical variables were identified and were used to create three risk groups: low, medium, and high.
In the validation sample, 10.6% of patients had HN and 3.3% had HNRI. The negative predictive value for HN was 96.9%, sensitivity 91.8%, and negative likelihood ratio 0.27. The number needed to screen (NNS) low-risk patients for HN was 32 and 223 for HNRI. Based on their findings, if the patient was classified low-risk, clinicians might be able to delay or avoid ordering RUS.
The major limitation of this study was that it was based at a single institution. This study only evaluated RUS obtained in patients who were hospitalized and might not be applicable to outpatients.
Bottom line: RUS was not found to change clinical management in patients with AKI and classified as low-risk for HN. Limiting RUS to patients who are high-risk for obstruction will increase the chance of finding useful clinical information that can change management decisions and limit cost of unnecessary testing.
Citation: Licurse A, Kim MC, Dziura J, et al. Renal ultrasonography in the evaluation of acute kidney injury: developing a risk stratification framework. Arch Intern Med. 2010;170(21):1900-1907.
Romiplostim Has Higher Rate of Platelet Response and Fewer Adverse Events in Patients with Immune Thrombocytopenia
Clinical question: Does the use of romiplostim lead to increased platelet counts and lower rates of splenectomy and other adverse events when compared with standard therapy in patients with immune thrombocytopenia?
Background: Romiplostim is a thrombopoetin mimetic used to increase platelet counts in immune thrombocytopenia. Initial treatments for this disease involve glucocorticoids or intravenous immune globulin. Most patients require second-line medical or surgical therapies, including splenectomy.
Study design: Randomized, open-label controlled trial.
Setting: Eighty-five medical centers in North America, Europe, and Australia.
Synopsis: A total of 234 patients were randomized in a 2:1 ratio to receive either romiplostim or the medical standard of care. Co-primary endpoints were the incidence of treatment failure and the incidence of splenectomy; secondary endpoints included time to splenectomy, platelet count, platelet response, and quality of life. Treatment failure was defined as a platelet count of 20x109 per liter or lower for four weeks, or a major bleeding event.
At the end of 52 weeks, patients receiving romiplostim had higher platelet counts, fewer bleeding events, less need for splenectomy (9% vs. 36%), and a better quality of life.
The short-term use of romiplostim in this study was not associated with an increase in adverse events when compared with standard therapy. However, maintenance of the elevated platelet count, which results from romiplostim treatment, requires continuous use of the drug; the long-term effects are unknown.
Bottom line: In patients with immune thrombocytopenia, romiplostim leads to increased platelet counts, decreased bleeding events, and decreased need for splenectomy compared to standard of care. However, the cost of the medication, when compared with current therapies, could be prohibitive.
Citation: Kuter DJ, Rummel M, Boccia R, et al. Romiplostim or standard of care in patients with immune thrombocytopenia. N Engl J Med. 2010;363(20):1889-1899. TH