User login
Visualization and Reduction of a Meniscal Capsular Junction Tear in the Knee: An Arthroscopic Surgical Technique
The annual incidence of anterior cruciate ligament (ACL) injury in the general US population is estimated at 1 in 3000, or approximately 100,000 ACL injuries per year.1 The incidence of meniscal injuries after ACL tears ranges from 34% to 92%,2 with peripheral posterior horn tears of the medial meniscus accounting for 40% of the meniscal pathology.3
Although several meniscal tear patterns and their treatments have been described in the literature, posterior medial meniscal capsular junction (MCJ) tears have not been adequately addressed. Thijn4 found the accuracy of routine anterior portal arthroscopy in identifying medial meniscus tears was only 81%. Gillies and Seligson5 found a 25% arthroscopic false-negative rate caused by failure to detect peripheral tears in the posterior horn of the medial meniscus.
We reviewed 781 (517 male, 264 female) patients who underwent ACL reconstruction at our clinic and found a 12.3% incidence of MCJ tear with primary ACL injury and a 23.6% incidence of MCJ tear with revision ACL reconstruction. We believe this is a specific injury pattern. If not looked for during arthroscopy, it can be missed. Whether this tear pattern behaves differently from a posterior medial meniscus tear is yet to be determined.
To address such tear patterns, with or without ACL reconstruction, we use an arthroscopic repair technique that shows direct visualization of the tear and its reduction.
Materials and Methods
The standard anterior medial and lateral arthroscopic portals are established. A 30° scope is placed in the anterior lateral portal, and an arthroscopic shaver is used to débride the ACL remnants, including the footprint and the femoral insertion site. The camera is then adjusted to look straight down. Next, it is placed between the posterior cruciate ligament (PCL) and the medial femoral condyle and advanced toward the posterior capsule. It is then adjusted to view medially (Figure 1). If there is a tear (Figures 2A, 2B), a posterior medial portal (described by Gillquist and colleagues6) is established using an 18-gauge spinal needle for localization followed by a small stab incision through the skin. The spinal needle is left in position to obtain the correct angle for the suture passer (Figure 3). A 70° Hewson suture passer (Smith & Nephew, Memphis, Tennessee) is passed through the posterior medial portal.
Once inside the joint, the suture passer is passed through the capsule and then through the posterior horn of the meniscus (Figure 4). A loop grasper is used to grab the suture on the end of the passer and then is brought out the posterior medial portal and loaded with a No. 2 MaxBraid suture (Biomet, Warsaw, Indiana) (Figure 5). In some cases, the suture passer’s wire goes out the notch toward the anterior aspect of the knee. If this occurs, the loop grasper can be used to grab this wire from the anterior medial portal and load with the MaxBraid suture.
Standard arthroscopic knot-tying techniques are used under direct visualization showing the reduction of the capsule to the meniscus (Figure 6). This is done from the posterior medial portal. The excess suture is cut with an arthroscopic suture cutter in the standard fashion. In the rare case of an intact ACL with this same tear pattern, the same technique can be used. If there is difficulty moving past the intact ACL and PCL, a posterior lateral portal can be used as another accessory portal. The arthroscope can then be placed in the posterior lateral portal, while the posterior medial portal can be used as the working portal. Care must be taken in either technique to avoid soft-tissue bridges.
Discussion
Previous biomechanical studies have shown the meniscus to be important to knee stability. In an ACL-deficient knee, the posterior medial meniscus is important as a secondary stabilizer, and for that reason it is crucial to identify and repair tears there to avoid risking extra force on the ACL graft.7,8 We think an MCJ tear can potentially compromise knee stability as well, so there is a need to examine the posterior aspect of the knee during every knee arthroscopy. However, biomechanical studies must be performed to validate this theory.
To assess whether orthopedists in general are aware of and concerned about MCJ tears, a survey was e-mailed to members of the Arthroscopy Association of North America (AANA) and the American Sports Medicine Fellowship Society (ASMFS). Sixty-seven orthopedic surgeons who perform ACL reconstruction surgeries responded to some or all of the following questions. Nearly half (48%) of the surgeons said they always assess the posteromedial MCJ by placing the camera between the PCL and the medial femoral condyle. Only 25% said MCJ tears should be repaired always, but another 64% said these tears should be repaired sometimes. Thus, 89% responded that at least some MCJ tears should be repaired. Most (88%) said these tears could sometimes or always be a source of chronic pain. Also, 92% said these tears could sometimes or always change the contact pressures in the knee, and 66% said these tears could sometimes or always change the rotational stability of the knee. Finally, 60% said MCJ tears could sometimes or always affect ACL graft failure. These data show a need to determine an appropriate surgical technique that will help treat MCJ tears.
There is a vast amount of literature about the meniscus, but there are few current studies on the specific entity of MCJ tears. We think these tears act similarly to posterior meniscus tears and should be addressed similarly. MCJ tears are easily missed on anterior arthroscopy. In every knee arthroscopy, the posterior aspect of the knee should be checked for these injuries, particularly in ACL-deficient knees. A lesion found within the capsule can be repaired with the technique we have described.
1. Fu FH, Cohen SB. Current Concepts in ACL Reconstruction. Thorofare, NJ: Slack; 2008.
2. Simonian PT, Cole BJ, Bach BR. Sports Injuries of the Knee: Surgical Approaches. New York, NY: Thieme; 2006.
3. Smith JP 3rd, Barrett GR. Medial and lateral meniscal tear patterns in anterior cruciate ligament-deficient knees. A prospective analysis of 575 tears. Am J Sports Med. 2001;29(4):415-419.
4. Thijn CJ. Accuracy of double-contrast arthrography and arthroscopy of the knee joint. Skeletal Radiol. 1982;8(3):187-192.
5. Gillies H, Seligson D. Precision in the diagnosis of meniscal lesions: a comparison of clinical evaluation, arthrography, and arthroscopy. J Bone Joint Surg Am. 1979;61(3):343-346.
6. Gillquist J, Hagberg G, Oretorp N. Arthroscopic examination of the posteromedial compartment of the knee joint. Int Orthop. 1979;3(1):13-18.
7. Levy IM, Torzilli PA, Warren RF. The effect of medial meniscectomy on anterior-posterior motion of the knee. J Bone Joint Surg Am. 1982;64(6):883-888.
8. Allen CR, Wong EK, Livesay GA, Sakane M, Fu FH, Woo SL. Importance of the medial meniscus in the anterior cruciate ligament–deficient knee. J Orthop Res. 2000;18(1):109-115.
The annual incidence of anterior cruciate ligament (ACL) injury in the general US population is estimated at 1 in 3000, or approximately 100,000 ACL injuries per year.1 The incidence of meniscal injuries after ACL tears ranges from 34% to 92%,2 with peripheral posterior horn tears of the medial meniscus accounting for 40% of the meniscal pathology.3
Although several meniscal tear patterns and their treatments have been described in the literature, posterior medial meniscal capsular junction (MCJ) tears have not been adequately addressed. Thijn4 found the accuracy of routine anterior portal arthroscopy in identifying medial meniscus tears was only 81%. Gillies and Seligson5 found a 25% arthroscopic false-negative rate caused by failure to detect peripheral tears in the posterior horn of the medial meniscus.
We reviewed 781 (517 male, 264 female) patients who underwent ACL reconstruction at our clinic and found a 12.3% incidence of MCJ tear with primary ACL injury and a 23.6% incidence of MCJ tear with revision ACL reconstruction. We believe this is a specific injury pattern. If not looked for during arthroscopy, it can be missed. Whether this tear pattern behaves differently from a posterior medial meniscus tear is yet to be determined.
To address such tear patterns, with or without ACL reconstruction, we use an arthroscopic repair technique that shows direct visualization of the tear and its reduction.
Materials and Methods
The standard anterior medial and lateral arthroscopic portals are established. A 30° scope is placed in the anterior lateral portal, and an arthroscopic shaver is used to débride the ACL remnants, including the footprint and the femoral insertion site. The camera is then adjusted to look straight down. Next, it is placed between the posterior cruciate ligament (PCL) and the medial femoral condyle and advanced toward the posterior capsule. It is then adjusted to view medially (Figure 1). If there is a tear (Figures 2A, 2B), a posterior medial portal (described by Gillquist and colleagues6) is established using an 18-gauge spinal needle for localization followed by a small stab incision through the skin. The spinal needle is left in position to obtain the correct angle for the suture passer (Figure 3). A 70° Hewson suture passer (Smith & Nephew, Memphis, Tennessee) is passed through the posterior medial portal.
Once inside the joint, the suture passer is passed through the capsule and then through the posterior horn of the meniscus (Figure 4). A loop grasper is used to grab the suture on the end of the passer and then is brought out the posterior medial portal and loaded with a No. 2 MaxBraid suture (Biomet, Warsaw, Indiana) (Figure 5). In some cases, the suture passer’s wire goes out the notch toward the anterior aspect of the knee. If this occurs, the loop grasper can be used to grab this wire from the anterior medial portal and load with the MaxBraid suture.
Standard arthroscopic knot-tying techniques are used under direct visualization showing the reduction of the capsule to the meniscus (Figure 6). This is done from the posterior medial portal. The excess suture is cut with an arthroscopic suture cutter in the standard fashion. In the rare case of an intact ACL with this same tear pattern, the same technique can be used. If there is difficulty moving past the intact ACL and PCL, a posterior lateral portal can be used as another accessory portal. The arthroscope can then be placed in the posterior lateral portal, while the posterior medial portal can be used as the working portal. Care must be taken in either technique to avoid soft-tissue bridges.
Discussion
Previous biomechanical studies have shown the meniscus to be important to knee stability. In an ACL-deficient knee, the posterior medial meniscus is important as a secondary stabilizer, and for that reason it is crucial to identify and repair tears there to avoid risking extra force on the ACL graft.7,8 We think an MCJ tear can potentially compromise knee stability as well, so there is a need to examine the posterior aspect of the knee during every knee arthroscopy. However, biomechanical studies must be performed to validate this theory.
To assess whether orthopedists in general are aware of and concerned about MCJ tears, a survey was e-mailed to members of the Arthroscopy Association of North America (AANA) and the American Sports Medicine Fellowship Society (ASMFS). Sixty-seven orthopedic surgeons who perform ACL reconstruction surgeries responded to some or all of the following questions. Nearly half (48%) of the surgeons said they always assess the posteromedial MCJ by placing the camera between the PCL and the medial femoral condyle. Only 25% said MCJ tears should be repaired always, but another 64% said these tears should be repaired sometimes. Thus, 89% responded that at least some MCJ tears should be repaired. Most (88%) said these tears could sometimes or always be a source of chronic pain. Also, 92% said these tears could sometimes or always change the contact pressures in the knee, and 66% said these tears could sometimes or always change the rotational stability of the knee. Finally, 60% said MCJ tears could sometimes or always affect ACL graft failure. These data show a need to determine an appropriate surgical technique that will help treat MCJ tears.
There is a vast amount of literature about the meniscus, but there are few current studies on the specific entity of MCJ tears. We think these tears act similarly to posterior meniscus tears and should be addressed similarly. MCJ tears are easily missed on anterior arthroscopy. In every knee arthroscopy, the posterior aspect of the knee should be checked for these injuries, particularly in ACL-deficient knees. A lesion found within the capsule can be repaired with the technique we have described.
The annual incidence of anterior cruciate ligament (ACL) injury in the general US population is estimated at 1 in 3000, or approximately 100,000 ACL injuries per year.1 The incidence of meniscal injuries after ACL tears ranges from 34% to 92%,2 with peripheral posterior horn tears of the medial meniscus accounting for 40% of the meniscal pathology.3
Although several meniscal tear patterns and their treatments have been described in the literature, posterior medial meniscal capsular junction (MCJ) tears have not been adequately addressed. Thijn4 found the accuracy of routine anterior portal arthroscopy in identifying medial meniscus tears was only 81%. Gillies and Seligson5 found a 25% arthroscopic false-negative rate caused by failure to detect peripheral tears in the posterior horn of the medial meniscus.
We reviewed 781 (517 male, 264 female) patients who underwent ACL reconstruction at our clinic and found a 12.3% incidence of MCJ tear with primary ACL injury and a 23.6% incidence of MCJ tear with revision ACL reconstruction. We believe this is a specific injury pattern. If not looked for during arthroscopy, it can be missed. Whether this tear pattern behaves differently from a posterior medial meniscus tear is yet to be determined.
To address such tear patterns, with or without ACL reconstruction, we use an arthroscopic repair technique that shows direct visualization of the tear and its reduction.
Materials and Methods
The standard anterior medial and lateral arthroscopic portals are established. A 30° scope is placed in the anterior lateral portal, and an arthroscopic shaver is used to débride the ACL remnants, including the footprint and the femoral insertion site. The camera is then adjusted to look straight down. Next, it is placed between the posterior cruciate ligament (PCL) and the medial femoral condyle and advanced toward the posterior capsule. It is then adjusted to view medially (Figure 1). If there is a tear (Figures 2A, 2B), a posterior medial portal (described by Gillquist and colleagues6) is established using an 18-gauge spinal needle for localization followed by a small stab incision through the skin. The spinal needle is left in position to obtain the correct angle for the suture passer (Figure 3). A 70° Hewson suture passer (Smith & Nephew, Memphis, Tennessee) is passed through the posterior medial portal.
Once inside the joint, the suture passer is passed through the capsule and then through the posterior horn of the meniscus (Figure 4). A loop grasper is used to grab the suture on the end of the passer and then is brought out the posterior medial portal and loaded with a No. 2 MaxBraid suture (Biomet, Warsaw, Indiana) (Figure 5). In some cases, the suture passer’s wire goes out the notch toward the anterior aspect of the knee. If this occurs, the loop grasper can be used to grab this wire from the anterior medial portal and load with the MaxBraid suture.
Standard arthroscopic knot-tying techniques are used under direct visualization showing the reduction of the capsule to the meniscus (Figure 6). This is done from the posterior medial portal. The excess suture is cut with an arthroscopic suture cutter in the standard fashion. In the rare case of an intact ACL with this same tear pattern, the same technique can be used. If there is difficulty moving past the intact ACL and PCL, a posterior lateral portal can be used as another accessory portal. The arthroscope can then be placed in the posterior lateral portal, while the posterior medial portal can be used as the working portal. Care must be taken in either technique to avoid soft-tissue bridges.
Discussion
Previous biomechanical studies have shown the meniscus to be important to knee stability. In an ACL-deficient knee, the posterior medial meniscus is important as a secondary stabilizer, and for that reason it is crucial to identify and repair tears there to avoid risking extra force on the ACL graft.7,8 We think an MCJ tear can potentially compromise knee stability as well, so there is a need to examine the posterior aspect of the knee during every knee arthroscopy. However, biomechanical studies must be performed to validate this theory.
To assess whether orthopedists in general are aware of and concerned about MCJ tears, a survey was e-mailed to members of the Arthroscopy Association of North America (AANA) and the American Sports Medicine Fellowship Society (ASMFS). Sixty-seven orthopedic surgeons who perform ACL reconstruction surgeries responded to some or all of the following questions. Nearly half (48%) of the surgeons said they always assess the posteromedial MCJ by placing the camera between the PCL and the medial femoral condyle. Only 25% said MCJ tears should be repaired always, but another 64% said these tears should be repaired sometimes. Thus, 89% responded that at least some MCJ tears should be repaired. Most (88%) said these tears could sometimes or always be a source of chronic pain. Also, 92% said these tears could sometimes or always change the contact pressures in the knee, and 66% said these tears could sometimes or always change the rotational stability of the knee. Finally, 60% said MCJ tears could sometimes or always affect ACL graft failure. These data show a need to determine an appropriate surgical technique that will help treat MCJ tears.
There is a vast amount of literature about the meniscus, but there are few current studies on the specific entity of MCJ tears. We think these tears act similarly to posterior meniscus tears and should be addressed similarly. MCJ tears are easily missed on anterior arthroscopy. In every knee arthroscopy, the posterior aspect of the knee should be checked for these injuries, particularly in ACL-deficient knees. A lesion found within the capsule can be repaired with the technique we have described.
1. Fu FH, Cohen SB. Current Concepts in ACL Reconstruction. Thorofare, NJ: Slack; 2008.
2. Simonian PT, Cole BJ, Bach BR. Sports Injuries of the Knee: Surgical Approaches. New York, NY: Thieme; 2006.
3. Smith JP 3rd, Barrett GR. Medial and lateral meniscal tear patterns in anterior cruciate ligament-deficient knees. A prospective analysis of 575 tears. Am J Sports Med. 2001;29(4):415-419.
4. Thijn CJ. Accuracy of double-contrast arthrography and arthroscopy of the knee joint. Skeletal Radiol. 1982;8(3):187-192.
5. Gillies H, Seligson D. Precision in the diagnosis of meniscal lesions: a comparison of clinical evaluation, arthrography, and arthroscopy. J Bone Joint Surg Am. 1979;61(3):343-346.
6. Gillquist J, Hagberg G, Oretorp N. Arthroscopic examination of the posteromedial compartment of the knee joint. Int Orthop. 1979;3(1):13-18.
7. Levy IM, Torzilli PA, Warren RF. The effect of medial meniscectomy on anterior-posterior motion of the knee. J Bone Joint Surg Am. 1982;64(6):883-888.
8. Allen CR, Wong EK, Livesay GA, Sakane M, Fu FH, Woo SL. Importance of the medial meniscus in the anterior cruciate ligament–deficient knee. J Orthop Res. 2000;18(1):109-115.
1. Fu FH, Cohen SB. Current Concepts in ACL Reconstruction. Thorofare, NJ: Slack; 2008.
2. Simonian PT, Cole BJ, Bach BR. Sports Injuries of the Knee: Surgical Approaches. New York, NY: Thieme; 2006.
3. Smith JP 3rd, Barrett GR. Medial and lateral meniscal tear patterns in anterior cruciate ligament-deficient knees. A prospective analysis of 575 tears. Am J Sports Med. 2001;29(4):415-419.
4. Thijn CJ. Accuracy of double-contrast arthrography and arthroscopy of the knee joint. Skeletal Radiol. 1982;8(3):187-192.
5. Gillies H, Seligson D. Precision in the diagnosis of meniscal lesions: a comparison of clinical evaluation, arthrography, and arthroscopy. J Bone Joint Surg Am. 1979;61(3):343-346.
6. Gillquist J, Hagberg G, Oretorp N. Arthroscopic examination of the posteromedial compartment of the knee joint. Int Orthop. 1979;3(1):13-18.
7. Levy IM, Torzilli PA, Warren RF. The effect of medial meniscectomy on anterior-posterior motion of the knee. J Bone Joint Surg Am. 1982;64(6):883-888.
8. Allen CR, Wong EK, Livesay GA, Sakane M, Fu FH, Woo SL. Importance of the medial meniscus in the anterior cruciate ligament–deficient knee. J Orthop Res. 2000;18(1):109-115.
Timing of lifestyle interventions for obesity
Obesity has become so pervasive that it is now considered a major health concern during pregnancy. Almost 56% of women aged 20-39 years in the United States are overweight or obese, based on the World Health Organization’s criteria for body mass index (BMI) and data from the 2009-2010 National Health and Nutrition Examination Survey (NHANES). Moreover, 7.5% of women in this age group are morbidly obese, with a body mass index (BMI) greater than 40 kg/m2 (JAMA 2012;307:491-7).
Obesity in pregnancy not only increases the risk of spontaneous abortions and congenital anomalies, it also increases the risk of gestational diabetes (GDM), hypertensive disorders, and other metabolic complications that affect both the mother and fetus.
Of much concern is the increased risk of fetal overgrowth and long-term health consequences for children of obese mothers. Obesity in early pregnancy has been shown to more than double the risk of obesity in the offspring, which in turn puts these children at risk for developing the metabolic syndrome – and, as Dr. Thomas Moore pointed out in September’s Master Class – appears to program these offspring for downstream cardiovascular risk in adulthood.
Mean term birth weights have risen in the United States during the past several decades. In Cleveland, we have seen a significant 116 g increase in mean term birth weight since 1975; this increase encompasses weights from the 5th to the 95th percentiles. Even more concerning is our finding that the ponderal index in our neonatal population has increased because of decreased fetal length over the last decade.
Some recent studies have suggested that the increase in birth weight in the United States has reached a plateau, but our analyses of national trends suggest that such change is secondary to factors such as earlier gestational age of delivery. Concurrently, an alarming number of children and adolescents – 17% of those aged 2-19 years, according to the 2009-2010 NHANES data – are overweight or obese (JAMA 2012;307:483-90).
How to best treat obesity for improved maternal and fetal health has thus become a focus of research. Studies on lifestyle interventions for obese women during pregnancy have aimed to prevent excessive gestational weight gain and decrease adverse perinatal outcomes – mainly macrosomia, GDM, and hypertensive disorders.
However, the results of this recent body of research have been disappointing. Lifestyle interventions initiated during pregnancy have had only limited success in improving perinatal outcomes. The research tells us that while we may be able to reduce excessive gestational weight gain, it is unlikely that we will be successful in reducing fetal overgrowth, GDM, or preeclampsia in obese women.
Moreover, other studies show that it is a high pregravid BMI – not excessive gestational weight gain or the development of GDM – that plays the biggest role in fetal overgrowth and fetal adiposity.
A paradigm shift is in order. We must think about lifestyle intervention and weight loss before pregnancy, when the woman’s metabolic condition can be improved in time to minimize adverse perinatal metabolic outcomes and to maximize metabolic benefits relating to fetal body composition and metabolism.
Role of prepregnancy BMI
In 2008, the Institute of Medicine (IOM) and National Research Council reexamined 1990 guidelines for gestational weight gain. They concluded that excessive weight gain in pregnancy was a primary contributor to the development of obesity in women. In fact, according to the 2009 IOM report, “Weight Gain During Pregnancy: Reexamining the Guidelines” (Washington: National Academy Press, 2009), 38% of normal weight, 63% of overweight, and 46% of obese women had gained weight in excess of the earlier guidelines.
Helping our patients to gain within the guidelines is important. Excessive gestational weight gain is a primary risk factor for maternal postpartum weight retention, which increases the risk for maternal obesity in a subsequent pregnancy. It also has been associated with a modest increased risk of preterm birth and development of type 2 diabetes.
Interestingly, however, high gestational weight gain has not been related to an increased risk of fetal overgrowth or macrosomia in many obese women. Increased gestational weight gain is a greater risk for fetal overgrowth in women who are of normal weight prior to pregnancy (J. Clin. Endocrinol. Metab. 2012;97:3648-54).
Our research has found that in overweight and obese women, it is maternal pregravid BMI – and not gestational weight gain – that presents the greatest risk for fetal macrosomia, and more specifically, the greatest risk for fetal obesity. Even when glucose tolerance levels are normal, overweight and obese women have neonates who are heavier and who have significant increases in the percentage of body fat and fat mass (Am. J. Obstet. Gynecol. 2006;195:1100-3).
In an 8-year prospective study of the perinatal risk factors associated with childhood obesity, we similarly found that maternal pregravid BMI – independent of maternal glucose status or gestational weight gain – was the strongest predictor of childhood obesity and metabolic dysfunction (Am. J. Clin. Nutr. 2009;90:1303-13).
Other studies have teased apart the roles of maternal obesity and GDM in long-term health of offspring. This work has found that maternal obesity during pregnancy is associated with metabolic syndrome in the offspring and an increased risk of type 2 diabetes in youth, independent of maternal diabetes during pregnancy. A recent meta-analysis also reported that, although maternal diabetes is associated with an increased BMI z score, this was no longer significant after adjustments were made for prepregnancy BMI (Diabetologia 2011;54:1957-66).
Maternal pregravid obesity, therefore, is not only a risk factor for neonatal adiposity at birth, but also for the longer-term risk of obesity and metabolic dysfunction in the offspring – independent of maternal GDM or excessive gestational weight gain.
Interventions in Pregnancy
Numerous prospective trials have examined lifestyle interventions for obese women during pregnancy. One randomized controlled study of a low glycemic index diet in pregnancy (coined the ROLO study) involved 800 women in Ireland who had previously delivered an infant weighting greater than 4,000 g. Women were randomized to receive the restricted diet or no intervention at 13 weeks. Despite a decrease in gestational weight gain in the intervention group, there were no differences in birth weight, birth weight percentile, ponderal index, or macrosomia between the two groups (BMJ 2012;345:e5605).
Another randomized controlled trial reported by a Danish group involved an intervention that consisted of dietary guidance, free membership in a fitness center, and personal coaching initiated between 10 and 14 weeks of gestation. There was a decrease in gestational weight gain in the intervention group, but paradoxically, the infants in the intervention group also had significantly higher birth weight, compared with controls (Diabetes Care 2011;34:2502-7).
Additionally, there have been at least five meta-analyses published in the past 2 years looking at lifestyle interventions during pregnancy. All have concluded that interventions initiated during pregnancy have limited success in reducing excessive gestational weight gain but not necessarily to within the IOM guidelines. The literature contains scant evidence to support further benefits for infant or maternal health (in other words, fetal overgrowth, GDM, or hypertensive disorders).
A recent Cochrane review also concluded that the results of several randomized controlled trials suggest no significant difference in GDM incidence between women receiving exercise intervention versus routine care.
Just this year, three additional randomized controlled trials of lifestyle interventions during pregnancy were published. Only one, the Treatment of Obese Pregnant Women (TOP) study, showed a modest effect in decreasing gestational weight gain. None found a reduction in GDM or fetal overgrowth.
Focus on prepregnancy
Obesity is an inflammatory condition that increases the risk of insulin resistance, impaired beta-cell function, and abnormal adiponectin concentrations. In pregnancy, maternal obesity and hyperinsulinemia can affect placental growth and gene expression.
We have studied lean and obese women recruited prior to a planned pregnancy, as well as lean and obese women scheduled for elective pregnancy termination in the first trimester. Our research, some of which we reported recently in the American Journal of Physiology , has shown increased expression of lipogenic and inflammatory genes in maternal adipose tissue and in the placenta of obese women in the early first trimester, before any phenotypic change becomes apparent (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).
Specifically, hyperinsulinemia and/or defective insulin action in obese women appears to affect the placental programming of genes relating to adipokine expression and lipid metabolism, as well as mitrochondrial function. Altered inflammatory and lipid pathways affect the availability of nutrients for the fetus and, consequently, the size and body composition of the fetus. Fetal overgrowth and neonatal adiposity can result.
In addition, our research has shown that obese women have decreased insulin suppression of lipolysis in white adipose tissue, which during pregnancy results in improved lipid availability for fetal fat accretion and lipotoxicity.
When interventions aimed at weight loss and improved insulin sensitivity are undertaken before pregnancy or in the period between pregnancies, we have the opportunity to increase fat oxidation and reduce oxidative stress in early pregnancy. We also may be able to limit placental inflammation and favorably affect placental growth and gene expression. By the second trimester, our research suggests, gene expression in the placenta and early molecular changes in the white adipose tissue have already been programmed and cannot be reversed (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).
In studies by our group and others of interpregnancy weight loss or gain, interpregnancy weight loss has been associated with a lower risk of large-for-gestational-age (LGA) infants, whereas interpregnancy weight gain has been associated with an increased risk of LGA. Preliminary work from our group shows that the decrease in birth weight involves primarily fat and not lean mass.
The 2009 IOM guidelines support weight loss before pregnancy and state that overweight women should receive individual preconceptional counseling to improve diet quality, increase physical activity, and normalize weight. Multifaceted interventions do work: In obese nonpregnant individuals, lifestyle interventions, which include an exercise program, diet, and behavioral modification have been shown to be successful in improving insulin sensitivity, inflammation, and overall metabolic function.
According to the IOM report, preconceptional services aimed at achieving a healthy weight before conceiving will represent “a radical change to the care provided to obese women of childbearing age.” With continuing research and accumulating data, however, the concept is gaining traction as a viable paradigm for improving perinatal outcomes, with long-term benefits for both the mother and her baby.
Dr. Catalano reports that he has no disclosures relevant to this Master Class.
Obesity has become so pervasive that it is now considered a major health concern during pregnancy. Almost 56% of women aged 20-39 years in the United States are overweight or obese, based on the World Health Organization’s criteria for body mass index (BMI) and data from the 2009-2010 National Health and Nutrition Examination Survey (NHANES). Moreover, 7.5% of women in this age group are morbidly obese, with a body mass index (BMI) greater than 40 kg/m2 (JAMA 2012;307:491-7).
Obesity in pregnancy not only increases the risk of spontaneous abortions and congenital anomalies, it also increases the risk of gestational diabetes (GDM), hypertensive disorders, and other metabolic complications that affect both the mother and fetus.
Of much concern is the increased risk of fetal overgrowth and long-term health consequences for children of obese mothers. Obesity in early pregnancy has been shown to more than double the risk of obesity in the offspring, which in turn puts these children at risk for developing the metabolic syndrome – and, as Dr. Thomas Moore pointed out in September’s Master Class – appears to program these offspring for downstream cardiovascular risk in adulthood.
Mean term birth weights have risen in the United States during the past several decades. In Cleveland, we have seen a significant 116 g increase in mean term birth weight since 1975; this increase encompasses weights from the 5th to the 95th percentiles. Even more concerning is our finding that the ponderal index in our neonatal population has increased because of decreased fetal length over the last decade.
Some recent studies have suggested that the increase in birth weight in the United States has reached a plateau, but our analyses of national trends suggest that such change is secondary to factors such as earlier gestational age of delivery. Concurrently, an alarming number of children and adolescents – 17% of those aged 2-19 years, according to the 2009-2010 NHANES data – are overweight or obese (JAMA 2012;307:483-90).
How to best treat obesity for improved maternal and fetal health has thus become a focus of research. Studies on lifestyle interventions for obese women during pregnancy have aimed to prevent excessive gestational weight gain and decrease adverse perinatal outcomes – mainly macrosomia, GDM, and hypertensive disorders.
However, the results of this recent body of research have been disappointing. Lifestyle interventions initiated during pregnancy have had only limited success in improving perinatal outcomes. The research tells us that while we may be able to reduce excessive gestational weight gain, it is unlikely that we will be successful in reducing fetal overgrowth, GDM, or preeclampsia in obese women.
Moreover, other studies show that it is a high pregravid BMI – not excessive gestational weight gain or the development of GDM – that plays the biggest role in fetal overgrowth and fetal adiposity.
A paradigm shift is in order. We must think about lifestyle intervention and weight loss before pregnancy, when the woman’s metabolic condition can be improved in time to minimize adverse perinatal metabolic outcomes and to maximize metabolic benefits relating to fetal body composition and metabolism.
Role of prepregnancy BMI
In 2008, the Institute of Medicine (IOM) and National Research Council reexamined 1990 guidelines for gestational weight gain. They concluded that excessive weight gain in pregnancy was a primary contributor to the development of obesity in women. In fact, according to the 2009 IOM report, “Weight Gain During Pregnancy: Reexamining the Guidelines” (Washington: National Academy Press, 2009), 38% of normal weight, 63% of overweight, and 46% of obese women had gained weight in excess of the earlier guidelines.
Helping our patients to gain within the guidelines is important. Excessive gestational weight gain is a primary risk factor for maternal postpartum weight retention, which increases the risk for maternal obesity in a subsequent pregnancy. It also has been associated with a modest increased risk of preterm birth and development of type 2 diabetes.
Interestingly, however, high gestational weight gain has not been related to an increased risk of fetal overgrowth or macrosomia in many obese women. Increased gestational weight gain is a greater risk for fetal overgrowth in women who are of normal weight prior to pregnancy (J. Clin. Endocrinol. Metab. 2012;97:3648-54).
Our research has found that in overweight and obese women, it is maternal pregravid BMI – and not gestational weight gain – that presents the greatest risk for fetal macrosomia, and more specifically, the greatest risk for fetal obesity. Even when glucose tolerance levels are normal, overweight and obese women have neonates who are heavier and who have significant increases in the percentage of body fat and fat mass (Am. J. Obstet. Gynecol. 2006;195:1100-3).
In an 8-year prospective study of the perinatal risk factors associated with childhood obesity, we similarly found that maternal pregravid BMI – independent of maternal glucose status or gestational weight gain – was the strongest predictor of childhood obesity and metabolic dysfunction (Am. J. Clin. Nutr. 2009;90:1303-13).
Other studies have teased apart the roles of maternal obesity and GDM in long-term health of offspring. This work has found that maternal obesity during pregnancy is associated with metabolic syndrome in the offspring and an increased risk of type 2 diabetes in youth, independent of maternal diabetes during pregnancy. A recent meta-analysis also reported that, although maternal diabetes is associated with an increased BMI z score, this was no longer significant after adjustments were made for prepregnancy BMI (Diabetologia 2011;54:1957-66).
Maternal pregravid obesity, therefore, is not only a risk factor for neonatal adiposity at birth, but also for the longer-term risk of obesity and metabolic dysfunction in the offspring – independent of maternal GDM or excessive gestational weight gain.
Interventions in Pregnancy
Numerous prospective trials have examined lifestyle interventions for obese women during pregnancy. One randomized controlled study of a low glycemic index diet in pregnancy (coined the ROLO study) involved 800 women in Ireland who had previously delivered an infant weighting greater than 4,000 g. Women were randomized to receive the restricted diet or no intervention at 13 weeks. Despite a decrease in gestational weight gain in the intervention group, there were no differences in birth weight, birth weight percentile, ponderal index, or macrosomia between the two groups (BMJ 2012;345:e5605).
Another randomized controlled trial reported by a Danish group involved an intervention that consisted of dietary guidance, free membership in a fitness center, and personal coaching initiated between 10 and 14 weeks of gestation. There was a decrease in gestational weight gain in the intervention group, but paradoxically, the infants in the intervention group also had significantly higher birth weight, compared with controls (Diabetes Care 2011;34:2502-7).
Additionally, there have been at least five meta-analyses published in the past 2 years looking at lifestyle interventions during pregnancy. All have concluded that interventions initiated during pregnancy have limited success in reducing excessive gestational weight gain but not necessarily to within the IOM guidelines. The literature contains scant evidence to support further benefits for infant or maternal health (in other words, fetal overgrowth, GDM, or hypertensive disorders).
A recent Cochrane review also concluded that the results of several randomized controlled trials suggest no significant difference in GDM incidence between women receiving exercise intervention versus routine care.
Just this year, three additional randomized controlled trials of lifestyle interventions during pregnancy were published. Only one, the Treatment of Obese Pregnant Women (TOP) study, showed a modest effect in decreasing gestational weight gain. None found a reduction in GDM or fetal overgrowth.
Focus on prepregnancy
Obesity is an inflammatory condition that increases the risk of insulin resistance, impaired beta-cell function, and abnormal adiponectin concentrations. In pregnancy, maternal obesity and hyperinsulinemia can affect placental growth and gene expression.
We have studied lean and obese women recruited prior to a planned pregnancy, as well as lean and obese women scheduled for elective pregnancy termination in the first trimester. Our research, some of which we reported recently in the American Journal of Physiology , has shown increased expression of lipogenic and inflammatory genes in maternal adipose tissue and in the placenta of obese women in the early first trimester, before any phenotypic change becomes apparent (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).
Specifically, hyperinsulinemia and/or defective insulin action in obese women appears to affect the placental programming of genes relating to adipokine expression and lipid metabolism, as well as mitrochondrial function. Altered inflammatory and lipid pathways affect the availability of nutrients for the fetus and, consequently, the size and body composition of the fetus. Fetal overgrowth and neonatal adiposity can result.
In addition, our research has shown that obese women have decreased insulin suppression of lipolysis in white adipose tissue, which during pregnancy results in improved lipid availability for fetal fat accretion and lipotoxicity.
When interventions aimed at weight loss and improved insulin sensitivity are undertaken before pregnancy or in the period between pregnancies, we have the opportunity to increase fat oxidation and reduce oxidative stress in early pregnancy. We also may be able to limit placental inflammation and favorably affect placental growth and gene expression. By the second trimester, our research suggests, gene expression in the placenta and early molecular changes in the white adipose tissue have already been programmed and cannot be reversed (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).
In studies by our group and others of interpregnancy weight loss or gain, interpregnancy weight loss has been associated with a lower risk of large-for-gestational-age (LGA) infants, whereas interpregnancy weight gain has been associated with an increased risk of LGA. Preliminary work from our group shows that the decrease in birth weight involves primarily fat and not lean mass.
The 2009 IOM guidelines support weight loss before pregnancy and state that overweight women should receive individual preconceptional counseling to improve diet quality, increase physical activity, and normalize weight. Multifaceted interventions do work: In obese nonpregnant individuals, lifestyle interventions, which include an exercise program, diet, and behavioral modification have been shown to be successful in improving insulin sensitivity, inflammation, and overall metabolic function.
According to the IOM report, preconceptional services aimed at achieving a healthy weight before conceiving will represent “a radical change to the care provided to obese women of childbearing age.” With continuing research and accumulating data, however, the concept is gaining traction as a viable paradigm for improving perinatal outcomes, with long-term benefits for both the mother and her baby.
Dr. Catalano reports that he has no disclosures relevant to this Master Class.
Obesity has become so pervasive that it is now considered a major health concern during pregnancy. Almost 56% of women aged 20-39 years in the United States are overweight or obese, based on the World Health Organization’s criteria for body mass index (BMI) and data from the 2009-2010 National Health and Nutrition Examination Survey (NHANES). Moreover, 7.5% of women in this age group are morbidly obese, with a body mass index (BMI) greater than 40 kg/m2 (JAMA 2012;307:491-7).
Obesity in pregnancy not only increases the risk of spontaneous abortions and congenital anomalies, it also increases the risk of gestational diabetes (GDM), hypertensive disorders, and other metabolic complications that affect both the mother and fetus.
Of much concern is the increased risk of fetal overgrowth and long-term health consequences for children of obese mothers. Obesity in early pregnancy has been shown to more than double the risk of obesity in the offspring, which in turn puts these children at risk for developing the metabolic syndrome – and, as Dr. Thomas Moore pointed out in September’s Master Class – appears to program these offspring for downstream cardiovascular risk in adulthood.
Mean term birth weights have risen in the United States during the past several decades. In Cleveland, we have seen a significant 116 g increase in mean term birth weight since 1975; this increase encompasses weights from the 5th to the 95th percentiles. Even more concerning is our finding that the ponderal index in our neonatal population has increased because of decreased fetal length over the last decade.
Some recent studies have suggested that the increase in birth weight in the United States has reached a plateau, but our analyses of national trends suggest that such change is secondary to factors such as earlier gestational age of delivery. Concurrently, an alarming number of children and adolescents – 17% of those aged 2-19 years, according to the 2009-2010 NHANES data – are overweight or obese (JAMA 2012;307:483-90).
How to best treat obesity for improved maternal and fetal health has thus become a focus of research. Studies on lifestyle interventions for obese women during pregnancy have aimed to prevent excessive gestational weight gain and decrease adverse perinatal outcomes – mainly macrosomia, GDM, and hypertensive disorders.
However, the results of this recent body of research have been disappointing. Lifestyle interventions initiated during pregnancy have had only limited success in improving perinatal outcomes. The research tells us that while we may be able to reduce excessive gestational weight gain, it is unlikely that we will be successful in reducing fetal overgrowth, GDM, or preeclampsia in obese women.
Moreover, other studies show that it is a high pregravid BMI – not excessive gestational weight gain or the development of GDM – that plays the biggest role in fetal overgrowth and fetal adiposity.
A paradigm shift is in order. We must think about lifestyle intervention and weight loss before pregnancy, when the woman’s metabolic condition can be improved in time to minimize adverse perinatal metabolic outcomes and to maximize metabolic benefits relating to fetal body composition and metabolism.
Role of prepregnancy BMI
In 2008, the Institute of Medicine (IOM) and National Research Council reexamined 1990 guidelines for gestational weight gain. They concluded that excessive weight gain in pregnancy was a primary contributor to the development of obesity in women. In fact, according to the 2009 IOM report, “Weight Gain During Pregnancy: Reexamining the Guidelines” (Washington: National Academy Press, 2009), 38% of normal weight, 63% of overweight, and 46% of obese women had gained weight in excess of the earlier guidelines.
Helping our patients to gain within the guidelines is important. Excessive gestational weight gain is a primary risk factor for maternal postpartum weight retention, which increases the risk for maternal obesity in a subsequent pregnancy. It also has been associated with a modest increased risk of preterm birth and development of type 2 diabetes.
Interestingly, however, high gestational weight gain has not been related to an increased risk of fetal overgrowth or macrosomia in many obese women. Increased gestational weight gain is a greater risk for fetal overgrowth in women who are of normal weight prior to pregnancy (J. Clin. Endocrinol. Metab. 2012;97:3648-54).
Our research has found that in overweight and obese women, it is maternal pregravid BMI – and not gestational weight gain – that presents the greatest risk for fetal macrosomia, and more specifically, the greatest risk for fetal obesity. Even when glucose tolerance levels are normal, overweight and obese women have neonates who are heavier and who have significant increases in the percentage of body fat and fat mass (Am. J. Obstet. Gynecol. 2006;195:1100-3).
In an 8-year prospective study of the perinatal risk factors associated with childhood obesity, we similarly found that maternal pregravid BMI – independent of maternal glucose status or gestational weight gain – was the strongest predictor of childhood obesity and metabolic dysfunction (Am. J. Clin. Nutr. 2009;90:1303-13).
Other studies have teased apart the roles of maternal obesity and GDM in long-term health of offspring. This work has found that maternal obesity during pregnancy is associated with metabolic syndrome in the offspring and an increased risk of type 2 diabetes in youth, independent of maternal diabetes during pregnancy. A recent meta-analysis also reported that, although maternal diabetes is associated with an increased BMI z score, this was no longer significant after adjustments were made for prepregnancy BMI (Diabetologia 2011;54:1957-66).
Maternal pregravid obesity, therefore, is not only a risk factor for neonatal adiposity at birth, but also for the longer-term risk of obesity and metabolic dysfunction in the offspring – independent of maternal GDM or excessive gestational weight gain.
Interventions in Pregnancy
Numerous prospective trials have examined lifestyle interventions for obese women during pregnancy. One randomized controlled study of a low glycemic index diet in pregnancy (coined the ROLO study) involved 800 women in Ireland who had previously delivered an infant weighting greater than 4,000 g. Women were randomized to receive the restricted diet or no intervention at 13 weeks. Despite a decrease in gestational weight gain in the intervention group, there were no differences in birth weight, birth weight percentile, ponderal index, or macrosomia between the two groups (BMJ 2012;345:e5605).
Another randomized controlled trial reported by a Danish group involved an intervention that consisted of dietary guidance, free membership in a fitness center, and personal coaching initiated between 10 and 14 weeks of gestation. There was a decrease in gestational weight gain in the intervention group, but paradoxically, the infants in the intervention group also had significantly higher birth weight, compared with controls (Diabetes Care 2011;34:2502-7).
Additionally, there have been at least five meta-analyses published in the past 2 years looking at lifestyle interventions during pregnancy. All have concluded that interventions initiated during pregnancy have limited success in reducing excessive gestational weight gain but not necessarily to within the IOM guidelines. The literature contains scant evidence to support further benefits for infant or maternal health (in other words, fetal overgrowth, GDM, or hypertensive disorders).
A recent Cochrane review also concluded that the results of several randomized controlled trials suggest no significant difference in GDM incidence between women receiving exercise intervention versus routine care.
Just this year, three additional randomized controlled trials of lifestyle interventions during pregnancy were published. Only one, the Treatment of Obese Pregnant Women (TOP) study, showed a modest effect in decreasing gestational weight gain. None found a reduction in GDM or fetal overgrowth.
Focus on prepregnancy
Obesity is an inflammatory condition that increases the risk of insulin resistance, impaired beta-cell function, and abnormal adiponectin concentrations. In pregnancy, maternal obesity and hyperinsulinemia can affect placental growth and gene expression.
We have studied lean and obese women recruited prior to a planned pregnancy, as well as lean and obese women scheduled for elective pregnancy termination in the first trimester. Our research, some of which we reported recently in the American Journal of Physiology , has shown increased expression of lipogenic and inflammatory genes in maternal adipose tissue and in the placenta of obese women in the early first trimester, before any phenotypic change becomes apparent (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).
Specifically, hyperinsulinemia and/or defective insulin action in obese women appears to affect the placental programming of genes relating to adipokine expression and lipid metabolism, as well as mitrochondrial function. Altered inflammatory and lipid pathways affect the availability of nutrients for the fetus and, consequently, the size and body composition of the fetus. Fetal overgrowth and neonatal adiposity can result.
In addition, our research has shown that obese women have decreased insulin suppression of lipolysis in white adipose tissue, which during pregnancy results in improved lipid availability for fetal fat accretion and lipotoxicity.
When interventions aimed at weight loss and improved insulin sensitivity are undertaken before pregnancy or in the period between pregnancies, we have the opportunity to increase fat oxidation and reduce oxidative stress in early pregnancy. We also may be able to limit placental inflammation and favorably affect placental growth and gene expression. By the second trimester, our research suggests, gene expression in the placenta and early molecular changes in the white adipose tissue have already been programmed and cannot be reversed (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).
In studies by our group and others of interpregnancy weight loss or gain, interpregnancy weight loss has been associated with a lower risk of large-for-gestational-age (LGA) infants, whereas interpregnancy weight gain has been associated with an increased risk of LGA. Preliminary work from our group shows that the decrease in birth weight involves primarily fat and not lean mass.
The 2009 IOM guidelines support weight loss before pregnancy and state that overweight women should receive individual preconceptional counseling to improve diet quality, increase physical activity, and normalize weight. Multifaceted interventions do work: In obese nonpregnant individuals, lifestyle interventions, which include an exercise program, diet, and behavioral modification have been shown to be successful in improving insulin sensitivity, inflammation, and overall metabolic function.
According to the IOM report, preconceptional services aimed at achieving a healthy weight before conceiving will represent “a radical change to the care provided to obese women of childbearing age.” With continuing research and accumulating data, however, the concept is gaining traction as a viable paradigm for improving perinatal outcomes, with long-term benefits for both the mother and her baby.
Dr. Catalano reports that he has no disclosures relevant to this Master Class.
Newer blood linked to fewer complications from heart surgery
Credit: University of Ottawa
Heart Institute
VANCOUVER—In a large study, heart surgery patients who received recently donated blood had significantly fewer post-operative complications than those who received blood stored for more than 2 weeks.
Patients who received newer blood had a lower rate of mortality, infection, and renal failure.
They were also less likely to require prolonged ventilation or re-exploration for bleeding.
Ansar Hassan, MD, PhD, of Saint John Regional Hospital in New Brunswick, Canada, and his colleagues presented these results at the Canadian Cardiovascular Congress as abstract 562.
The researchers examined records at the New Brunswick Heart Centre in Saint John for non-emergency heart surgeries performed from January 2005 to September 2013 on patients who received red blood cells during or after surgery and who stayed in the hospital less than 30 days.
Of 2015 patients, slightly more than half (n=1052) received only blood that was donated within 14 days of the transfusion. The rest of the patients received some or only blood that was donated more than 14 days before transfusion. Canadian protocols allow blood to be stored and used up to 6 weeks after donation.
Patients who received newer blood were more likely to be female, have unstable angina, to have undergone isolated coronary artery bypass graft or valve surgery, to have experienced shorter bypass and cross-clamp times, and to have left the operating room on inotropes.
After surgery, patients who received newer blood had a lower rate of mortality (1.7% vs 3.3%, P=0.02), infection (3.2% vs 5.4%, P=0.02), atrial fibrillation (43.8% vs 47.3%, P=0.12), and renal failure (12.8% vs 17.7%, P=0.0003).
In addition, they were less likely to require ventilation for more than 24 hours (3% vs 7.7%, P<0.0001) or re-exploration for bleeding (1.5% vs 3.1%, P=0.02).
After the researchers adjusted for differences in baseline and intra-operative characteristics, receiving newer blood was associated with a significant reduction in a composite of the aforementioned outcomes (odds ratio=0.79, P=0.01).
“The findings show that we need to pay attention to the age of the blood we give cardiac surgery patients,” Dr Hassan said. “Perhaps more importantly, we need new studies to determine what is driving this relationship between the age of blood and the outcomes we are seeing.”
Dr Hassan noted that previous studies have reached contradictory conclusions on this subject, which was a reason this study was conducted.
Credit: University of Ottawa
Heart Institute
VANCOUVER—In a large study, heart surgery patients who received recently donated blood had significantly fewer post-operative complications than those who received blood stored for more than 2 weeks.
Patients who received newer blood had a lower rate of mortality, infection, and renal failure.
They were also less likely to require prolonged ventilation or re-exploration for bleeding.
Ansar Hassan, MD, PhD, of Saint John Regional Hospital in New Brunswick, Canada, and his colleagues presented these results at the Canadian Cardiovascular Congress as abstract 562.
The researchers examined records at the New Brunswick Heart Centre in Saint John for non-emergency heart surgeries performed from January 2005 to September 2013 on patients who received red blood cells during or after surgery and who stayed in the hospital less than 30 days.
Of 2015 patients, slightly more than half (n=1052) received only blood that was donated within 14 days of the transfusion. The rest of the patients received some or only blood that was donated more than 14 days before transfusion. Canadian protocols allow blood to be stored and used up to 6 weeks after donation.
Patients who received newer blood were more likely to be female, have unstable angina, to have undergone isolated coronary artery bypass graft or valve surgery, to have experienced shorter bypass and cross-clamp times, and to have left the operating room on inotropes.
After surgery, patients who received newer blood had a lower rate of mortality (1.7% vs 3.3%, P=0.02), infection (3.2% vs 5.4%, P=0.02), atrial fibrillation (43.8% vs 47.3%, P=0.12), and renal failure (12.8% vs 17.7%, P=0.0003).
In addition, they were less likely to require ventilation for more than 24 hours (3% vs 7.7%, P<0.0001) or re-exploration for bleeding (1.5% vs 3.1%, P=0.02).
After the researchers adjusted for differences in baseline and intra-operative characteristics, receiving newer blood was associated with a significant reduction in a composite of the aforementioned outcomes (odds ratio=0.79, P=0.01).
“The findings show that we need to pay attention to the age of the blood we give cardiac surgery patients,” Dr Hassan said. “Perhaps more importantly, we need new studies to determine what is driving this relationship between the age of blood and the outcomes we are seeing.”
Dr Hassan noted that previous studies have reached contradictory conclusions on this subject, which was a reason this study was conducted.
Credit: University of Ottawa
Heart Institute
VANCOUVER—In a large study, heart surgery patients who received recently donated blood had significantly fewer post-operative complications than those who received blood stored for more than 2 weeks.
Patients who received newer blood had a lower rate of mortality, infection, and renal failure.
They were also less likely to require prolonged ventilation or re-exploration for bleeding.
Ansar Hassan, MD, PhD, of Saint John Regional Hospital in New Brunswick, Canada, and his colleagues presented these results at the Canadian Cardiovascular Congress as abstract 562.
The researchers examined records at the New Brunswick Heart Centre in Saint John for non-emergency heart surgeries performed from January 2005 to September 2013 on patients who received red blood cells during or after surgery and who stayed in the hospital less than 30 days.
Of 2015 patients, slightly more than half (n=1052) received only blood that was donated within 14 days of the transfusion. The rest of the patients received some or only blood that was donated more than 14 days before transfusion. Canadian protocols allow blood to be stored and used up to 6 weeks after donation.
Patients who received newer blood were more likely to be female, have unstable angina, to have undergone isolated coronary artery bypass graft or valve surgery, to have experienced shorter bypass and cross-clamp times, and to have left the operating room on inotropes.
After surgery, patients who received newer blood had a lower rate of mortality (1.7% vs 3.3%, P=0.02), infection (3.2% vs 5.4%, P=0.02), atrial fibrillation (43.8% vs 47.3%, P=0.12), and renal failure (12.8% vs 17.7%, P=0.0003).
In addition, they were less likely to require ventilation for more than 24 hours (3% vs 7.7%, P<0.0001) or re-exploration for bleeding (1.5% vs 3.1%, P=0.02).
After the researchers adjusted for differences in baseline and intra-operative characteristics, receiving newer blood was associated with a significant reduction in a composite of the aforementioned outcomes (odds ratio=0.79, P=0.01).
“The findings show that we need to pay attention to the age of the blood we give cardiac surgery patients,” Dr Hassan said. “Perhaps more importantly, we need new studies to determine what is driving this relationship between the age of blood and the outcomes we are seeing.”
Dr Hassan noted that previous studies have reached contradictory conclusions on this subject, which was a reason this study was conducted.
Obese ALL patients more likely to have MRD after induction
Obese youths with acute lymphoblastic leukemia (ALL) are known to have worse outcomes than their lean counterparts.
To gain more insight into this phenomenon, investigators set out to determine if body mass index (BMI) impacted ALL patients’ responses to initial chemotherapy.
The results showed that, following induction chemotherapy, obese patients were more than twice as likely to have minimal residual disease (MRD) than non-obese patients.
“Induction chemotherapy provides a patient’s best chance for remission or a cure,” said principal investigator Steven Mittelman, MD, PhD, of The Saban Research Institute of Children’s Hospital Los Angeles in California.
“Our findings indicate that a patient’s obesity negatively impacts the ability of chemotherapy to kill leukemia cells, reducing the odds of survival.”
The study, which was published in Blood, included 198 patients who were diagnosed with ALL and between the ages of 1 and 21 years.
Each patient’s BMI was converted to a percentile and classified according to the Center for Disease Control and Prevention’s thresholds for overweight (85% to 94%) and obese (greater than 95%). Patients with a BMI less than 85% were considered “lean.”
About one-third of the patients were obese or overweight at the time of diagnosis.
MRD was determined by testing bone marrow specimens at the end of induction therapy, and patients were followed for 2 to 5 years from the time of diagnosis.
The investigators found that lean patients with MRD had similar outcomes to obese patients without evidence of MRD. Obese patients with MRD had the worst outcomes.
Additionally, although nearly a quarter of the patients initially deemed “lean” gained weight and became obese during the first month of treatment, these patients still showed similar outcomes to those who remained lean.
“In addition to increasing a patient’s likelihood of having persistent disease following treatment, obesity appears to add a risk factor that changes the interaction between chemotherapy and residual leukemia cells,” said Hisham Abdel-Azim, MD, also of The Saban Research Institute.
Findings from this study offer new avenues for investigation that include modifying chemotherapy regimens for obese patients and working to change a patient’s weight status beginning at the time of diagnosis.
Obese youths with acute lymphoblastic leukemia (ALL) are known to have worse outcomes than their lean counterparts.
To gain more insight into this phenomenon, investigators set out to determine if body mass index (BMI) impacted ALL patients’ responses to initial chemotherapy.
The results showed that, following induction chemotherapy, obese patients were more than twice as likely to have minimal residual disease (MRD) than non-obese patients.
“Induction chemotherapy provides a patient’s best chance for remission or a cure,” said principal investigator Steven Mittelman, MD, PhD, of The Saban Research Institute of Children’s Hospital Los Angeles in California.
“Our findings indicate that a patient’s obesity negatively impacts the ability of chemotherapy to kill leukemia cells, reducing the odds of survival.”
The study, which was published in Blood, included 198 patients who were diagnosed with ALL and between the ages of 1 and 21 years.
Each patient’s BMI was converted to a percentile and classified according to the Center for Disease Control and Prevention’s thresholds for overweight (85% to 94%) and obese (greater than 95%). Patients with a BMI less than 85% were considered “lean.”
About one-third of the patients were obese or overweight at the time of diagnosis.
MRD was determined by testing bone marrow specimens at the end of induction therapy, and patients were followed for 2 to 5 years from the time of diagnosis.
The investigators found that lean patients with MRD had similar outcomes to obese patients without evidence of MRD. Obese patients with MRD had the worst outcomes.
Additionally, although nearly a quarter of the patients initially deemed “lean” gained weight and became obese during the first month of treatment, these patients still showed similar outcomes to those who remained lean.
“In addition to increasing a patient’s likelihood of having persistent disease following treatment, obesity appears to add a risk factor that changes the interaction between chemotherapy and residual leukemia cells,” said Hisham Abdel-Azim, MD, also of The Saban Research Institute.
Findings from this study offer new avenues for investigation that include modifying chemotherapy regimens for obese patients and working to change a patient’s weight status beginning at the time of diagnosis.
Obese youths with acute lymphoblastic leukemia (ALL) are known to have worse outcomes than their lean counterparts.
To gain more insight into this phenomenon, investigators set out to determine if body mass index (BMI) impacted ALL patients’ responses to initial chemotherapy.
The results showed that, following induction chemotherapy, obese patients were more than twice as likely to have minimal residual disease (MRD) than non-obese patients.
“Induction chemotherapy provides a patient’s best chance for remission or a cure,” said principal investigator Steven Mittelman, MD, PhD, of The Saban Research Institute of Children’s Hospital Los Angeles in California.
“Our findings indicate that a patient’s obesity negatively impacts the ability of chemotherapy to kill leukemia cells, reducing the odds of survival.”
The study, which was published in Blood, included 198 patients who were diagnosed with ALL and between the ages of 1 and 21 years.
Each patient’s BMI was converted to a percentile and classified according to the Center for Disease Control and Prevention’s thresholds for overweight (85% to 94%) and obese (greater than 95%). Patients with a BMI less than 85% were considered “lean.”
About one-third of the patients were obese or overweight at the time of diagnosis.
MRD was determined by testing bone marrow specimens at the end of induction therapy, and patients were followed for 2 to 5 years from the time of diagnosis.
The investigators found that lean patients with MRD had similar outcomes to obese patients without evidence of MRD. Obese patients with MRD had the worst outcomes.
Additionally, although nearly a quarter of the patients initially deemed “lean” gained weight and became obese during the first month of treatment, these patients still showed similar outcomes to those who remained lean.
“In addition to increasing a patient’s likelihood of having persistent disease following treatment, obesity appears to add a risk factor that changes the interaction between chemotherapy and residual leukemia cells,” said Hisham Abdel-Azim, MD, also of The Saban Research Institute.
Findings from this study offer new avenues for investigation that include modifying chemotherapy regimens for obese patients and working to change a patient’s weight status beginning at the time of diagnosis.
Armored CAR T cells next on the production line
NEW YORK—Chimeric antigen receptor (CAR) T cells have “remarkable” activity, according to a speaker at the NCCN 9th Annual Congress: Hematologic Malignancies.
“[T]his chimera binds like an antibody, but it acts like a T cell, so it combines the best of both worlds,” said Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center (MSKCC) in New York.
He then traced the evolution of CAR T-cell design, discussed clinical trials using CD19-targed T cells, and described how investigators are working at building a better T cell.
Researchers found that T-cell activation and proliferation require signaling through a costimulatory receptor, such as CD28, 4-1BB, or OX-40. Without costimulation, the T cell becomes unresponsive or undergoes apoptosis.
So based on this observation, Dr Park said, several research groups created second- and third-generation CARs to incorporate the costimulatory signal.
The first generation was typically fused to the CD8 domain. Second-generation CARs include a costimulatory signaling domain, such as CD28, 4-1BB, or OX40. And the third generation contains signaling domains from 2 costimulatory receptors, CD28 with 4-1BB and CD28 with OX40.
The built-in costimulatory signal proved superior to the first-generation CAR T cells.
In NOD/SCID mice inoculated with NALM-6 lymphoma cells, Dr Park said, about 50% more were “cured,” in terms of survival, using a CD80 costimulatory ligand with CD19-targeted T cells compared to those without the ligand.
Clinical trials
Clinical trials using second-generation CD19-targeted T cells in relapsed B-cell acute lymphoblastic leukemia (ALL) at MSKCC produced an overall complete response (CR) rate of 88% in a median of 22.1 days. And 72% of the CRs were minimal residual disease (MRD) negative.
So the CAR T cells produce a “very rapid and deep remission,” Dr Park said.
CAR T-cell therapy, however, comes with adverse events, most notably, cytokine release syndrome (CRS), which results from T-cell activation. CRS causes fevers, hypotension, and neurologic toxicities including mental status changes, obtundation, and seizures.
“CRS is not unique to CAR T-cell therapy,” Dr Park said. “Any therapy that activates T cells can have this type of side effect.”
Dr Park noted that CRS is associated with disease burden at the time of treatment. “The larger the disease burden pre T-cell therapy,” he said, “the more likely [patients are] to develop CRS.”
In the MSKCC trial, no patient with very low disease burden—5% blasts in the bone marrow—developed CRS.
However, there is also a correlation between tumor burden and T-cell expansion, he added. T cells expand much better with a larger disease burden, because there is a greater antigen load.
The investigators found that serum C-reactive protein can serve as a surrogate marker for the severity of CRS. Patients with levels above 20 mg/dL are more likely to experience CRS.
And Dr Park pointed out that CRS symptoms respond pretty rapidly to steroids or interleukin-6 receptor blockade.
CAR T-cell therapy has also been used to treat chronic lymphocytic leukemia, but with much more modest response rates than in ALL. Both University of Pennsylvania and MSKCC trials in CLL have produced overall response rates around 40%.
Building a better T cell
Dr Park described efforts underway to develop the fourth-generation “armored” CAR T cells to overcome the hostile tumor microenvironment, which contains multiple inhibitory factors designed to suppress effector T cells.
Armored T cells can actually secrete some of the inflammatory cytokines to change the tumor microenvironment and overcome the inhibitory effect.
Dr Park described a potential scenario: The armored CAR T cells secrete IL-12, enhance the central memory phenotype, enhance cytotoxicity, enhance persistence, modify the endogenous immune system and T-cell activation, and reactivate tumor-infiltrating lymphocytes.
He said future studies will focus on translation of these armored CAR T cells to the clinical setting in both hematologic and solid tumor malignancies.
NEW YORK—Chimeric antigen receptor (CAR) T cells have “remarkable” activity, according to a speaker at the NCCN 9th Annual Congress: Hematologic Malignancies.
“[T]his chimera binds like an antibody, but it acts like a T cell, so it combines the best of both worlds,” said Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center (MSKCC) in New York.
He then traced the evolution of CAR T-cell design, discussed clinical trials using CD19-targed T cells, and described how investigators are working at building a better T cell.
Researchers found that T-cell activation and proliferation require signaling through a costimulatory receptor, such as CD28, 4-1BB, or OX-40. Without costimulation, the T cell becomes unresponsive or undergoes apoptosis.
So based on this observation, Dr Park said, several research groups created second- and third-generation CARs to incorporate the costimulatory signal.
The first generation was typically fused to the CD8 domain. Second-generation CARs include a costimulatory signaling domain, such as CD28, 4-1BB, or OX40. And the third generation contains signaling domains from 2 costimulatory receptors, CD28 with 4-1BB and CD28 with OX40.
The built-in costimulatory signal proved superior to the first-generation CAR T cells.
In NOD/SCID mice inoculated with NALM-6 lymphoma cells, Dr Park said, about 50% more were “cured,” in terms of survival, using a CD80 costimulatory ligand with CD19-targeted T cells compared to those without the ligand.
Clinical trials
Clinical trials using second-generation CD19-targeted T cells in relapsed B-cell acute lymphoblastic leukemia (ALL) at MSKCC produced an overall complete response (CR) rate of 88% in a median of 22.1 days. And 72% of the CRs were minimal residual disease (MRD) negative.
So the CAR T cells produce a “very rapid and deep remission,” Dr Park said.
CAR T-cell therapy, however, comes with adverse events, most notably, cytokine release syndrome (CRS), which results from T-cell activation. CRS causes fevers, hypotension, and neurologic toxicities including mental status changes, obtundation, and seizures.
“CRS is not unique to CAR T-cell therapy,” Dr Park said. “Any therapy that activates T cells can have this type of side effect.”
Dr Park noted that CRS is associated with disease burden at the time of treatment. “The larger the disease burden pre T-cell therapy,” he said, “the more likely [patients are] to develop CRS.”
In the MSKCC trial, no patient with very low disease burden—5% blasts in the bone marrow—developed CRS.
However, there is also a correlation between tumor burden and T-cell expansion, he added. T cells expand much better with a larger disease burden, because there is a greater antigen load.
The investigators found that serum C-reactive protein can serve as a surrogate marker for the severity of CRS. Patients with levels above 20 mg/dL are more likely to experience CRS.
And Dr Park pointed out that CRS symptoms respond pretty rapidly to steroids or interleukin-6 receptor blockade.
CAR T-cell therapy has also been used to treat chronic lymphocytic leukemia, but with much more modest response rates than in ALL. Both University of Pennsylvania and MSKCC trials in CLL have produced overall response rates around 40%.
Building a better T cell
Dr Park described efforts underway to develop the fourth-generation “armored” CAR T cells to overcome the hostile tumor microenvironment, which contains multiple inhibitory factors designed to suppress effector T cells.
Armored T cells can actually secrete some of the inflammatory cytokines to change the tumor microenvironment and overcome the inhibitory effect.
Dr Park described a potential scenario: The armored CAR T cells secrete IL-12, enhance the central memory phenotype, enhance cytotoxicity, enhance persistence, modify the endogenous immune system and T-cell activation, and reactivate tumor-infiltrating lymphocytes.
He said future studies will focus on translation of these armored CAR T cells to the clinical setting in both hematologic and solid tumor malignancies.
NEW YORK—Chimeric antigen receptor (CAR) T cells have “remarkable” activity, according to a speaker at the NCCN 9th Annual Congress: Hematologic Malignancies.
“[T]his chimera binds like an antibody, but it acts like a T cell, so it combines the best of both worlds,” said Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center (MSKCC) in New York.
He then traced the evolution of CAR T-cell design, discussed clinical trials using CD19-targed T cells, and described how investigators are working at building a better T cell.
Researchers found that T-cell activation and proliferation require signaling through a costimulatory receptor, such as CD28, 4-1BB, or OX-40. Without costimulation, the T cell becomes unresponsive or undergoes apoptosis.
So based on this observation, Dr Park said, several research groups created second- and third-generation CARs to incorporate the costimulatory signal.
The first generation was typically fused to the CD8 domain. Second-generation CARs include a costimulatory signaling domain, such as CD28, 4-1BB, or OX40. And the third generation contains signaling domains from 2 costimulatory receptors, CD28 with 4-1BB and CD28 with OX40.
The built-in costimulatory signal proved superior to the first-generation CAR T cells.
In NOD/SCID mice inoculated with NALM-6 lymphoma cells, Dr Park said, about 50% more were “cured,” in terms of survival, using a CD80 costimulatory ligand with CD19-targeted T cells compared to those without the ligand.
Clinical trials
Clinical trials using second-generation CD19-targeted T cells in relapsed B-cell acute lymphoblastic leukemia (ALL) at MSKCC produced an overall complete response (CR) rate of 88% in a median of 22.1 days. And 72% of the CRs were minimal residual disease (MRD) negative.
So the CAR T cells produce a “very rapid and deep remission,” Dr Park said.
CAR T-cell therapy, however, comes with adverse events, most notably, cytokine release syndrome (CRS), which results from T-cell activation. CRS causes fevers, hypotension, and neurologic toxicities including mental status changes, obtundation, and seizures.
“CRS is not unique to CAR T-cell therapy,” Dr Park said. “Any therapy that activates T cells can have this type of side effect.”
Dr Park noted that CRS is associated with disease burden at the time of treatment. “The larger the disease burden pre T-cell therapy,” he said, “the more likely [patients are] to develop CRS.”
In the MSKCC trial, no patient with very low disease burden—5% blasts in the bone marrow—developed CRS.
However, there is also a correlation between tumor burden and T-cell expansion, he added. T cells expand much better with a larger disease burden, because there is a greater antigen load.
The investigators found that serum C-reactive protein can serve as a surrogate marker for the severity of CRS. Patients with levels above 20 mg/dL are more likely to experience CRS.
And Dr Park pointed out that CRS symptoms respond pretty rapidly to steroids or interleukin-6 receptor blockade.
CAR T-cell therapy has also been used to treat chronic lymphocytic leukemia, but with much more modest response rates than in ALL. Both University of Pennsylvania and MSKCC trials in CLL have produced overall response rates around 40%.
Building a better T cell
Dr Park described efforts underway to develop the fourth-generation “armored” CAR T cells to overcome the hostile tumor microenvironment, which contains multiple inhibitory factors designed to suppress effector T cells.
Armored T cells can actually secrete some of the inflammatory cytokines to change the tumor microenvironment and overcome the inhibitory effect.
Dr Park described a potential scenario: The armored CAR T cells secrete IL-12, enhance the central memory phenotype, enhance cytotoxicity, enhance persistence, modify the endogenous immune system and T-cell activation, and reactivate tumor-infiltrating lymphocytes.
He said future studies will focus on translation of these armored CAR T cells to the clinical setting in both hematologic and solid tumor malignancies.
Bacterium could help control malaria, dengue
Credit: CDC
A bacterium isolated from a mosquito’s gut could aid the fight against malaria and dengue, according to a study published in PLOS Pathogens.
With previous research, scientists isolated Csp_P, a member of the family of chromobacteria, from the gut of Aedes aegypti mosquitoes.
Now, the team has found that Csp_P can directly inhibit malaria and dengue pathogens in vitro and shorten the life span of the mosquitoes that transmit both diseases.
George Dimopoulos, PhD, of Johns Hopkins University in Baltimore, Maryland, and his colleagues examined Csp_P’s actions on both mosquitoes and pathogens, and the results suggest that Csp_P might help to fight malaria and dengue at different levels.
The researchers added Csp_P to sugar water fed to mosquitoes and found that the bacteria are able to quickly colonize the gut of the two most important mosquito disease vectors—Aedes aegypti and Anopheles gambiae.
Moreover, the presence of Csp_P in the gut reduced the susceptibility of the respective mosquitoes to infection with the malaria parasite Plasmodium falciparum or with dengue virus.
Even without gut colonization, exposure to Csp_P through food or breeding water shortened the lifespan of adult mosquitoes and mosquito larvae of both species.
When the researchers tested whether Csp_P could act against the malaria or dengue pathogens directly, they found that the bacterium, likely through the production of toxic metabolites, can inhibit the growth of Plasmodium at various stages during the parasite’s life cycle and also abolish dengue virus infectivity.
The team said these toxic metabolites could potentially be developed into drugs to treat malaria and dengue.
Overall, the researchers concluded that Csp_P’s broad-spectrum antipathogen properties and ability to kill mosquitoes make it a good candidate for the development of novel control strategies for malaria and dengue, so it warrants further study.
Credit: CDC
A bacterium isolated from a mosquito’s gut could aid the fight against malaria and dengue, according to a study published in PLOS Pathogens.
With previous research, scientists isolated Csp_P, a member of the family of chromobacteria, from the gut of Aedes aegypti mosquitoes.
Now, the team has found that Csp_P can directly inhibit malaria and dengue pathogens in vitro and shorten the life span of the mosquitoes that transmit both diseases.
George Dimopoulos, PhD, of Johns Hopkins University in Baltimore, Maryland, and his colleagues examined Csp_P’s actions on both mosquitoes and pathogens, and the results suggest that Csp_P might help to fight malaria and dengue at different levels.
The researchers added Csp_P to sugar water fed to mosquitoes and found that the bacteria are able to quickly colonize the gut of the two most important mosquito disease vectors—Aedes aegypti and Anopheles gambiae.
Moreover, the presence of Csp_P in the gut reduced the susceptibility of the respective mosquitoes to infection with the malaria parasite Plasmodium falciparum or with dengue virus.
Even without gut colonization, exposure to Csp_P through food or breeding water shortened the lifespan of adult mosquitoes and mosquito larvae of both species.
When the researchers tested whether Csp_P could act against the malaria or dengue pathogens directly, they found that the bacterium, likely through the production of toxic metabolites, can inhibit the growth of Plasmodium at various stages during the parasite’s life cycle and also abolish dengue virus infectivity.
The team said these toxic metabolites could potentially be developed into drugs to treat malaria and dengue.
Overall, the researchers concluded that Csp_P’s broad-spectrum antipathogen properties and ability to kill mosquitoes make it a good candidate for the development of novel control strategies for malaria and dengue, so it warrants further study.
Credit: CDC
A bacterium isolated from a mosquito’s gut could aid the fight against malaria and dengue, according to a study published in PLOS Pathogens.
With previous research, scientists isolated Csp_P, a member of the family of chromobacteria, from the gut of Aedes aegypti mosquitoes.
Now, the team has found that Csp_P can directly inhibit malaria and dengue pathogens in vitro and shorten the life span of the mosquitoes that transmit both diseases.
George Dimopoulos, PhD, of Johns Hopkins University in Baltimore, Maryland, and his colleagues examined Csp_P’s actions on both mosquitoes and pathogens, and the results suggest that Csp_P might help to fight malaria and dengue at different levels.
The researchers added Csp_P to sugar water fed to mosquitoes and found that the bacteria are able to quickly colonize the gut of the two most important mosquito disease vectors—Aedes aegypti and Anopheles gambiae.
Moreover, the presence of Csp_P in the gut reduced the susceptibility of the respective mosquitoes to infection with the malaria parasite Plasmodium falciparum or with dengue virus.
Even without gut colonization, exposure to Csp_P through food or breeding water shortened the lifespan of adult mosquitoes and mosquito larvae of both species.
When the researchers tested whether Csp_P could act against the malaria or dengue pathogens directly, they found that the bacterium, likely through the production of toxic metabolites, can inhibit the growth of Plasmodium at various stages during the parasite’s life cycle and also abolish dengue virus infectivity.
The team said these toxic metabolites could potentially be developed into drugs to treat malaria and dengue.
Overall, the researchers concluded that Csp_P’s broad-spectrum antipathogen properties and ability to kill mosquitoes make it a good candidate for the development of novel control strategies for malaria and dengue, so it warrants further study.
Paracentesis in Cirrhosis Patients/
Ascites is the most common complication of cirrhosis leading to hospital admission.[1] Approximately 12% of hospitalized patients who present with decompensated cirrhosis and ascites have spontaneous bacterial peritonitis (SBP); half of these patients do not present with abdominal pain, fever, nausea, or vomiting.[2] Guidelines published by the American Association for the Study of Liver Diseases (AASLD) recommend paracentesis for all hospitalized patients with cirrhosis and ascites and also recommend long‐term antibiotic prophylaxis for survivors of an SBP episode.[3] Despite evidence that in‐hospital mortality is reduced in those patients who receive paracentesis in a timely manner,[4, 5] only 40% to 60% of eligible patients receive paracentesis.[4, 6, 7] We aimed to describe clinical predictors of paracentesis and use of antibiotics following an episode of SBP in patients with decompensated cirrhosis and ascites.
METHODS
We conducted a retrospective cohort study of adults admitted to a single tertiary care center between January 1, 2009 and December 31, 2009.7 We included patients with an International Classification of Diseases, Ninth Revision discharge code consistent with decompensated cirrhosis who met clinical criteria for decompensated cirrhosis (see
RESULTS
We identified 193 admissions for 103 patients with decompensated cirrhosis and ascites (Table 1). Of these, 41% (80/193) received diagnostic paracentesis. Mean/standard deviation for age was 53.6/12.4 years; 71% of patients were male and 63% were English speaking. Common comorbidities included diabetes mellitus (33%), psychiatric diagnosis (29%), substance abuse (18%), and renal failure (17%). Excluding SBP, 31% of patients had another documented infection. Gastroenterology was consulted in 50% of the admissions. Fever was present in 27% of patients, elevated white blood cell (WBC) count (ie, WBC >11 k/mm3) was present in 27% of patients, International Normalized Ratio (INR) was elevated (>1.1) in 92% of patients, and 16% of patients had a platelet count of <50,000/mm3. Patients who received paracentesis were less likely to have a fever on presentation (19% vs 32%, P=0.06), low (ie, <50,000/mm3) platelet count (11% vs 19%, P=0.14), or concurrent gastrointestinal (GI) bleed (6% vs 16%, P=0.05). In a multiple logistic regression model including characteristics associated at P0.2 with paracentesis, fever, low platelet count, and concurrent GI bleeding were associated with decreased odds of receiving paracentesis (Appendix 1).
Overall, N=193, Mean/SD or N (%)* | Paracentesis (), n=113, Mean/SD or N (%) | Paracentesis (+), n=80, Mean/SD or N (%) | Odds Ratio (95% CI) | |
---|---|---|---|---|
| ||||
Age, y | 53.6/12.4 | 54.1/13.4 | 53.2/11.7 | 1.00 (0.981.03) |
Sex (male) | 137 (71.0%) | 78 (69.0%) | 59 (73.8%) | 1.26 (0.672.39) |
English speaking | 122 (63.2%) | 69 (61.1%) | 53 (66.3%) | 1.25 (0.692.28) |
Etiology | ||||
Alcohol | 120 (62.2%) | 74 (65.5%) | 46 (57.5%) | 0.71 (0.401.29) |
Hepatitis C | 94 (48.7%) | 57 (50.4%) | 37 (46.3%) | 0.85 (0.481.50) |
Hepatitis B | 16 (8.3%) | 7 (6.2%) | 9 (11.3%) | 1.92 (0.685.39) |
NASH | 8 (4.2%) | 4 (3.5%) | 4 (5.0%) | 1.43 (0.355.91) |
Cryptogenic | 11 (5.7%) | 6 (5.3%) | 5 (6.3%) | 1.19 (0.354.04) |
Comorbidities | ||||
Substance abuse | 34 (17.6%) | 22 (19.5%) | 12 (15.0%) | 0.73 (0.341.58) |
Psychiatric diagnosis | 55 (28.5%) | 38 (33.6%) | 17 (21.3%) | 0.53 (0.271.03) |
Diabetes mellitus | 63 (32.6%) | 37 (32.7%) | 26 (32.5%) | 0.99 (0.541.82) |
Renal failure | 33 (17.1%) | 20 (17.7%) | 13 (16.3%) | 0.90 (0.421.94) |
GI bleed | 23 (11.9%) | 18 (15.9%) | 5 (6.3%) | 0.35 (0.120.99) |
Admission MELD | 17.3/7.3 | 17.5/7.3 | 17.0/7.3 | 0.99 (0.951.03) |
Creatinine, median/IQR | 0.9/0.7 | 0.9/0.7 | 0.9/0.8 | 1.02 (0.821.27) |
Gastroenterology consult | 97 (50.3%) | 46 (40.7%) | 51 (63.8%) | 2.56 (1.424.63) |
Infection, UTI, pneumonia, other | 60 (31.1%) | 38 (33.6%) | 22 (27.5%) | 0.75 (0.401.40) |
Temperature 100.4F | 49 (26.8%) | 34 (32.4%) | 15 (19.2%) | 0.50 (0.251.00) |
WBC >11 k/mm3 | 50 (27.3%) | 28 (26.7%) | 22 (28.2%) | 1.08 (0.562.08) |
WBC <4 k/mm3 | 43 (23.5%) | 23 (21.9%) | 20 (25.6%) | 1.23 (0.622.44) |
INR >1.1 | 149 (92.0%) | 83 (93.3%) | 66 (90.4%) | 0.68 (0.222.13) |
Highest temperature, F | 98.9/1.1 | 99.1/1.3 | 98.8/0.8 | 0.82 (0.621.09) |
Highest HR | 98.2/20.4 | 97.4/22.4 | 99.2/17.4 | 1.00 (0.991.02) |
Highest RR | 24.5/13.7 | 25.2/16.8 | 23.5/7.8 | 0.99 (0.961.02) |
Lowest SBP | 101.0/20.0 | 99.4/20.3 | 102.2/19.7 | 0.99 (0.981.01) |
Lowest MAP | 73.0/12.2 | 73.2/13.3 | 72.7/10.6 | 1.00 (0.971.02) |
Lowest O2Sat | 92.6/13.6 | 91.0/17.7 | 94.9/2.8 | 1.04 (0.991.10) |
Highest PT | 15.8/3.8 | 15.9/3.7 | 15.7/3.9 | 0.98 (0.901.08) |
Platelets 50 k/mm3 | 30 (15.9%) | 21 (19.3%) | 9 (11.3%) | 0.53 (0.231.23) |
Of the patients who received paracentesis (n=80), 14% were diagnosed with SBP. Of these, 55% received prophylaxis on discharge. Among the patients who did not receive paracentesis (n=113), 38 (34%) received antibiotics for another documented infection (eg, pneumonia), and 25 patients (22%) received antibiotics with no other documented infection or evidence of variceal bleeding. Of these 25 patients who were presumed to be empirically treated for SBP (Figure 1), only 20% were prescribed prophylactic antibiotics on discharge.

CONCLUSION
We found that many patients with decompensated cirrhosis and ascites did not receive paracentesis when hospitalized, which is similar to previously published data.[4, 6, 7] Clinical evidence of infection, such as fever or elevated WBC count, did not increase the odds of receiving paracentesis. Many patients treated for SBP were not discharged on prophylaxis.
This study is limited by its small single‐center design. We could only use data from 1 year (2009), because study data collection was part of a quality‐improvement project that took place for that year only. We did not adjust for the number of red blood cells in the ascitic fluid samples. We were also unable to determine the timing of gastroenterology consultation (whether it was done prior to paracentesis), admission venue (floor vs intensive care), or patient history of SBP.
Despite these limitations, there are important implications. First, the decision to perform paracentesis was not associated with symptoms of infection, although some clinical factors (eg, low platelets or GI bleeding) were associated with reduced odds of receiving paracentesis. Second, a majority of patients treated for SBP did not receive prophylactic antibiotics at discharge. These findings suggest a clear opportunity to increase awareness and acceptance of AASLD guidelines among hospital medicine practitioners. Quality‐improvement efforts should focus on the education of providers, and future research should identify barriers to paracentesis at both the practitioner and system levels (eg, availability of interventional radiology). Checklists or decision support within electronic order entry systems may also help reduce the low rates of paracentesis seen in our and prior studies.[4, 6, 7]
Disclosures: Dr. Lagu is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under award number K01HL114745. Drs. Lagu, Ghaoui, and Brooling had full access to all of the data in the study. They take responsibility for the integrity of the data and the accuracy of the data analysis. Drs. Lagu, Ghaoui, and Brooling conceived of the study. Dr. Ghaoui acquired the data. Ms. Friderici carried out the statistical analyses. Drs. Lagu, Ghaoui, Brooling, Lindenauer, and Ms. Friderici analyzed and interpreted the data, drafted the manuscript, and critically reviewed the manuscript for important intellectual content. The authors report no conflicts of interest.
- Spanish Collaborative Study Group On Therapeutic Management In Liver Disease. Multicenter hospital study on prescribing patterns for prophylaxis and treatment of complications of cirrhosis. Eur J Clin Pharmacol. 2002;58(6):435–440. , , , , ;
- Bacterial infection in patients with advanced cirrhosis: a multicentre prospective study. Dig Liver Dis. 2001;33(1):41–48. , , , et al.
- AASLD. Introduction to the revised American Association for the Study of Liver Diseases Practice Guideline management of adult patients with ascites due to cirrhosis 2012. Hepatology. 2013;57(4):1651–1653. ,
- Paracentesis is associated with reduced mortality in patients hospitalized with cirrhosis and ascites. Clin Gastroenterol Hepatol. 2014;12(3):496–503.e1. , , , .
- Delayed paracentesis is associated with increased in‐hospital mortality in patients with spontaneous bacterial peritonitis. Am J Gastroenterol. 2014;109(9):1436–1442. , , , et al.
- The quality of care provided to patients with cirrhosis and ascites in the Department of Veterans Affairs. Gastroenterology. 2012;143(1):70–77. , , , et al.
- Measurement of the quality of care of patients admitted with decompensated cirrhosis. Liver Int. 2014;34(2):204–210. , , , , , .
Ascites is the most common complication of cirrhosis leading to hospital admission.[1] Approximately 12% of hospitalized patients who present with decompensated cirrhosis and ascites have spontaneous bacterial peritonitis (SBP); half of these patients do not present with abdominal pain, fever, nausea, or vomiting.[2] Guidelines published by the American Association for the Study of Liver Diseases (AASLD) recommend paracentesis for all hospitalized patients with cirrhosis and ascites and also recommend long‐term antibiotic prophylaxis for survivors of an SBP episode.[3] Despite evidence that in‐hospital mortality is reduced in those patients who receive paracentesis in a timely manner,[4, 5] only 40% to 60% of eligible patients receive paracentesis.[4, 6, 7] We aimed to describe clinical predictors of paracentesis and use of antibiotics following an episode of SBP in patients with decompensated cirrhosis and ascites.
METHODS
We conducted a retrospective cohort study of adults admitted to a single tertiary care center between January 1, 2009 and December 31, 2009.7 We included patients with an International Classification of Diseases, Ninth Revision discharge code consistent with decompensated cirrhosis who met clinical criteria for decompensated cirrhosis (see
RESULTS
We identified 193 admissions for 103 patients with decompensated cirrhosis and ascites (Table 1). Of these, 41% (80/193) received diagnostic paracentesis. Mean/standard deviation for age was 53.6/12.4 years; 71% of patients were male and 63% were English speaking. Common comorbidities included diabetes mellitus (33%), psychiatric diagnosis (29%), substance abuse (18%), and renal failure (17%). Excluding SBP, 31% of patients had another documented infection. Gastroenterology was consulted in 50% of the admissions. Fever was present in 27% of patients, elevated white blood cell (WBC) count (ie, WBC >11 k/mm3) was present in 27% of patients, International Normalized Ratio (INR) was elevated (>1.1) in 92% of patients, and 16% of patients had a platelet count of <50,000/mm3. Patients who received paracentesis were less likely to have a fever on presentation (19% vs 32%, P=0.06), low (ie, <50,000/mm3) platelet count (11% vs 19%, P=0.14), or concurrent gastrointestinal (GI) bleed (6% vs 16%, P=0.05). In a multiple logistic regression model including characteristics associated at P0.2 with paracentesis, fever, low platelet count, and concurrent GI bleeding were associated with decreased odds of receiving paracentesis (Appendix 1).
Overall, N=193, Mean/SD or N (%)* | Paracentesis (), n=113, Mean/SD or N (%) | Paracentesis (+), n=80, Mean/SD or N (%) | Odds Ratio (95% CI) | |
---|---|---|---|---|
| ||||
Age, y | 53.6/12.4 | 54.1/13.4 | 53.2/11.7 | 1.00 (0.981.03) |
Sex (male) | 137 (71.0%) | 78 (69.0%) | 59 (73.8%) | 1.26 (0.672.39) |
English speaking | 122 (63.2%) | 69 (61.1%) | 53 (66.3%) | 1.25 (0.692.28) |
Etiology | ||||
Alcohol | 120 (62.2%) | 74 (65.5%) | 46 (57.5%) | 0.71 (0.401.29) |
Hepatitis C | 94 (48.7%) | 57 (50.4%) | 37 (46.3%) | 0.85 (0.481.50) |
Hepatitis B | 16 (8.3%) | 7 (6.2%) | 9 (11.3%) | 1.92 (0.685.39) |
NASH | 8 (4.2%) | 4 (3.5%) | 4 (5.0%) | 1.43 (0.355.91) |
Cryptogenic | 11 (5.7%) | 6 (5.3%) | 5 (6.3%) | 1.19 (0.354.04) |
Comorbidities | ||||
Substance abuse | 34 (17.6%) | 22 (19.5%) | 12 (15.0%) | 0.73 (0.341.58) |
Psychiatric diagnosis | 55 (28.5%) | 38 (33.6%) | 17 (21.3%) | 0.53 (0.271.03) |
Diabetes mellitus | 63 (32.6%) | 37 (32.7%) | 26 (32.5%) | 0.99 (0.541.82) |
Renal failure | 33 (17.1%) | 20 (17.7%) | 13 (16.3%) | 0.90 (0.421.94) |
GI bleed | 23 (11.9%) | 18 (15.9%) | 5 (6.3%) | 0.35 (0.120.99) |
Admission MELD | 17.3/7.3 | 17.5/7.3 | 17.0/7.3 | 0.99 (0.951.03) |
Creatinine, median/IQR | 0.9/0.7 | 0.9/0.7 | 0.9/0.8 | 1.02 (0.821.27) |
Gastroenterology consult | 97 (50.3%) | 46 (40.7%) | 51 (63.8%) | 2.56 (1.424.63) |
Infection, UTI, pneumonia, other | 60 (31.1%) | 38 (33.6%) | 22 (27.5%) | 0.75 (0.401.40) |
Temperature 100.4F | 49 (26.8%) | 34 (32.4%) | 15 (19.2%) | 0.50 (0.251.00) |
WBC >11 k/mm3 | 50 (27.3%) | 28 (26.7%) | 22 (28.2%) | 1.08 (0.562.08) |
WBC <4 k/mm3 | 43 (23.5%) | 23 (21.9%) | 20 (25.6%) | 1.23 (0.622.44) |
INR >1.1 | 149 (92.0%) | 83 (93.3%) | 66 (90.4%) | 0.68 (0.222.13) |
Highest temperature, F | 98.9/1.1 | 99.1/1.3 | 98.8/0.8 | 0.82 (0.621.09) |
Highest HR | 98.2/20.4 | 97.4/22.4 | 99.2/17.4 | 1.00 (0.991.02) |
Highest RR | 24.5/13.7 | 25.2/16.8 | 23.5/7.8 | 0.99 (0.961.02) |
Lowest SBP | 101.0/20.0 | 99.4/20.3 | 102.2/19.7 | 0.99 (0.981.01) |
Lowest MAP | 73.0/12.2 | 73.2/13.3 | 72.7/10.6 | 1.00 (0.971.02) |
Lowest O2Sat | 92.6/13.6 | 91.0/17.7 | 94.9/2.8 | 1.04 (0.991.10) |
Highest PT | 15.8/3.8 | 15.9/3.7 | 15.7/3.9 | 0.98 (0.901.08) |
Platelets 50 k/mm3 | 30 (15.9%) | 21 (19.3%) | 9 (11.3%) | 0.53 (0.231.23) |
Of the patients who received paracentesis (n=80), 14% were diagnosed with SBP. Of these, 55% received prophylaxis on discharge. Among the patients who did not receive paracentesis (n=113), 38 (34%) received antibiotics for another documented infection (eg, pneumonia), and 25 patients (22%) received antibiotics with no other documented infection or evidence of variceal bleeding. Of these 25 patients who were presumed to be empirically treated for SBP (Figure 1), only 20% were prescribed prophylactic antibiotics on discharge.

CONCLUSION
We found that many patients with decompensated cirrhosis and ascites did not receive paracentesis when hospitalized, which is similar to previously published data.[4, 6, 7] Clinical evidence of infection, such as fever or elevated WBC count, did not increase the odds of receiving paracentesis. Many patients treated for SBP were not discharged on prophylaxis.
This study is limited by its small single‐center design. We could only use data from 1 year (2009), because study data collection was part of a quality‐improvement project that took place for that year only. We did not adjust for the number of red blood cells in the ascitic fluid samples. We were also unable to determine the timing of gastroenterology consultation (whether it was done prior to paracentesis), admission venue (floor vs intensive care), or patient history of SBP.
Despite these limitations, there are important implications. First, the decision to perform paracentesis was not associated with symptoms of infection, although some clinical factors (eg, low platelets or GI bleeding) were associated with reduced odds of receiving paracentesis. Second, a majority of patients treated for SBP did not receive prophylactic antibiotics at discharge. These findings suggest a clear opportunity to increase awareness and acceptance of AASLD guidelines among hospital medicine practitioners. Quality‐improvement efforts should focus on the education of providers, and future research should identify barriers to paracentesis at both the practitioner and system levels (eg, availability of interventional radiology). Checklists or decision support within electronic order entry systems may also help reduce the low rates of paracentesis seen in our and prior studies.[4, 6, 7]
Disclosures: Dr. Lagu is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under award number K01HL114745. Drs. Lagu, Ghaoui, and Brooling had full access to all of the data in the study. They take responsibility for the integrity of the data and the accuracy of the data analysis. Drs. Lagu, Ghaoui, and Brooling conceived of the study. Dr. Ghaoui acquired the data. Ms. Friderici carried out the statistical analyses. Drs. Lagu, Ghaoui, Brooling, Lindenauer, and Ms. Friderici analyzed and interpreted the data, drafted the manuscript, and critically reviewed the manuscript for important intellectual content. The authors report no conflicts of interest.
Ascites is the most common complication of cirrhosis leading to hospital admission.[1] Approximately 12% of hospitalized patients who present with decompensated cirrhosis and ascites have spontaneous bacterial peritonitis (SBP); half of these patients do not present with abdominal pain, fever, nausea, or vomiting.[2] Guidelines published by the American Association for the Study of Liver Diseases (AASLD) recommend paracentesis for all hospitalized patients with cirrhosis and ascites and also recommend long‐term antibiotic prophylaxis for survivors of an SBP episode.[3] Despite evidence that in‐hospital mortality is reduced in those patients who receive paracentesis in a timely manner,[4, 5] only 40% to 60% of eligible patients receive paracentesis.[4, 6, 7] We aimed to describe clinical predictors of paracentesis and use of antibiotics following an episode of SBP in patients with decompensated cirrhosis and ascites.
METHODS
We conducted a retrospective cohort study of adults admitted to a single tertiary care center between January 1, 2009 and December 31, 2009.7 We included patients with an International Classification of Diseases, Ninth Revision discharge code consistent with decompensated cirrhosis who met clinical criteria for decompensated cirrhosis (see
RESULTS
We identified 193 admissions for 103 patients with decompensated cirrhosis and ascites (Table 1). Of these, 41% (80/193) received diagnostic paracentesis. Mean/standard deviation for age was 53.6/12.4 years; 71% of patients were male and 63% were English speaking. Common comorbidities included diabetes mellitus (33%), psychiatric diagnosis (29%), substance abuse (18%), and renal failure (17%). Excluding SBP, 31% of patients had another documented infection. Gastroenterology was consulted in 50% of the admissions. Fever was present in 27% of patients, elevated white blood cell (WBC) count (ie, WBC >11 k/mm3) was present in 27% of patients, International Normalized Ratio (INR) was elevated (>1.1) in 92% of patients, and 16% of patients had a platelet count of <50,000/mm3. Patients who received paracentesis were less likely to have a fever on presentation (19% vs 32%, P=0.06), low (ie, <50,000/mm3) platelet count (11% vs 19%, P=0.14), or concurrent gastrointestinal (GI) bleed (6% vs 16%, P=0.05). In a multiple logistic regression model including characteristics associated at P0.2 with paracentesis, fever, low platelet count, and concurrent GI bleeding were associated with decreased odds of receiving paracentesis (Appendix 1).
Overall, N=193, Mean/SD or N (%)* | Paracentesis (), n=113, Mean/SD or N (%) | Paracentesis (+), n=80, Mean/SD or N (%) | Odds Ratio (95% CI) | |
---|---|---|---|---|
| ||||
Age, y | 53.6/12.4 | 54.1/13.4 | 53.2/11.7 | 1.00 (0.981.03) |
Sex (male) | 137 (71.0%) | 78 (69.0%) | 59 (73.8%) | 1.26 (0.672.39) |
English speaking | 122 (63.2%) | 69 (61.1%) | 53 (66.3%) | 1.25 (0.692.28) |
Etiology | ||||
Alcohol | 120 (62.2%) | 74 (65.5%) | 46 (57.5%) | 0.71 (0.401.29) |
Hepatitis C | 94 (48.7%) | 57 (50.4%) | 37 (46.3%) | 0.85 (0.481.50) |
Hepatitis B | 16 (8.3%) | 7 (6.2%) | 9 (11.3%) | 1.92 (0.685.39) |
NASH | 8 (4.2%) | 4 (3.5%) | 4 (5.0%) | 1.43 (0.355.91) |
Cryptogenic | 11 (5.7%) | 6 (5.3%) | 5 (6.3%) | 1.19 (0.354.04) |
Comorbidities | ||||
Substance abuse | 34 (17.6%) | 22 (19.5%) | 12 (15.0%) | 0.73 (0.341.58) |
Psychiatric diagnosis | 55 (28.5%) | 38 (33.6%) | 17 (21.3%) | 0.53 (0.271.03) |
Diabetes mellitus | 63 (32.6%) | 37 (32.7%) | 26 (32.5%) | 0.99 (0.541.82) |
Renal failure | 33 (17.1%) | 20 (17.7%) | 13 (16.3%) | 0.90 (0.421.94) |
GI bleed | 23 (11.9%) | 18 (15.9%) | 5 (6.3%) | 0.35 (0.120.99) |
Admission MELD | 17.3/7.3 | 17.5/7.3 | 17.0/7.3 | 0.99 (0.951.03) |
Creatinine, median/IQR | 0.9/0.7 | 0.9/0.7 | 0.9/0.8 | 1.02 (0.821.27) |
Gastroenterology consult | 97 (50.3%) | 46 (40.7%) | 51 (63.8%) | 2.56 (1.424.63) |
Infection, UTI, pneumonia, other | 60 (31.1%) | 38 (33.6%) | 22 (27.5%) | 0.75 (0.401.40) |
Temperature 100.4F | 49 (26.8%) | 34 (32.4%) | 15 (19.2%) | 0.50 (0.251.00) |
WBC >11 k/mm3 | 50 (27.3%) | 28 (26.7%) | 22 (28.2%) | 1.08 (0.562.08) |
WBC <4 k/mm3 | 43 (23.5%) | 23 (21.9%) | 20 (25.6%) | 1.23 (0.622.44) |
INR >1.1 | 149 (92.0%) | 83 (93.3%) | 66 (90.4%) | 0.68 (0.222.13) |
Highest temperature, F | 98.9/1.1 | 99.1/1.3 | 98.8/0.8 | 0.82 (0.621.09) |
Highest HR | 98.2/20.4 | 97.4/22.4 | 99.2/17.4 | 1.00 (0.991.02) |
Highest RR | 24.5/13.7 | 25.2/16.8 | 23.5/7.8 | 0.99 (0.961.02) |
Lowest SBP | 101.0/20.0 | 99.4/20.3 | 102.2/19.7 | 0.99 (0.981.01) |
Lowest MAP | 73.0/12.2 | 73.2/13.3 | 72.7/10.6 | 1.00 (0.971.02) |
Lowest O2Sat | 92.6/13.6 | 91.0/17.7 | 94.9/2.8 | 1.04 (0.991.10) |
Highest PT | 15.8/3.8 | 15.9/3.7 | 15.7/3.9 | 0.98 (0.901.08) |
Platelets 50 k/mm3 | 30 (15.9%) | 21 (19.3%) | 9 (11.3%) | 0.53 (0.231.23) |
Of the patients who received paracentesis (n=80), 14% were diagnosed with SBP. Of these, 55% received prophylaxis on discharge. Among the patients who did not receive paracentesis (n=113), 38 (34%) received antibiotics for another documented infection (eg, pneumonia), and 25 patients (22%) received antibiotics with no other documented infection or evidence of variceal bleeding. Of these 25 patients who were presumed to be empirically treated for SBP (Figure 1), only 20% were prescribed prophylactic antibiotics on discharge.

CONCLUSION
We found that many patients with decompensated cirrhosis and ascites did not receive paracentesis when hospitalized, which is similar to previously published data.[4, 6, 7] Clinical evidence of infection, such as fever or elevated WBC count, did not increase the odds of receiving paracentesis. Many patients treated for SBP were not discharged on prophylaxis.
This study is limited by its small single‐center design. We could only use data from 1 year (2009), because study data collection was part of a quality‐improvement project that took place for that year only. We did not adjust for the number of red blood cells in the ascitic fluid samples. We were also unable to determine the timing of gastroenterology consultation (whether it was done prior to paracentesis), admission venue (floor vs intensive care), or patient history of SBP.
Despite these limitations, there are important implications. First, the decision to perform paracentesis was not associated with symptoms of infection, although some clinical factors (eg, low platelets or GI bleeding) were associated with reduced odds of receiving paracentesis. Second, a majority of patients treated for SBP did not receive prophylactic antibiotics at discharge. These findings suggest a clear opportunity to increase awareness and acceptance of AASLD guidelines among hospital medicine practitioners. Quality‐improvement efforts should focus on the education of providers, and future research should identify barriers to paracentesis at both the practitioner and system levels (eg, availability of interventional radiology). Checklists or decision support within electronic order entry systems may also help reduce the low rates of paracentesis seen in our and prior studies.[4, 6, 7]
Disclosures: Dr. Lagu is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under award number K01HL114745. Drs. Lagu, Ghaoui, and Brooling had full access to all of the data in the study. They take responsibility for the integrity of the data and the accuracy of the data analysis. Drs. Lagu, Ghaoui, and Brooling conceived of the study. Dr. Ghaoui acquired the data. Ms. Friderici carried out the statistical analyses. Drs. Lagu, Ghaoui, Brooling, Lindenauer, and Ms. Friderici analyzed and interpreted the data, drafted the manuscript, and critically reviewed the manuscript for important intellectual content. The authors report no conflicts of interest.
- Spanish Collaborative Study Group On Therapeutic Management In Liver Disease. Multicenter hospital study on prescribing patterns for prophylaxis and treatment of complications of cirrhosis. Eur J Clin Pharmacol. 2002;58(6):435–440. , , , , ;
- Bacterial infection in patients with advanced cirrhosis: a multicentre prospective study. Dig Liver Dis. 2001;33(1):41–48. , , , et al.
- AASLD. Introduction to the revised American Association for the Study of Liver Diseases Practice Guideline management of adult patients with ascites due to cirrhosis 2012. Hepatology. 2013;57(4):1651–1653. ,
- Paracentesis is associated with reduced mortality in patients hospitalized with cirrhosis and ascites. Clin Gastroenterol Hepatol. 2014;12(3):496–503.e1. , , , .
- Delayed paracentesis is associated with increased in‐hospital mortality in patients with spontaneous bacterial peritonitis. Am J Gastroenterol. 2014;109(9):1436–1442. , , , et al.
- The quality of care provided to patients with cirrhosis and ascites in the Department of Veterans Affairs. Gastroenterology. 2012;143(1):70–77. , , , et al.
- Measurement of the quality of care of patients admitted with decompensated cirrhosis. Liver Int. 2014;34(2):204–210. , , , , , .
- Spanish Collaborative Study Group On Therapeutic Management In Liver Disease. Multicenter hospital study on prescribing patterns for prophylaxis and treatment of complications of cirrhosis. Eur J Clin Pharmacol. 2002;58(6):435–440. , , , , ;
- Bacterial infection in patients with advanced cirrhosis: a multicentre prospective study. Dig Liver Dis. 2001;33(1):41–48. , , , et al.
- AASLD. Introduction to the revised American Association for the Study of Liver Diseases Practice Guideline management of adult patients with ascites due to cirrhosis 2012. Hepatology. 2013;57(4):1651–1653. ,
- Paracentesis is associated with reduced mortality in patients hospitalized with cirrhosis and ascites. Clin Gastroenterol Hepatol. 2014;12(3):496–503.e1. , , , .
- Delayed paracentesis is associated with increased in‐hospital mortality in patients with spontaneous bacterial peritonitis. Am J Gastroenterol. 2014;109(9):1436–1442. , , , et al.
- The quality of care provided to patients with cirrhosis and ascites in the Department of Veterans Affairs. Gastroenterology. 2012;143(1):70–77. , , , et al.
- Measurement of the quality of care of patients admitted with decompensated cirrhosis. Liver Int. 2014;34(2):204–210. , , , , , .
Lungs donated after cardiac arrest, brain death yield similar survival rates
AUSTIN, TEX. – The risk of death at 1 year after lung transplantation with organs donated either after cardiac arrest or after brain death was virtually the same, an analysis of the literature has shown.
“Donation after cardiac death appears to be a safe and effective method to expand the donor pool,” said Dr. Dustin Krutsinger of the University of Iowa, Iowa City, who presented the findings during the Hot Topics in Pulmonary Critical Care session at the annual meeting of the American College of Chest Physicians.
Over the years, the demand for organ donations for lung transplant candidates has steadily increased while the number of available organs has remained static. This is due, in part, to physicians being concerned about injury to the organs during the ischemic period, as well as what can often be as much as an hour before organ procurement after withdrawal of life support. However, Dr. Krutsinger said the similarities between the two cohorts could result from the fact that before procurement, systemic circulation allows the lungs to oxygenate by perfusion, and so there is less impact during the ischemic period.
“There is also a thought that the ischemic period might actually protect the lungs and the liver from reperfusion injury. And we’re avoiding brain death, which is not a completely benign state,” he told the audience.
After conducting an extensive review of the literature for 1-year survival rates post lung transplantation, the investigators found 519 unique citations, including 58 citations selected for full text review, 10 observational cohort studies for systematic review, and another 5 such studies for meta-analysis.
Dr. Krutsinger and his colleagues found no significant difference in 1-year survival rates between the donation after cardiac death and the donation after brain death cohorts (P = .658). In a pooled analysis of the five studies, no significant difference in risk of death was found at 1 year after either transplantation procedure (relative risk, 0.66; 95% confidence interval, 0.38-1.15; P = .15). Although he thought the findings were limited by shortcomings in the data, such as the fact that the study was a retrospective analysis of unmatched cohorts and that the follow-up period was short, Dr. Krutsinger said in an interview that he thought the data were compelling enough for institutions to begin rethinking organ procurement and transplantation protocols. In addition to his own study, he cited a 2013 study which he said indicated that if lungs donated after cardiac arrest were included, the pool of available organs would increase by as much as 50% (Ann. Am. Thorac. Soc. 2013;10:73-80).
But challenges remain.
“There are some things you can do to the potential donors that are questionable ethicswise, such as administering heparin premortem, which would be beneficial to the actual recipients. But, up until they are pronounced dead, they are still a patient. You don’t really have that complication with a donation after brain death, since once brain death is determined, the person is officially dead. Things you then do to them to benefit the eventual recipients aren’t being done to a ‘patient.’ ”
Still, Dr. Krutsinger said that if organs procured after cardiac arrest were to become more common than after brain death, he would be “disappointed” since the data showed “the outcomes are similar, not inferior.”
Dr. Krutsinger said he had no relevant disclosures.
On Twitter @whitneymcknight
AUSTIN, TEX. – The risk of death at 1 year after lung transplantation with organs donated either after cardiac arrest or after brain death was virtually the same, an analysis of the literature has shown.
“Donation after cardiac death appears to be a safe and effective method to expand the donor pool,” said Dr. Dustin Krutsinger of the University of Iowa, Iowa City, who presented the findings during the Hot Topics in Pulmonary Critical Care session at the annual meeting of the American College of Chest Physicians.
Over the years, the demand for organ donations for lung transplant candidates has steadily increased while the number of available organs has remained static. This is due, in part, to physicians being concerned about injury to the organs during the ischemic period, as well as what can often be as much as an hour before organ procurement after withdrawal of life support. However, Dr. Krutsinger said the similarities between the two cohorts could result from the fact that before procurement, systemic circulation allows the lungs to oxygenate by perfusion, and so there is less impact during the ischemic period.
“There is also a thought that the ischemic period might actually protect the lungs and the liver from reperfusion injury. And we’re avoiding brain death, which is not a completely benign state,” he told the audience.
After conducting an extensive review of the literature for 1-year survival rates post lung transplantation, the investigators found 519 unique citations, including 58 citations selected for full text review, 10 observational cohort studies for systematic review, and another 5 such studies for meta-analysis.
Dr. Krutsinger and his colleagues found no significant difference in 1-year survival rates between the donation after cardiac death and the donation after brain death cohorts (P = .658). In a pooled analysis of the five studies, no significant difference in risk of death was found at 1 year after either transplantation procedure (relative risk, 0.66; 95% confidence interval, 0.38-1.15; P = .15). Although he thought the findings were limited by shortcomings in the data, such as the fact that the study was a retrospective analysis of unmatched cohorts and that the follow-up period was short, Dr. Krutsinger said in an interview that he thought the data were compelling enough for institutions to begin rethinking organ procurement and transplantation protocols. In addition to his own study, he cited a 2013 study which he said indicated that if lungs donated after cardiac arrest were included, the pool of available organs would increase by as much as 50% (Ann. Am. Thorac. Soc. 2013;10:73-80).
But challenges remain.
“There are some things you can do to the potential donors that are questionable ethicswise, such as administering heparin premortem, which would be beneficial to the actual recipients. But, up until they are pronounced dead, they are still a patient. You don’t really have that complication with a donation after brain death, since once brain death is determined, the person is officially dead. Things you then do to them to benefit the eventual recipients aren’t being done to a ‘patient.’ ”
Still, Dr. Krutsinger said that if organs procured after cardiac arrest were to become more common than after brain death, he would be “disappointed” since the data showed “the outcomes are similar, not inferior.”
Dr. Krutsinger said he had no relevant disclosures.
On Twitter @whitneymcknight
AUSTIN, TEX. – The risk of death at 1 year after lung transplantation with organs donated either after cardiac arrest or after brain death was virtually the same, an analysis of the literature has shown.
“Donation after cardiac death appears to be a safe and effective method to expand the donor pool,” said Dr. Dustin Krutsinger of the University of Iowa, Iowa City, who presented the findings during the Hot Topics in Pulmonary Critical Care session at the annual meeting of the American College of Chest Physicians.
Over the years, the demand for organ donations for lung transplant candidates has steadily increased while the number of available organs has remained static. This is due, in part, to physicians being concerned about injury to the organs during the ischemic period, as well as what can often be as much as an hour before organ procurement after withdrawal of life support. However, Dr. Krutsinger said the similarities between the two cohorts could result from the fact that before procurement, systemic circulation allows the lungs to oxygenate by perfusion, and so there is less impact during the ischemic period.
“There is also a thought that the ischemic period might actually protect the lungs and the liver from reperfusion injury. And we’re avoiding brain death, which is not a completely benign state,” he told the audience.
After conducting an extensive review of the literature for 1-year survival rates post lung transplantation, the investigators found 519 unique citations, including 58 citations selected for full text review, 10 observational cohort studies for systematic review, and another 5 such studies for meta-analysis.
Dr. Krutsinger and his colleagues found no significant difference in 1-year survival rates between the donation after cardiac death and the donation after brain death cohorts (P = .658). In a pooled analysis of the five studies, no significant difference in risk of death was found at 1 year after either transplantation procedure (relative risk, 0.66; 95% confidence interval, 0.38-1.15; P = .15). Although he thought the findings were limited by shortcomings in the data, such as the fact that the study was a retrospective analysis of unmatched cohorts and that the follow-up period was short, Dr. Krutsinger said in an interview that he thought the data were compelling enough for institutions to begin rethinking organ procurement and transplantation protocols. In addition to his own study, he cited a 2013 study which he said indicated that if lungs donated after cardiac arrest were included, the pool of available organs would increase by as much as 50% (Ann. Am. Thorac. Soc. 2013;10:73-80).
But challenges remain.
“There are some things you can do to the potential donors that are questionable ethicswise, such as administering heparin premortem, which would be beneficial to the actual recipients. But, up until they are pronounced dead, they are still a patient. You don’t really have that complication with a donation after brain death, since once brain death is determined, the person is officially dead. Things you then do to them to benefit the eventual recipients aren’t being done to a ‘patient.’ ”
Still, Dr. Krutsinger said that if organs procured after cardiac arrest were to become more common than after brain death, he would be “disappointed” since the data showed “the outcomes are similar, not inferior.”
Dr. Krutsinger said he had no relevant disclosures.
On Twitter @whitneymcknight
AT CHEST 2014
Key clinical point: Expansion of organ donation programs to include organs donated after cardiac death could help meet a growing demand for donated lungs.
Major finding: No significant difference was seen in lung transplantation 1-year survival rates between donation after cardiac arrest and donation after brain death.
Data source: A systematic review of 10 observational cohort studies and a meta-analysis of 5 studies, chosen from more than 500 citations that included 1-year survival data for lung transplantation occuring after either cardiac arrest or brain death.
Disclosures: Dr. Krutsinger said he had no relevant disclosures.
Hospitalists Less-Likely Targets of Malpractice Claims Than Other Physicians
In the article "Liability Impact of the Hospitalist Model of Care," Adam Schaffer, MD, a hospitalist at Brigham and Women's Hospital in Boston, writes that hospitalists average 0.52 malpractice claims per 100 physician coverage years (PCYs), while non-hospitalist internal medicine physicians have a rate of 1.91 claims per 100 PCYs. By comparison, ED physicians average 3.5 claims per 100 PYCs, general surgeons average 4.7 claims, and OB/GYNs average 5.56 claims (P<0.001 for all comparisons).
"I was fairly surprised because the magnitude of the decreased risk…was fairly significant and statistically significant," Dr. Schaffer says. He notes that having relatively short interactions with patients and the difficulties of care transitions would appear to make it difficult for hospitalists to establish the type of close relationships with patients that can help prevent malpractice claims. However, hospitalists have overcome that hurdle.
An editorial that accompanies the JHM study contends that hospitalists develop and hone skills "which allow them to quickly establish rapport with patients and families." The editorial was penned by hospitalist Kevin O'Leary, MD, MS, SFHM, of Northwestern University Feinberg School of Medicine in Chicago, and JHM Editor-in-Chief Andrew Auerbach, MD, MPH, SFHM, of the University of California, San Francisco.
"Even though you may have a relatively brief relationship with the patient," Dr. Schaffer adds, "the fact that you're in the hospital, able to see them, meet with them, answer their questions multiple times a day if need be, that may actually help establish a strong and robust physician-patient relationship."
Visit SHM's blog, "The Hospital Leader," for an exploration of malpractice suits and a Q&A with study author Adam Schaffer.
In the article "Liability Impact of the Hospitalist Model of Care," Adam Schaffer, MD, a hospitalist at Brigham and Women's Hospital in Boston, writes that hospitalists average 0.52 malpractice claims per 100 physician coverage years (PCYs), while non-hospitalist internal medicine physicians have a rate of 1.91 claims per 100 PCYs. By comparison, ED physicians average 3.5 claims per 100 PYCs, general surgeons average 4.7 claims, and OB/GYNs average 5.56 claims (P<0.001 for all comparisons).
"I was fairly surprised because the magnitude of the decreased risk…was fairly significant and statistically significant," Dr. Schaffer says. He notes that having relatively short interactions with patients and the difficulties of care transitions would appear to make it difficult for hospitalists to establish the type of close relationships with patients that can help prevent malpractice claims. However, hospitalists have overcome that hurdle.
An editorial that accompanies the JHM study contends that hospitalists develop and hone skills "which allow them to quickly establish rapport with patients and families." The editorial was penned by hospitalist Kevin O'Leary, MD, MS, SFHM, of Northwestern University Feinberg School of Medicine in Chicago, and JHM Editor-in-Chief Andrew Auerbach, MD, MPH, SFHM, of the University of California, San Francisco.
"Even though you may have a relatively brief relationship with the patient," Dr. Schaffer adds, "the fact that you're in the hospital, able to see them, meet with them, answer their questions multiple times a day if need be, that may actually help establish a strong and robust physician-patient relationship."
Visit SHM's blog, "The Hospital Leader," for an exploration of malpractice suits and a Q&A with study author Adam Schaffer.
In the article "Liability Impact of the Hospitalist Model of Care," Adam Schaffer, MD, a hospitalist at Brigham and Women's Hospital in Boston, writes that hospitalists average 0.52 malpractice claims per 100 physician coverage years (PCYs), while non-hospitalist internal medicine physicians have a rate of 1.91 claims per 100 PCYs. By comparison, ED physicians average 3.5 claims per 100 PYCs, general surgeons average 4.7 claims, and OB/GYNs average 5.56 claims (P<0.001 for all comparisons).
"I was fairly surprised because the magnitude of the decreased risk…was fairly significant and statistically significant," Dr. Schaffer says. He notes that having relatively short interactions with patients and the difficulties of care transitions would appear to make it difficult for hospitalists to establish the type of close relationships with patients that can help prevent malpractice claims. However, hospitalists have overcome that hurdle.
An editorial that accompanies the JHM study contends that hospitalists develop and hone skills "which allow them to quickly establish rapport with patients and families." The editorial was penned by hospitalist Kevin O'Leary, MD, MS, SFHM, of Northwestern University Feinberg School of Medicine in Chicago, and JHM Editor-in-Chief Andrew Auerbach, MD, MPH, SFHM, of the University of California, San Francisco.
"Even though you may have a relatively brief relationship with the patient," Dr. Schaffer adds, "the fact that you're in the hospital, able to see them, meet with them, answer their questions multiple times a day if need be, that may actually help establish a strong and robust physician-patient relationship."
Visit SHM's blog, "The Hospital Leader," for an exploration of malpractice suits and a Q&A with study author Adam Schaffer.
Once-Weekly Antibiotic Might Be Effective for Treatment of Acute Bacterial Skin Infections
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of =37.6°C at 48–72 hours. Secondary endpoints included a decrease in lesion area of =20% at 48–72 hours and clinical success at end of therapy (determined by clinical and historical features). Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups. Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48–72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of =37.6°C at 48–72 hours. Secondary endpoints included a decrease in lesion area of =20% at 48–72 hours and clinical success at end of therapy (determined by clinical and historical features). Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups. Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48–72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of =37.6°C at 48–72 hours. Secondary endpoints included a decrease in lesion area of =20% at 48–72 hours and clinical success at end of therapy (determined by clinical and historical features). Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups. Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48–72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.