User login
Applying a Quality Improvement Framework to Operating Room Efficiency in an Academic-Practice Partnership
From the Case Western Reserve University School of Medicine, Cleveland, OH.
Abstract
- Objective: To improve operating room (OR) scheduling efficiency at a large academic institution through the use of an academic-practice partnership and quality improvement (QI) methods.
- Methods: The OR administrative team at a large academic hospital partnered with students in a graduate level QI course to apply QI tools to the problem of OR efficiency.
- Results: The team found wide variation in the way that surgeries were scheduled and other factors that contributed to inefficient OR utilization. A plan-do-study-act (PDSA) cycle was applied to the problem of discrepancy in surgeons’ interpretation of case length, resulting in poor case length accuracy. Our intervention, adding time on the schedule for cases, did not show consistent improvement in case length accuracy.
- Conclusion: Although our intervention did not lead to sustained improvements in OR scheduling efficiency, our project demonstrates how QI tools can be taught and applied in an academic course to address a management problem. Further research is needed to study the impact of student teams on health care improvement.
Operating rooms are one of the most costly departments of a hospital. At University Hospitals Case Medical Center (UHCMC), as at many hospitals, operating room utilization is a key area of focus for both operating room (OR) and hospital administrators. Efficient use of the OR is an important aspect of a hospital’s finances and patient-centeredness.
UHCMC uses block scheduling, a common OR scheduling design. Each surgical department is allotted a certain number of blocks (hours of reserved OR time) that they are responsible for filling with surgical cases and that the hospital is responsible for staffing. Block utilization rate is a metric commonly used to measure OR efficiency. It divides the time that the OR is in use by the total block time allocated to the department (while accounting for room turnaround time). An industry benchmark is 75% block utilization [1], which was adopted as an internal target at UHCMC. Achieving this metric is necessary because the hospital (rather than each individual surgical department) is responsible for ensuring that the appropriate amount of non-surgeon staff (eg, anesthesiologists, nurses, scrub techs, and facilities staff) is available. Poor utilization rates indicate that the staff and equipment are inefficiently used, which can impact the hospital’s financial well-being [2]. Block utilization is the result of a complex system, making it challenging to improve. Many people are involved in scheduling, and a large degree of inherent uncertainty exists in the system.
At UHCMC, block utilization rates by department ranged from 52% to 80%, with an overall utilization of 64% from February to July 2014. Given this wide variation, higher level management staff in the OR initiated a project in which OR administrators partnered with students in a graduate level QI course in an effort to improve overall block utilization. They believed that improving block utilization rate would improve the effectiveness, patient-centeredness, and efficiency of care, health care delivery goals described by the Institute of Medicine [3].
Methods
Setting
The OR at UHCMC contains 4 operating suites that serve over 25,000 patients per year and train over 900 residents each year. Nearly 250 surgeons in 23 departments use the OR. The OR schedule at our institution is coordinated by block scheduling, as described above. If a surgical department cannot fill the block, they must release the time to central scheduling for re-allocation of the time to another department.
Application of QI Process
This QI project was an academic-practice collaboration between UHCMC and a graduate level course at Case Western Reserve University called The Continual Improvement of Healthcare: an Interdisciplinary Course [4]. Faculty course instructors solicit applications of QI projects from departments at UHCMC. The project team consisted of 4 students (from medicine, social work, public health, and bioethics), 2 administrative staff from UHCMC, and a QI coach who is on the faculty at Case Western. Guidance was provided by 2 faculty facilitators. The students attended 15 weekly class sessions, 4 meetings with the project team, numerous data gathering sessions with other hospital staff, and held a handful of outside-class student team meetings. An early class session was devoted to team skills and the Seven-Step meeting process [5]. Each classroom session consisted of structured group activities to practice the tools of the QI process.
Tool 1: Global Aim
The team first established a global aim: to improve the OR block utilization rate at UHCMC. This aim was based on the initial project proposal from UHCMC. The global aim explains the reason that the project team was established, and frames all future work [7].
Tool 2: Industry Assessment
Based on the global aim, the student team performed an industry assessment in order to understand strategies for improving block utilization rate in use at other institutions. Peer-reviewed journal articles and case reports were reviewed and the student team was able to contact a team at another institution working on similar issues.
Overall, 2 broad categories of interventions to improve block utilization were identified. Some institutions addressed the way time in the OR was scheduled. They made improvements to how block time was allotted, timing of cases, and dealing with add-on cases [8]. Others focused on using time in the OR more efficiently by addressing room turnover, delays including waiting for surgeons, and waiting for hospital beds [9]. Because the specific case mix of each hospital is so distinct, hospitals that successfully made changes all used a variety of interventions [10–12]. After the industry assessment, the student team realized that there would be a large number of possible approaches to the problem of block utilization, and a better understanding of the actual process of scheduling at UHCMC was necessary to find an area of focus.
Tool 3: Process Map
As the project team began to address the global aim of improving OR block utilization at UHCMC, they needed to have a thorough understanding of how OR time was allotted and used. To do this, the student team created a process map by interviewing process stakeholders, including the OR managers and department schedulers in orthopedics, general surgery, and urology, as suggested by the OR managers. The perspective of these staff were critical to understanding the process of operating room scheduling.
Through the creation of the process map, the project team found that there was wide variation in the process and structure for scheduling surgeries. Some departments used one central scheduler while others used individual secretaries for each surgeon. Some surgeons maintained control over changing their schedule, while others did not. Further, the project team learned that the metric of block utilization rate was of varying importance to people working on the ground.
Tool 4: Fishbone Diagram
After understanding the process, the project team considered all of the factors that
Tool 5: Specific Aim
Though the global aim was to improve block utilization, the project team needed to chose a specific aim that met S.M.A.R.T criteria: Specific, Measureable, Achievable, Results-focused, and Time-bound [7]. After considering multiple potential areas of initial focus, the OR staff suggested focusing on the issue of case length accuracy. In qualitative interviews, the student team had found that the surgery request forms ask for “case length,” and the schedulers were not sure how the surgeons defined it. When the OR is booked for an operation, the amount of time blocked out is the time from when the patient is brought into the operating room to the time that the patient leaves the room, or WIWO (Wheels In Wheels Out). This WIWO time includes anesthesia induction and preparations for surgery such as positioning. Some surgeons think about case length as only the time that the patient is operated on, or CTC (Cut to Close). Thus, the surgeon may be requesting less time than is really necessary for the case if he or she is only thinking about CTC time. The student team created a survey and found that 2 urology surgeons considered case length to be WIWO, and 4 considered case length to mean CTC.
Tools 6 and 7: PDSA Cycle and Control Charts
The Plan-Do-Study-Act cycle is an iterative plan of action for designing and testing a specific change [7]. This part of the QI cycle involved implementing and testing a change to address our specific aim. As the first cycle of change, the team requested that the scheduler add 15 minutes to the surgeons’ requested case time over 1 week. Of the urologists scheduled that week, one had used CTC and the other had not completed the student team’s survey. In order to study the change, the project team used control charts for the 2 surgeons whose case times were adapted. Prior to the intervention, the surgeons averaged at least 20 minutes over their scheduled time, with wide variation. Surgeons were infrequently completing cases at or below their requested case time. Most of the inaccuracy came from going long. The team used control charts to understand the impact of the change. The control charts showed that after the change in scheduling time, the 2 surgeons still went over their allotted case time, but to a lesser degree.
After gaining new information, the next step in the PDSA cycle is to determine the next test of change. The student team recommended sharing these data with the surgeons to consider next steps in improving block utilization, though time constraints of the semester limited continued involvement of the student team in the next PDSA cycle.
Discussion
Through the application of QI tools, new insight was gained about OR efficiency and potential improvements. The student team talked to numerous staff involved in scheduling and each discussion increased understanding of the issues that lead to OR inefficiency. The process map and fishbone diagram provided a visual expression of how small issues could impact the overall OR system. Application of QI tools also led the team to the discovery that surgeons may be interpreting case length in disparate ways, contributing to problems with scheduling.
Though the intervention did not have significant impact over 1 week, more time for subsequent PDSA cycles may have resulted in clinical improvements. Despite the limitations, the student team uncovered an important aspect of the block scheduling process, providing valuable information and insight for the department around this scheduling issue. The student team’s work was shared between multiple surgical departments, and the QI work in the department is ongoing.
Implications for Health Care Institutions
Nontraditional Projects Can Work
The issue of OR utilization is perhaps not a “traditional” QI project given the macro nature of the problem. Once it was broken down into discrete processes, problems such as OR turnover, scheduling redundancies, and others look much more like traditional QI projects. It may be beneficial to institutions to broaden the scope of QI to problems that may, at first glance, seem out of the realm of process mapping, fishbone diagramming, and SMART aims. QI tools can turn management problems into projects that can be tackled by small teams, creating an culture of change in an organization [13].
Benefits of Student Teams
There are clear benefits to the institution working with students. Our hospital-based team members found it beneficial to have independent observers review the process and recommend improvements. Students were able to challenge the status quo and point out inefficiencies that have remained due to institutional complacency and lack of resources. The hospital employees were impressed and surprised that the students found the misunderstanding about case length, and noted that it suggests that there may be other places where there are miscommunications between various people involved in OR scheduling. The students’ energy and time was supported by the QI expertise of the course instructors, and the practical knowledge of the hospital-based team members. Similar benefits have been noted by others utilizing collaborative QI educational models [14,15].
Benefits for Students
For the students on the team, the opportunity to apply QI concepts to the real world was a unique learning opportunity. First, the project was truly interdisciplinary. The students were from varied fields and they worked with schedulers, surgeons, and office managers providing the students with insight into the meaning and perspectives of interprofessional collaboration. The students appreciated the complexity and tensions of the OR staff who were working to balance the schedules of nurses, anesthesiologists, and other OR support staff. Additionally, interdisciplinary collaboration in health care is of increasing importance in everyday practice [16,17]. A strong understanding of collaboration across professions will be a cornerstone of the students’ credentials as they move into the workforce.
There is also value in adding real work experience to academics. The students were able to appreciate not only the concepts of QI but the actual challenges of implementing QI methodology in an institution where people had varying levels of buy-in. Quality improvement is about more than sitting at a whiteboard coming up with charts—it is about enacting actual change and understanding specific real-world situations. The hospital collaboration allowed the students to gain experience that is impossible to replicate in the classroom.
Limitations and Barriers
As noted in other academic-practice collaborations, the limitation of completing the project in one semester presents a barrier to collaboration; the working world does not operate on an academic timeline [14]. Students were limited to only testing one cycle of change. This part of the semester was disappointing as the students would have liked to implement multiple PDSA cycles. The OR managers faced barriers as well; they invested time in educating students who would soon move on, and would have to repeat the process with a new group of students. The department has continued on with this work, but losing the students who they oriented was not ideal.
The course instructors were flexible in allowing the project team to spend the majority of time breaking down the problem of OR block utilization into testable changes, which was the bulk of our work. However, the skill that the team was able to dedicate the least amount time to, testing and implementing change, is useful for the students to learn and beneficial for the organization. Moving forward, allowing teams to build on the previous semester’s work, and even implementing a student handoff, might be tried.
Future Directions
Although our intervention did not lead to sustained improvements in OR scheduling efficiency, our project demonstrates how QI tools can be taught and applied in an academic course to address a management problem. Research to specifically understand institutional benefits of academic-practice collaborations would be helpful in recruiting partners and furthering best practices for participants in these partnerships. Research is also needed to understand the impact of QI collaborative models such as the one described in this paper on improving interprofessional teamwork and communication skills, as called for by health care professional educators [16].
Corresponding author: Danielle O’Rourke-Suchoff, BA, Case Western Reserve University School of Medicine, Office of Student Affairs, 10900 Euclid Ave., Cleveland, OH 44106, [email protected].
Financial disclosures: none.
1. The right strategies can help increase OR utilization. OR Manager 2013;29:21–2.
2. Jackson RL. The business of surgery. Managing the OR as a profit center requires more than just IT. It requires a profit-making mindset, too. Health Manage Technol 2002;23:20–2.
3. Institute of Medicine. Crossing the quality chasm: A new health system for the 21st century. Washington (DC): National Academy Press; 2001.
4. Hand R, Dolansky MA, Hanahan E, Tinsley N. Quality comes alive: an interdisciplinary student team’s quality improvement experience in learning by doing—health care education case study. Qual Approaches Higher Educ 2014;5:26–32.
5. Scholtes PR, Joiner BL, Streibel BJ. The team handbook. Oriel; 2003.
6. Institute for Healthcare Improvement. Open School. 2015. Accessed 13 Apr 2015 at www.ihi.org/education/ihiopenschool/Pages/default.aspx.
7. Ogrinc GS, Headrick LA, Moore SM, et al. Fundamentals of health care improvement: A guide to improving your patients’ care. 2nd ed. Oakbrook Terrace, IL: Joint Commission Resources and the Institute for Healthcare Improvement; 2012.
8. Managing patient flow: Smoothing OR schedule can ease capacity crunches, researchers say. OR Manager 2003;19:1,9–10.
9. Harders M, Malangoni MA, Weight S, Sidhu T. Improving operating room efficiency through process redesign. Surgery 2006;140:509–16.
10. Paynter J, Horne W, Sizemore R. Realizing revenue opportunities in the operating room. 2015. Accessed 13 Apr 2015 at www.ihi.org/resources/Pages/ImprovementStories/RealizingRevenueOpportunitiesintheOperatingRoom.aspx.
11. Cima RR, Brown MJ, Hebl JR, et al. Use of Lean and Six Sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center. J Am Coll Surg 2011;213:83–92.
12. Day R, Garfinkel R, Thompson S. Integrated block sharing: a win–win strategy for hospitals and surgeons. Manufact Serv Op Manage 2012;14:567–83.
13. Pardini-Kiely K, Greenlee E, Hopkins J, et al. Improving and Sustaining core measure performance through effective accountability of clinical microsystems in an academic medical center. Jt Comm J Qual Improv Pt Safety 2010;36:387–98.
14. Hall LW, Headrick LA, Cox KR, et al. Linking health professional learners and health care workers on action-based improvement teams. Qual Manag Health Care 2009;18:194–201.
15. Ogrinc GS, Nierenberg DW, Batalden PB. Building experiential learning about quality improvement into a medical school curriculum: The Dartmouth Experience. Health Aff 2011;30:716–22.
16. Interprofessional Education Collaborative Expert Panel. Core competencies for interprofessional collaborative practice. Washington, DC: Interprofessional Education Collaborative; 2011.
17. World Health Organization. Framework for action on inerprofessional education and collaborative practice. Geneva: World Health Organization; 2010.
From the Case Western Reserve University School of Medicine, Cleveland, OH.
Abstract
- Objective: To improve operating room (OR) scheduling efficiency at a large academic institution through the use of an academic-practice partnership and quality improvement (QI) methods.
- Methods: The OR administrative team at a large academic hospital partnered with students in a graduate level QI course to apply QI tools to the problem of OR efficiency.
- Results: The team found wide variation in the way that surgeries were scheduled and other factors that contributed to inefficient OR utilization. A plan-do-study-act (PDSA) cycle was applied to the problem of discrepancy in surgeons’ interpretation of case length, resulting in poor case length accuracy. Our intervention, adding time on the schedule for cases, did not show consistent improvement in case length accuracy.
- Conclusion: Although our intervention did not lead to sustained improvements in OR scheduling efficiency, our project demonstrates how QI tools can be taught and applied in an academic course to address a management problem. Further research is needed to study the impact of student teams on health care improvement.
Operating rooms are one of the most costly departments of a hospital. At University Hospitals Case Medical Center (UHCMC), as at many hospitals, operating room utilization is a key area of focus for both operating room (OR) and hospital administrators. Efficient use of the OR is an important aspect of a hospital’s finances and patient-centeredness.
UHCMC uses block scheduling, a common OR scheduling design. Each surgical department is allotted a certain number of blocks (hours of reserved OR time) that they are responsible for filling with surgical cases and that the hospital is responsible for staffing. Block utilization rate is a metric commonly used to measure OR efficiency. It divides the time that the OR is in use by the total block time allocated to the department (while accounting for room turnaround time). An industry benchmark is 75% block utilization [1], which was adopted as an internal target at UHCMC. Achieving this metric is necessary because the hospital (rather than each individual surgical department) is responsible for ensuring that the appropriate amount of non-surgeon staff (eg, anesthesiologists, nurses, scrub techs, and facilities staff) is available. Poor utilization rates indicate that the staff and equipment are inefficiently used, which can impact the hospital’s financial well-being [2]. Block utilization is the result of a complex system, making it challenging to improve. Many people are involved in scheduling, and a large degree of inherent uncertainty exists in the system.
At UHCMC, block utilization rates by department ranged from 52% to 80%, with an overall utilization of 64% from February to July 2014. Given this wide variation, higher level management staff in the OR initiated a project in which OR administrators partnered with students in a graduate level QI course in an effort to improve overall block utilization. They believed that improving block utilization rate would improve the effectiveness, patient-centeredness, and efficiency of care, health care delivery goals described by the Institute of Medicine [3].
Methods
Setting
The OR at UHCMC contains 4 operating suites that serve over 25,000 patients per year and train over 900 residents each year. Nearly 250 surgeons in 23 departments use the OR. The OR schedule at our institution is coordinated by block scheduling, as described above. If a surgical department cannot fill the block, they must release the time to central scheduling for re-allocation of the time to another department.
Application of QI Process
This QI project was an academic-practice collaboration between UHCMC and a graduate level course at Case Western Reserve University called The Continual Improvement of Healthcare: an Interdisciplinary Course [4]. Faculty course instructors solicit applications of QI projects from departments at UHCMC. The project team consisted of 4 students (from medicine, social work, public health, and bioethics), 2 administrative staff from UHCMC, and a QI coach who is on the faculty at Case Western. Guidance was provided by 2 faculty facilitators. The students attended 15 weekly class sessions, 4 meetings with the project team, numerous data gathering sessions with other hospital staff, and held a handful of outside-class student team meetings. An early class session was devoted to team skills and the Seven-Step meeting process [5]. Each classroom session consisted of structured group activities to practice the tools of the QI process.
Tool 1: Global Aim
The team first established a global aim: to improve the OR block utilization rate at UHCMC. This aim was based on the initial project proposal from UHCMC. The global aim explains the reason that the project team was established, and frames all future work [7].
Tool 2: Industry Assessment
Based on the global aim, the student team performed an industry assessment in order to understand strategies for improving block utilization rate in use at other institutions. Peer-reviewed journal articles and case reports were reviewed and the student team was able to contact a team at another institution working on similar issues.
Overall, 2 broad categories of interventions to improve block utilization were identified. Some institutions addressed the way time in the OR was scheduled. They made improvements to how block time was allotted, timing of cases, and dealing with add-on cases [8]. Others focused on using time in the OR more efficiently by addressing room turnover, delays including waiting for surgeons, and waiting for hospital beds [9]. Because the specific case mix of each hospital is so distinct, hospitals that successfully made changes all used a variety of interventions [10–12]. After the industry assessment, the student team realized that there would be a large number of possible approaches to the problem of block utilization, and a better understanding of the actual process of scheduling at UHCMC was necessary to find an area of focus.
Tool 3: Process Map
As the project team began to address the global aim of improving OR block utilization at UHCMC, they needed to have a thorough understanding of how OR time was allotted and used. To do this, the student team created a process map by interviewing process stakeholders, including the OR managers and department schedulers in orthopedics, general surgery, and urology, as suggested by the OR managers. The perspective of these staff were critical to understanding the process of operating room scheduling.
Through the creation of the process map, the project team found that there was wide variation in the process and structure for scheduling surgeries. Some departments used one central scheduler while others used individual secretaries for each surgeon. Some surgeons maintained control over changing their schedule, while others did not. Further, the project team learned that the metric of block utilization rate was of varying importance to people working on the ground.
Tool 4: Fishbone Diagram
After understanding the process, the project team considered all of the factors that
Tool 5: Specific Aim
Though the global aim was to improve block utilization, the project team needed to chose a specific aim that met S.M.A.R.T criteria: Specific, Measureable, Achievable, Results-focused, and Time-bound [7]. After considering multiple potential areas of initial focus, the OR staff suggested focusing on the issue of case length accuracy. In qualitative interviews, the student team had found that the surgery request forms ask for “case length,” and the schedulers were not sure how the surgeons defined it. When the OR is booked for an operation, the amount of time blocked out is the time from when the patient is brought into the operating room to the time that the patient leaves the room, or WIWO (Wheels In Wheels Out). This WIWO time includes anesthesia induction and preparations for surgery such as positioning. Some surgeons think about case length as only the time that the patient is operated on, or CTC (Cut to Close). Thus, the surgeon may be requesting less time than is really necessary for the case if he or she is only thinking about CTC time. The student team created a survey and found that 2 urology surgeons considered case length to be WIWO, and 4 considered case length to mean CTC.
Tools 6 and 7: PDSA Cycle and Control Charts
The Plan-Do-Study-Act cycle is an iterative plan of action for designing and testing a specific change [7]. This part of the QI cycle involved implementing and testing a change to address our specific aim. As the first cycle of change, the team requested that the scheduler add 15 minutes to the surgeons’ requested case time over 1 week. Of the urologists scheduled that week, one had used CTC and the other had not completed the student team’s survey. In order to study the change, the project team used control charts for the 2 surgeons whose case times were adapted. Prior to the intervention, the surgeons averaged at least 20 minutes over their scheduled time, with wide variation. Surgeons were infrequently completing cases at or below their requested case time. Most of the inaccuracy came from going long. The team used control charts to understand the impact of the change. The control charts showed that after the change in scheduling time, the 2 surgeons still went over their allotted case time, but to a lesser degree.
After gaining new information, the next step in the PDSA cycle is to determine the next test of change. The student team recommended sharing these data with the surgeons to consider next steps in improving block utilization, though time constraints of the semester limited continued involvement of the student team in the next PDSA cycle.
Discussion
Through the application of QI tools, new insight was gained about OR efficiency and potential improvements. The student team talked to numerous staff involved in scheduling and each discussion increased understanding of the issues that lead to OR inefficiency. The process map and fishbone diagram provided a visual expression of how small issues could impact the overall OR system. Application of QI tools also led the team to the discovery that surgeons may be interpreting case length in disparate ways, contributing to problems with scheduling.
Though the intervention did not have significant impact over 1 week, more time for subsequent PDSA cycles may have resulted in clinical improvements. Despite the limitations, the student team uncovered an important aspect of the block scheduling process, providing valuable information and insight for the department around this scheduling issue. The student team’s work was shared between multiple surgical departments, and the QI work in the department is ongoing.
Implications for Health Care Institutions
Nontraditional Projects Can Work
The issue of OR utilization is perhaps not a “traditional” QI project given the macro nature of the problem. Once it was broken down into discrete processes, problems such as OR turnover, scheduling redundancies, and others look much more like traditional QI projects. It may be beneficial to institutions to broaden the scope of QI to problems that may, at first glance, seem out of the realm of process mapping, fishbone diagramming, and SMART aims. QI tools can turn management problems into projects that can be tackled by small teams, creating an culture of change in an organization [13].
Benefits of Student Teams
There are clear benefits to the institution working with students. Our hospital-based team members found it beneficial to have independent observers review the process and recommend improvements. Students were able to challenge the status quo and point out inefficiencies that have remained due to institutional complacency and lack of resources. The hospital employees were impressed and surprised that the students found the misunderstanding about case length, and noted that it suggests that there may be other places where there are miscommunications between various people involved in OR scheduling. The students’ energy and time was supported by the QI expertise of the course instructors, and the practical knowledge of the hospital-based team members. Similar benefits have been noted by others utilizing collaborative QI educational models [14,15].
Benefits for Students
For the students on the team, the opportunity to apply QI concepts to the real world was a unique learning opportunity. First, the project was truly interdisciplinary. The students were from varied fields and they worked with schedulers, surgeons, and office managers providing the students with insight into the meaning and perspectives of interprofessional collaboration. The students appreciated the complexity and tensions of the OR staff who were working to balance the schedules of nurses, anesthesiologists, and other OR support staff. Additionally, interdisciplinary collaboration in health care is of increasing importance in everyday practice [16,17]. A strong understanding of collaboration across professions will be a cornerstone of the students’ credentials as they move into the workforce.
There is also value in adding real work experience to academics. The students were able to appreciate not only the concepts of QI but the actual challenges of implementing QI methodology in an institution where people had varying levels of buy-in. Quality improvement is about more than sitting at a whiteboard coming up with charts—it is about enacting actual change and understanding specific real-world situations. The hospital collaboration allowed the students to gain experience that is impossible to replicate in the classroom.
Limitations and Barriers
As noted in other academic-practice collaborations, the limitation of completing the project in one semester presents a barrier to collaboration; the working world does not operate on an academic timeline [14]. Students were limited to only testing one cycle of change. This part of the semester was disappointing as the students would have liked to implement multiple PDSA cycles. The OR managers faced barriers as well; they invested time in educating students who would soon move on, and would have to repeat the process with a new group of students. The department has continued on with this work, but losing the students who they oriented was not ideal.
The course instructors were flexible in allowing the project team to spend the majority of time breaking down the problem of OR block utilization into testable changes, which was the bulk of our work. However, the skill that the team was able to dedicate the least amount time to, testing and implementing change, is useful for the students to learn and beneficial for the organization. Moving forward, allowing teams to build on the previous semester’s work, and even implementing a student handoff, might be tried.
Future Directions
Although our intervention did not lead to sustained improvements in OR scheduling efficiency, our project demonstrates how QI tools can be taught and applied in an academic course to address a management problem. Research to specifically understand institutional benefits of academic-practice collaborations would be helpful in recruiting partners and furthering best practices for participants in these partnerships. Research is also needed to understand the impact of QI collaborative models such as the one described in this paper on improving interprofessional teamwork and communication skills, as called for by health care professional educators [16].
Corresponding author: Danielle O’Rourke-Suchoff, BA, Case Western Reserve University School of Medicine, Office of Student Affairs, 10900 Euclid Ave., Cleveland, OH 44106, [email protected].
Financial disclosures: none.
From the Case Western Reserve University School of Medicine, Cleveland, OH.
Abstract
- Objective: To improve operating room (OR) scheduling efficiency at a large academic institution through the use of an academic-practice partnership and quality improvement (QI) methods.
- Methods: The OR administrative team at a large academic hospital partnered with students in a graduate level QI course to apply QI tools to the problem of OR efficiency.
- Results: The team found wide variation in the way that surgeries were scheduled and other factors that contributed to inefficient OR utilization. A plan-do-study-act (PDSA) cycle was applied to the problem of discrepancy in surgeons’ interpretation of case length, resulting in poor case length accuracy. Our intervention, adding time on the schedule for cases, did not show consistent improvement in case length accuracy.
- Conclusion: Although our intervention did not lead to sustained improvements in OR scheduling efficiency, our project demonstrates how QI tools can be taught and applied in an academic course to address a management problem. Further research is needed to study the impact of student teams on health care improvement.
Operating rooms are one of the most costly departments of a hospital. At University Hospitals Case Medical Center (UHCMC), as at many hospitals, operating room utilization is a key area of focus for both operating room (OR) and hospital administrators. Efficient use of the OR is an important aspect of a hospital’s finances and patient-centeredness.
UHCMC uses block scheduling, a common OR scheduling design. Each surgical department is allotted a certain number of blocks (hours of reserved OR time) that they are responsible for filling with surgical cases and that the hospital is responsible for staffing. Block utilization rate is a metric commonly used to measure OR efficiency. It divides the time that the OR is in use by the total block time allocated to the department (while accounting for room turnaround time). An industry benchmark is 75% block utilization [1], which was adopted as an internal target at UHCMC. Achieving this metric is necessary because the hospital (rather than each individual surgical department) is responsible for ensuring that the appropriate amount of non-surgeon staff (eg, anesthesiologists, nurses, scrub techs, and facilities staff) is available. Poor utilization rates indicate that the staff and equipment are inefficiently used, which can impact the hospital’s financial well-being [2]. Block utilization is the result of a complex system, making it challenging to improve. Many people are involved in scheduling, and a large degree of inherent uncertainty exists in the system.
At UHCMC, block utilization rates by department ranged from 52% to 80%, with an overall utilization of 64% from February to July 2014. Given this wide variation, higher level management staff in the OR initiated a project in which OR administrators partnered with students in a graduate level QI course in an effort to improve overall block utilization. They believed that improving block utilization rate would improve the effectiveness, patient-centeredness, and efficiency of care, health care delivery goals described by the Institute of Medicine [3].
Methods
Setting
The OR at UHCMC contains 4 operating suites that serve over 25,000 patients per year and train over 900 residents each year. Nearly 250 surgeons in 23 departments use the OR. The OR schedule at our institution is coordinated by block scheduling, as described above. If a surgical department cannot fill the block, they must release the time to central scheduling for re-allocation of the time to another department.
Application of QI Process
This QI project was an academic-practice collaboration between UHCMC and a graduate level course at Case Western Reserve University called The Continual Improvement of Healthcare: an Interdisciplinary Course [4]. Faculty course instructors solicit applications of QI projects from departments at UHCMC. The project team consisted of 4 students (from medicine, social work, public health, and bioethics), 2 administrative staff from UHCMC, and a QI coach who is on the faculty at Case Western. Guidance was provided by 2 faculty facilitators. The students attended 15 weekly class sessions, 4 meetings with the project team, numerous data gathering sessions with other hospital staff, and held a handful of outside-class student team meetings. An early class session was devoted to team skills and the Seven-Step meeting process [5]. Each classroom session consisted of structured group activities to practice the tools of the QI process.
Tool 1: Global Aim
The team first established a global aim: to improve the OR block utilization rate at UHCMC. This aim was based on the initial project proposal from UHCMC. The global aim explains the reason that the project team was established, and frames all future work [7].
Tool 2: Industry Assessment
Based on the global aim, the student team performed an industry assessment in order to understand strategies for improving block utilization rate in use at other institutions. Peer-reviewed journal articles and case reports were reviewed and the student team was able to contact a team at another institution working on similar issues.
Overall, 2 broad categories of interventions to improve block utilization were identified. Some institutions addressed the way time in the OR was scheduled. They made improvements to how block time was allotted, timing of cases, and dealing with add-on cases [8]. Others focused on using time in the OR more efficiently by addressing room turnover, delays including waiting for surgeons, and waiting for hospital beds [9]. Because the specific case mix of each hospital is so distinct, hospitals that successfully made changes all used a variety of interventions [10–12]. After the industry assessment, the student team realized that there would be a large number of possible approaches to the problem of block utilization, and a better understanding of the actual process of scheduling at UHCMC was necessary to find an area of focus.
Tool 3: Process Map
As the project team began to address the global aim of improving OR block utilization at UHCMC, they needed to have a thorough understanding of how OR time was allotted and used. To do this, the student team created a process map by interviewing process stakeholders, including the OR managers and department schedulers in orthopedics, general surgery, and urology, as suggested by the OR managers. The perspective of these staff were critical to understanding the process of operating room scheduling.
Through the creation of the process map, the project team found that there was wide variation in the process and structure for scheduling surgeries. Some departments used one central scheduler while others used individual secretaries for each surgeon. Some surgeons maintained control over changing their schedule, while others did not. Further, the project team learned that the metric of block utilization rate was of varying importance to people working on the ground.
Tool 4: Fishbone Diagram
After understanding the process, the project team considered all of the factors that
Tool 5: Specific Aim
Though the global aim was to improve block utilization, the project team needed to chose a specific aim that met S.M.A.R.T criteria: Specific, Measureable, Achievable, Results-focused, and Time-bound [7]. After considering multiple potential areas of initial focus, the OR staff suggested focusing on the issue of case length accuracy. In qualitative interviews, the student team had found that the surgery request forms ask for “case length,” and the schedulers were not sure how the surgeons defined it. When the OR is booked for an operation, the amount of time blocked out is the time from when the patient is brought into the operating room to the time that the patient leaves the room, or WIWO (Wheels In Wheels Out). This WIWO time includes anesthesia induction and preparations for surgery such as positioning. Some surgeons think about case length as only the time that the patient is operated on, or CTC (Cut to Close). Thus, the surgeon may be requesting less time than is really necessary for the case if he or she is only thinking about CTC time. The student team created a survey and found that 2 urology surgeons considered case length to be WIWO, and 4 considered case length to mean CTC.
Tools 6 and 7: PDSA Cycle and Control Charts
The Plan-Do-Study-Act cycle is an iterative plan of action for designing and testing a specific change [7]. This part of the QI cycle involved implementing and testing a change to address our specific aim. As the first cycle of change, the team requested that the scheduler add 15 minutes to the surgeons’ requested case time over 1 week. Of the urologists scheduled that week, one had used CTC and the other had not completed the student team’s survey. In order to study the change, the project team used control charts for the 2 surgeons whose case times were adapted. Prior to the intervention, the surgeons averaged at least 20 minutes over their scheduled time, with wide variation. Surgeons were infrequently completing cases at or below their requested case time. Most of the inaccuracy came from going long. The team used control charts to understand the impact of the change. The control charts showed that after the change in scheduling time, the 2 surgeons still went over their allotted case time, but to a lesser degree.
After gaining new information, the next step in the PDSA cycle is to determine the next test of change. The student team recommended sharing these data with the surgeons to consider next steps in improving block utilization, though time constraints of the semester limited continued involvement of the student team in the next PDSA cycle.
Discussion
Through the application of QI tools, new insight was gained about OR efficiency and potential improvements. The student team talked to numerous staff involved in scheduling and each discussion increased understanding of the issues that lead to OR inefficiency. The process map and fishbone diagram provided a visual expression of how small issues could impact the overall OR system. Application of QI tools also led the team to the discovery that surgeons may be interpreting case length in disparate ways, contributing to problems with scheduling.
Though the intervention did not have significant impact over 1 week, more time for subsequent PDSA cycles may have resulted in clinical improvements. Despite the limitations, the student team uncovered an important aspect of the block scheduling process, providing valuable information and insight for the department around this scheduling issue. The student team’s work was shared between multiple surgical departments, and the QI work in the department is ongoing.
Implications for Health Care Institutions
Nontraditional Projects Can Work
The issue of OR utilization is perhaps not a “traditional” QI project given the macro nature of the problem. Once it was broken down into discrete processes, problems such as OR turnover, scheduling redundancies, and others look much more like traditional QI projects. It may be beneficial to institutions to broaden the scope of QI to problems that may, at first glance, seem out of the realm of process mapping, fishbone diagramming, and SMART aims. QI tools can turn management problems into projects that can be tackled by small teams, creating an culture of change in an organization [13].
Benefits of Student Teams
There are clear benefits to the institution working with students. Our hospital-based team members found it beneficial to have independent observers review the process and recommend improvements. Students were able to challenge the status quo and point out inefficiencies that have remained due to institutional complacency and lack of resources. The hospital employees were impressed and surprised that the students found the misunderstanding about case length, and noted that it suggests that there may be other places where there are miscommunications between various people involved in OR scheduling. The students’ energy and time was supported by the QI expertise of the course instructors, and the practical knowledge of the hospital-based team members. Similar benefits have been noted by others utilizing collaborative QI educational models [14,15].
Benefits for Students
For the students on the team, the opportunity to apply QI concepts to the real world was a unique learning opportunity. First, the project was truly interdisciplinary. The students were from varied fields and they worked with schedulers, surgeons, and office managers providing the students with insight into the meaning and perspectives of interprofessional collaboration. The students appreciated the complexity and tensions of the OR staff who were working to balance the schedules of nurses, anesthesiologists, and other OR support staff. Additionally, interdisciplinary collaboration in health care is of increasing importance in everyday practice [16,17]. A strong understanding of collaboration across professions will be a cornerstone of the students’ credentials as they move into the workforce.
There is also value in adding real work experience to academics. The students were able to appreciate not only the concepts of QI but the actual challenges of implementing QI methodology in an institution where people had varying levels of buy-in. Quality improvement is about more than sitting at a whiteboard coming up with charts—it is about enacting actual change and understanding specific real-world situations. The hospital collaboration allowed the students to gain experience that is impossible to replicate in the classroom.
Limitations and Barriers
As noted in other academic-practice collaborations, the limitation of completing the project in one semester presents a barrier to collaboration; the working world does not operate on an academic timeline [14]. Students were limited to only testing one cycle of change. This part of the semester was disappointing as the students would have liked to implement multiple PDSA cycles. The OR managers faced barriers as well; they invested time in educating students who would soon move on, and would have to repeat the process with a new group of students. The department has continued on with this work, but losing the students who they oriented was not ideal.
The course instructors were flexible in allowing the project team to spend the majority of time breaking down the problem of OR block utilization into testable changes, which was the bulk of our work. However, the skill that the team was able to dedicate the least amount time to, testing and implementing change, is useful for the students to learn and beneficial for the organization. Moving forward, allowing teams to build on the previous semester’s work, and even implementing a student handoff, might be tried.
Future Directions
Although our intervention did not lead to sustained improvements in OR scheduling efficiency, our project demonstrates how QI tools can be taught and applied in an academic course to address a management problem. Research to specifically understand institutional benefits of academic-practice collaborations would be helpful in recruiting partners and furthering best practices for participants in these partnerships. Research is also needed to understand the impact of QI collaborative models such as the one described in this paper on improving interprofessional teamwork and communication skills, as called for by health care professional educators [16].
Corresponding author: Danielle O’Rourke-Suchoff, BA, Case Western Reserve University School of Medicine, Office of Student Affairs, 10900 Euclid Ave., Cleveland, OH 44106, [email protected].
Financial disclosures: none.
1. The right strategies can help increase OR utilization. OR Manager 2013;29:21–2.
2. Jackson RL. The business of surgery. Managing the OR as a profit center requires more than just IT. It requires a profit-making mindset, too. Health Manage Technol 2002;23:20–2.
3. Institute of Medicine. Crossing the quality chasm: A new health system for the 21st century. Washington (DC): National Academy Press; 2001.
4. Hand R, Dolansky MA, Hanahan E, Tinsley N. Quality comes alive: an interdisciplinary student team’s quality improvement experience in learning by doing—health care education case study. Qual Approaches Higher Educ 2014;5:26–32.
5. Scholtes PR, Joiner BL, Streibel BJ. The team handbook. Oriel; 2003.
6. Institute for Healthcare Improvement. Open School. 2015. Accessed 13 Apr 2015 at www.ihi.org/education/ihiopenschool/Pages/default.aspx.
7. Ogrinc GS, Headrick LA, Moore SM, et al. Fundamentals of health care improvement: A guide to improving your patients’ care. 2nd ed. Oakbrook Terrace, IL: Joint Commission Resources and the Institute for Healthcare Improvement; 2012.
8. Managing patient flow: Smoothing OR schedule can ease capacity crunches, researchers say. OR Manager 2003;19:1,9–10.
9. Harders M, Malangoni MA, Weight S, Sidhu T. Improving operating room efficiency through process redesign. Surgery 2006;140:509–16.
10. Paynter J, Horne W, Sizemore R. Realizing revenue opportunities in the operating room. 2015. Accessed 13 Apr 2015 at www.ihi.org/resources/Pages/ImprovementStories/RealizingRevenueOpportunitiesintheOperatingRoom.aspx.
11. Cima RR, Brown MJ, Hebl JR, et al. Use of Lean and Six Sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center. J Am Coll Surg 2011;213:83–92.
12. Day R, Garfinkel R, Thompson S. Integrated block sharing: a win–win strategy for hospitals and surgeons. Manufact Serv Op Manage 2012;14:567–83.
13. Pardini-Kiely K, Greenlee E, Hopkins J, et al. Improving and Sustaining core measure performance through effective accountability of clinical microsystems in an academic medical center. Jt Comm J Qual Improv Pt Safety 2010;36:387–98.
14. Hall LW, Headrick LA, Cox KR, et al. Linking health professional learners and health care workers on action-based improvement teams. Qual Manag Health Care 2009;18:194–201.
15. Ogrinc GS, Nierenberg DW, Batalden PB. Building experiential learning about quality improvement into a medical school curriculum: The Dartmouth Experience. Health Aff 2011;30:716–22.
16. Interprofessional Education Collaborative Expert Panel. Core competencies for interprofessional collaborative practice. Washington, DC: Interprofessional Education Collaborative; 2011.
17. World Health Organization. Framework for action on inerprofessional education and collaborative practice. Geneva: World Health Organization; 2010.
1. The right strategies can help increase OR utilization. OR Manager 2013;29:21–2.
2. Jackson RL. The business of surgery. Managing the OR as a profit center requires more than just IT. It requires a profit-making mindset, too. Health Manage Technol 2002;23:20–2.
3. Institute of Medicine. Crossing the quality chasm: A new health system for the 21st century. Washington (DC): National Academy Press; 2001.
4. Hand R, Dolansky MA, Hanahan E, Tinsley N. Quality comes alive: an interdisciplinary student team’s quality improvement experience in learning by doing—health care education case study. Qual Approaches Higher Educ 2014;5:26–32.
5. Scholtes PR, Joiner BL, Streibel BJ. The team handbook. Oriel; 2003.
6. Institute for Healthcare Improvement. Open School. 2015. Accessed 13 Apr 2015 at www.ihi.org/education/ihiopenschool/Pages/default.aspx.
7. Ogrinc GS, Headrick LA, Moore SM, et al. Fundamentals of health care improvement: A guide to improving your patients’ care. 2nd ed. Oakbrook Terrace, IL: Joint Commission Resources and the Institute for Healthcare Improvement; 2012.
8. Managing patient flow: Smoothing OR schedule can ease capacity crunches, researchers say. OR Manager 2003;19:1,9–10.
9. Harders M, Malangoni MA, Weight S, Sidhu T. Improving operating room efficiency through process redesign. Surgery 2006;140:509–16.
10. Paynter J, Horne W, Sizemore R. Realizing revenue opportunities in the operating room. 2015. Accessed 13 Apr 2015 at www.ihi.org/resources/Pages/ImprovementStories/RealizingRevenueOpportunitiesintheOperatingRoom.aspx.
11. Cima RR, Brown MJ, Hebl JR, et al. Use of Lean and Six Sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center. J Am Coll Surg 2011;213:83–92.
12. Day R, Garfinkel R, Thompson S. Integrated block sharing: a win–win strategy for hospitals and surgeons. Manufact Serv Op Manage 2012;14:567–83.
13. Pardini-Kiely K, Greenlee E, Hopkins J, et al. Improving and Sustaining core measure performance through effective accountability of clinical microsystems in an academic medical center. Jt Comm J Qual Improv Pt Safety 2010;36:387–98.
14. Hall LW, Headrick LA, Cox KR, et al. Linking health professional learners and health care workers on action-based improvement teams. Qual Manag Health Care 2009;18:194–201.
15. Ogrinc GS, Nierenberg DW, Batalden PB. Building experiential learning about quality improvement into a medical school curriculum: The Dartmouth Experience. Health Aff 2011;30:716–22.
16. Interprofessional Education Collaborative Expert Panel. Core competencies for interprofessional collaborative practice. Washington, DC: Interprofessional Education Collaborative; 2011.
17. World Health Organization. Framework for action on inerprofessional education and collaborative practice. Geneva: World Health Organization; 2010.
Interdisciplinary Geriatric Difficult Case Conference: Innovative Education Across the Continuum
From Wheaton Franciscan Healthcare (Ms. Fedel), Aspirus (Ms. Hackbarth), and Aurora Health Care (Mr. Malsch and Ms. Pagel).
Abstract
- Background: There is a nationwide shortage of geriatric prepared providers. Caring for complex older adults is challenging.
- Objective: To develop an efficient and affordable way to educate members of the interdisciplinary team involved in the care of geriatric patients.
- Methods: A team from 3 area health systems developed a plan to present monthly case studies via teleconference. Cases are presented by a direct caregiver using the Wisconsin Star Method to facilitate analysis of the case. A geriatric expert and another member of the team presents teaching points, and questions are elicited and discussed.
- Results: The team has completed 18 consecutive monthly teleconferences. Participant satisfaction has been favorable. Participation on the call has increased approximately 300% since the initiation of the program.
- Conclusion: The case teleconference provides an accessible and affordable educational forum that provides learners an opportunity to improve their knowledge in care of older adults.
The number of older adults in the United States will nearly double between 2005 and 2030 [1] as the baby boom generation begins turning 65 and as life expectancy for older Americans increases. The Institute of Medicine’s (IOM) landmark report Retooling for an Aging America: Building the Health Care Workforce states that “unless action is taken immediately, the health care workforce will lack the capacity (in both size and ability) to meet the needs of older patients in the future [1].” One of their recommendations is to explore ways to widen the duties and responsibilities of workers at various levels of training. More health care providers need to be trained in the basics of geriatric care and should be capable of caring for older patients.
Team-based care is becoming more prevalent. Care delivered by interdisciplinary teams have been shown to improve patient outcomes [2]. A team led by one of the authors (PF) developed an intervention to increase the geriatric and teamwork competencies of interdisciplinary teams who serve patients throughout Wisconsin. The Interdisciplinary Geriatric Difficult Case Conference Call (IGDCC) is sponsored monthly by 3 Wisconsin health systems. The purpose is to provide opportunities to discuss clinical cases, to learn from one another and from experts, and to elevate the level of geriatric care in the states of Wisconsin, Michigan, and beyond. Each month a difficult case is presented by a clinician involved in that patient’s care. Time is allotted for participants to ask questions, and teaching points are shared by a clinical expert to highlight concepts and provide additional context. The IGDCC is meant to be a joint learning exercise to explore a specific difficult patient situation and learn skills and knowledge to improve care and transitions for older adults. The conference call is not a critique of the care, but rather an opportunity to jointly learn from the challenging situations all experience.
Background
The IGDCC was created by four members of 3 health systems in Wisconsin: Wheaton Franciscan Healthcare, Aspirus, and Aurora Health Care. The health systems serve and partially overlap on a broad geographic and demographic area of Wisconsin. The 4 members collaborated on numerous projects in the past, including Nurses Improving Case for Health System Elders (NICHE) implementation [3]. A common concern among the team is the management of challenging geriatric clinical patients and having a prepared workforce to meet those challenges.
Problem/Issue
As mentioned above, the older adult population is increasing, and these statistics are reflected in our service area [4]. Exacerbating these demographic changes is a shortage of health care workers in all disciplines, inadequate geriatric training, and the increased prevalence of multiple chronic conditions. Older adults also have higher rates of 30-day readmissions as well as higher rates of functional decline and medical errors during hospital stays [5,6]. Effective interprofessional teamwork is essential for the delivery of high-quality patient care in an increasingly complex health environment [7]. The IOM’s Future of Nursing report recommends that nurses, who represent the largest segment of the US health workforce, should achieve higher levels of training and be full partners in redesigning health care [8]. Unfortunately, effective care is hampered by poor coordination, limited communication, boundary infringement, and lack of understanding of roles [9]. Meta-analyses have demonstrated that there is a positive relationship between team training interventions and outcomes [10,11].
Objectives
The objective of the IGDCC is to elevate the level of geriatric care in the region by providing an accessible and affordable forum for the education of health care workers involved in the care of our most vulnerable population. To meet this challenge, the 4 founding members of IGDCC utilized the Aurora Health Care Geriatric Fellow’s Most Difficult Case (GFMCC) conference format as a model [12,13]. All disciplines are encouraged to participate, with announcements sent out via the leadership at the participating hospital systems. Participants have the option to call into the conference and teleconference via their own personal telephone and computer; in addition, each participating hospital system frequently hosts an open forum teleconference room where participants also may join a group.
Conference Components
The team uses the Wisconsin Star Method framework for presentation and discussion of the case. The Star Method, developed by Timothy Howell, enables clinical data about a person to be mapped out onto a single field with 5 domains: medications, medical, behavioral, personal, and social [14], creating a visual representation of the complicated and interacting physical, emotional, and social issues of older adults (Figure). By becoming comfortable using this method, the learner can use a similar approach in their clinical practice to address the needs of the patient in a holistic manner.
The case call concludes with expert teaching points from both a geriatric expert and a member of the interdisciplinary team. The interdisciplinary team member is chosen based on the key issues raised by the case. For example, cases that are made complex due to polypharmacy and adverse drug reactions might have a pharmacist presenting pertinent take-home message for the learner. In addition, geriatric teaching experts (ie, a geriatrician or advanced practice geriatric nursing specialist) provide the learner with insights that they can apply to their future practice. Often times the teaching points consist of an analysis of the various geriatric syndromes and how they can be managed in the complex older adult.
Implementation
Implementation of the IGDCC is coordinated by an oversight team with representation from each of the 3 sponsoring health systems. The oversight team currently includes 4 members: 3 geriatric clinical nurse specialists and a geriatric service line administrator. The team is responsible for:
- Planning the conference call schedule
- Making arrangements for case presenters and experts to contribute teaching points
- Registering participants and sharing written materials with participants
- Publicizing and encouraging attendance
- Soliciting feedback for continual improvement
- Exploring and implementing new ways to maximize learning.
Team members share duties and rotate case presentations. The Aurora and Wheaton Franciscan systems provide the geriatric specialists who provide the expert teaching points. The Aspirus system provides the conference line and webinar application and supports publicity and evaluations. All 3 systems are supported by a geriatric clinical nurse specialist who identifies and helps prepare presenters, case presentations, and call participants. Over time, the conference call format has evolved into a webinar format, allowing participants to either phone into the call for audio only or participate via both audio and visual. The visual allows participants to watch on their computer screens while the case is presented using the Star Method. During the call, a member of the oversight team adds clinical details by typing into a Word template of a blank star, adding information for each of the 5 domains in real-time as the case is discussed. Another member of the team facilitates the call, introducing presenters and experts, describing the Star Method, and offering “housekeeping” announcements. The facilitator also watches the timing to make certain the agenda is followed and the call begins and ends on time. During the call, another member of the team updates the attendance spreadsheet and makes a recording of each session.
Some participating facilities reserve a meeting room and project the webinar onto a screen for shared viewing. One of the participating sites has done this quite successfully with a growing group of participants coming together to watch the case during their lunch hour. This allows an opportunity for group discussion—when the conference call is on “mute” so as not to disrupt learners at other locations.
Measurement/Analysis
Attendance has steadily increased. In CY2015 from January to September, the mean attendance per month was 29.1 (mode, 17). The maximum per month was 62 (September 2015). The program enjoyed a boost in attendance beginning in July 2015 when Nurses Improving Care of Healthsystem Elders (NICHE) [3] began promoting the call-in opportunity to its NICHE Coordinators at member health systems. In June 2015, the technology was improved to allow for recorded sessions, and the recordings are growing in popularity from 2 listeners per month in July 2014 to 23 listeners per month in September 2015.
Lessons Learned
In comparing the IGDCC with similar conference call educational offerings, the team found that the program was unique in 2 areas. First, in addition to having a rich discussion in the care of frail older adults with experts in the field, the team also sought to help our staff learn how to present a difficult case to their peers. Three of our 4 committee members are geriatric clinical nurse specialists (a fourth is a clinical nurse specialist from Aspirus who assists periodically) who have been able to mentor, guide, and encourage interdisciplinary team members to present a challenging case. Many presenters had never presented a difficult case in this format. Presenters found the process fun and rewarding and have offered to present cases again in the future.
A second unique feature was utilizing the Wisconsin Star Method rather than focusing on a typical medical model framework for discussing a challenging case. The Star Method allows participants to increase their proficiency in providing comprehensive care while being more confident and mindful in addressing the complicated interacting physical, emotional and social issues of older adults [13].
A monthly post-call debriefing with committee members to review the strengths and weakness of the call was key to growing the program. The committee was able to critically review the process of the call, review participant surveys and discuss next steps. Adding a webinar approach, automatic email notification of calls, participant electronic survey, recording the call, and the addition of offering contact hours were some of the action items that were a result of monthly debriefing calls.
The team also found the 3-system collaboration to be beneficial. Aspirus has a large rural population, and Wheaton and Aurora have a diverse population, and each adds to the participant’s experience. Each IGDCC was rotated between the systems, which did not put the burden on any one health system. An annual call assignment listing was maintained for noting which system was responsible for the case each month and whether the geriatric expert was assigned/confirmed. Identifying the committee’s individual and collective group expertise was helpful in the overall project planning. The committee also developed a standard presenter guide and template and an expert teaching guide so the monthly IGDCC were consistent.
Challenges
The committee did not have a budget. Participation on the committee was in-kind funding from each system. Aspirus used its electronic system in place at the time to support the project. Interactive conference call education platform can be challenging with multiple participants on an open line who may not mute their phone. Often times, when a group of participants are calling in from one phone line it is difficult to know how many people are attending the IGDCC. It can be challenging at times to facilitate the call during the discussion component as participants occasionally talk over each other.
Current Status/Future Directions
The team has completed 18 consecutive monthly IGDCCs. Our participation rate has tripled. Participant satisfaction remains favorable. The team is now offering 1 contact hour to participants, and our invitations to participate have been extended to national health care groups. Challenging cases will be presented from community sources outside the hospital. Focusing attention on elevating the level of geriatric care in our region using a community educational approach will give us new opportunities for collaborating on best practice in multiple settings across the care continuum.
Acknowledgment: The planning team acknowledges Evalyn Michira, MSN, RN, PHN, AGCNS-BC, for her assistance in call presentations.
Corresponding author: Margie Hackbarth, MBA, [email protected].
Financial disclosures: none.
1. Institute of Medicine. Retooling for an aging America: Building the health care workforce. Washington, DC: National Academies Press; 2008.
2. Mitchell P, Wynia M, Golden R, et al. Core principles and values of effective team-based health care. Discussion paper. Washington, DC; Institute of Medicine; 2012.
3. Nurses Improving Care for Healthsystem Elders. Accessed 1 Dec 2015 at www.nicheprogram.org/.
4. Wisconsin Department of Health Services. Southeastern region population report: 1 Jul 2013. Accessed 16 Feb 2015 at www.dhs.wisconsin.gov/sites/default/files/legacy/population/13data/southeastern.pdf.
5. From the Centers for Disease Control and Prevention. Public health and aging: trends in aging--United States and worldwide. JAMA 2003;289:1371–3.
6. Hall MJ, DeFrances CJ, Williams SN, et al. National Hospital Discharge Survey: 2007 summary. Natl Health Stat Report 2010;(29):1–20, 24.
7. Nembhard IM, Edmondson AC. Making it safe: The effects of leader inclusiveness and professional status on psychological safety and improvement efforts in health care teams. J Organiz Behav 2006; 27:941–66.
8. Institute of Medicine. The future of nursing: leading change, advancing health. National Academies Press; 2011.
9. Reeves S, Zwarenstein M, Goldman et al. Interprofessional education: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2013;3:CD002213.
10. Salas E, Diaz Granados D, Klein C, et al. Does team training improve team performance? A meta-analysis. Hum Factors 2008;50:903–33.
11. Strasser DD, Burridge AB, Falconer JA, et al. Toward spanning the quality chasm: an examination of team functioning measures. Arch Phys Med Rehabil 2014;95:2220–3.
12. Roche VM, Torregosa H, Howell T, Malone ML. Establishing a treatment plan for an elder with a complex and incomplete medical history and multiple medical providers, diagnoses, and medications. Ann Long-Term Care 2012;20(9).
13. Roche VM, Arnouville J, Danto-Nocton ES, et al. Optimal management of an older patient with multiple comorbidities and a complex psychosocial history. Ann Long-Term Care 2011;19(9).
14. Wisconsin Geriatric Psychiatry Initiative. The Wisconsin Star Method. Accessed 19 Jan 2015 at wgpi.wisc.edu/wisconsin-star-method/.
From Wheaton Franciscan Healthcare (Ms. Fedel), Aspirus (Ms. Hackbarth), and Aurora Health Care (Mr. Malsch and Ms. Pagel).
Abstract
- Background: There is a nationwide shortage of geriatric prepared providers. Caring for complex older adults is challenging.
- Objective: To develop an efficient and affordable way to educate members of the interdisciplinary team involved in the care of geriatric patients.
- Methods: A team from 3 area health systems developed a plan to present monthly case studies via teleconference. Cases are presented by a direct caregiver using the Wisconsin Star Method to facilitate analysis of the case. A geriatric expert and another member of the team presents teaching points, and questions are elicited and discussed.
- Results: The team has completed 18 consecutive monthly teleconferences. Participant satisfaction has been favorable. Participation on the call has increased approximately 300% since the initiation of the program.
- Conclusion: The case teleconference provides an accessible and affordable educational forum that provides learners an opportunity to improve their knowledge in care of older adults.
The number of older adults in the United States will nearly double between 2005 and 2030 [1] as the baby boom generation begins turning 65 and as life expectancy for older Americans increases. The Institute of Medicine’s (IOM) landmark report Retooling for an Aging America: Building the Health Care Workforce states that “unless action is taken immediately, the health care workforce will lack the capacity (in both size and ability) to meet the needs of older patients in the future [1].” One of their recommendations is to explore ways to widen the duties and responsibilities of workers at various levels of training. More health care providers need to be trained in the basics of geriatric care and should be capable of caring for older patients.
Team-based care is becoming more prevalent. Care delivered by interdisciplinary teams have been shown to improve patient outcomes [2]. A team led by one of the authors (PF) developed an intervention to increase the geriatric and teamwork competencies of interdisciplinary teams who serve patients throughout Wisconsin. The Interdisciplinary Geriatric Difficult Case Conference Call (IGDCC) is sponsored monthly by 3 Wisconsin health systems. The purpose is to provide opportunities to discuss clinical cases, to learn from one another and from experts, and to elevate the level of geriatric care in the states of Wisconsin, Michigan, and beyond. Each month a difficult case is presented by a clinician involved in that patient’s care. Time is allotted for participants to ask questions, and teaching points are shared by a clinical expert to highlight concepts and provide additional context. The IGDCC is meant to be a joint learning exercise to explore a specific difficult patient situation and learn skills and knowledge to improve care and transitions for older adults. The conference call is not a critique of the care, but rather an opportunity to jointly learn from the challenging situations all experience.
Background
The IGDCC was created by four members of 3 health systems in Wisconsin: Wheaton Franciscan Healthcare, Aspirus, and Aurora Health Care. The health systems serve and partially overlap on a broad geographic and demographic area of Wisconsin. The 4 members collaborated on numerous projects in the past, including Nurses Improving Case for Health System Elders (NICHE) implementation [3]. A common concern among the team is the management of challenging geriatric clinical patients and having a prepared workforce to meet those challenges.
Problem/Issue
As mentioned above, the older adult population is increasing, and these statistics are reflected in our service area [4]. Exacerbating these demographic changes is a shortage of health care workers in all disciplines, inadequate geriatric training, and the increased prevalence of multiple chronic conditions. Older adults also have higher rates of 30-day readmissions as well as higher rates of functional decline and medical errors during hospital stays [5,6]. Effective interprofessional teamwork is essential for the delivery of high-quality patient care in an increasingly complex health environment [7]. The IOM’s Future of Nursing report recommends that nurses, who represent the largest segment of the US health workforce, should achieve higher levels of training and be full partners in redesigning health care [8]. Unfortunately, effective care is hampered by poor coordination, limited communication, boundary infringement, and lack of understanding of roles [9]. Meta-analyses have demonstrated that there is a positive relationship between team training interventions and outcomes [10,11].
Objectives
The objective of the IGDCC is to elevate the level of geriatric care in the region by providing an accessible and affordable forum for the education of health care workers involved in the care of our most vulnerable population. To meet this challenge, the 4 founding members of IGDCC utilized the Aurora Health Care Geriatric Fellow’s Most Difficult Case (GFMCC) conference format as a model [12,13]. All disciplines are encouraged to participate, with announcements sent out via the leadership at the participating hospital systems. Participants have the option to call into the conference and teleconference via their own personal telephone and computer; in addition, each participating hospital system frequently hosts an open forum teleconference room where participants also may join a group.
Conference Components
The team uses the Wisconsin Star Method framework for presentation and discussion of the case. The Star Method, developed by Timothy Howell, enables clinical data about a person to be mapped out onto a single field with 5 domains: medications, medical, behavioral, personal, and social [14], creating a visual representation of the complicated and interacting physical, emotional, and social issues of older adults (Figure). By becoming comfortable using this method, the learner can use a similar approach in their clinical practice to address the needs of the patient in a holistic manner.
The case call concludes with expert teaching points from both a geriatric expert and a member of the interdisciplinary team. The interdisciplinary team member is chosen based on the key issues raised by the case. For example, cases that are made complex due to polypharmacy and adverse drug reactions might have a pharmacist presenting pertinent take-home message for the learner. In addition, geriatric teaching experts (ie, a geriatrician or advanced practice geriatric nursing specialist) provide the learner with insights that they can apply to their future practice. Often times the teaching points consist of an analysis of the various geriatric syndromes and how they can be managed in the complex older adult.
Implementation
Implementation of the IGDCC is coordinated by an oversight team with representation from each of the 3 sponsoring health systems. The oversight team currently includes 4 members: 3 geriatric clinical nurse specialists and a geriatric service line administrator. The team is responsible for:
- Planning the conference call schedule
- Making arrangements for case presenters and experts to contribute teaching points
- Registering participants and sharing written materials with participants
- Publicizing and encouraging attendance
- Soliciting feedback for continual improvement
- Exploring and implementing new ways to maximize learning.
Team members share duties and rotate case presentations. The Aurora and Wheaton Franciscan systems provide the geriatric specialists who provide the expert teaching points. The Aspirus system provides the conference line and webinar application and supports publicity and evaluations. All 3 systems are supported by a geriatric clinical nurse specialist who identifies and helps prepare presenters, case presentations, and call participants. Over time, the conference call format has evolved into a webinar format, allowing participants to either phone into the call for audio only or participate via both audio and visual. The visual allows participants to watch on their computer screens while the case is presented using the Star Method. During the call, a member of the oversight team adds clinical details by typing into a Word template of a blank star, adding information for each of the 5 domains in real-time as the case is discussed. Another member of the team facilitates the call, introducing presenters and experts, describing the Star Method, and offering “housekeeping” announcements. The facilitator also watches the timing to make certain the agenda is followed and the call begins and ends on time. During the call, another member of the team updates the attendance spreadsheet and makes a recording of each session.
Some participating facilities reserve a meeting room and project the webinar onto a screen for shared viewing. One of the participating sites has done this quite successfully with a growing group of participants coming together to watch the case during their lunch hour. This allows an opportunity for group discussion—when the conference call is on “mute” so as not to disrupt learners at other locations.
Measurement/Analysis
Attendance has steadily increased. In CY2015 from January to September, the mean attendance per month was 29.1 (mode, 17). The maximum per month was 62 (September 2015). The program enjoyed a boost in attendance beginning in July 2015 when Nurses Improving Care of Healthsystem Elders (NICHE) [3] began promoting the call-in opportunity to its NICHE Coordinators at member health systems. In June 2015, the technology was improved to allow for recorded sessions, and the recordings are growing in popularity from 2 listeners per month in July 2014 to 23 listeners per month in September 2015.
Lessons Learned
In comparing the IGDCC with similar conference call educational offerings, the team found that the program was unique in 2 areas. First, in addition to having a rich discussion in the care of frail older adults with experts in the field, the team also sought to help our staff learn how to present a difficult case to their peers. Three of our 4 committee members are geriatric clinical nurse specialists (a fourth is a clinical nurse specialist from Aspirus who assists periodically) who have been able to mentor, guide, and encourage interdisciplinary team members to present a challenging case. Many presenters had never presented a difficult case in this format. Presenters found the process fun and rewarding and have offered to present cases again in the future.
A second unique feature was utilizing the Wisconsin Star Method rather than focusing on a typical medical model framework for discussing a challenging case. The Star Method allows participants to increase their proficiency in providing comprehensive care while being more confident and mindful in addressing the complicated interacting physical, emotional and social issues of older adults [13].
A monthly post-call debriefing with committee members to review the strengths and weakness of the call was key to growing the program. The committee was able to critically review the process of the call, review participant surveys and discuss next steps. Adding a webinar approach, automatic email notification of calls, participant electronic survey, recording the call, and the addition of offering contact hours were some of the action items that were a result of monthly debriefing calls.
The team also found the 3-system collaboration to be beneficial. Aspirus has a large rural population, and Wheaton and Aurora have a diverse population, and each adds to the participant’s experience. Each IGDCC was rotated between the systems, which did not put the burden on any one health system. An annual call assignment listing was maintained for noting which system was responsible for the case each month and whether the geriatric expert was assigned/confirmed. Identifying the committee’s individual and collective group expertise was helpful in the overall project planning. The committee also developed a standard presenter guide and template and an expert teaching guide so the monthly IGDCC were consistent.
Challenges
The committee did not have a budget. Participation on the committee was in-kind funding from each system. Aspirus used its electronic system in place at the time to support the project. Interactive conference call education platform can be challenging with multiple participants on an open line who may not mute their phone. Often times, when a group of participants are calling in from one phone line it is difficult to know how many people are attending the IGDCC. It can be challenging at times to facilitate the call during the discussion component as participants occasionally talk over each other.
Current Status/Future Directions
The team has completed 18 consecutive monthly IGDCCs. Our participation rate has tripled. Participant satisfaction remains favorable. The team is now offering 1 contact hour to participants, and our invitations to participate have been extended to national health care groups. Challenging cases will be presented from community sources outside the hospital. Focusing attention on elevating the level of geriatric care in our region using a community educational approach will give us new opportunities for collaborating on best practice in multiple settings across the care continuum.
Acknowledgment: The planning team acknowledges Evalyn Michira, MSN, RN, PHN, AGCNS-BC, for her assistance in call presentations.
Corresponding author: Margie Hackbarth, MBA, [email protected].
Financial disclosures: none.
From Wheaton Franciscan Healthcare (Ms. Fedel), Aspirus (Ms. Hackbarth), and Aurora Health Care (Mr. Malsch and Ms. Pagel).
Abstract
- Background: There is a nationwide shortage of geriatric prepared providers. Caring for complex older adults is challenging.
- Objective: To develop an efficient and affordable way to educate members of the interdisciplinary team involved in the care of geriatric patients.
- Methods: A team from 3 area health systems developed a plan to present monthly case studies via teleconference. Cases are presented by a direct caregiver using the Wisconsin Star Method to facilitate analysis of the case. A geriatric expert and another member of the team presents teaching points, and questions are elicited and discussed.
- Results: The team has completed 18 consecutive monthly teleconferences. Participant satisfaction has been favorable. Participation on the call has increased approximately 300% since the initiation of the program.
- Conclusion: The case teleconference provides an accessible and affordable educational forum that provides learners an opportunity to improve their knowledge in care of older adults.
The number of older adults in the United States will nearly double between 2005 and 2030 [1] as the baby boom generation begins turning 65 and as life expectancy for older Americans increases. The Institute of Medicine’s (IOM) landmark report Retooling for an Aging America: Building the Health Care Workforce states that “unless action is taken immediately, the health care workforce will lack the capacity (in both size and ability) to meet the needs of older patients in the future [1].” One of their recommendations is to explore ways to widen the duties and responsibilities of workers at various levels of training. More health care providers need to be trained in the basics of geriatric care and should be capable of caring for older patients.
Team-based care is becoming more prevalent. Care delivered by interdisciplinary teams have been shown to improve patient outcomes [2]. A team led by one of the authors (PF) developed an intervention to increase the geriatric and teamwork competencies of interdisciplinary teams who serve patients throughout Wisconsin. The Interdisciplinary Geriatric Difficult Case Conference Call (IGDCC) is sponsored monthly by 3 Wisconsin health systems. The purpose is to provide opportunities to discuss clinical cases, to learn from one another and from experts, and to elevate the level of geriatric care in the states of Wisconsin, Michigan, and beyond. Each month a difficult case is presented by a clinician involved in that patient’s care. Time is allotted for participants to ask questions, and teaching points are shared by a clinical expert to highlight concepts and provide additional context. The IGDCC is meant to be a joint learning exercise to explore a specific difficult patient situation and learn skills and knowledge to improve care and transitions for older adults. The conference call is not a critique of the care, but rather an opportunity to jointly learn from the challenging situations all experience.
Background
The IGDCC was created by four members of 3 health systems in Wisconsin: Wheaton Franciscan Healthcare, Aspirus, and Aurora Health Care. The health systems serve and partially overlap on a broad geographic and demographic area of Wisconsin. The 4 members collaborated on numerous projects in the past, including Nurses Improving Case for Health System Elders (NICHE) implementation [3]. A common concern among the team is the management of challenging geriatric clinical patients and having a prepared workforce to meet those challenges.
Problem/Issue
As mentioned above, the older adult population is increasing, and these statistics are reflected in our service area [4]. Exacerbating these demographic changes is a shortage of health care workers in all disciplines, inadequate geriatric training, and the increased prevalence of multiple chronic conditions. Older adults also have higher rates of 30-day readmissions as well as higher rates of functional decline and medical errors during hospital stays [5,6]. Effective interprofessional teamwork is essential for the delivery of high-quality patient care in an increasingly complex health environment [7]. The IOM’s Future of Nursing report recommends that nurses, who represent the largest segment of the US health workforce, should achieve higher levels of training and be full partners in redesigning health care [8]. Unfortunately, effective care is hampered by poor coordination, limited communication, boundary infringement, and lack of understanding of roles [9]. Meta-analyses have demonstrated that there is a positive relationship between team training interventions and outcomes [10,11].
Objectives
The objective of the IGDCC is to elevate the level of geriatric care in the region by providing an accessible and affordable forum for the education of health care workers involved in the care of our most vulnerable population. To meet this challenge, the 4 founding members of IGDCC utilized the Aurora Health Care Geriatric Fellow’s Most Difficult Case (GFMCC) conference format as a model [12,13]. All disciplines are encouraged to participate, with announcements sent out via the leadership at the participating hospital systems. Participants have the option to call into the conference and teleconference via their own personal telephone and computer; in addition, each participating hospital system frequently hosts an open forum teleconference room where participants also may join a group.
Conference Components
The team uses the Wisconsin Star Method framework for presentation and discussion of the case. The Star Method, developed by Timothy Howell, enables clinical data about a person to be mapped out onto a single field with 5 domains: medications, medical, behavioral, personal, and social [14], creating a visual representation of the complicated and interacting physical, emotional, and social issues of older adults (Figure). By becoming comfortable using this method, the learner can use a similar approach in their clinical practice to address the needs of the patient in a holistic manner.
The case call concludes with expert teaching points from both a geriatric expert and a member of the interdisciplinary team. The interdisciplinary team member is chosen based on the key issues raised by the case. For example, cases that are made complex due to polypharmacy and adverse drug reactions might have a pharmacist presenting pertinent take-home message for the learner. In addition, geriatric teaching experts (ie, a geriatrician or advanced practice geriatric nursing specialist) provide the learner with insights that they can apply to their future practice. Often times the teaching points consist of an analysis of the various geriatric syndromes and how they can be managed in the complex older adult.
Implementation
Implementation of the IGDCC is coordinated by an oversight team with representation from each of the 3 sponsoring health systems. The oversight team currently includes 4 members: 3 geriatric clinical nurse specialists and a geriatric service line administrator. The team is responsible for:
- Planning the conference call schedule
- Making arrangements for case presenters and experts to contribute teaching points
- Registering participants and sharing written materials with participants
- Publicizing and encouraging attendance
- Soliciting feedback for continual improvement
- Exploring and implementing new ways to maximize learning.
Team members share duties and rotate case presentations. The Aurora and Wheaton Franciscan systems provide the geriatric specialists who provide the expert teaching points. The Aspirus system provides the conference line and webinar application and supports publicity and evaluations. All 3 systems are supported by a geriatric clinical nurse specialist who identifies and helps prepare presenters, case presentations, and call participants. Over time, the conference call format has evolved into a webinar format, allowing participants to either phone into the call for audio only or participate via both audio and visual. The visual allows participants to watch on their computer screens while the case is presented using the Star Method. During the call, a member of the oversight team adds clinical details by typing into a Word template of a blank star, adding information for each of the 5 domains in real-time as the case is discussed. Another member of the team facilitates the call, introducing presenters and experts, describing the Star Method, and offering “housekeeping” announcements. The facilitator also watches the timing to make certain the agenda is followed and the call begins and ends on time. During the call, another member of the team updates the attendance spreadsheet and makes a recording of each session.
Some participating facilities reserve a meeting room and project the webinar onto a screen for shared viewing. One of the participating sites has done this quite successfully with a growing group of participants coming together to watch the case during their lunch hour. This allows an opportunity for group discussion—when the conference call is on “mute” so as not to disrupt learners at other locations.
Measurement/Analysis
Attendance has steadily increased. In CY2015 from January to September, the mean attendance per month was 29.1 (mode, 17). The maximum per month was 62 (September 2015). The program enjoyed a boost in attendance beginning in July 2015 when Nurses Improving Care of Healthsystem Elders (NICHE) [3] began promoting the call-in opportunity to its NICHE Coordinators at member health systems. In June 2015, the technology was improved to allow for recorded sessions, and the recordings are growing in popularity from 2 listeners per month in July 2014 to 23 listeners per month in September 2015.
Lessons Learned
In comparing the IGDCC with similar conference call educational offerings, the team found that the program was unique in 2 areas. First, in addition to having a rich discussion in the care of frail older adults with experts in the field, the team also sought to help our staff learn how to present a difficult case to their peers. Three of our 4 committee members are geriatric clinical nurse specialists (a fourth is a clinical nurse specialist from Aspirus who assists periodically) who have been able to mentor, guide, and encourage interdisciplinary team members to present a challenging case. Many presenters had never presented a difficult case in this format. Presenters found the process fun and rewarding and have offered to present cases again in the future.
A second unique feature was utilizing the Wisconsin Star Method rather than focusing on a typical medical model framework for discussing a challenging case. The Star Method allows participants to increase their proficiency in providing comprehensive care while being more confident and mindful in addressing the complicated interacting physical, emotional and social issues of older adults [13].
A monthly post-call debriefing with committee members to review the strengths and weakness of the call was key to growing the program. The committee was able to critically review the process of the call, review participant surveys and discuss next steps. Adding a webinar approach, automatic email notification of calls, participant electronic survey, recording the call, and the addition of offering contact hours were some of the action items that were a result of monthly debriefing calls.
The team also found the 3-system collaboration to be beneficial. Aspirus has a large rural population, and Wheaton and Aurora have a diverse population, and each adds to the participant’s experience. Each IGDCC was rotated between the systems, which did not put the burden on any one health system. An annual call assignment listing was maintained for noting which system was responsible for the case each month and whether the geriatric expert was assigned/confirmed. Identifying the committee’s individual and collective group expertise was helpful in the overall project planning. The committee also developed a standard presenter guide and template and an expert teaching guide so the monthly IGDCC were consistent.
Challenges
The committee did not have a budget. Participation on the committee was in-kind funding from each system. Aspirus used its electronic system in place at the time to support the project. Interactive conference call education platform can be challenging with multiple participants on an open line who may not mute their phone. Often times, when a group of participants are calling in from one phone line it is difficult to know how many people are attending the IGDCC. It can be challenging at times to facilitate the call during the discussion component as participants occasionally talk over each other.
Current Status/Future Directions
The team has completed 18 consecutive monthly IGDCCs. Our participation rate has tripled. Participant satisfaction remains favorable. The team is now offering 1 contact hour to participants, and our invitations to participate have been extended to national health care groups. Challenging cases will be presented from community sources outside the hospital. Focusing attention on elevating the level of geriatric care in our region using a community educational approach will give us new opportunities for collaborating on best practice in multiple settings across the care continuum.
Acknowledgment: The planning team acknowledges Evalyn Michira, MSN, RN, PHN, AGCNS-BC, for her assistance in call presentations.
Corresponding author: Margie Hackbarth, MBA, [email protected].
Financial disclosures: none.
1. Institute of Medicine. Retooling for an aging America: Building the health care workforce. Washington, DC: National Academies Press; 2008.
2. Mitchell P, Wynia M, Golden R, et al. Core principles and values of effective team-based health care. Discussion paper. Washington, DC; Institute of Medicine; 2012.
3. Nurses Improving Care for Healthsystem Elders. Accessed 1 Dec 2015 at www.nicheprogram.org/.
4. Wisconsin Department of Health Services. Southeastern region population report: 1 Jul 2013. Accessed 16 Feb 2015 at www.dhs.wisconsin.gov/sites/default/files/legacy/population/13data/southeastern.pdf.
5. From the Centers for Disease Control and Prevention. Public health and aging: trends in aging--United States and worldwide. JAMA 2003;289:1371–3.
6. Hall MJ, DeFrances CJ, Williams SN, et al. National Hospital Discharge Survey: 2007 summary. Natl Health Stat Report 2010;(29):1–20, 24.
7. Nembhard IM, Edmondson AC. Making it safe: The effects of leader inclusiveness and professional status on psychological safety and improvement efforts in health care teams. J Organiz Behav 2006; 27:941–66.
8. Institute of Medicine. The future of nursing: leading change, advancing health. National Academies Press; 2011.
9. Reeves S, Zwarenstein M, Goldman et al. Interprofessional education: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2013;3:CD002213.
10. Salas E, Diaz Granados D, Klein C, et al. Does team training improve team performance? A meta-analysis. Hum Factors 2008;50:903–33.
11. Strasser DD, Burridge AB, Falconer JA, et al. Toward spanning the quality chasm: an examination of team functioning measures. Arch Phys Med Rehabil 2014;95:2220–3.
12. Roche VM, Torregosa H, Howell T, Malone ML. Establishing a treatment plan for an elder with a complex and incomplete medical history and multiple medical providers, diagnoses, and medications. Ann Long-Term Care 2012;20(9).
13. Roche VM, Arnouville J, Danto-Nocton ES, et al. Optimal management of an older patient with multiple comorbidities and a complex psychosocial history. Ann Long-Term Care 2011;19(9).
14. Wisconsin Geriatric Psychiatry Initiative. The Wisconsin Star Method. Accessed 19 Jan 2015 at wgpi.wisc.edu/wisconsin-star-method/.
1. Institute of Medicine. Retooling for an aging America: Building the health care workforce. Washington, DC: National Academies Press; 2008.
2. Mitchell P, Wynia M, Golden R, et al. Core principles and values of effective team-based health care. Discussion paper. Washington, DC; Institute of Medicine; 2012.
3. Nurses Improving Care for Healthsystem Elders. Accessed 1 Dec 2015 at www.nicheprogram.org/.
4. Wisconsin Department of Health Services. Southeastern region population report: 1 Jul 2013. Accessed 16 Feb 2015 at www.dhs.wisconsin.gov/sites/default/files/legacy/population/13data/southeastern.pdf.
5. From the Centers for Disease Control and Prevention. Public health and aging: trends in aging--United States and worldwide. JAMA 2003;289:1371–3.
6. Hall MJ, DeFrances CJ, Williams SN, et al. National Hospital Discharge Survey: 2007 summary. Natl Health Stat Report 2010;(29):1–20, 24.
7. Nembhard IM, Edmondson AC. Making it safe: The effects of leader inclusiveness and professional status on psychological safety and improvement efforts in health care teams. J Organiz Behav 2006; 27:941–66.
8. Institute of Medicine. The future of nursing: leading change, advancing health. National Academies Press; 2011.
9. Reeves S, Zwarenstein M, Goldman et al. Interprofessional education: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2013;3:CD002213.
10. Salas E, Diaz Granados D, Klein C, et al. Does team training improve team performance? A meta-analysis. Hum Factors 2008;50:903–33.
11. Strasser DD, Burridge AB, Falconer JA, et al. Toward spanning the quality chasm: an examination of team functioning measures. Arch Phys Med Rehabil 2014;95:2220–3.
12. Roche VM, Torregosa H, Howell T, Malone ML. Establishing a treatment plan for an elder with a complex and incomplete medical history and multiple medical providers, diagnoses, and medications. Ann Long-Term Care 2012;20(9).
13. Roche VM, Arnouville J, Danto-Nocton ES, et al. Optimal management of an older patient with multiple comorbidities and a complex psychosocial history. Ann Long-Term Care 2011;19(9).
14. Wisconsin Geriatric Psychiatry Initiative. The Wisconsin Star Method. Accessed 19 Jan 2015 at wgpi.wisc.edu/wisconsin-star-method/.
Longer-Term Evidence Supporting Bariatric Surgery in Adolescents
Study Overview
Objective. To examine the efficacy and safety of weight-loss surgery in adolescents.
Design. Prospective observational study.
Setting and participants. Adolescents (aged 13–19 years) with severe obesity undergoing bariatric surgery at 5 U.S. hospitals and medical centers from March 2007 through February 2012. Participants were enrolled in the Teen-Longitudinal Assessment of Bariatric Surgery (Teen-LABS) study, a longitudinal prospective study that investigated the risks and benefits of adolescent bariatric surgery.
Main outcome measures. Data was collected on weight, comorbidities, cardiometabolic risk factors, nutritional status, and weight-related quality of life at research visits scheduled at 6 months, 1 year, 2 years, and 3 years post bariatric surgery. Researchers measured height and weight and blood pressure directly and calculated BMI. They assessed for comorbidities and cardiometabolic risk factors through urine and serum laboratory tests of lipids, glomerular filtration rate, albumin, glycated hemoglobin, fasting glucose level, and insulin. They assessed nutritional status with laboratory values for serum albumin, folate, vitamin B12, 25-hydroxyvitamin D, parathyroid hormone, ferritin, transferrin, vitamin A, and vitamin B1 erythrocyte transketolase. Researchers conducted interviews with the participants to collect information about subsequent medical or surgical procedures or, if participants missed a research visit, they obtained information through chart reviews. Finally, weight-related quality of life was assessed with the Impact of Weight on Quality of Life-Kids instrument, a validated self-report measure with 27 items divided into 4 subscales: physical comfort, body esteem, social life, and family relations.
Main results. Analysis was conducted on results for 228 of 242 participants who received Roux-en-Y gastric bypass (n = 161) and sleeve gastrectomy (n = 67). Results for 14 participants who received adjustable gastric banding were not included due to the small size of that group. Mean weight loss was 41 kg while mean height increased by only 0.51 cm. The mean percentage of weight loss was 27% overall and was similar in both groups, 28% in participants who underwent gastric bypass and 26% in those who underwent sleeve gastrectomy. At the 3-year visit, there were statistically significant improvements in comorbidities: 74% of the 96 participants with elevated blood pressure, 66% of the 171 participants with dyslipidemia, and 86% of the 36 participants with abnormal kidney function at baseline had values within the normal range. None of 3 participants with type 1 diabetes at baseline had resolution. However, 29 participants had type 2 diabetes (median glycolated hemoglobin 6.3% at baseline) and 19 of 20 of them for whom data were available at 3 years were in remission, with a median glycolated hemoglobin of 5.3%. There was an increase in the number of participants with micronutrient deficiencies at the 3-year mark: the percentage of participants with low ferritin levels increased from 5% at baseline to 57%, those with low vitamin B12 increased from < 1% to 8%, and those with low vitamin A increased from 6% to 16%. During the 3-year follow-up period, 30 participants underwent 44 intrabdominal procedures related to the bariatric procedure and 29 participants underwent 48 endoscopic procedures, including stricture dilatation (n = 11). Total scores on the Impact of Weight on Quality of Life-Kids instrument improved from a mean of 63 at baseline to 83 at 3 years.
Conclusion. Overall there were significant improvements in weight, comorbidities, cardiometabolic health, and weight-related quality of life. However, there were also risks, including increased micronutrient deficiencies and the need for subsequent invasive abdominal procedures.
Commentary
Pediatric obesity is one of the most significant health problems facing children and adolescents. According to the most recent estimates, 34.5% of all adolescents aged 12 to 19 years are overweight or obese [1]. Pediatric obesity has serious short- and long-term psychosocial and physical implications. Obese adolescents suffer from social marginalization, poor self-concept, and lower health-related quality of life [2,3]. They are at greater risk for metabolic syndrome, diabetes, obstructive sleep apnea, and conditions associated with coronary artery disease such as hyperlipidemia and hypertension [4,5]. Additionally, obesity in adolescence is strongly associated with early mortality and years of life lost [6].
Despite extensive research and public health campaigns, rates of adolescent obesity have not decreased since 2003 [1]. Diet and behavioral approaches have had limited success and are rarely sustained over time. Bariatric surgery is an approach that has been used safely and effectively in severely obese adults and is increasingly being used for adolescents as well [7]. The results of this study are encouraging in that they suggest that bariatric surgery is effective in adolescents, leading to significant and sustained weight loss over 3 years and improved cardiometabolic health and weight-related quality of life.
The procedures are not without risks as demonstrated by the findings of micronutrient deficiencies and the need for follow-up intraabdominal and endoscopic procedures. The number of follow-up procedures and the fact that they continued into the third year is concerning. More details about this finding, such as characteristics of participants who required them, would be helpful. Further research to determine risk factors associated with complications that require subsequent invasive procedures is important for developing criteria for selection of candidates for bariatric surgery. Additionally, there was no information on impact of the follow-up procedures on participants or the conditions that precipitated them. In addition, there was no information on physical sequelae that can cause ongoing distress for patients, eg, chronic abdominal cramping and pain. The authors measured weight-related quality of life but measuring overall quality of life post-procedure would have captured the impact of post-procedure dietary restrictions and any medical problems. Such data could be helpful in decision-making about the use of bariatric procedures in this population versus noninvasive approaches to management.
As the authors note, treating severe obesity in adolescence rather than waiting until adulthood may have significant implications for improved health in adulthood, particularly in preventing or reversing cardiovascular damage related to obesity-related cardiometabolic risk factors. However, what is not known yet is whether the positive outcomes, beginning with weight loss, are sustained through adulthood. This 3-year longitudinal study was the first to examine factors over an extended time period, however, considering the average life expectancy of an adolescent, it provides only a relatively short-term outlook. A longitudinal study that follows a cohort of adolescents from the time of the bariatric procedure into middle age or beyond is needed. Such a study would also provide needed information about the long-term consequences of repeated intraabdominal procedures and the persistence or resolution of micronutrient deficiencies and their effects on health.
The strengths of this study are its prospective longitudinal design and its high rate of cohort completion (99% of participants remained actively involved, completing 88% of follow-up visits). As the authors note, the lack of a control group of adolescents treated with diet and behavioral approaches prevents any definitive statement about the benefits and risks compared to nonsurgical approaches. However, previous research indicates that weight loss is not as great nor sustained when nonsurgical approaches are used.
Applications for Clinical Practice
The use of bariatric surgery in adolescents is a promising approach to a major health problem that has proven resistant to concerted medical and public health efforts and the use of nonsurgical treatments. Ongoing longitudinal research is needed but the positive outcomes seen here—sustained significant weight loss, improvement in cardiometabolic risk factors and comorbidities, and improved weight-related quality of life—indicate that bariatric surgery is an effective treatment for adolescent obesity when diet and behavioral approaches have failed. However, the occurrence of post-procedure complications also highlights the need for caution. Clinicians must carefully weigh the risk-benefit ratio for each individual, taking into consideration the long-term implications of severe obesity, any potential for significant weight loss with diet and behavioral changes, and the positive outcomes of bariatric surgery demonstrated here.
—Karen Roush, PhD, RN
1. Ogden CL, Carroll MD, Kit BK, Flegal KM. Prevalence of childhood and adult obesity in the United States, 2011–2012. JAMA 2014;311:806–14.
2. Schwimmer JB, Burwinkle TM, Varni JW. Health-related quality of life of severely obese children and adolescents. JAMA 2003;289:1813–9.
3. Strauss RS, Pollack HA. Social marginalization of overweight children. Arch Pediatr Adolesc Med 2003;157:746–52.
4. Inge TH, Zeller MH, Lawson ML, Daniels SR. A critical appraisal of evidence supporting a bariatric surgical approach to weight management for adolescents. J Pediatr 2005;147:10–9.
5. Weiss R, Dziura J, Burgert TS, et al. Obesity and the metabolic syndrome in children and adolescents. N Engl J Med 2004;350:2362–74.
6. Fontaine KR, Redden DT, Wang C, et al. Years of life lost due to obesity. JAMA 2003;289:187–93.
7. Zwintscher NP, Azarow KS, Horton JD, et al. The increasing incidence of adolescent bariatric surgery. J Pediatr Surg 2013;48:2401–7.
Study Overview
Objective. To examine the efficacy and safety of weight-loss surgery in adolescents.
Design. Prospective observational study.
Setting and participants. Adolescents (aged 13–19 years) with severe obesity undergoing bariatric surgery at 5 U.S. hospitals and medical centers from March 2007 through February 2012. Participants were enrolled in the Teen-Longitudinal Assessment of Bariatric Surgery (Teen-LABS) study, a longitudinal prospective study that investigated the risks and benefits of adolescent bariatric surgery.
Main outcome measures. Data was collected on weight, comorbidities, cardiometabolic risk factors, nutritional status, and weight-related quality of life at research visits scheduled at 6 months, 1 year, 2 years, and 3 years post bariatric surgery. Researchers measured height and weight and blood pressure directly and calculated BMI. They assessed for comorbidities and cardiometabolic risk factors through urine and serum laboratory tests of lipids, glomerular filtration rate, albumin, glycated hemoglobin, fasting glucose level, and insulin. They assessed nutritional status with laboratory values for serum albumin, folate, vitamin B12, 25-hydroxyvitamin D, parathyroid hormone, ferritin, transferrin, vitamin A, and vitamin B1 erythrocyte transketolase. Researchers conducted interviews with the participants to collect information about subsequent medical or surgical procedures or, if participants missed a research visit, they obtained information through chart reviews. Finally, weight-related quality of life was assessed with the Impact of Weight on Quality of Life-Kids instrument, a validated self-report measure with 27 items divided into 4 subscales: physical comfort, body esteem, social life, and family relations.
Main results. Analysis was conducted on results for 228 of 242 participants who received Roux-en-Y gastric bypass (n = 161) and sleeve gastrectomy (n = 67). Results for 14 participants who received adjustable gastric banding were not included due to the small size of that group. Mean weight loss was 41 kg while mean height increased by only 0.51 cm. The mean percentage of weight loss was 27% overall and was similar in both groups, 28% in participants who underwent gastric bypass and 26% in those who underwent sleeve gastrectomy. At the 3-year visit, there were statistically significant improvements in comorbidities: 74% of the 96 participants with elevated blood pressure, 66% of the 171 participants with dyslipidemia, and 86% of the 36 participants with abnormal kidney function at baseline had values within the normal range. None of 3 participants with type 1 diabetes at baseline had resolution. However, 29 participants had type 2 diabetes (median glycolated hemoglobin 6.3% at baseline) and 19 of 20 of them for whom data were available at 3 years were in remission, with a median glycolated hemoglobin of 5.3%. There was an increase in the number of participants with micronutrient deficiencies at the 3-year mark: the percentage of participants with low ferritin levels increased from 5% at baseline to 57%, those with low vitamin B12 increased from < 1% to 8%, and those with low vitamin A increased from 6% to 16%. During the 3-year follow-up period, 30 participants underwent 44 intrabdominal procedures related to the bariatric procedure and 29 participants underwent 48 endoscopic procedures, including stricture dilatation (n = 11). Total scores on the Impact of Weight on Quality of Life-Kids instrument improved from a mean of 63 at baseline to 83 at 3 years.
Conclusion. Overall there were significant improvements in weight, comorbidities, cardiometabolic health, and weight-related quality of life. However, there were also risks, including increased micronutrient deficiencies and the need for subsequent invasive abdominal procedures.
Commentary
Pediatric obesity is one of the most significant health problems facing children and adolescents. According to the most recent estimates, 34.5% of all adolescents aged 12 to 19 years are overweight or obese [1]. Pediatric obesity has serious short- and long-term psychosocial and physical implications. Obese adolescents suffer from social marginalization, poor self-concept, and lower health-related quality of life [2,3]. They are at greater risk for metabolic syndrome, diabetes, obstructive sleep apnea, and conditions associated with coronary artery disease such as hyperlipidemia and hypertension [4,5]. Additionally, obesity in adolescence is strongly associated with early mortality and years of life lost [6].
Despite extensive research and public health campaigns, rates of adolescent obesity have not decreased since 2003 [1]. Diet and behavioral approaches have had limited success and are rarely sustained over time. Bariatric surgery is an approach that has been used safely and effectively in severely obese adults and is increasingly being used for adolescents as well [7]. The results of this study are encouraging in that they suggest that bariatric surgery is effective in adolescents, leading to significant and sustained weight loss over 3 years and improved cardiometabolic health and weight-related quality of life.
The procedures are not without risks as demonstrated by the findings of micronutrient deficiencies and the need for follow-up intraabdominal and endoscopic procedures. The number of follow-up procedures and the fact that they continued into the third year is concerning. More details about this finding, such as characteristics of participants who required them, would be helpful. Further research to determine risk factors associated with complications that require subsequent invasive procedures is important for developing criteria for selection of candidates for bariatric surgery. Additionally, there was no information on impact of the follow-up procedures on participants or the conditions that precipitated them. In addition, there was no information on physical sequelae that can cause ongoing distress for patients, eg, chronic abdominal cramping and pain. The authors measured weight-related quality of life but measuring overall quality of life post-procedure would have captured the impact of post-procedure dietary restrictions and any medical problems. Such data could be helpful in decision-making about the use of bariatric procedures in this population versus noninvasive approaches to management.
As the authors note, treating severe obesity in adolescence rather than waiting until adulthood may have significant implications for improved health in adulthood, particularly in preventing or reversing cardiovascular damage related to obesity-related cardiometabolic risk factors. However, what is not known yet is whether the positive outcomes, beginning with weight loss, are sustained through adulthood. This 3-year longitudinal study was the first to examine factors over an extended time period, however, considering the average life expectancy of an adolescent, it provides only a relatively short-term outlook. A longitudinal study that follows a cohort of adolescents from the time of the bariatric procedure into middle age or beyond is needed. Such a study would also provide needed information about the long-term consequences of repeated intraabdominal procedures and the persistence or resolution of micronutrient deficiencies and their effects on health.
The strengths of this study are its prospective longitudinal design and its high rate of cohort completion (99% of participants remained actively involved, completing 88% of follow-up visits). As the authors note, the lack of a control group of adolescents treated with diet and behavioral approaches prevents any definitive statement about the benefits and risks compared to nonsurgical approaches. However, previous research indicates that weight loss is not as great nor sustained when nonsurgical approaches are used.
Applications for Clinical Practice
The use of bariatric surgery in adolescents is a promising approach to a major health problem that has proven resistant to concerted medical and public health efforts and the use of nonsurgical treatments. Ongoing longitudinal research is needed but the positive outcomes seen here—sustained significant weight loss, improvement in cardiometabolic risk factors and comorbidities, and improved weight-related quality of life—indicate that bariatric surgery is an effective treatment for adolescent obesity when diet and behavioral approaches have failed. However, the occurrence of post-procedure complications also highlights the need for caution. Clinicians must carefully weigh the risk-benefit ratio for each individual, taking into consideration the long-term implications of severe obesity, any potential for significant weight loss with diet and behavioral changes, and the positive outcomes of bariatric surgery demonstrated here.
—Karen Roush, PhD, RN
Study Overview
Objective. To examine the efficacy and safety of weight-loss surgery in adolescents.
Design. Prospective observational study.
Setting and participants. Adolescents (aged 13–19 years) with severe obesity undergoing bariatric surgery at 5 U.S. hospitals and medical centers from March 2007 through February 2012. Participants were enrolled in the Teen-Longitudinal Assessment of Bariatric Surgery (Teen-LABS) study, a longitudinal prospective study that investigated the risks and benefits of adolescent bariatric surgery.
Main outcome measures. Data was collected on weight, comorbidities, cardiometabolic risk factors, nutritional status, and weight-related quality of life at research visits scheduled at 6 months, 1 year, 2 years, and 3 years post bariatric surgery. Researchers measured height and weight and blood pressure directly and calculated BMI. They assessed for comorbidities and cardiometabolic risk factors through urine and serum laboratory tests of lipids, glomerular filtration rate, albumin, glycated hemoglobin, fasting glucose level, and insulin. They assessed nutritional status with laboratory values for serum albumin, folate, vitamin B12, 25-hydroxyvitamin D, parathyroid hormone, ferritin, transferrin, vitamin A, and vitamin B1 erythrocyte transketolase. Researchers conducted interviews with the participants to collect information about subsequent medical or surgical procedures or, if participants missed a research visit, they obtained information through chart reviews. Finally, weight-related quality of life was assessed with the Impact of Weight on Quality of Life-Kids instrument, a validated self-report measure with 27 items divided into 4 subscales: physical comfort, body esteem, social life, and family relations.
Main results. Analysis was conducted on results for 228 of 242 participants who received Roux-en-Y gastric bypass (n = 161) and sleeve gastrectomy (n = 67). Results for 14 participants who received adjustable gastric banding were not included due to the small size of that group. Mean weight loss was 41 kg while mean height increased by only 0.51 cm. The mean percentage of weight loss was 27% overall and was similar in both groups, 28% in participants who underwent gastric bypass and 26% in those who underwent sleeve gastrectomy. At the 3-year visit, there were statistically significant improvements in comorbidities: 74% of the 96 participants with elevated blood pressure, 66% of the 171 participants with dyslipidemia, and 86% of the 36 participants with abnormal kidney function at baseline had values within the normal range. None of 3 participants with type 1 diabetes at baseline had resolution. However, 29 participants had type 2 diabetes (median glycolated hemoglobin 6.3% at baseline) and 19 of 20 of them for whom data were available at 3 years were in remission, with a median glycolated hemoglobin of 5.3%. There was an increase in the number of participants with micronutrient deficiencies at the 3-year mark: the percentage of participants with low ferritin levels increased from 5% at baseline to 57%, those with low vitamin B12 increased from < 1% to 8%, and those with low vitamin A increased from 6% to 16%. During the 3-year follow-up period, 30 participants underwent 44 intrabdominal procedures related to the bariatric procedure and 29 participants underwent 48 endoscopic procedures, including stricture dilatation (n = 11). Total scores on the Impact of Weight on Quality of Life-Kids instrument improved from a mean of 63 at baseline to 83 at 3 years.
Conclusion. Overall there were significant improvements in weight, comorbidities, cardiometabolic health, and weight-related quality of life. However, there were also risks, including increased micronutrient deficiencies and the need for subsequent invasive abdominal procedures.
Commentary
Pediatric obesity is one of the most significant health problems facing children and adolescents. According to the most recent estimates, 34.5% of all adolescents aged 12 to 19 years are overweight or obese [1]. Pediatric obesity has serious short- and long-term psychosocial and physical implications. Obese adolescents suffer from social marginalization, poor self-concept, and lower health-related quality of life [2,3]. They are at greater risk for metabolic syndrome, diabetes, obstructive sleep apnea, and conditions associated with coronary artery disease such as hyperlipidemia and hypertension [4,5]. Additionally, obesity in adolescence is strongly associated with early mortality and years of life lost [6].
Despite extensive research and public health campaigns, rates of adolescent obesity have not decreased since 2003 [1]. Diet and behavioral approaches have had limited success and are rarely sustained over time. Bariatric surgery is an approach that has been used safely and effectively in severely obese adults and is increasingly being used for adolescents as well [7]. The results of this study are encouraging in that they suggest that bariatric surgery is effective in adolescents, leading to significant and sustained weight loss over 3 years and improved cardiometabolic health and weight-related quality of life.
The procedures are not without risks as demonstrated by the findings of micronutrient deficiencies and the need for follow-up intraabdominal and endoscopic procedures. The number of follow-up procedures and the fact that they continued into the third year is concerning. More details about this finding, such as characteristics of participants who required them, would be helpful. Further research to determine risk factors associated with complications that require subsequent invasive procedures is important for developing criteria for selection of candidates for bariatric surgery. Additionally, there was no information on impact of the follow-up procedures on participants or the conditions that precipitated them. In addition, there was no information on physical sequelae that can cause ongoing distress for patients, eg, chronic abdominal cramping and pain. The authors measured weight-related quality of life but measuring overall quality of life post-procedure would have captured the impact of post-procedure dietary restrictions and any medical problems. Such data could be helpful in decision-making about the use of bariatric procedures in this population versus noninvasive approaches to management.
As the authors note, treating severe obesity in adolescence rather than waiting until adulthood may have significant implications for improved health in adulthood, particularly in preventing or reversing cardiovascular damage related to obesity-related cardiometabolic risk factors. However, what is not known yet is whether the positive outcomes, beginning with weight loss, are sustained through adulthood. This 3-year longitudinal study was the first to examine factors over an extended time period, however, considering the average life expectancy of an adolescent, it provides only a relatively short-term outlook. A longitudinal study that follows a cohort of adolescents from the time of the bariatric procedure into middle age or beyond is needed. Such a study would also provide needed information about the long-term consequences of repeated intraabdominal procedures and the persistence or resolution of micronutrient deficiencies and their effects on health.
The strengths of this study are its prospective longitudinal design and its high rate of cohort completion (99% of participants remained actively involved, completing 88% of follow-up visits). As the authors note, the lack of a control group of adolescents treated with diet and behavioral approaches prevents any definitive statement about the benefits and risks compared to nonsurgical approaches. However, previous research indicates that weight loss is not as great nor sustained when nonsurgical approaches are used.
Applications for Clinical Practice
The use of bariatric surgery in adolescents is a promising approach to a major health problem that has proven resistant to concerted medical and public health efforts and the use of nonsurgical treatments. Ongoing longitudinal research is needed but the positive outcomes seen here—sustained significant weight loss, improvement in cardiometabolic risk factors and comorbidities, and improved weight-related quality of life—indicate that bariatric surgery is an effective treatment for adolescent obesity when diet and behavioral approaches have failed. However, the occurrence of post-procedure complications also highlights the need for caution. Clinicians must carefully weigh the risk-benefit ratio for each individual, taking into consideration the long-term implications of severe obesity, any potential for significant weight loss with diet and behavioral changes, and the positive outcomes of bariatric surgery demonstrated here.
—Karen Roush, PhD, RN
1. Ogden CL, Carroll MD, Kit BK, Flegal KM. Prevalence of childhood and adult obesity in the United States, 2011–2012. JAMA 2014;311:806–14.
2. Schwimmer JB, Burwinkle TM, Varni JW. Health-related quality of life of severely obese children and adolescents. JAMA 2003;289:1813–9.
3. Strauss RS, Pollack HA. Social marginalization of overweight children. Arch Pediatr Adolesc Med 2003;157:746–52.
4. Inge TH, Zeller MH, Lawson ML, Daniels SR. A critical appraisal of evidence supporting a bariatric surgical approach to weight management for adolescents. J Pediatr 2005;147:10–9.
5. Weiss R, Dziura J, Burgert TS, et al. Obesity and the metabolic syndrome in children and adolescents. N Engl J Med 2004;350:2362–74.
6. Fontaine KR, Redden DT, Wang C, et al. Years of life lost due to obesity. JAMA 2003;289:187–93.
7. Zwintscher NP, Azarow KS, Horton JD, et al. The increasing incidence of adolescent bariatric surgery. J Pediatr Surg 2013;48:2401–7.
1. Ogden CL, Carroll MD, Kit BK, Flegal KM. Prevalence of childhood and adult obesity in the United States, 2011–2012. JAMA 2014;311:806–14.
2. Schwimmer JB, Burwinkle TM, Varni JW. Health-related quality of life of severely obese children and adolescents. JAMA 2003;289:1813–9.
3. Strauss RS, Pollack HA. Social marginalization of overweight children. Arch Pediatr Adolesc Med 2003;157:746–52.
4. Inge TH, Zeller MH, Lawson ML, Daniels SR. A critical appraisal of evidence supporting a bariatric surgical approach to weight management for adolescents. J Pediatr 2005;147:10–9.
5. Weiss R, Dziura J, Burgert TS, et al. Obesity and the metabolic syndrome in children and adolescents. N Engl J Med 2004;350:2362–74.
6. Fontaine KR, Redden DT, Wang C, et al. Years of life lost due to obesity. JAMA 2003;289:187–93.
7. Zwintscher NP, Azarow KS, Horton JD, et al. The increasing incidence of adolescent bariatric surgery. J Pediatr Surg 2013;48:2401–7.
Fruits But Not Vegetables Associated with Lower Risk of Developing Hypertension
Study Overview
Objective. To examine the association of individual fruit and vegetable intake with the risk of developing hypertension.
Design. Meta-analysis.
Setting and participants. Subjects were derived from the Nurses’ Health Study (n = 121,700 women, aged 30–55 years in 1976), the Nurses’ Health Study II (n = 116,430 women, aged 25–42 years in 1989), and the Health Professionals Follow-up Study (n = 51,529 men, aged 40–75 years in 1986). Participants returned a questionnaire every 2 years reporting a diagnosis of hypertension by a health care provider. Participants also answered qualitative–quantitative food frequency questionnaires (FFQs) every 4 years, reporting an intake of > 130 foods and beverages. Participants who reported a diagnosis of hypertension at the baseline questionnaire were excluded from the analysis.
Main outcome measures. Self-reported incident hypertension.
Results. Compared to participants whose consumption of fruits and vegetables was ≤ 4 servings/week, those whose intake was ≥ 4 servings/day had multivariable pooled hazard ratios for incident hypertension of 0.92 (95% confidence interval [CI], 0.87–0.97) for total whole fruit intake and 0.95 (CI, 0.86–1.04) for total vegetable intake. Similarly, compared to participants who did not increase their fruit or vegetable consumption, the pooled hazard ratios for those whose intake increased by ≥ 7 servings/week were 0.94 (0.90–0.97) for total whole fruit intake and 0.98 (0.94–1.01) for total vegetable intake. When individual fruit and vegetable consumption was analyzed, consumption levels of ≥ 4 servings/week (as opposed to < 1 serving/month) of broccoli, carrots, tofu or soybeans, raisins, and apples were associated with lower hypertension risk. String beans, brussel sprouts, and cantaloupe were associated with increased risk of hypertension.
Conclusion. The study findings suggested that greater long-term intake and increased consumption of whole fruits may reduce the risk of developing hypertension.
Commentary
Hypertension is a major risk factor for cardiovascular disease and a growing public health concern. Effective public health interventions that will lead to population-wide reductions in blood pressure are needed. The adoption of a healthy diet and low sodium intake is recommended by the American Heart Association in order to prevent hypertension in adults [1]. However, specific information about the benefits of long-term intake and individual foods is limited.
This study aimed to examine the association of individual fruit and vegetable intake with the risk of developing hypertension in 3 large prospective cohort studies in the United States. It was found that greater long-term intake and increased consumption of whole fruits may reduce risk of developing hypertension. Participants with higher fruit and vegetable intakes were more physically active, older, had higher daily caloric intakes, and were less likely to be smokers.
This study was novel in that it examined individual fruit and vegetable consumption. All 3 studies provided a large sample, which increased precision and power in the statistical analysis. Researchers were focused on establishing an association between the risk of hypertension and fruit and vegetable consumption; therefore, hazard ratios were presented and Cox regression and multivariate analysis were used, which are appropriate statistical methods for this type of study.
Some limitations should be mentioned. Blood pressure was not directly measured. Food intake was measured using a dietary questionnaire and may not have accurately represented actual intake. Also, participants were mostly non-Hispanic white men and women and other population groups were not well represented.
Applications for Clinical Practice
Reducing the risk for hypertension by increasing fruit consumption needs to be examined in other population groups and studies. In the meantime, clinicians can continue to recommend an eating plan that is rich in fruits, vegetables, and low-fat dairy products and reduced in saturated fat, total fat, and cholesterol.
—Paloma Cesar de Sales, BS, RN, MS
1. American Heart Association. Prevention of high blood pressure. Available at www.heart.org/HEARTORG/Conditions/HighBloodPressure/PreventionTreatmentofHighBloodPressure/Shaking-the-Salt-Habit_UCM_303241_Article.jsp#.VsNZ8eZab-Y.
Study Overview
Objective. To examine the association of individual fruit and vegetable intake with the risk of developing hypertension.
Design. Meta-analysis.
Setting and participants. Subjects were derived from the Nurses’ Health Study (n = 121,700 women, aged 30–55 years in 1976), the Nurses’ Health Study II (n = 116,430 women, aged 25–42 years in 1989), and the Health Professionals Follow-up Study (n = 51,529 men, aged 40–75 years in 1986). Participants returned a questionnaire every 2 years reporting a diagnosis of hypertension by a health care provider. Participants also answered qualitative–quantitative food frequency questionnaires (FFQs) every 4 years, reporting an intake of > 130 foods and beverages. Participants who reported a diagnosis of hypertension at the baseline questionnaire were excluded from the analysis.
Main outcome measures. Self-reported incident hypertension.
Results. Compared to participants whose consumption of fruits and vegetables was ≤ 4 servings/week, those whose intake was ≥ 4 servings/day had multivariable pooled hazard ratios for incident hypertension of 0.92 (95% confidence interval [CI], 0.87–0.97) for total whole fruit intake and 0.95 (CI, 0.86–1.04) for total vegetable intake. Similarly, compared to participants who did not increase their fruit or vegetable consumption, the pooled hazard ratios for those whose intake increased by ≥ 7 servings/week were 0.94 (0.90–0.97) for total whole fruit intake and 0.98 (0.94–1.01) for total vegetable intake. When individual fruit and vegetable consumption was analyzed, consumption levels of ≥ 4 servings/week (as opposed to < 1 serving/month) of broccoli, carrots, tofu or soybeans, raisins, and apples were associated with lower hypertension risk. String beans, brussel sprouts, and cantaloupe were associated with increased risk of hypertension.
Conclusion. The study findings suggested that greater long-term intake and increased consumption of whole fruits may reduce the risk of developing hypertension.
Commentary
Hypertension is a major risk factor for cardiovascular disease and a growing public health concern. Effective public health interventions that will lead to population-wide reductions in blood pressure are needed. The adoption of a healthy diet and low sodium intake is recommended by the American Heart Association in order to prevent hypertension in adults [1]. However, specific information about the benefits of long-term intake and individual foods is limited.
This study aimed to examine the association of individual fruit and vegetable intake with the risk of developing hypertension in 3 large prospective cohort studies in the United States. It was found that greater long-term intake and increased consumption of whole fruits may reduce risk of developing hypertension. Participants with higher fruit and vegetable intakes were more physically active, older, had higher daily caloric intakes, and were less likely to be smokers.
This study was novel in that it examined individual fruit and vegetable consumption. All 3 studies provided a large sample, which increased precision and power in the statistical analysis. Researchers were focused on establishing an association between the risk of hypertension and fruit and vegetable consumption; therefore, hazard ratios were presented and Cox regression and multivariate analysis were used, which are appropriate statistical methods for this type of study.
Some limitations should be mentioned. Blood pressure was not directly measured. Food intake was measured using a dietary questionnaire and may not have accurately represented actual intake. Also, participants were mostly non-Hispanic white men and women and other population groups were not well represented.
Applications for Clinical Practice
Reducing the risk for hypertension by increasing fruit consumption needs to be examined in other population groups and studies. In the meantime, clinicians can continue to recommend an eating plan that is rich in fruits, vegetables, and low-fat dairy products and reduced in saturated fat, total fat, and cholesterol.
—Paloma Cesar de Sales, BS, RN, MS
Study Overview
Objective. To examine the association of individual fruit and vegetable intake with the risk of developing hypertension.
Design. Meta-analysis.
Setting and participants. Subjects were derived from the Nurses’ Health Study (n = 121,700 women, aged 30–55 years in 1976), the Nurses’ Health Study II (n = 116,430 women, aged 25–42 years in 1989), and the Health Professionals Follow-up Study (n = 51,529 men, aged 40–75 years in 1986). Participants returned a questionnaire every 2 years reporting a diagnosis of hypertension by a health care provider. Participants also answered qualitative–quantitative food frequency questionnaires (FFQs) every 4 years, reporting an intake of > 130 foods and beverages. Participants who reported a diagnosis of hypertension at the baseline questionnaire were excluded from the analysis.
Main outcome measures. Self-reported incident hypertension.
Results. Compared to participants whose consumption of fruits and vegetables was ≤ 4 servings/week, those whose intake was ≥ 4 servings/day had multivariable pooled hazard ratios for incident hypertension of 0.92 (95% confidence interval [CI], 0.87–0.97) for total whole fruit intake and 0.95 (CI, 0.86–1.04) for total vegetable intake. Similarly, compared to participants who did not increase their fruit or vegetable consumption, the pooled hazard ratios for those whose intake increased by ≥ 7 servings/week were 0.94 (0.90–0.97) for total whole fruit intake and 0.98 (0.94–1.01) for total vegetable intake. When individual fruit and vegetable consumption was analyzed, consumption levels of ≥ 4 servings/week (as opposed to < 1 serving/month) of broccoli, carrots, tofu or soybeans, raisins, and apples were associated with lower hypertension risk. String beans, brussel sprouts, and cantaloupe were associated with increased risk of hypertension.
Conclusion. The study findings suggested that greater long-term intake and increased consumption of whole fruits may reduce the risk of developing hypertension.
Commentary
Hypertension is a major risk factor for cardiovascular disease and a growing public health concern. Effective public health interventions that will lead to population-wide reductions in blood pressure are needed. The adoption of a healthy diet and low sodium intake is recommended by the American Heart Association in order to prevent hypertension in adults [1]. However, specific information about the benefits of long-term intake and individual foods is limited.
This study aimed to examine the association of individual fruit and vegetable intake with the risk of developing hypertension in 3 large prospective cohort studies in the United States. It was found that greater long-term intake and increased consumption of whole fruits may reduce risk of developing hypertension. Participants with higher fruit and vegetable intakes were more physically active, older, had higher daily caloric intakes, and were less likely to be smokers.
This study was novel in that it examined individual fruit and vegetable consumption. All 3 studies provided a large sample, which increased precision and power in the statistical analysis. Researchers were focused on establishing an association between the risk of hypertension and fruit and vegetable consumption; therefore, hazard ratios were presented and Cox regression and multivariate analysis were used, which are appropriate statistical methods for this type of study.
Some limitations should be mentioned. Blood pressure was not directly measured. Food intake was measured using a dietary questionnaire and may not have accurately represented actual intake. Also, participants were mostly non-Hispanic white men and women and other population groups were not well represented.
Applications for Clinical Practice
Reducing the risk for hypertension by increasing fruit consumption needs to be examined in other population groups and studies. In the meantime, clinicians can continue to recommend an eating plan that is rich in fruits, vegetables, and low-fat dairy products and reduced in saturated fat, total fat, and cholesterol.
—Paloma Cesar de Sales, BS, RN, MS
1. American Heart Association. Prevention of high blood pressure. Available at www.heart.org/HEARTORG/Conditions/HighBloodPressure/PreventionTreatmentofHighBloodPressure/Shaking-the-Salt-Habit_UCM_303241_Article.jsp#.VsNZ8eZab-Y.
1. American Heart Association. Prevention of high blood pressure. Available at www.heart.org/HEARTORG/Conditions/HighBloodPressure/PreventionTreatmentofHighBloodPressure/Shaking-the-Salt-Habit_UCM_303241_Article.jsp#.VsNZ8eZab-Y.
Slow and Steady May Not Win the Race for Weight Loss Maintenance
Study Overview
Objective. To compare weight regain after rapid versus slower loss of an equivalent amount of weight.
Study design. Randomized clinical trial.
Setting and participants. This study took place in a single medical center in the Netherlands. Investigators recruited 61 adults (no age range provided) with body mass index (BMI) between 28–35 kg/m2 and at a stable weight (no change of > 3 kg for the past 2 months) to participate in a weight loss study. Individuals with type 2 diabetes, dyslipidemia, uncontrolled hypertension, or liver, heart or kidney disease were excluded, as were those who were currently pregnant or reported consuming more than moderate amounts of alcohol.
Once consented, participants were randomized into one of 2 study arms. The rapid weight loss arm was prescribed a very-low-calorie diet (VLCD) with just 500 kcal/day (43% protein/43% carb/14% fat) for 5 weeks, after which they transitioned to a 4-week “weight stable” period, and then a 9-month follow-up period (overall follow-up time of ~11 months; 10 months after weight loss). In contrast, the slower weight loss arm was prescribed a low-calorie diet (LCD) with 1250 kcal/day (29% protein/48% carb/23% fat) for 12 weeks, after which they also transitioned to a 4-week weight stable period and 9 months of follow-up (overall follow-up time of ~13 months; 10 months after weight loss). VLCD (rapid weight loss) participants received 3 meal replacement shakes per day (totaling 500 kcal) during the weight loss period and were also told they could consume unlimited amounts of low-calorie vegetables. The LCD (slower weight loss) participants received 1 meal replacement shake per day during their 12 weeks of weight loss and were responsible for providing the remainder of their own meals and snacks according to guidelines from a study dietitian. Following active weight loss, both groups then shifted to higher-calorie, food-based diets during a “weight stable” 4-week period and were responsible during this time for providing all of their own food. The researchers do not specify the details of the diet composition for this weight stable period. Exposure to the registered dietitian was the same in both groups, with 5 consultations during weight loss (weekly for VLCD, presumably more spaced out for LCD) and 4 during weight stable period. No further diet advice or meal replacement support was given during the 9-month follow-up period, but participants came in for monthly weigh-ins.
Main outcome measure. The primary outcome measure was change in weight (ie, amount of weight regained) during the 9-month follow-up period, compared between groups using an independent samples t test. Additional biometric measures included change in waist circumference and changes in body composition. For the latter, the researchers used a “Bod Pod” to conduct air-displacement plethysmography and determine what percentage of an individual’s weight was fat mass (FM) versus lean mass/water (FFM [fat-free mass]). They then compared the amount of FFM lost between groups, again using the independent samples t test.
The researchers also collected information on self-reported physical activity (questionnaire) and self-reported history of weight cycling (number of times a participant had previously lost and regained at least 5 kg) prior to this study. These were not outcomes per-se, but were collected so that they could be examined as correlates of the biometric outcomes above, using Pearson and Spearman’s correlation coefficients.
Results. The LCD (n = 29) and VLCD (n = 28) groups were similar at baseline with no significant differences reported. Of the 61 individuals initially enrolled, 57 (93%) completed the study. Summary statistics are reported only for these 57 individuals. No imputation or other methods for handling missing data were used. There were slightly more women than men in the study (53% women); the average (SD) age was 51.8 (1.9) years in the LCD group and 50.7 (1.5) years in the VLCD group. Mean starting BMI was 31 kg/m2 (31.3 [0.5] in LCD, 31.0 [0.4] in VLCD) and both groups had just under 40% body fat at baseline (39.9% [1.8] in LCD, 39.7% [1.5] in VLCD).
After 12 weeks of weight loss for LCD, or 5 weeks of weight loss for VLCD, both groups lost a similar amount of total weight (8.2 [0.5] kg in LCD vs. 9.0 [0.4] kg in VLCD), then had no significant changes in weight during the subsequent 4-week “weight stable” period. However, during the weight stable period VLCD patients had an average 0.8 (0.6) cm increase in waist circumference (a rebounding after a decrease of 7.7 cm during weight loss), while LCD patients on average had a continued decrease of 1.0 (0.5 cm) in waist circumference (P = 0.003).
There was no significant difference between groups for the primary outcome of weight regain during 9-months of follow-up (4.2 [0.6] kg regained for LCD, 4.5 [0.7] for VLCD; P = 0.73). The only significant correlates of weight regain were amount of FFM lost (more lean mass lost predicted more weight regain), and amount of physical activity reported during follow-up (more activity predicted less regain). Participant sex, age, starting BMI, history of weight cycling, and amount of weight lost did not correlate with rate of re-gain.
One area where there was a significant between-group difference, both after initial weight loss and persisting after the weight stable period, was in the amount of FFM lost (a rough approximation of lost lean mass, eg, muscle mass). VLCD participants had more FFM loss (1.6 [0.2] kg) than LCD participants (0.6 [0.2] kg) (P < 0.01) after active weight loss, and continued to have significantly more FFM loss (0.8 [0.2] kg vs. 0.2 [0.2] kg) after the 4-week weight stable period.
There were no between-group differences at the end of weight loss or at the end of follow-up for hip or waist circumference or for blood pressure.
Conclusion. The authors conclude that rate of weight loss does not affect one’s risk of weight regain after a diet, after a similar amount of weight has been lost.
Commentary
The failure of most diets to produce durable weight loss is a frustration for patients, clinicians, and researchers. In general, regardless of the composition of a diet, the majority of patients will regain some or all of their lost weight within several years after completing the diet. The reasons for weight regain are complex, and include reversion to old eating or physical activity behaviors but also a strong physiologic drive by the body to reverse weight loss that it perceives as a threat to health [1].
One area in diet research that has recently generated some controversy is whether or not rate of initial weight loss might impact a patient’s ability to maintain that weight loss, with the conventional wisdom (and national guidelines, in some cases), suggesting that slower weight loss is preferable to rapid weight loss for this reason [2]. A handful of studies have challenged this notion, however, and suggested that rapid weight loss does not necessarily lead to greater weight regain [3,4]. Previous such studies, however, have not generally been designed to compare regain after equal amounts of weight loss, which may make their results more difficult to interpret.
The present study contributes another piece of evidence to the argument that rapid initial weight loss may not increase a patient’s risk of regain. This small randomized trial is timely and has several features that make it a unique contribution. First, the design of the study allowed for both groups, despite losing weight at very different rates, to reach the same amount of total weight loss before being followed forward in time. This made weight regain much easier to compare between groups during follow-up. Second, the study included measurement of changing body composition—ie, what kind of weight was being lost (fat vs. fat-free mass)—rather than just the total amount of weight. This allowed the researchers to present data for an outcome that is mechanistically related to metabolic rate (and therefore weight regain), and one that might have implications for longer-term health after rapid versus more moderate-pace weight loss.
Several aspects of the study design, however, may limit the impact of the findings. For example, in both arms, while a certain type of diet was “prescribed,” there is no comment about assessment of participant fidelity to the prescribed diet, and there is potential for very different levels of adherence between groups, especially in active weight loss, when basically all meals were provided to the VLCD arm, but LCD subjects were responsible for about 90% of their own meals. This could have led to larger discrepancies between prescribed and actual diet in the LCD arm relative to VLCD. Granted, the rate of weight loss was the exposure of interest, and that clearly varied between groups as expected, implying at least moderate fidelity to prescribed caloric content of each diet, but how much protein vs. fat vs. carb was actually consumed by each group is not clear. Additionally, while 9 months of post weight-loss follow-up is certainly a good start in terms of follow-up duration, it may not have been sufficient to observe differences that would later emerge between the groups for weight regain. Other long-term weight loss maintenance studies have followed patients for several years or longer after initial weight loss [5].
Using data from all participants, the researchers reported that the amount of FFM an individual lost was a predictor of weight regain during follow-up. This finding is in keeping with the idea that more lean mass loss leads to lower metabolic rate and predisposes to weight regain (hence the conventional wisdom to avoid rapid weight loss with low-protein diets). In keeping with this theme, VLCD patients, whose protein intakes and activity levels were lower, did lose more FFM (ie, lean mass) than LCD patients. It was therefore surprising that in between-group analyses there was no statistical difference in weight regain. On some level, this raises concerns about the robustness of the overall finding. Perhaps with a larger sample, more precise measures of FFM lost (eg, with DEXA scanning instead of the “bod pod” or longer follow-up, this difference in lost lean mass between groups actually would have predicted greater weight regain for VLCD patients. The researchers attribute some of the FFM loss after the caloric restriction phase to decreased water and glycogen stores, rather than muscle mass, and speculate that this is why no impact on weight regain was seen between groups.
From a generalizability standpoint, there are important safety concerns with the use of VLCDs, aside from subsequent risk of weight regain, that are not addressed with this study. Many patients simply cannot tolerate a 500 kcal per day diet, including those with more severe obesity (who have higher daily energy requirements) or those with complicated chronic medical conditions who might be at higher risk of complications from such low energy intake. Accordingly, these kinds of patients were not included in this study, so it is not clear whether results might generalize to them.
Applications for Clinical Practice
Despite the conventional wisdom that slower weight loss may be more sustainable over time, several recent trials have suggested otherwise. Nonetheless, rapid weight loss produced with the use of VLCDs is not appropriate for every patient and must be carefully overseen by a weight management professional. Furthermore, rapid weight loss may place patients at increased risk of preferentially losing lean mass, which does correlate with risk of weight regain and could set them up for other health problems in the long-term. More studies are needed in this area before a definitive judgment can be made regarding the long term risks and benefits of rapid versus moderate-pace weight loss.
—Kristina Lewis, MD, MPH
1. Anastasiou CA, Karfopoulou E, Yannakoulia M. Weight regaining: From statistics and behaviors to physiology and metabolism. Metabolism 2015;64:1395–407.
2. Casazza K, Brown A, Astrup A, et al. Weighing the evidence of common beliefs in obesity research. Crit Rev Food Sci Nutr 2015;55:2014–53.
3. Purcell K, Sumithran P, Prendergast LA, et al. The effect of rate of weight loss on long-term weight management: a randomised controlled trial. Lancet Diabetes Endocrinol 2014;2:954–62.
4. Toubro S, Astrup A. Randomised comparison of diets for maintaining obese subjects’ weight after major weight loss: ad lib, low fat, high carbohydrate diet v fixed energy intake. BMJ 1997;314:29–34.
5. Wing RR, Phelan S. Long-term weight loss maintenance. Am J Clin Nutr 2005;82(1 Suppl):222S–225S.
Study Overview
Objective. To compare weight regain after rapid versus slower loss of an equivalent amount of weight.
Study design. Randomized clinical trial.
Setting and participants. This study took place in a single medical center in the Netherlands. Investigators recruited 61 adults (no age range provided) with body mass index (BMI) between 28–35 kg/m2 and at a stable weight (no change of > 3 kg for the past 2 months) to participate in a weight loss study. Individuals with type 2 diabetes, dyslipidemia, uncontrolled hypertension, or liver, heart or kidney disease were excluded, as were those who were currently pregnant or reported consuming more than moderate amounts of alcohol.
Once consented, participants were randomized into one of 2 study arms. The rapid weight loss arm was prescribed a very-low-calorie diet (VLCD) with just 500 kcal/day (43% protein/43% carb/14% fat) for 5 weeks, after which they transitioned to a 4-week “weight stable” period, and then a 9-month follow-up period (overall follow-up time of ~11 months; 10 months after weight loss). In contrast, the slower weight loss arm was prescribed a low-calorie diet (LCD) with 1250 kcal/day (29% protein/48% carb/23% fat) for 12 weeks, after which they also transitioned to a 4-week weight stable period and 9 months of follow-up (overall follow-up time of ~13 months; 10 months after weight loss). VLCD (rapid weight loss) participants received 3 meal replacement shakes per day (totaling 500 kcal) during the weight loss period and were also told they could consume unlimited amounts of low-calorie vegetables. The LCD (slower weight loss) participants received 1 meal replacement shake per day during their 12 weeks of weight loss and were responsible for providing the remainder of their own meals and snacks according to guidelines from a study dietitian. Following active weight loss, both groups then shifted to higher-calorie, food-based diets during a “weight stable” 4-week period and were responsible during this time for providing all of their own food. The researchers do not specify the details of the diet composition for this weight stable period. Exposure to the registered dietitian was the same in both groups, with 5 consultations during weight loss (weekly for VLCD, presumably more spaced out for LCD) and 4 during weight stable period. No further diet advice or meal replacement support was given during the 9-month follow-up period, but participants came in for monthly weigh-ins.
Main outcome measure. The primary outcome measure was change in weight (ie, amount of weight regained) during the 9-month follow-up period, compared between groups using an independent samples t test. Additional biometric measures included change in waist circumference and changes in body composition. For the latter, the researchers used a “Bod Pod” to conduct air-displacement plethysmography and determine what percentage of an individual’s weight was fat mass (FM) versus lean mass/water (FFM [fat-free mass]). They then compared the amount of FFM lost between groups, again using the independent samples t test.
The researchers also collected information on self-reported physical activity (questionnaire) and self-reported history of weight cycling (number of times a participant had previously lost and regained at least 5 kg) prior to this study. These were not outcomes per-se, but were collected so that they could be examined as correlates of the biometric outcomes above, using Pearson and Spearman’s correlation coefficients.
Results. The LCD (n = 29) and VLCD (n = 28) groups were similar at baseline with no significant differences reported. Of the 61 individuals initially enrolled, 57 (93%) completed the study. Summary statistics are reported only for these 57 individuals. No imputation or other methods for handling missing data were used. There were slightly more women than men in the study (53% women); the average (SD) age was 51.8 (1.9) years in the LCD group and 50.7 (1.5) years in the VLCD group. Mean starting BMI was 31 kg/m2 (31.3 [0.5] in LCD, 31.0 [0.4] in VLCD) and both groups had just under 40% body fat at baseline (39.9% [1.8] in LCD, 39.7% [1.5] in VLCD).
After 12 weeks of weight loss for LCD, or 5 weeks of weight loss for VLCD, both groups lost a similar amount of total weight (8.2 [0.5] kg in LCD vs. 9.0 [0.4] kg in VLCD), then had no significant changes in weight during the subsequent 4-week “weight stable” period. However, during the weight stable period VLCD patients had an average 0.8 (0.6) cm increase in waist circumference (a rebounding after a decrease of 7.7 cm during weight loss), while LCD patients on average had a continued decrease of 1.0 (0.5 cm) in waist circumference (P = 0.003).
There was no significant difference between groups for the primary outcome of weight regain during 9-months of follow-up (4.2 [0.6] kg regained for LCD, 4.5 [0.7] for VLCD; P = 0.73). The only significant correlates of weight regain were amount of FFM lost (more lean mass lost predicted more weight regain), and amount of physical activity reported during follow-up (more activity predicted less regain). Participant sex, age, starting BMI, history of weight cycling, and amount of weight lost did not correlate with rate of re-gain.
One area where there was a significant between-group difference, both after initial weight loss and persisting after the weight stable period, was in the amount of FFM lost (a rough approximation of lost lean mass, eg, muscle mass). VLCD participants had more FFM loss (1.6 [0.2] kg) than LCD participants (0.6 [0.2] kg) (P < 0.01) after active weight loss, and continued to have significantly more FFM loss (0.8 [0.2] kg vs. 0.2 [0.2] kg) after the 4-week weight stable period.
There were no between-group differences at the end of weight loss or at the end of follow-up for hip or waist circumference or for blood pressure.
Conclusion. The authors conclude that rate of weight loss does not affect one’s risk of weight regain after a diet, after a similar amount of weight has been lost.
Commentary
The failure of most diets to produce durable weight loss is a frustration for patients, clinicians, and researchers. In general, regardless of the composition of a diet, the majority of patients will regain some or all of their lost weight within several years after completing the diet. The reasons for weight regain are complex, and include reversion to old eating or physical activity behaviors but also a strong physiologic drive by the body to reverse weight loss that it perceives as a threat to health [1].
One area in diet research that has recently generated some controversy is whether or not rate of initial weight loss might impact a patient’s ability to maintain that weight loss, with the conventional wisdom (and national guidelines, in some cases), suggesting that slower weight loss is preferable to rapid weight loss for this reason [2]. A handful of studies have challenged this notion, however, and suggested that rapid weight loss does not necessarily lead to greater weight regain [3,4]. Previous such studies, however, have not generally been designed to compare regain after equal amounts of weight loss, which may make their results more difficult to interpret.
The present study contributes another piece of evidence to the argument that rapid initial weight loss may not increase a patient’s risk of regain. This small randomized trial is timely and has several features that make it a unique contribution. First, the design of the study allowed for both groups, despite losing weight at very different rates, to reach the same amount of total weight loss before being followed forward in time. This made weight regain much easier to compare between groups during follow-up. Second, the study included measurement of changing body composition—ie, what kind of weight was being lost (fat vs. fat-free mass)—rather than just the total amount of weight. This allowed the researchers to present data for an outcome that is mechanistically related to metabolic rate (and therefore weight regain), and one that might have implications for longer-term health after rapid versus more moderate-pace weight loss.
Several aspects of the study design, however, may limit the impact of the findings. For example, in both arms, while a certain type of diet was “prescribed,” there is no comment about assessment of participant fidelity to the prescribed diet, and there is potential for very different levels of adherence between groups, especially in active weight loss, when basically all meals were provided to the VLCD arm, but LCD subjects were responsible for about 90% of their own meals. This could have led to larger discrepancies between prescribed and actual diet in the LCD arm relative to VLCD. Granted, the rate of weight loss was the exposure of interest, and that clearly varied between groups as expected, implying at least moderate fidelity to prescribed caloric content of each diet, but how much protein vs. fat vs. carb was actually consumed by each group is not clear. Additionally, while 9 months of post weight-loss follow-up is certainly a good start in terms of follow-up duration, it may not have been sufficient to observe differences that would later emerge between the groups for weight regain. Other long-term weight loss maintenance studies have followed patients for several years or longer after initial weight loss [5].
Using data from all participants, the researchers reported that the amount of FFM an individual lost was a predictor of weight regain during follow-up. This finding is in keeping with the idea that more lean mass loss leads to lower metabolic rate and predisposes to weight regain (hence the conventional wisdom to avoid rapid weight loss with low-protein diets). In keeping with this theme, VLCD patients, whose protein intakes and activity levels were lower, did lose more FFM (ie, lean mass) than LCD patients. It was therefore surprising that in between-group analyses there was no statistical difference in weight regain. On some level, this raises concerns about the robustness of the overall finding. Perhaps with a larger sample, more precise measures of FFM lost (eg, with DEXA scanning instead of the “bod pod” or longer follow-up, this difference in lost lean mass between groups actually would have predicted greater weight regain for VLCD patients. The researchers attribute some of the FFM loss after the caloric restriction phase to decreased water and glycogen stores, rather than muscle mass, and speculate that this is why no impact on weight regain was seen between groups.
From a generalizability standpoint, there are important safety concerns with the use of VLCDs, aside from subsequent risk of weight regain, that are not addressed with this study. Many patients simply cannot tolerate a 500 kcal per day diet, including those with more severe obesity (who have higher daily energy requirements) or those with complicated chronic medical conditions who might be at higher risk of complications from such low energy intake. Accordingly, these kinds of patients were not included in this study, so it is not clear whether results might generalize to them.
Applications for Clinical Practice
Despite the conventional wisdom that slower weight loss may be more sustainable over time, several recent trials have suggested otherwise. Nonetheless, rapid weight loss produced with the use of VLCDs is not appropriate for every patient and must be carefully overseen by a weight management professional. Furthermore, rapid weight loss may place patients at increased risk of preferentially losing lean mass, which does correlate with risk of weight regain and could set them up for other health problems in the long-term. More studies are needed in this area before a definitive judgment can be made regarding the long term risks and benefits of rapid versus moderate-pace weight loss.
—Kristina Lewis, MD, MPH
Study Overview
Objective. To compare weight regain after rapid versus slower loss of an equivalent amount of weight.
Study design. Randomized clinical trial.
Setting and participants. This study took place in a single medical center in the Netherlands. Investigators recruited 61 adults (no age range provided) with body mass index (BMI) between 28–35 kg/m2 and at a stable weight (no change of > 3 kg for the past 2 months) to participate in a weight loss study. Individuals with type 2 diabetes, dyslipidemia, uncontrolled hypertension, or liver, heart or kidney disease were excluded, as were those who were currently pregnant or reported consuming more than moderate amounts of alcohol.
Once consented, participants were randomized into one of 2 study arms. The rapid weight loss arm was prescribed a very-low-calorie diet (VLCD) with just 500 kcal/day (43% protein/43% carb/14% fat) for 5 weeks, after which they transitioned to a 4-week “weight stable” period, and then a 9-month follow-up period (overall follow-up time of ~11 months; 10 months after weight loss). In contrast, the slower weight loss arm was prescribed a low-calorie diet (LCD) with 1250 kcal/day (29% protein/48% carb/23% fat) for 12 weeks, after which they also transitioned to a 4-week weight stable period and 9 months of follow-up (overall follow-up time of ~13 months; 10 months after weight loss). VLCD (rapid weight loss) participants received 3 meal replacement shakes per day (totaling 500 kcal) during the weight loss period and were also told they could consume unlimited amounts of low-calorie vegetables. The LCD (slower weight loss) participants received 1 meal replacement shake per day during their 12 weeks of weight loss and were responsible for providing the remainder of their own meals and snacks according to guidelines from a study dietitian. Following active weight loss, both groups then shifted to higher-calorie, food-based diets during a “weight stable” 4-week period and were responsible during this time for providing all of their own food. The researchers do not specify the details of the diet composition for this weight stable period. Exposure to the registered dietitian was the same in both groups, with 5 consultations during weight loss (weekly for VLCD, presumably more spaced out for LCD) and 4 during weight stable period. No further diet advice or meal replacement support was given during the 9-month follow-up period, but participants came in for monthly weigh-ins.
Main outcome measure. The primary outcome measure was change in weight (ie, amount of weight regained) during the 9-month follow-up period, compared between groups using an independent samples t test. Additional biometric measures included change in waist circumference and changes in body composition. For the latter, the researchers used a “Bod Pod” to conduct air-displacement plethysmography and determine what percentage of an individual’s weight was fat mass (FM) versus lean mass/water (FFM [fat-free mass]). They then compared the amount of FFM lost between groups, again using the independent samples t test.
The researchers also collected information on self-reported physical activity (questionnaire) and self-reported history of weight cycling (number of times a participant had previously lost and regained at least 5 kg) prior to this study. These were not outcomes per-se, but were collected so that they could be examined as correlates of the biometric outcomes above, using Pearson and Spearman’s correlation coefficients.
Results. The LCD (n = 29) and VLCD (n = 28) groups were similar at baseline with no significant differences reported. Of the 61 individuals initially enrolled, 57 (93%) completed the study. Summary statistics are reported only for these 57 individuals. No imputation or other methods for handling missing data were used. There were slightly more women than men in the study (53% women); the average (SD) age was 51.8 (1.9) years in the LCD group and 50.7 (1.5) years in the VLCD group. Mean starting BMI was 31 kg/m2 (31.3 [0.5] in LCD, 31.0 [0.4] in VLCD) and both groups had just under 40% body fat at baseline (39.9% [1.8] in LCD, 39.7% [1.5] in VLCD).
After 12 weeks of weight loss for LCD, or 5 weeks of weight loss for VLCD, both groups lost a similar amount of total weight (8.2 [0.5] kg in LCD vs. 9.0 [0.4] kg in VLCD), then had no significant changes in weight during the subsequent 4-week “weight stable” period. However, during the weight stable period VLCD patients had an average 0.8 (0.6) cm increase in waist circumference (a rebounding after a decrease of 7.7 cm during weight loss), while LCD patients on average had a continued decrease of 1.0 (0.5 cm) in waist circumference (P = 0.003).
There was no significant difference between groups for the primary outcome of weight regain during 9-months of follow-up (4.2 [0.6] kg regained for LCD, 4.5 [0.7] for VLCD; P = 0.73). The only significant correlates of weight regain were amount of FFM lost (more lean mass lost predicted more weight regain), and amount of physical activity reported during follow-up (more activity predicted less regain). Participant sex, age, starting BMI, history of weight cycling, and amount of weight lost did not correlate with rate of re-gain.
One area where there was a significant between-group difference, both after initial weight loss and persisting after the weight stable period, was in the amount of FFM lost (a rough approximation of lost lean mass, eg, muscle mass). VLCD participants had more FFM loss (1.6 [0.2] kg) than LCD participants (0.6 [0.2] kg) (P < 0.01) after active weight loss, and continued to have significantly more FFM loss (0.8 [0.2] kg vs. 0.2 [0.2] kg) after the 4-week weight stable period.
There were no between-group differences at the end of weight loss or at the end of follow-up for hip or waist circumference or for blood pressure.
Conclusion. The authors conclude that rate of weight loss does not affect one’s risk of weight regain after a diet, after a similar amount of weight has been lost.
Commentary
The failure of most diets to produce durable weight loss is a frustration for patients, clinicians, and researchers. In general, regardless of the composition of a diet, the majority of patients will regain some or all of their lost weight within several years after completing the diet. The reasons for weight regain are complex, and include reversion to old eating or physical activity behaviors but also a strong physiologic drive by the body to reverse weight loss that it perceives as a threat to health [1].
One area in diet research that has recently generated some controversy is whether or not rate of initial weight loss might impact a patient’s ability to maintain that weight loss, with the conventional wisdom (and national guidelines, in some cases), suggesting that slower weight loss is preferable to rapid weight loss for this reason [2]. A handful of studies have challenged this notion, however, and suggested that rapid weight loss does not necessarily lead to greater weight regain [3,4]. Previous such studies, however, have not generally been designed to compare regain after equal amounts of weight loss, which may make their results more difficult to interpret.
The present study contributes another piece of evidence to the argument that rapid initial weight loss may not increase a patient’s risk of regain. This small randomized trial is timely and has several features that make it a unique contribution. First, the design of the study allowed for both groups, despite losing weight at very different rates, to reach the same amount of total weight loss before being followed forward in time. This made weight regain much easier to compare between groups during follow-up. Second, the study included measurement of changing body composition—ie, what kind of weight was being lost (fat vs. fat-free mass)—rather than just the total amount of weight. This allowed the researchers to present data for an outcome that is mechanistically related to metabolic rate (and therefore weight regain), and one that might have implications for longer-term health after rapid versus more moderate-pace weight loss.
Several aspects of the study design, however, may limit the impact of the findings. For example, in both arms, while a certain type of diet was “prescribed,” there is no comment about assessment of participant fidelity to the prescribed diet, and there is potential for very different levels of adherence between groups, especially in active weight loss, when basically all meals were provided to the VLCD arm, but LCD subjects were responsible for about 90% of their own meals. This could have led to larger discrepancies between prescribed and actual diet in the LCD arm relative to VLCD. Granted, the rate of weight loss was the exposure of interest, and that clearly varied between groups as expected, implying at least moderate fidelity to prescribed caloric content of each diet, but how much protein vs. fat vs. carb was actually consumed by each group is not clear. Additionally, while 9 months of post weight-loss follow-up is certainly a good start in terms of follow-up duration, it may not have been sufficient to observe differences that would later emerge between the groups for weight regain. Other long-term weight loss maintenance studies have followed patients for several years or longer after initial weight loss [5].
Using data from all participants, the researchers reported that the amount of FFM an individual lost was a predictor of weight regain during follow-up. This finding is in keeping with the idea that more lean mass loss leads to lower metabolic rate and predisposes to weight regain (hence the conventional wisdom to avoid rapid weight loss with low-protein diets). In keeping with this theme, VLCD patients, whose protein intakes and activity levels were lower, did lose more FFM (ie, lean mass) than LCD patients. It was therefore surprising that in between-group analyses there was no statistical difference in weight regain. On some level, this raises concerns about the robustness of the overall finding. Perhaps with a larger sample, more precise measures of FFM lost (eg, with DEXA scanning instead of the “bod pod” or longer follow-up, this difference in lost lean mass between groups actually would have predicted greater weight regain for VLCD patients. The researchers attribute some of the FFM loss after the caloric restriction phase to decreased water and glycogen stores, rather than muscle mass, and speculate that this is why no impact on weight regain was seen between groups.
From a generalizability standpoint, there are important safety concerns with the use of VLCDs, aside from subsequent risk of weight regain, that are not addressed with this study. Many patients simply cannot tolerate a 500 kcal per day diet, including those with more severe obesity (who have higher daily energy requirements) or those with complicated chronic medical conditions who might be at higher risk of complications from such low energy intake. Accordingly, these kinds of patients were not included in this study, so it is not clear whether results might generalize to them.
Applications for Clinical Practice
Despite the conventional wisdom that slower weight loss may be more sustainable over time, several recent trials have suggested otherwise. Nonetheless, rapid weight loss produced with the use of VLCDs is not appropriate for every patient and must be carefully overseen by a weight management professional. Furthermore, rapid weight loss may place patients at increased risk of preferentially losing lean mass, which does correlate with risk of weight regain and could set them up for other health problems in the long-term. More studies are needed in this area before a definitive judgment can be made regarding the long term risks and benefits of rapid versus moderate-pace weight loss.
—Kristina Lewis, MD, MPH
1. Anastasiou CA, Karfopoulou E, Yannakoulia M. Weight regaining: From statistics and behaviors to physiology and metabolism. Metabolism 2015;64:1395–407.
2. Casazza K, Brown A, Astrup A, et al. Weighing the evidence of common beliefs in obesity research. Crit Rev Food Sci Nutr 2015;55:2014–53.
3. Purcell K, Sumithran P, Prendergast LA, et al. The effect of rate of weight loss on long-term weight management: a randomised controlled trial. Lancet Diabetes Endocrinol 2014;2:954–62.
4. Toubro S, Astrup A. Randomised comparison of diets for maintaining obese subjects’ weight after major weight loss: ad lib, low fat, high carbohydrate diet v fixed energy intake. BMJ 1997;314:29–34.
5. Wing RR, Phelan S. Long-term weight loss maintenance. Am J Clin Nutr 2005;82(1 Suppl):222S–225S.
1. Anastasiou CA, Karfopoulou E, Yannakoulia M. Weight regaining: From statistics and behaviors to physiology and metabolism. Metabolism 2015;64:1395–407.
2. Casazza K, Brown A, Astrup A, et al. Weighing the evidence of common beliefs in obesity research. Crit Rev Food Sci Nutr 2015;55:2014–53.
3. Purcell K, Sumithran P, Prendergast LA, et al. The effect of rate of weight loss on long-term weight management: a randomised controlled trial. Lancet Diabetes Endocrinol 2014;2:954–62.
4. Toubro S, Astrup A. Randomised comparison of diets for maintaining obese subjects’ weight after major weight loss: ad lib, low fat, high carbohydrate diet v fixed energy intake. BMJ 1997;314:29–34.
5. Wing RR, Phelan S. Long-term weight loss maintenance. Am J Clin Nutr 2005;82(1 Suppl):222S–225S.
Delayed Prescriptions for Reducing Antibiotic Use
Study Overview
Objective. To determine the efficacy and safety of delayed antibiotic prescribing strategies in acute uncomplicated respiratory infections.
Design. Randomized, multicenter, open-label clinical trial.
Setting and participants. The setting was 23 primary care centers in Spain. The study recruited patients who were 18 years of age or older with an acute uncomplicated respiratory infection (acute pharyngitis, rhinosinusitis, acute bronchitis, exacerbations of chronic bronchitis or mild to moderate chronic obstructive pulmonary disease). Patients with these infections were included by the physicians as long as they were unsure of whether to use antibiotics or not. The study protocol has been published elsewhere [1].
Intervention. Patients were randomized to 1 of 4 potential prescription strategies: (1) a delayed patient-led prescription strategy where patients were given an antibiotic prescription at first consultation but instructed to fill the prescription only if they felt substantially worse or saw no improvement in symptoms in the first few days after initial consultation; (2) a delayed prescription collection strategy requiring patients to collect their prescription from the primary care center reception desk 3 days after the first consultation; (3) an immediate prescription strategy; or (4) no antibiotic strategy. The patient-led and delayed collection strategies were considered delayed prescription strategies.
Main outcome measures. Duration of symptoms and severity of symptoms. Patients filled out a daily questionnaire for a maximum of 30 days, which listed common symptoms such as fever, discomfort or general pain, cough, difficulty sleeping, and changes in everyday life, and specific symptoms according to condition. Patients assessed severity of their symptoms using 6-point Likert scale, with scores of 1-2 considered mild, 3-4 moderate, and 5-6 severe. Secondary outcomes included antibiotic use, patient satisfaction, patients’ beliefs in the effectiveness of antibiotics, and absenteeism (absence from work or doing their daily activities).
Main results. A total of 405 patients were recruited, 398 of whom were included in the analysis. 136 patients (34.2%) were men. The mean (SD) age was 45 (17) years and 265 patients (72%) had at least a secondary education level. The most common infection was pharyngitis (n = 184; 46.2%), followed by acute bronchitis (n = 128; 32.2%). The mean severity of symptoms ranged from 1.8 to 3.5 points on the Likert scale, and mean (SD) duration of symptoms described on first visit was 6 (6) days. The mean (SD) general health status on first visit was 54 (20) based on a scale with 0 indicating worst health status and 100 indicating best health status. 314 patients (80.1%) were nonsmokers, and 372 patients (93.5%) did not have a respiratory comorbidity. The presence of symptoms on first visit was similar among the 4 groups.
The duration of the common symptoms of fever, discomfort or general pain, and cough was shorter in the immediate prescription group versus the no prescription group (P < 0.05 for all). In the immediate prescription group, the duration of patient symptoms after first visit was significantly different from that of the prescription collection and patient-led prescription groups only for discomfort or general pain. The mean (SD) duration of severe symptoms was 3.6 (3.3) days for the immediate prescription group, 4.0 (4.2) days for the prescription collection group, 5.1 (6.3) days for the patient-led prescription group, and 4.7 (3.6) days for the no prescription group. The median (interquartile range [IQR]) of severe symptoms was 3 (1–4) days for the prescription collection group and 3 (2–6) days for the patient-led prescription group. The median (IQR) of the maximum severity for any symptom was 5 (3–5) for the immediate prescription group and the prescription collection group; 5 (4–5) for the patient-led prescription group; and 5 (4–6) for the no prescription group. Patients randomized to the no prescription strategy or to either of the delayed strategies used fewer antibiotics and less frequently believed in antibiotic effectiveness. Among patients in the immediate prescription group, 91.1% used antibiotics; in the delayed patient-led, delayed collection, and no prescription groups, the rates of antibiotic use were 32.6%, 23.0%, and 12.1%, respectively. There were very few adverse events across groups, although the no prescription group had 3 adverse events compared with 0-1 in the other groups. Satisfaction was similar across groups.
Conclusion. Delayed strategies were associated with slightly greater but clinically similar symptom burden and duration and also with substantially reduced antibiotic use when compared with an immediate strategy.
Commentary
Acute respiratory infections are a common reasons for physician visits. These infections tend to be self-limiting and overuse of antibiotics for these infections is widespread. Approximately 60% of patients with a sore throat and ~70% of patients with acute uncomplicated bronchitis receive antibiotic prescriptions despite the literature suggesting no or limited benefit [2,3].Antibiotic resistance is a growing problem and the main cause of this problem is misuse of antibiotics.
Often physicians feel pressured into prescribing anti-biotics due to patient expectation and patient satisfaction metrics. In the face of the critical need to reduce overuse, delayed antibiotic prescribing strategies offers a compromise between immediate and no prescription [4]. Delayed prescribing strategies have been evaluated previously [5–8], with findings suggesting they do reduce antibiotic use. This study strengthens the evidence base supporting the delayed strategy.
This study has a few limitations. The sample size was small, and symptom data was obtained via patient self-report. In addition, the randomization procedure was not described. However, the investigators were able to achieve good patient retention, with very few patients lost to follow-up. The investigators used an intention to treat analysis; thus, the estimate of treatment effect size can be considered conservative.
In terms of baseline characteristics of the study participants, there was a lower overall education level, fewer smokers, and less respiratory comorbidity (defined as only cardiovascular comorbidity [P = 0.12] and diabetes [P = 0.19]) in the patient-led group. Otherwise, groups were very well-matched. Most patients in the study had pharyngitis and bronchitis, limiting the inferences for patients with rhinosinusitis or exacerbation of mild-to-moderate COPD.
Applications for Clinical Practice
Delayed antibiotic prescribing for acute uncomplicated respiratory infections appears to be an acceptable strategy for reducing the overuse of antibiotics. As patients may lack knowledge of this prescribing strategy [9], clinicians may need to spend time explaining the concept. Using the term “back-up antibiotics” instead of “delayed prescription” [10] may help to increase patients’ understanding and acceptance.
—Ajay Dharod, MD
1. de la Poza Abad M, Mas Dalmau G, Moreno Bakedano M, Get al; Delayed Antibiotic Prescription (DAP) Working Group. Rationale, design and organization of the delayed antibiotic prescription (DAP) trial: a randomized controlled trial of the efficacy and safety of delayed antibiotic prescribing strategies in the non-complicated acute respiratory tract infections in general practice. BMC Fam Pract 2013;14:63.
2. Barnett ML, Linder JA. Antibiotic prescribing to adults with sore throat in the United States, 1997-2010. JAMA Intern Med 2014;174:138–40.
3. Barnett ML, Linder JA. Antibiotic prescribing for adults with acute bronchitis in the United States, 1996–2010. JAMA 2014;311:2020–2.
4. McCullough AR, Glasziou PP. Delayed antibiotic prescribing strategies-time to implement? JAMA Intern Med 2016;176:29–30.
5. National Institute for Health and Clinical Excellence. Prescribing of antibiotics for self-limiting respiratory tract infections in adults and children in primary care. Clinical guideline 69. London: NICE; 2008.
6. Arnold SR, Straus SE. Interventions to improve antibiotic prescribing practices in ambulatory care. Cochrane Database Syst Rev 2005;(4):CD003539.
7. Arroll B, Kenealy T, Kerse N. Do delayed prescriptions reduce antibiotic use in respiratory tract infections? A systematic review. Br J Gen Pract 2003;53:871–7.
8. Spurling GKP, Del Mar CB, Dooley L, et al. Delayed antibiotics for respiratory infections. Cochrane Database Syst Rev 2013;4:CD004417.
9. McNulty CAM, Lecky DM, Hawking MKD, et al. Delayed/back up antibiotic prescriptions: what do the public think? BMJ Open 2015;5:e009748.
10. Bunten AK, Hawking MKD, McNulty CAM. Patient information can improve appropriate antibiotic prescribing. Nurs Pract 2015;82:61–3.
Study Overview
Objective. To determine the efficacy and safety of delayed antibiotic prescribing strategies in acute uncomplicated respiratory infections.
Design. Randomized, multicenter, open-label clinical trial.
Setting and participants. The setting was 23 primary care centers in Spain. The study recruited patients who were 18 years of age or older with an acute uncomplicated respiratory infection (acute pharyngitis, rhinosinusitis, acute bronchitis, exacerbations of chronic bronchitis or mild to moderate chronic obstructive pulmonary disease). Patients with these infections were included by the physicians as long as they were unsure of whether to use antibiotics or not. The study protocol has been published elsewhere [1].
Intervention. Patients were randomized to 1 of 4 potential prescription strategies: (1) a delayed patient-led prescription strategy where patients were given an antibiotic prescription at first consultation but instructed to fill the prescription only if they felt substantially worse or saw no improvement in symptoms in the first few days after initial consultation; (2) a delayed prescription collection strategy requiring patients to collect their prescription from the primary care center reception desk 3 days after the first consultation; (3) an immediate prescription strategy; or (4) no antibiotic strategy. The patient-led and delayed collection strategies were considered delayed prescription strategies.
Main outcome measures. Duration of symptoms and severity of symptoms. Patients filled out a daily questionnaire for a maximum of 30 days, which listed common symptoms such as fever, discomfort or general pain, cough, difficulty sleeping, and changes in everyday life, and specific symptoms according to condition. Patients assessed severity of their symptoms using 6-point Likert scale, with scores of 1-2 considered mild, 3-4 moderate, and 5-6 severe. Secondary outcomes included antibiotic use, patient satisfaction, patients’ beliefs in the effectiveness of antibiotics, and absenteeism (absence from work or doing their daily activities).
Main results. A total of 405 patients were recruited, 398 of whom were included in the analysis. 136 patients (34.2%) were men. The mean (SD) age was 45 (17) years and 265 patients (72%) had at least a secondary education level. The most common infection was pharyngitis (n = 184; 46.2%), followed by acute bronchitis (n = 128; 32.2%). The mean severity of symptoms ranged from 1.8 to 3.5 points on the Likert scale, and mean (SD) duration of symptoms described on first visit was 6 (6) days. The mean (SD) general health status on first visit was 54 (20) based on a scale with 0 indicating worst health status and 100 indicating best health status. 314 patients (80.1%) were nonsmokers, and 372 patients (93.5%) did not have a respiratory comorbidity. The presence of symptoms on first visit was similar among the 4 groups.
The duration of the common symptoms of fever, discomfort or general pain, and cough was shorter in the immediate prescription group versus the no prescription group (P < 0.05 for all). In the immediate prescription group, the duration of patient symptoms after first visit was significantly different from that of the prescription collection and patient-led prescription groups only for discomfort or general pain. The mean (SD) duration of severe symptoms was 3.6 (3.3) days for the immediate prescription group, 4.0 (4.2) days for the prescription collection group, 5.1 (6.3) days for the patient-led prescription group, and 4.7 (3.6) days for the no prescription group. The median (interquartile range [IQR]) of severe symptoms was 3 (1–4) days for the prescription collection group and 3 (2–6) days for the patient-led prescription group. The median (IQR) of the maximum severity for any symptom was 5 (3–5) for the immediate prescription group and the prescription collection group; 5 (4–5) for the patient-led prescription group; and 5 (4–6) for the no prescription group. Patients randomized to the no prescription strategy or to either of the delayed strategies used fewer antibiotics and less frequently believed in antibiotic effectiveness. Among patients in the immediate prescription group, 91.1% used antibiotics; in the delayed patient-led, delayed collection, and no prescription groups, the rates of antibiotic use were 32.6%, 23.0%, and 12.1%, respectively. There were very few adverse events across groups, although the no prescription group had 3 adverse events compared with 0-1 in the other groups. Satisfaction was similar across groups.
Conclusion. Delayed strategies were associated with slightly greater but clinically similar symptom burden and duration and also with substantially reduced antibiotic use when compared with an immediate strategy.
Commentary
Acute respiratory infections are a common reasons for physician visits. These infections tend to be self-limiting and overuse of antibiotics for these infections is widespread. Approximately 60% of patients with a sore throat and ~70% of patients with acute uncomplicated bronchitis receive antibiotic prescriptions despite the literature suggesting no or limited benefit [2,3].Antibiotic resistance is a growing problem and the main cause of this problem is misuse of antibiotics.
Often physicians feel pressured into prescribing anti-biotics due to patient expectation and patient satisfaction metrics. In the face of the critical need to reduce overuse, delayed antibiotic prescribing strategies offers a compromise between immediate and no prescription [4]. Delayed prescribing strategies have been evaluated previously [5–8], with findings suggesting they do reduce antibiotic use. This study strengthens the evidence base supporting the delayed strategy.
This study has a few limitations. The sample size was small, and symptom data was obtained via patient self-report. In addition, the randomization procedure was not described. However, the investigators were able to achieve good patient retention, with very few patients lost to follow-up. The investigators used an intention to treat analysis; thus, the estimate of treatment effect size can be considered conservative.
In terms of baseline characteristics of the study participants, there was a lower overall education level, fewer smokers, and less respiratory comorbidity (defined as only cardiovascular comorbidity [P = 0.12] and diabetes [P = 0.19]) in the patient-led group. Otherwise, groups were very well-matched. Most patients in the study had pharyngitis and bronchitis, limiting the inferences for patients with rhinosinusitis or exacerbation of mild-to-moderate COPD.
Applications for Clinical Practice
Delayed antibiotic prescribing for acute uncomplicated respiratory infections appears to be an acceptable strategy for reducing the overuse of antibiotics. As patients may lack knowledge of this prescribing strategy [9], clinicians may need to spend time explaining the concept. Using the term “back-up antibiotics” instead of “delayed prescription” [10] may help to increase patients’ understanding and acceptance.
—Ajay Dharod, MD
Study Overview
Objective. To determine the efficacy and safety of delayed antibiotic prescribing strategies in acute uncomplicated respiratory infections.
Design. Randomized, multicenter, open-label clinical trial.
Setting and participants. The setting was 23 primary care centers in Spain. The study recruited patients who were 18 years of age or older with an acute uncomplicated respiratory infection (acute pharyngitis, rhinosinusitis, acute bronchitis, exacerbations of chronic bronchitis or mild to moderate chronic obstructive pulmonary disease). Patients with these infections were included by the physicians as long as they were unsure of whether to use antibiotics or not. The study protocol has been published elsewhere [1].
Intervention. Patients were randomized to 1 of 4 potential prescription strategies: (1) a delayed patient-led prescription strategy where patients were given an antibiotic prescription at first consultation but instructed to fill the prescription only if they felt substantially worse or saw no improvement in symptoms in the first few days after initial consultation; (2) a delayed prescription collection strategy requiring patients to collect their prescription from the primary care center reception desk 3 days after the first consultation; (3) an immediate prescription strategy; or (4) no antibiotic strategy. The patient-led and delayed collection strategies were considered delayed prescription strategies.
Main outcome measures. Duration of symptoms and severity of symptoms. Patients filled out a daily questionnaire for a maximum of 30 days, which listed common symptoms such as fever, discomfort or general pain, cough, difficulty sleeping, and changes in everyday life, and specific symptoms according to condition. Patients assessed severity of their symptoms using 6-point Likert scale, with scores of 1-2 considered mild, 3-4 moderate, and 5-6 severe. Secondary outcomes included antibiotic use, patient satisfaction, patients’ beliefs in the effectiveness of antibiotics, and absenteeism (absence from work or doing their daily activities).
Main results. A total of 405 patients were recruited, 398 of whom were included in the analysis. 136 patients (34.2%) were men. The mean (SD) age was 45 (17) years and 265 patients (72%) had at least a secondary education level. The most common infection was pharyngitis (n = 184; 46.2%), followed by acute bronchitis (n = 128; 32.2%). The mean severity of symptoms ranged from 1.8 to 3.5 points on the Likert scale, and mean (SD) duration of symptoms described on first visit was 6 (6) days. The mean (SD) general health status on first visit was 54 (20) based on a scale with 0 indicating worst health status and 100 indicating best health status. 314 patients (80.1%) were nonsmokers, and 372 patients (93.5%) did not have a respiratory comorbidity. The presence of symptoms on first visit was similar among the 4 groups.
The duration of the common symptoms of fever, discomfort or general pain, and cough was shorter in the immediate prescription group versus the no prescription group (P < 0.05 for all). In the immediate prescription group, the duration of patient symptoms after first visit was significantly different from that of the prescription collection and patient-led prescription groups only for discomfort or general pain. The mean (SD) duration of severe symptoms was 3.6 (3.3) days for the immediate prescription group, 4.0 (4.2) days for the prescription collection group, 5.1 (6.3) days for the patient-led prescription group, and 4.7 (3.6) days for the no prescription group. The median (interquartile range [IQR]) of severe symptoms was 3 (1–4) days for the prescription collection group and 3 (2–6) days for the patient-led prescription group. The median (IQR) of the maximum severity for any symptom was 5 (3–5) for the immediate prescription group and the prescription collection group; 5 (4–5) for the patient-led prescription group; and 5 (4–6) for the no prescription group. Patients randomized to the no prescription strategy or to either of the delayed strategies used fewer antibiotics and less frequently believed in antibiotic effectiveness. Among patients in the immediate prescription group, 91.1% used antibiotics; in the delayed patient-led, delayed collection, and no prescription groups, the rates of antibiotic use were 32.6%, 23.0%, and 12.1%, respectively. There were very few adverse events across groups, although the no prescription group had 3 adverse events compared with 0-1 in the other groups. Satisfaction was similar across groups.
Conclusion. Delayed strategies were associated with slightly greater but clinically similar symptom burden and duration and also with substantially reduced antibiotic use when compared with an immediate strategy.
Commentary
Acute respiratory infections are a common reasons for physician visits. These infections tend to be self-limiting and overuse of antibiotics for these infections is widespread. Approximately 60% of patients with a sore throat and ~70% of patients with acute uncomplicated bronchitis receive antibiotic prescriptions despite the literature suggesting no or limited benefit [2,3].Antibiotic resistance is a growing problem and the main cause of this problem is misuse of antibiotics.
Often physicians feel pressured into prescribing anti-biotics due to patient expectation and patient satisfaction metrics. In the face of the critical need to reduce overuse, delayed antibiotic prescribing strategies offers a compromise between immediate and no prescription [4]. Delayed prescribing strategies have been evaluated previously [5–8], with findings suggesting they do reduce antibiotic use. This study strengthens the evidence base supporting the delayed strategy.
This study has a few limitations. The sample size was small, and symptom data was obtained via patient self-report. In addition, the randomization procedure was not described. However, the investigators were able to achieve good patient retention, with very few patients lost to follow-up. The investigators used an intention to treat analysis; thus, the estimate of treatment effect size can be considered conservative.
In terms of baseline characteristics of the study participants, there was a lower overall education level, fewer smokers, and less respiratory comorbidity (defined as only cardiovascular comorbidity [P = 0.12] and diabetes [P = 0.19]) in the patient-led group. Otherwise, groups were very well-matched. Most patients in the study had pharyngitis and bronchitis, limiting the inferences for patients with rhinosinusitis or exacerbation of mild-to-moderate COPD.
Applications for Clinical Practice
Delayed antibiotic prescribing for acute uncomplicated respiratory infections appears to be an acceptable strategy for reducing the overuse of antibiotics. As patients may lack knowledge of this prescribing strategy [9], clinicians may need to spend time explaining the concept. Using the term “back-up antibiotics” instead of “delayed prescription” [10] may help to increase patients’ understanding and acceptance.
—Ajay Dharod, MD
1. de la Poza Abad M, Mas Dalmau G, Moreno Bakedano M, Get al; Delayed Antibiotic Prescription (DAP) Working Group. Rationale, design and organization of the delayed antibiotic prescription (DAP) trial: a randomized controlled trial of the efficacy and safety of delayed antibiotic prescribing strategies in the non-complicated acute respiratory tract infections in general practice. BMC Fam Pract 2013;14:63.
2. Barnett ML, Linder JA. Antibiotic prescribing to adults with sore throat in the United States, 1997-2010. JAMA Intern Med 2014;174:138–40.
3. Barnett ML, Linder JA. Antibiotic prescribing for adults with acute bronchitis in the United States, 1996–2010. JAMA 2014;311:2020–2.
4. McCullough AR, Glasziou PP. Delayed antibiotic prescribing strategies-time to implement? JAMA Intern Med 2016;176:29–30.
5. National Institute for Health and Clinical Excellence. Prescribing of antibiotics for self-limiting respiratory tract infections in adults and children in primary care. Clinical guideline 69. London: NICE; 2008.
6. Arnold SR, Straus SE. Interventions to improve antibiotic prescribing practices in ambulatory care. Cochrane Database Syst Rev 2005;(4):CD003539.
7. Arroll B, Kenealy T, Kerse N. Do delayed prescriptions reduce antibiotic use in respiratory tract infections? A systematic review. Br J Gen Pract 2003;53:871–7.
8. Spurling GKP, Del Mar CB, Dooley L, et al. Delayed antibiotics for respiratory infections. Cochrane Database Syst Rev 2013;4:CD004417.
9. McNulty CAM, Lecky DM, Hawking MKD, et al. Delayed/back up antibiotic prescriptions: what do the public think? BMJ Open 2015;5:e009748.
10. Bunten AK, Hawking MKD, McNulty CAM. Patient information can improve appropriate antibiotic prescribing. Nurs Pract 2015;82:61–3.
1. de la Poza Abad M, Mas Dalmau G, Moreno Bakedano M, Get al; Delayed Antibiotic Prescription (DAP) Working Group. Rationale, design and organization of the delayed antibiotic prescription (DAP) trial: a randomized controlled trial of the efficacy and safety of delayed antibiotic prescribing strategies in the non-complicated acute respiratory tract infections in general practice. BMC Fam Pract 2013;14:63.
2. Barnett ML, Linder JA. Antibiotic prescribing to adults with sore throat in the United States, 1997-2010. JAMA Intern Med 2014;174:138–40.
3. Barnett ML, Linder JA. Antibiotic prescribing for adults with acute bronchitis in the United States, 1996–2010. JAMA 2014;311:2020–2.
4. McCullough AR, Glasziou PP. Delayed antibiotic prescribing strategies-time to implement? JAMA Intern Med 2016;176:29–30.
5. National Institute for Health and Clinical Excellence. Prescribing of antibiotics for self-limiting respiratory tract infections in adults and children in primary care. Clinical guideline 69. London: NICE; 2008.
6. Arnold SR, Straus SE. Interventions to improve antibiotic prescribing practices in ambulatory care. Cochrane Database Syst Rev 2005;(4):CD003539.
7. Arroll B, Kenealy T, Kerse N. Do delayed prescriptions reduce antibiotic use in respiratory tract infections? A systematic review. Br J Gen Pract 2003;53:871–7.
8. Spurling GKP, Del Mar CB, Dooley L, et al. Delayed antibiotics for respiratory infections. Cochrane Database Syst Rev 2013;4:CD004417.
9. McNulty CAM, Lecky DM, Hawking MKD, et al. Delayed/back up antibiotic prescriptions: what do the public think? BMJ Open 2015;5:e009748.
10. Bunten AK, Hawking MKD, McNulty CAM. Patient information can improve appropriate antibiotic prescribing. Nurs Pract 2015;82:61–3.
NPs, PAs Vital to Hospital Medicine
Yes, it’s time for another “year ahead” type column where the writer attempts to provide clarity on future events. What does “Hospital Medicine 2016” hold for us? I hope by the time Hospital Medicine 2017 rolls around, everyone will have forgotten the wrong predictions and only remember those that reveal my exceptional clairvoyance and prescient knowledge.
NP and PA Practice in Hospital Medicine Will Continue to Grow
Well, it doesn’t take a crystal ball or tarot cards to predict this. One only has to look at the data. The 2012 State of Hospital Medicine report revealed that 51.7% of hospital medicine groups (HMGs) employed nurse practitioners (NPs) and/or physician assistants (PAs) in their practice. Two short years later, the survey showed 83% of HMGs reported having NPs and/or PAs in their groups. That is an astounding amount of growth in a short period of time, which brings me to my next prediction.
HMGs Will Have to Continue to Figure Out How to Hire and Deploy NPs and PAs in Sensible Ways
I know that statement is very controversial. Not. But the true work of utilizing NP and PA providers in hospitalist practice is not in the hiring; it’s how to use these providers in thoughtful, sensible, and cost-effective ways.
A group leader really needs to know and understand the drivers behind the need for these hires as well as understand the financial landscape in the hiring. Are you hiring an NP/PA because you want to reduce your provider workforce cost? Are you hiring to target quality outcomes in a specific patient population? Are you hiring to staff your observation unit, freeing up your physicians for higher-acuity work? Are you hiring to treat and improve physician burnout? Or is this the only carbon-based life form you can attract to the outer boroughs of your northern clime in the deepest, darkest days of January?
All these may or may not be good reasons, but understanding those variables will help you get the right person for the right reason and will help you evaluate the return on investment and the impact on practice.
Diversity Prevents Disease
Much like the potato monoculture of McDonald’s french fries increasing the risk of potato diseases, monoculture in your hospitalist group may breed burnout and bad attitudes. Diversity of experience, perspective, and skill set may inoculate your group, keeping the dreaded crispy coated from complaining about schedule, workload, or acuity or, worse yet, simply leaving.
I don’t have data to support this, but I have heard anecdotally from more than one HMG leader that the addition of NP/PA providers to physician teams has improved physician satisfaction. SHM obviously agrees with this philosophy, as they value and support the value of a “big tent” philosophy. This big tent includes all types of people who contribute to the culture of this organization, making it stronger, more nimble and innovative, and definitely more fun.
Diversity in providers can only have a positive impact on your organization’s culture.
Whatever the Reason You Hire Them, Get Ready for Change
Be prepared for evolution. You may have initially hired an NP or PA simply to do admissions or to see all of your orthopedic co-management patients. But over time, your practice is going to morph and evolve, hopefully, in positive ways. Bring your NP/PA colleagues along for the ride; pull up a chair to the table. They may be able to provide new direction, support, or service lines to your practice in ways you hadn’t considered.
NP/PA providers’ abilities and ambitions will change over time as well. Make sure that change goes both ways. You may find that their influence and impact on your organization’s productivity and growth go beyond their industry. Consider utilizing NP/PA providers in novel ways; maybe they have great onboarding skills, are fabulous at scheduling, or can look at a spreadsheet without going cross-eyed or bald.
Change is growth. And growth is good. Unless you would rather die.
HM Needs to Develop Innovative Care Models; NPs/PAs Provide a Platform for Innovation
Inpatient medicine is changing in a rapid and unpredictable way. Some of the necessity of that work is driven by financial incentives and quality indicators, but necessity is the biggest driver of all. People, patients, and providers are getting old (thank God it’s not just me). There simply are not enough physicians to care for our rapidly aging population, or if there are, they are all employed in sunny Southern California. How we respond to this threat or opportunity is one of our most important charges. We own the inpatient kingdom. We need to lead with benevolence and thoughtfulness. We need to really look ahead and identify new ways to manage the complexity of a system whose complexity continues to mutate like some avian virus. I can’t see a future without a crucial role played by my NP/PA brethren. Can we begin this conversation with the long view in mind and really begin to own this in a true and responsible way?
Thanks for your attention, and remember, in 2017 you will have forgotten all the ways, if any, that I was wrong. TH
Ms. Cardin is a nurse practitioner in the Section of Hospital Medicine at the University of Chicago and is chair of SHM’s NP/PA Committee. She is a newly elected SHM board member.
Yes, it’s time for another “year ahead” type column where the writer attempts to provide clarity on future events. What does “Hospital Medicine 2016” hold for us? I hope by the time Hospital Medicine 2017 rolls around, everyone will have forgotten the wrong predictions and only remember those that reveal my exceptional clairvoyance and prescient knowledge.
NP and PA Practice in Hospital Medicine Will Continue to Grow
Well, it doesn’t take a crystal ball or tarot cards to predict this. One only has to look at the data. The 2012 State of Hospital Medicine report revealed that 51.7% of hospital medicine groups (HMGs) employed nurse practitioners (NPs) and/or physician assistants (PAs) in their practice. Two short years later, the survey showed 83% of HMGs reported having NPs and/or PAs in their groups. That is an astounding amount of growth in a short period of time, which brings me to my next prediction.
HMGs Will Have to Continue to Figure Out How to Hire and Deploy NPs and PAs in Sensible Ways
I know that statement is very controversial. Not. But the true work of utilizing NP and PA providers in hospitalist practice is not in the hiring; it’s how to use these providers in thoughtful, sensible, and cost-effective ways.
A group leader really needs to know and understand the drivers behind the need for these hires as well as understand the financial landscape in the hiring. Are you hiring an NP/PA because you want to reduce your provider workforce cost? Are you hiring to target quality outcomes in a specific patient population? Are you hiring to staff your observation unit, freeing up your physicians for higher-acuity work? Are you hiring to treat and improve physician burnout? Or is this the only carbon-based life form you can attract to the outer boroughs of your northern clime in the deepest, darkest days of January?
All these may or may not be good reasons, but understanding those variables will help you get the right person for the right reason and will help you evaluate the return on investment and the impact on practice.
Diversity Prevents Disease
Much like the potato monoculture of McDonald’s french fries increasing the risk of potato diseases, monoculture in your hospitalist group may breed burnout and bad attitudes. Diversity of experience, perspective, and skill set may inoculate your group, keeping the dreaded crispy coated from complaining about schedule, workload, or acuity or, worse yet, simply leaving.
I don’t have data to support this, but I have heard anecdotally from more than one HMG leader that the addition of NP/PA providers to physician teams has improved physician satisfaction. SHM obviously agrees with this philosophy, as they value and support the value of a “big tent” philosophy. This big tent includes all types of people who contribute to the culture of this organization, making it stronger, more nimble and innovative, and definitely more fun.
Diversity in providers can only have a positive impact on your organization’s culture.
Whatever the Reason You Hire Them, Get Ready for Change
Be prepared for evolution. You may have initially hired an NP or PA simply to do admissions or to see all of your orthopedic co-management patients. But over time, your practice is going to morph and evolve, hopefully, in positive ways. Bring your NP/PA colleagues along for the ride; pull up a chair to the table. They may be able to provide new direction, support, or service lines to your practice in ways you hadn’t considered.
NP/PA providers’ abilities and ambitions will change over time as well. Make sure that change goes both ways. You may find that their influence and impact on your organization’s productivity and growth go beyond their industry. Consider utilizing NP/PA providers in novel ways; maybe they have great onboarding skills, are fabulous at scheduling, or can look at a spreadsheet without going cross-eyed or bald.
Change is growth. And growth is good. Unless you would rather die.
HM Needs to Develop Innovative Care Models; NPs/PAs Provide a Platform for Innovation
Inpatient medicine is changing in a rapid and unpredictable way. Some of the necessity of that work is driven by financial incentives and quality indicators, but necessity is the biggest driver of all. People, patients, and providers are getting old (thank God it’s not just me). There simply are not enough physicians to care for our rapidly aging population, or if there are, they are all employed in sunny Southern California. How we respond to this threat or opportunity is one of our most important charges. We own the inpatient kingdom. We need to lead with benevolence and thoughtfulness. We need to really look ahead and identify new ways to manage the complexity of a system whose complexity continues to mutate like some avian virus. I can’t see a future without a crucial role played by my NP/PA brethren. Can we begin this conversation with the long view in mind and really begin to own this in a true and responsible way?
Thanks for your attention, and remember, in 2017 you will have forgotten all the ways, if any, that I was wrong. TH
Ms. Cardin is a nurse practitioner in the Section of Hospital Medicine at the University of Chicago and is chair of SHM’s NP/PA Committee. She is a newly elected SHM board member.
Yes, it’s time for another “year ahead” type column where the writer attempts to provide clarity on future events. What does “Hospital Medicine 2016” hold for us? I hope by the time Hospital Medicine 2017 rolls around, everyone will have forgotten the wrong predictions and only remember those that reveal my exceptional clairvoyance and prescient knowledge.
NP and PA Practice in Hospital Medicine Will Continue to Grow
Well, it doesn’t take a crystal ball or tarot cards to predict this. One only has to look at the data. The 2012 State of Hospital Medicine report revealed that 51.7% of hospital medicine groups (HMGs) employed nurse practitioners (NPs) and/or physician assistants (PAs) in their practice. Two short years later, the survey showed 83% of HMGs reported having NPs and/or PAs in their groups. That is an astounding amount of growth in a short period of time, which brings me to my next prediction.
HMGs Will Have to Continue to Figure Out How to Hire and Deploy NPs and PAs in Sensible Ways
I know that statement is very controversial. Not. But the true work of utilizing NP and PA providers in hospitalist practice is not in the hiring; it’s how to use these providers in thoughtful, sensible, and cost-effective ways.
A group leader really needs to know and understand the drivers behind the need for these hires as well as understand the financial landscape in the hiring. Are you hiring an NP/PA because you want to reduce your provider workforce cost? Are you hiring to target quality outcomes in a specific patient population? Are you hiring to staff your observation unit, freeing up your physicians for higher-acuity work? Are you hiring to treat and improve physician burnout? Or is this the only carbon-based life form you can attract to the outer boroughs of your northern clime in the deepest, darkest days of January?
All these may or may not be good reasons, but understanding those variables will help you get the right person for the right reason and will help you evaluate the return on investment and the impact on practice.
Diversity Prevents Disease
Much like the potato monoculture of McDonald’s french fries increasing the risk of potato diseases, monoculture in your hospitalist group may breed burnout and bad attitudes. Diversity of experience, perspective, and skill set may inoculate your group, keeping the dreaded crispy coated from complaining about schedule, workload, or acuity or, worse yet, simply leaving.
I don’t have data to support this, but I have heard anecdotally from more than one HMG leader that the addition of NP/PA providers to physician teams has improved physician satisfaction. SHM obviously agrees with this philosophy, as they value and support the value of a “big tent” philosophy. This big tent includes all types of people who contribute to the culture of this organization, making it stronger, more nimble and innovative, and definitely more fun.
Diversity in providers can only have a positive impact on your organization’s culture.
Whatever the Reason You Hire Them, Get Ready for Change
Be prepared for evolution. You may have initially hired an NP or PA simply to do admissions or to see all of your orthopedic co-management patients. But over time, your practice is going to morph and evolve, hopefully, in positive ways. Bring your NP/PA colleagues along for the ride; pull up a chair to the table. They may be able to provide new direction, support, or service lines to your practice in ways you hadn’t considered.
NP/PA providers’ abilities and ambitions will change over time as well. Make sure that change goes both ways. You may find that their influence and impact on your organization’s productivity and growth go beyond their industry. Consider utilizing NP/PA providers in novel ways; maybe they have great onboarding skills, are fabulous at scheduling, or can look at a spreadsheet without going cross-eyed or bald.
Change is growth. And growth is good. Unless you would rather die.
HM Needs to Develop Innovative Care Models; NPs/PAs Provide a Platform for Innovation
Inpatient medicine is changing in a rapid and unpredictable way. Some of the necessity of that work is driven by financial incentives and quality indicators, but necessity is the biggest driver of all. People, patients, and providers are getting old (thank God it’s not just me). There simply are not enough physicians to care for our rapidly aging population, or if there are, they are all employed in sunny Southern California. How we respond to this threat or opportunity is one of our most important charges. We own the inpatient kingdom. We need to lead with benevolence and thoughtfulness. We need to really look ahead and identify new ways to manage the complexity of a system whose complexity continues to mutate like some avian virus. I can’t see a future without a crucial role played by my NP/PA brethren. Can we begin this conversation with the long view in mind and really begin to own this in a true and responsible way?
Thanks for your attention, and remember, in 2017 you will have forgotten all the ways, if any, that I was wrong. TH
Ms. Cardin is a nurse practitioner in the Section of Hospital Medicine at the University of Chicago and is chair of SHM’s NP/PA Committee. She is a newly elected SHM board member.
Not Sleeping Enough Can Cause Serious Health Issues
ATLANTA (Reuters) - Did you get enough sleep last night? If not, you are not alone. More than one out of three American adults do not get enough sleep, according to a study released Thursday from the U.S. Centers for Disease Control and Prevention.
"That's a big problem," says Dr. Nancy Collop, director of the Emory Sleep Center at Emory University School of Medicine in Atlanta, who is familiar with the study. "You don't function as well, your ability to pay attention is reduced, and it can have serious, long term side effects. It can change your metabolism for the worse."
At least seven hours of sleep is considered healthy for an adults aged 18 to 60, according to the American Academy of Sleep Medicine and the Sleep Research Society. The CDC analyzed data from a 2014 survey of 444,306 adults and found roughly 65% of respondents reported getting that amount of
sleep.
"Lifestyle changes such as going to bed at the same time each night; rising at the same time each morning; and turning off or removing televisions, computers, mobile devices from the bedroom, can help people get the healthy sleep they need," said Dr. Wayne Giles, director of the CDC's Division of Population Health, in a statement.
Getting less than seven hours a night is associated with an increased risk of obesity, diabetes, high blood pressure, heart disease, stroke and frequent mental distress, the study shows. Published in the CDC's Morbidity and Mortality Weekly Report, the study is the first of its kind to look at all 50 U.S. states and the District of Columbia.
The study found that among those most likely to get great sleep were married or have a job, with 67% and 65%, respectively saying they get enough. Only 56% of divorced adults said they get enough sleep, and just over half of jobless adults sleep seven hours a night regularly. Among the best sleepers were college graduates, with 72% reporting seven hours or more.
The study found geographical differences as well as ethnic disparities. Hawaiian residents get less sleep than those living in South Dakota, the study found. Non-Hispanic whites sleep better than non-Hispanic black residents, with 67% and 54%, respectively.
ATLANTA (Reuters) - Did you get enough sleep last night? If not, you are not alone. More than one out of three American adults do not get enough sleep, according to a study released Thursday from the U.S. Centers for Disease Control and Prevention.
"That's a big problem," says Dr. Nancy Collop, director of the Emory Sleep Center at Emory University School of Medicine in Atlanta, who is familiar with the study. "You don't function as well, your ability to pay attention is reduced, and it can have serious, long term side effects. It can change your metabolism for the worse."
At least seven hours of sleep is considered healthy for an adults aged 18 to 60, according to the American Academy of Sleep Medicine and the Sleep Research Society. The CDC analyzed data from a 2014 survey of 444,306 adults and found roughly 65% of respondents reported getting that amount of
sleep.
"Lifestyle changes such as going to bed at the same time each night; rising at the same time each morning; and turning off or removing televisions, computers, mobile devices from the bedroom, can help people get the healthy sleep they need," said Dr. Wayne Giles, director of the CDC's Division of Population Health, in a statement.
Getting less than seven hours a night is associated with an increased risk of obesity, diabetes, high blood pressure, heart disease, stroke and frequent mental distress, the study shows. Published in the CDC's Morbidity and Mortality Weekly Report, the study is the first of its kind to look at all 50 U.S. states and the District of Columbia.
The study found that among those most likely to get great sleep were married or have a job, with 67% and 65%, respectively saying they get enough. Only 56% of divorced adults said they get enough sleep, and just over half of jobless adults sleep seven hours a night regularly. Among the best sleepers were college graduates, with 72% reporting seven hours or more.
The study found geographical differences as well as ethnic disparities. Hawaiian residents get less sleep than those living in South Dakota, the study found. Non-Hispanic whites sleep better than non-Hispanic black residents, with 67% and 54%, respectively.
ATLANTA (Reuters) - Did you get enough sleep last night? If not, you are not alone. More than one out of three American adults do not get enough sleep, according to a study released Thursday from the U.S. Centers for Disease Control and Prevention.
"That's a big problem," says Dr. Nancy Collop, director of the Emory Sleep Center at Emory University School of Medicine in Atlanta, who is familiar with the study. "You don't function as well, your ability to pay attention is reduced, and it can have serious, long term side effects. It can change your metabolism for the worse."
At least seven hours of sleep is considered healthy for an adults aged 18 to 60, according to the American Academy of Sleep Medicine and the Sleep Research Society. The CDC analyzed data from a 2014 survey of 444,306 adults and found roughly 65% of respondents reported getting that amount of
sleep.
"Lifestyle changes such as going to bed at the same time each night; rising at the same time each morning; and turning off or removing televisions, computers, mobile devices from the bedroom, can help people get the healthy sleep they need," said Dr. Wayne Giles, director of the CDC's Division of Population Health, in a statement.
Getting less than seven hours a night is associated with an increased risk of obesity, diabetes, high blood pressure, heart disease, stroke and frequent mental distress, the study shows. Published in the CDC's Morbidity and Mortality Weekly Report, the study is the first of its kind to look at all 50 U.S. states and the District of Columbia.
The study found that among those most likely to get great sleep were married or have a job, with 67% and 65%, respectively saying they get enough. Only 56% of divorced adults said they get enough sleep, and just over half of jobless adults sleep seven hours a night regularly. Among the best sleepers were college graduates, with 72% reporting seven hours or more.
The study found geographical differences as well as ethnic disparities. Hawaiian residents get less sleep than those living in South Dakota, the study found. Non-Hispanic whites sleep better than non-Hispanic black residents, with 67% and 54%, respectively.
Targeting a protein to prevent malignancy
Photo by Aaron Logan
New research suggests hematologic malignancies driven by MYC might be prevented by lowering levels of another protein, MCL-1.
“Our colleagues had previously discovered that reducing the activity of MCL-1 is a promising strategy to treat malignant MYC-driven cancers,” said Stephanie Grabow, PhD, of the Walter and Eliza Hall Institute of Medical Research in Parkville, Victoria, Australia.
“We have now shown that the same approach might be able to prevent those cancers from forming in the first place.”
Dr Grabow and her colleagues described this work in Cell Reports.
Previous research indicated that expression from both MCL-1 alleles is essential for the survival of hematopoietic stem and progenitor cells during stress-induced repopulation of the hematopoietic system.
So, with this study, Dr Grabow and her colleagues set out to determine whether reducing MCL-1 protein levels might hinder the development of hematologic malignancies.
In experiments with mice, the investigators found that loss of one MCL-1 allele significantly delayed the development of MYC-driven lymphoma and reduced MYC-driven accumulation of pre-leukemic cancer-initiating cells.
However, loss of one p53 allele accelerated MYC-driven lymphomagenesis even when one MCL-1 allele was deleted. Loss of PUMA accelerated lymphoma development as well, though to a much lesser extent.
Loss of BIM substantially accelerated lymphomagenesis when one MCL-1 allele was deleted, restoring lymphoma-initiating cells and the rate of tumor development.
And loss of one BIM allele overrode the survival defect observed in pre-leukemic Eμ-Myc B-cell progenitors when one MCL-1 allele was deleted.
The investigators noted that loss of one MCL-1 allele did not noticeably impair the survival of normal B lymphoid cells even though it greatly diminished the survival of MYC-overexpressing B-cell progenitors.
“No one had realized just how vulnerable cells undergoing cancerous changes are to a relatively minor reduction in the levels of MCL-1,” Dr Grabow said.
“We found that MCL-1 is critical for keeping developing cancer cells alive through the stressful events that cause the transformation of a healthy cell into a cancerous cell. This result is particularly exciting because MCL-1 inhibitors are already in development as anticancer drugs.”
Study investigator Brandon Aubrey, MBBS, also of the Walter and Eliza Hall Institute, said this research could inform future strategies to prevent cancer.
“Early treatment or even cancer prevention are likely to be a more effective way to fight cancer than treating an established cancer after it has already formed and made a person sick,” he said. ”Our research has suggested that dependency on MCL-1 could be a key vulnerability of many developing cancers.”
“In the future, MCL-1 inhibitors might have potential benefit for treating the very early stages of MYC-driven cancers, or we may even be able use these agents to prevent people from getting cancer in the first place.”
Photo by Aaron Logan
New research suggests hematologic malignancies driven by MYC might be prevented by lowering levels of another protein, MCL-1.
“Our colleagues had previously discovered that reducing the activity of MCL-1 is a promising strategy to treat malignant MYC-driven cancers,” said Stephanie Grabow, PhD, of the Walter and Eliza Hall Institute of Medical Research in Parkville, Victoria, Australia.
“We have now shown that the same approach might be able to prevent those cancers from forming in the first place.”
Dr Grabow and her colleagues described this work in Cell Reports.
Previous research indicated that expression from both MCL-1 alleles is essential for the survival of hematopoietic stem and progenitor cells during stress-induced repopulation of the hematopoietic system.
So, with this study, Dr Grabow and her colleagues set out to determine whether reducing MCL-1 protein levels might hinder the development of hematologic malignancies.
In experiments with mice, the investigators found that loss of one MCL-1 allele significantly delayed the development of MYC-driven lymphoma and reduced MYC-driven accumulation of pre-leukemic cancer-initiating cells.
However, loss of one p53 allele accelerated MYC-driven lymphomagenesis even when one MCL-1 allele was deleted. Loss of PUMA accelerated lymphoma development as well, though to a much lesser extent.
Loss of BIM substantially accelerated lymphomagenesis when one MCL-1 allele was deleted, restoring lymphoma-initiating cells and the rate of tumor development.
And loss of one BIM allele overrode the survival defect observed in pre-leukemic Eμ-Myc B-cell progenitors when one MCL-1 allele was deleted.
The investigators noted that loss of one MCL-1 allele did not noticeably impair the survival of normal B lymphoid cells even though it greatly diminished the survival of MYC-overexpressing B-cell progenitors.
“No one had realized just how vulnerable cells undergoing cancerous changes are to a relatively minor reduction in the levels of MCL-1,” Dr Grabow said.
“We found that MCL-1 is critical for keeping developing cancer cells alive through the stressful events that cause the transformation of a healthy cell into a cancerous cell. This result is particularly exciting because MCL-1 inhibitors are already in development as anticancer drugs.”
Study investigator Brandon Aubrey, MBBS, also of the Walter and Eliza Hall Institute, said this research could inform future strategies to prevent cancer.
“Early treatment or even cancer prevention are likely to be a more effective way to fight cancer than treating an established cancer after it has already formed and made a person sick,” he said. ”Our research has suggested that dependency on MCL-1 could be a key vulnerability of many developing cancers.”
“In the future, MCL-1 inhibitors might have potential benefit for treating the very early stages of MYC-driven cancers, or we may even be able use these agents to prevent people from getting cancer in the first place.”
Photo by Aaron Logan
New research suggests hematologic malignancies driven by MYC might be prevented by lowering levels of another protein, MCL-1.
“Our colleagues had previously discovered that reducing the activity of MCL-1 is a promising strategy to treat malignant MYC-driven cancers,” said Stephanie Grabow, PhD, of the Walter and Eliza Hall Institute of Medical Research in Parkville, Victoria, Australia.
“We have now shown that the same approach might be able to prevent those cancers from forming in the first place.”
Dr Grabow and her colleagues described this work in Cell Reports.
Previous research indicated that expression from both MCL-1 alleles is essential for the survival of hematopoietic stem and progenitor cells during stress-induced repopulation of the hematopoietic system.
So, with this study, Dr Grabow and her colleagues set out to determine whether reducing MCL-1 protein levels might hinder the development of hematologic malignancies.
In experiments with mice, the investigators found that loss of one MCL-1 allele significantly delayed the development of MYC-driven lymphoma and reduced MYC-driven accumulation of pre-leukemic cancer-initiating cells.
However, loss of one p53 allele accelerated MYC-driven lymphomagenesis even when one MCL-1 allele was deleted. Loss of PUMA accelerated lymphoma development as well, though to a much lesser extent.
Loss of BIM substantially accelerated lymphomagenesis when one MCL-1 allele was deleted, restoring lymphoma-initiating cells and the rate of tumor development.
And loss of one BIM allele overrode the survival defect observed in pre-leukemic Eμ-Myc B-cell progenitors when one MCL-1 allele was deleted.
The investigators noted that loss of one MCL-1 allele did not noticeably impair the survival of normal B lymphoid cells even though it greatly diminished the survival of MYC-overexpressing B-cell progenitors.
“No one had realized just how vulnerable cells undergoing cancerous changes are to a relatively minor reduction in the levels of MCL-1,” Dr Grabow said.
“We found that MCL-1 is critical for keeping developing cancer cells alive through the stressful events that cause the transformation of a healthy cell into a cancerous cell. This result is particularly exciting because MCL-1 inhibitors are already in development as anticancer drugs.”
Study investigator Brandon Aubrey, MBBS, also of the Walter and Eliza Hall Institute, said this research could inform future strategies to prevent cancer.
“Early treatment or even cancer prevention are likely to be a more effective way to fight cancer than treating an established cancer after it has already formed and made a person sick,” he said. ”Our research has suggested that dependency on MCL-1 could be a key vulnerability of many developing cancers.”
“In the future, MCL-1 inhibitors might have potential benefit for treating the very early stages of MYC-driven cancers, or we may even be able use these agents to prevent people from getting cancer in the first place.”
Study sheds new light on blood clot structure
Photo by Vera Kratochvil
Researchers say they have discovered significant differences between blood clot structure in adults and newborns, a finding that could help us better understand the challenges in addressing post-operative bleeding in neonatal patients.
The researchers also found evidence to suggest the current standard of care for treating post-operative bleeding may pose an increased risk of thrombosis in newborns as compared to adults.
The team reported these findings in Anesthesiology.
“We knew that neonates—infants less than 1 month old—are more likely than adults to suffer from severe bleeding after heart surgery, which poses a variety of health risks,” said study author Ashley Brown, PhD, of the University of North Carolina at Chapel Hill.
“The current standard of care is to give neonatal patients blood products . . . derived from adult blood, but neonatal blood and adult blood aren’t the same. Many of the components involved in clotting in newborns have differing levels of activity, or effectiveness, compared to the same components in adults. Our goal was to better understand how clotting in neonates differs from that in adults so that we can move closer to developing more effective treatment strategies for these infants.”
The researchers’ hypothesis was that fibrinogen from neonates would form clots that are different from those formed by adult fibrinogen, and this proved correct. However, the team was surprised to find that fibrinogen from adults did not integrate well with fibrinogen from neonates.
To test their hypothesis, the researchers took samples of neonate fibrinogen and adult fibrinogen and compared clot formation. The team looked at clots formed solely of adult fibrinogen, clots formed solely of neonate fibrinogen, and clots made from a mixture of the two.
Neonate fibrinogen formed less dense, more fragile clots than adult fibrinogen. Likewise, a mixture of adult and neonate fibrinogen formed clots that were fragile and less dense, even if there was relatively little neonate fibrinogen in the mixture.
The researchers also evaluated how long it took these clots to dissolve. Clots of neonate fibrinogen dissolved about twice as quickly as clots formed from adult fibrinogen.
Clots formed from an adult and neonate fibrinogen mixture dissolved at approximately the same rate as adult-only clots, regardless of the percentage of neonate fibrinogen in the mixture.
“This suggests that using adult fibrinogen in neonatal patients may pose an increased risk of embolism or other adverse thrombotic events,” said study author Nina Guzzetta, MD, of the Emory University School of Medicine in Atlanta, Georgia.
“This work drives home that newborns are not just small adults, and we still have much to learn about clotting in neonates. It also tells us that there is a great deal of room for improvement in the current standard of care for post-operative bleeding in neonates.”
“We are investigating several approaches that may help address this problem, evaluating various modes of action,” Dr Brown added. “It is possible that we can use various external factors that promote clotting to stimulate the fibrinogen in neonates to form a denser clot.”
“We are investigating possible alternatives to help neonates form a better clot after major surgery without having to use adult fibrinogen. For example, we are investigating the use of synthetic platelet-like particles developed by our team to augment hemostasis . . . in blood samples collected from these patients.”
Photo by Vera Kratochvil
Researchers say they have discovered significant differences between blood clot structure in adults and newborns, a finding that could help us better understand the challenges in addressing post-operative bleeding in neonatal patients.
The researchers also found evidence to suggest the current standard of care for treating post-operative bleeding may pose an increased risk of thrombosis in newborns as compared to adults.
The team reported these findings in Anesthesiology.
“We knew that neonates—infants less than 1 month old—are more likely than adults to suffer from severe bleeding after heart surgery, which poses a variety of health risks,” said study author Ashley Brown, PhD, of the University of North Carolina at Chapel Hill.
“The current standard of care is to give neonatal patients blood products . . . derived from adult blood, but neonatal blood and adult blood aren’t the same. Many of the components involved in clotting in newborns have differing levels of activity, or effectiveness, compared to the same components in adults. Our goal was to better understand how clotting in neonates differs from that in adults so that we can move closer to developing more effective treatment strategies for these infants.”
The researchers’ hypothesis was that fibrinogen from neonates would form clots that are different from those formed by adult fibrinogen, and this proved correct. However, the team was surprised to find that fibrinogen from adults did not integrate well with fibrinogen from neonates.
To test their hypothesis, the researchers took samples of neonate fibrinogen and adult fibrinogen and compared clot formation. The team looked at clots formed solely of adult fibrinogen, clots formed solely of neonate fibrinogen, and clots made from a mixture of the two.
Neonate fibrinogen formed less dense, more fragile clots than adult fibrinogen. Likewise, a mixture of adult and neonate fibrinogen formed clots that were fragile and less dense, even if there was relatively little neonate fibrinogen in the mixture.
The researchers also evaluated how long it took these clots to dissolve. Clots of neonate fibrinogen dissolved about twice as quickly as clots formed from adult fibrinogen.
Clots formed from an adult and neonate fibrinogen mixture dissolved at approximately the same rate as adult-only clots, regardless of the percentage of neonate fibrinogen in the mixture.
“This suggests that using adult fibrinogen in neonatal patients may pose an increased risk of embolism or other adverse thrombotic events,” said study author Nina Guzzetta, MD, of the Emory University School of Medicine in Atlanta, Georgia.
“This work drives home that newborns are not just small adults, and we still have much to learn about clotting in neonates. It also tells us that there is a great deal of room for improvement in the current standard of care for post-operative bleeding in neonates.”
“We are investigating several approaches that may help address this problem, evaluating various modes of action,” Dr Brown added. “It is possible that we can use various external factors that promote clotting to stimulate the fibrinogen in neonates to form a denser clot.”
“We are investigating possible alternatives to help neonates form a better clot after major surgery without having to use adult fibrinogen. For example, we are investigating the use of synthetic platelet-like particles developed by our team to augment hemostasis . . . in blood samples collected from these patients.”
Photo by Vera Kratochvil
Researchers say they have discovered significant differences between blood clot structure in adults and newborns, a finding that could help us better understand the challenges in addressing post-operative bleeding in neonatal patients.
The researchers also found evidence to suggest the current standard of care for treating post-operative bleeding may pose an increased risk of thrombosis in newborns as compared to adults.
The team reported these findings in Anesthesiology.
“We knew that neonates—infants less than 1 month old—are more likely than adults to suffer from severe bleeding after heart surgery, which poses a variety of health risks,” said study author Ashley Brown, PhD, of the University of North Carolina at Chapel Hill.
“The current standard of care is to give neonatal patients blood products . . . derived from adult blood, but neonatal blood and adult blood aren’t the same. Many of the components involved in clotting in newborns have differing levels of activity, or effectiveness, compared to the same components in adults. Our goal was to better understand how clotting in neonates differs from that in adults so that we can move closer to developing more effective treatment strategies for these infants.”
The researchers’ hypothesis was that fibrinogen from neonates would form clots that are different from those formed by adult fibrinogen, and this proved correct. However, the team was surprised to find that fibrinogen from adults did not integrate well with fibrinogen from neonates.
To test their hypothesis, the researchers took samples of neonate fibrinogen and adult fibrinogen and compared clot formation. The team looked at clots formed solely of adult fibrinogen, clots formed solely of neonate fibrinogen, and clots made from a mixture of the two.
Neonate fibrinogen formed less dense, more fragile clots than adult fibrinogen. Likewise, a mixture of adult and neonate fibrinogen formed clots that were fragile and less dense, even if there was relatively little neonate fibrinogen in the mixture.
The researchers also evaluated how long it took these clots to dissolve. Clots of neonate fibrinogen dissolved about twice as quickly as clots formed from adult fibrinogen.
Clots formed from an adult and neonate fibrinogen mixture dissolved at approximately the same rate as adult-only clots, regardless of the percentage of neonate fibrinogen in the mixture.
“This suggests that using adult fibrinogen in neonatal patients may pose an increased risk of embolism or other adverse thrombotic events,” said study author Nina Guzzetta, MD, of the Emory University School of Medicine in Atlanta, Georgia.
“This work drives home that newborns are not just small adults, and we still have much to learn about clotting in neonates. It also tells us that there is a great deal of room for improvement in the current standard of care for post-operative bleeding in neonates.”
“We are investigating several approaches that may help address this problem, evaluating various modes of action,” Dr Brown added. “It is possible that we can use various external factors that promote clotting to stimulate the fibrinogen in neonates to form a denser clot.”
“We are investigating possible alternatives to help neonates form a better clot after major surgery without having to use adult fibrinogen. For example, we are investigating the use of synthetic platelet-like particles developed by our team to augment hemostasis . . . in blood samples collected from these patients.”