User login
Engaging Veterans With Serious Mental Illness in Primary Care
People with serious mental illness (SMI) are at substantial risk for premature mortality, dying on average 10 to 20 years earlier than others.1 The reasons for this disparity are complex; however, the high prevalence of chronic disease and physical comorbidities in the SMI population have been identified as prominent factors.2 Engagement and reengagement in care, including primary care for medical comorbidities, can mitigate these mortality risks.2-4 Among veterans with SMI lost to follow-up care for more than 12 months, those not successfully reengaged in care were more likely to die compared with those reengaged in care.2,3
Given this evidence, health care systems, including the US Department of Veterans Affairs (VA), have looked to better engage these patients in care. These efforts have included mental health population health management, colocation of mental health with primary care, designation of primary care teams specializing in SMI, and integration of mental health and primary care services for patients experiencing homelessness.5-8
As part of a national approach to encourage locally driven quality improvement (QI), the VA compiles performance metrics for each facility, across a gamut of care settings, conditions, and veteran populations.9 Quarterly facility report cards, with longitudinal data and cross-facility comparisons, enable facilities to identify targets for QI and track improvement progress. One metric reports on the proportion of enrolled veterans with SMI who have primary care engagement, defined as having an assigned primary care practitioner (PCP) and a primary care visit in the prior 12 months.
In support of a QI initiative at the VA Greater Los Angeles Healthcare System (VAGLAHS), we sought to describe promising practices being utilized by VA facilities with higher levels of primary care engagement among their veterans with SMI populations.
Methods
We conducted semistructured telephone interviews with a purposeful sample of key informants at VA facilities with high levels of engagement in primary care among veterans with SMI. All project components were conducted by an interdisciplinary team, which included a medical anthropologist (JM), a mental health physician (PR), an internal medicine physician (KC), and other health services researchers (JB, AG). Because the primary objective of the project was QI, this project was designated as nonresearch by the VAGLAHS Institutional Review Board.
The VA Facility Complexity Model classifies facilities into 5 tiers: 1a (most complex), 1b, 1c, 2, and 3 (least complex), based on patient care volume, patient risk, complexity of clinical programs, and size of research and teaching programs. We sampled informants at VA facilities with complexity ratings of 1a or 1b with better than median scores for primary care engagement of veterans with SMI based on report cards from January 2019 to March 2019. To increase the likelihood of identifying lessons that can generalize to the VAGLAHS with its large population of veterans experiencing homelessness, we selected facilities serving populations consisting of more than 1000 veterans experiencing homelessness.
At each selected facility, we first aimed to interview mental health leaders responsible for quality measurement and improvement identified from a national VA database. We then used snowball sampling to identify other informants at these VA facilities who were knowledgeable about relevant processes. Potential interviewees were contacted via email.
Interviews
The interview guide was developed by the interdisciplinary team and based on published literature about strategies for engaging patients with SMI in care. Interview guide questions focused on local practice arrangements, panel management, population health practices, and quality measurement and improvement efforts for engaging veterans with SMI in primary care (Appendix). Interviews were conducted by telephone, from May 2019 through July 2019, by experienced qualitative interviewers (JM, JB). Interviewees were assured confidentiality of their responses.
Interview audio recordings were used to generate detailed notes (AG). Structured summaries were prepared from these notes, using a template based on the interview guide. We organized these summaries into matrices for analysis, grouping summarized points by interview domains to facilitate comparison across interviews.10-11 Our team reviewed and discussed the matrices, and iteratively identified and defined themes to identify the common engagement approaches and the nature of the connections between mental health and primary care. To ensure rigor, findings were checked by the senior qualitative lead (JM).
Results
The median SMI engagement score—defined as the proportion of veterans with SMI who have had a primary care visit in the prior 12 months and who have an assigned PCP—was 75.6% across 1a and 1b VA facilities. We identified 16 VA facilities that had a median or higher score and more than 1000 enrolled veterans experiencing homelessness. From these16 facilities, we emailed 31 potential interviewees, 14 of whom were identified from a VA database and 17 referred by other interviewees. In total, we interviewed 18 key informants across 11 (69%) facilities, including chiefs of psychology and mental health services, PCPs with mental health expertise, QI specialists, a psychosocial rehabilitation leader, and a local recovery coordinator, who helps veterans with SMI access recovery-oriented services. Characteristics of the facilities and interviewees are shown in Table 1. Interviews lasted a mean 35 (range, 26-50) minutes.
Engagement Approaches
The strategies used to engage veterans with SMI were heterogenous, with no single strategy common across all facilities. However, we identified 2 categories of engagement approaches: targeted outreach and routine practices.
Targeted outreach strategies included deliberate, systematic approaches to reach veterans with SMI outside of regularly scheduled visits. These strategies were designed to be proactive, often prioritizing veterans at risk of disengaging from care. Designated VA care team members identified and reached out to veterans well before 12 months had passed since their prior visit (the VA definition of disengagement from care); visits included any care at VA, including, but not exclusively, primary care. Table 2 describes the key components of targeted outreach strategies: (1) identifying veterans’ last visit; (2) prioritizing which veterans to outreach to; and (3) assigning responsibility and reaching out. A key defining feature of targeted outreach is that veterans were identified and prioritized for outreach independent from any visits with mental health or other VA services.
In identifying veterans at risk for disengagement, a designated employee in mental health or primary care (eg, local recovery coordinator) reviewed a VA dashboard or locally developed report that identified veterans who have not engaged in care for several months. This process was repeated regularly. The designated employee either contacted those veterans directly or coordinated with other clinicians and support staff. When possible, a clinician or nurse with an existing relationship with the veteran would call them. If no such relationship existed, an administrative staff member made a cold call, sometimes accompanied by mailed outreach materials.
Routine practices were business-as-usual activities embedded in regular clinical workflows that facilitated engagement or reengagement of veterans with SMI in care. Of note, and in contrast to targeted outreach, these activities were tied to veteran visits with mental health practitioners. These practices were typically described as being at least as important as targeted outreach efforts. For example, during mental health visits, clinicians routinely checked the VA electronic health record to assess whether veterans had an assigned primary care team. If not, they would contact the primary care service to refer the patient for a primary care visit and assignment. If the patient already had a primary care team assigned, the mental health practitioner checked for recent primary care visits. If none were evident, the mental health practitioner might email the assigned PCP or contact them via instant message.
At some facilities, mental health support staff were able to directly schedule primary care appointments, which was identified as an important enabling factor in promoting mental health patient engagement in primary care. Some interviewees seemed to take for granted the idea that mental health practitioners would help engage patients in primary care—suggesting that these practices had perhaps become a cultural norm within their facility. However, some interviewees identified clear strategies for making these practices a consistent part of care—for example, by designing a protocol for initial mental health assessments to include a routine check for primary care engagement.
Mental Health/Primary Care Connections
Interviewees characterized the nature of the connections between mental health and primary care at their facilities. Nearly all interviewees described that their medical centers had extensive ties, formal and informal, between mental health and primary care.
Formal ties may include the reverse integration care model, in which primary care services are embedded in mental health settings. Interviewees at sites with programs based on this model noted that these programs enabled warm hand-offs from mental health to primary care and suggested that it can foster integration between primary care and mental health care for patients with SMI. However, the size, scope, and structure of these programs varied, sometimes serving a small proportion of a facility’s population of SMI patients. Other examples of formal ties included written agreements, establishing frequent, regular meetings between mental health and primary care leadership and front-line staff, and giving mental health clerks the ability to directly schedule primary care appointments.
Informal ties between mental health and primary care included communication and personal working relationships between mental health and PCPs, facilitated by mental health and primary care leaders working together in workgroups and other administrative activities. Some participants described a history of collaboration between mental health and primary care leaders yielding productive and trusting working relationships. Some interviewees described frequent direct communication between individual mental health practitioners and PCPs—either face-to-face or via secure messaging.
Discussion
VA facilities with high levels of primary care engagement among veterans with SMI used extensive engagement strategies, including a diverse array of targeted outreach and routine practices. In both approaches, intentional organizational structural and process decisions, as well as formal and informal ties between mental health and primary care, established and supported them. In addition, organizational cultural factors were especially relevant to routine practice strategies.
To enable targeted outreach, a bevy of organizational resources, both local and national were required. Large accountable care organizations and integrated delivery systems, like the VA, are often better able to create dashboards and other informational resources for population health management compared with smaller, less integrated health care systems. Though these resources are difficult to create in fragmented systems, comparable tools have been explored by multiple state health departments.12 Our findings suggest that these data tools, though resource intensive to develop, may enable facilities to be more methodical and reliable in conducting outreach to vulnerable patients.
In contrast to targeted outreach, routine practices depend less on population health management resources and more on cultural norms. Such norms are notoriously difficult to change, but intentional structural decisions like embedding primary care engagement in mental health protocols may signal that primary care engagement is an important and legitimate consideration for mental health care.13
We identified extensive and heterogenous connections between mental health and primary care in our sample of VA facilities with high engagement of patients with SMI in primary care. A growing body of literature on relational coordination studies the factors that contribute to organizational siloing and mechanisms for breaking down those silos so work can be coordinated across boundaries (eg, the organizational boundary between mental health and primary care).14 Coordinating care across these boundaries, through good relational coordination practices has been shown to improve outcomes in health care and other sectors. Notably, VA facilities in our sample had several of the defining characteristics of good relational coordination: relationships between mental health and primary care that include shared goals, shared knowledge, and mutual respect, all reinforced by frequent communication structured around problem solving.15 The relational coordination literature also offers a way to identify evidence-based interventions for facilitating relational coordination in places where it is lacking, for example, with information systems, boundary-spanning individuals, facility design, and formal conflict resolution.15 Future work might explore how relational coordination can be further used to optimize mental health and primary care connections to keep veterans with SMI engaged in care.
Our approach of interviewing informants in higher-performing facilities draws heavily on the idea of positive deviance, which holds that information on what works in health care is available from organizations that already are demonstrating “consistently exceptional performance.”16 This approach works best when high performance and organizational characteristics are observable for a large number of facilities, and when high-performing facilities are willing to share their strategies. These features allow investigators to identify promising practices and hypotheses that can then be empirically tested and compared. Such testing, including assessing for unintended consequences, is needed for the approaches we identified. Research is also needed to assess for factors that would promote the implementation of effective strategies.
Limitations
As a QI project seeking to identify promising practices, our interviews were limited to 18 key informants across 11 VA facilities with high engagement of care among veterans with SMI. No inferences can be made that these practices are directly related to this high level of engagement, nor the differential impact of different practices. Future work is needed to assess for these relationships. We also did not interview veterans to understand their perspectives on these strategies, which is an additional important topic for future work. In addition, these interviews were performed before the start of the COVID-19 pandemic. Further work is needed to understand how these strategies may have been modified in response to changes in practice. The shift to care from in-person to virtual services may have impacted both clinical interactions with veterans, as well as between clinicians.
Conclusions
Interviews with key informants demonstrate that while engaging and retaining veterans with SMI in primary care is vital, it also requires intentional and potentially resource-intensive practices, including targeted outreach and routine engagement strategies embedded into mental health visits. These promising practices can provide valuable insights for both VA and community health care systems providing care to patients with SMI.
Acknowledgments
We thank Gracielle J. Tan, MD for administrative assistance in preparing this manuscript.
1. Liu NH, Daumit GL, Dua T, et al. Excess mortality in persons with severe mental disorders: a multilevel intervention framework and priorities for clinical practice, policy and research agendas. World Psychiatry. 2017;16(1):30-40. doi:10.1002/wps.20384
2. Bowersox NW, Kilbourne AM, Abraham KM, et al. Cause-specific mortality among veterans with serious mental illness lost to follow-up. Gen Hosp Psychiatry. 2012;34(6):651-653. doi:10.1016/j.genhosppsych.2012.05.014
3. Davis CL, Kilbourne AM, Blow FC, et al. Reduced mortality among Department of Veterans Affairs patients with schizophrenia or bipolar disorder lost to follow-up and engaged in active outreach to return for care. Am J Public Health. 2012;102(suppl 1):S74-S79. doi:10.2105/AJPH.2011.300502
4. Copeland LA, Zeber JE, Wang CP, et al. Patterns of primary care and mortality among patients with schizophrenia or diabetes: a cluster analysis approach to the retrospective study of healthcare utilization. BMC Health Serv Res. 2009;9:127. doi:10.1186/1472-6963-9-127
5. Abraham KM, Mach J, Visnic S, McCarthy JF. Enhancing treatment reengagement for veterans with serious mental illness: evaluating the effectiveness of SMI re-engage. Psychiatr Serv. 2018;69(8):887-895. doi:10.1176/appi.ps.201700407
6. Ward MC, Druss BG. Reverse integration initiatives for individuals with serious mental illness. Focus (Am Psychiatr Publ). 2017;15(3):271-278. doi:10.1176/appi.focus.20170011
7. Chang ET, Vinzon M, Cohen AN, Young AS. Effective models urgently needed to improve physical care for people with serious mental illnesses. Health Serv Insights. 2019;12:1178632919837628. Published 2019 Apr 2. doi:10.1177/1178632919837628
8. Gabrielian S, Gordon AJ, Gelberg L, et al. Primary care medical services for homeless veterans. Fed Pract. 2014;31(10):10-19.
9. Lemke S, Boden MT, Kearney LK, et al. Measurement-based management of mental health quality and access in VHA: SAIL mental health domain. Psychol Serv. 2017;14(1):1-12. doi:10.1037/ser0000097
10. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855-866. doi:10.1177/104973230201200611
11. Zuchowski JL, Chrystal JG, Hamilton AB, et al. Coordinating care across health care systems for Veterans with gynecologic malignancies: a qualitative analysis. Med Care. 2017;55(suppl 1):S53-S60. doi:10.1097/MLR.0000000000000737
12. Daumit GL, Stone EM, Kennedy-Hendricks A, Choksy S, Marsteller JA, McGinty EE. Care coordination and population health management strategies and challenges in a behavioral health home model. Med Care. 2019;57(1):79-84. doi:10.1097/MLR.0000000000001023
13. Parmelli E, Flodgren G, Beyer F, et al. The effectiveness of strategies to change organisational culture to improve healthcare performance: a systematic review. Implement Sci. 2011;6(33):1-8. doi:10.1186/1748-5908-6-33
14. Bolton R, Logan C, Gittell JH. Revisiting relational coordination: a systematic review. J Appl Behav Sci. 2021;57(3):290-322. doi:10.1177/0021886321991597
15. Gittell JH, Godfrey M, Thistlethwaite J. Interprofessional collaborative practice and relational coordination: improving healthcare through relationships. J Interprof Care. 2013;27(3):210-13. doi:10.3109/13561820.2012.730564
16. Bradley EH, Curry LA, Ramanadhan S, Rowe L, Nembhard IM, Krumholz HM. Research in action: using positive deviance to improve quality of health care. Implement Sci. 2009;4:25. Published 2009 May 8. doi:10.1186/1748-5908-4-25
People with serious mental illness (SMI) are at substantial risk for premature mortality, dying on average 10 to 20 years earlier than others.1 The reasons for this disparity are complex; however, the high prevalence of chronic disease and physical comorbidities in the SMI population have been identified as prominent factors.2 Engagement and reengagement in care, including primary care for medical comorbidities, can mitigate these mortality risks.2-4 Among veterans with SMI lost to follow-up care for more than 12 months, those not successfully reengaged in care were more likely to die compared with those reengaged in care.2,3
Given this evidence, health care systems, including the US Department of Veterans Affairs (VA), have looked to better engage these patients in care. These efforts have included mental health population health management, colocation of mental health with primary care, designation of primary care teams specializing in SMI, and integration of mental health and primary care services for patients experiencing homelessness.5-8
As part of a national approach to encourage locally driven quality improvement (QI), the VA compiles performance metrics for each facility, across a gamut of care settings, conditions, and veteran populations.9 Quarterly facility report cards, with longitudinal data and cross-facility comparisons, enable facilities to identify targets for QI and track improvement progress. One metric reports on the proportion of enrolled veterans with SMI who have primary care engagement, defined as having an assigned primary care practitioner (PCP) and a primary care visit in the prior 12 months.
In support of a QI initiative at the VA Greater Los Angeles Healthcare System (VAGLAHS), we sought to describe promising practices being utilized by VA facilities with higher levels of primary care engagement among their veterans with SMI populations.
Methods
We conducted semistructured telephone interviews with a purposeful sample of key informants at VA facilities with high levels of engagement in primary care among veterans with SMI. All project components were conducted by an interdisciplinary team, which included a medical anthropologist (JM), a mental health physician (PR), an internal medicine physician (KC), and other health services researchers (JB, AG). Because the primary objective of the project was QI, this project was designated as nonresearch by the VAGLAHS Institutional Review Board.
The VA Facility Complexity Model classifies facilities into 5 tiers: 1a (most complex), 1b, 1c, 2, and 3 (least complex), based on patient care volume, patient risk, complexity of clinical programs, and size of research and teaching programs. We sampled informants at VA facilities with complexity ratings of 1a or 1b with better than median scores for primary care engagement of veterans with SMI based on report cards from January 2019 to March 2019. To increase the likelihood of identifying lessons that can generalize to the VAGLAHS with its large population of veterans experiencing homelessness, we selected facilities serving populations consisting of more than 1000 veterans experiencing homelessness.
At each selected facility, we first aimed to interview mental health leaders responsible for quality measurement and improvement identified from a national VA database. We then used snowball sampling to identify other informants at these VA facilities who were knowledgeable about relevant processes. Potential interviewees were contacted via email.
Interviews
The interview guide was developed by the interdisciplinary team and based on published literature about strategies for engaging patients with SMI in care. Interview guide questions focused on local practice arrangements, panel management, population health practices, and quality measurement and improvement efforts for engaging veterans with SMI in primary care (Appendix). Interviews were conducted by telephone, from May 2019 through July 2019, by experienced qualitative interviewers (JM, JB). Interviewees were assured confidentiality of their responses.
Interview audio recordings were used to generate detailed notes (AG). Structured summaries were prepared from these notes, using a template based on the interview guide. We organized these summaries into matrices for analysis, grouping summarized points by interview domains to facilitate comparison across interviews.10-11 Our team reviewed and discussed the matrices, and iteratively identified and defined themes to identify the common engagement approaches and the nature of the connections between mental health and primary care. To ensure rigor, findings were checked by the senior qualitative lead (JM).
Results
The median SMI engagement score—defined as the proportion of veterans with SMI who have had a primary care visit in the prior 12 months and who have an assigned PCP—was 75.6% across 1a and 1b VA facilities. We identified 16 VA facilities that had a median or higher score and more than 1000 enrolled veterans experiencing homelessness. From these16 facilities, we emailed 31 potential interviewees, 14 of whom were identified from a VA database and 17 referred by other interviewees. In total, we interviewed 18 key informants across 11 (69%) facilities, including chiefs of psychology and mental health services, PCPs with mental health expertise, QI specialists, a psychosocial rehabilitation leader, and a local recovery coordinator, who helps veterans with SMI access recovery-oriented services. Characteristics of the facilities and interviewees are shown in Table 1. Interviews lasted a mean 35 (range, 26-50) minutes.
Engagement Approaches
The strategies used to engage veterans with SMI were heterogenous, with no single strategy common across all facilities. However, we identified 2 categories of engagement approaches: targeted outreach and routine practices.
Targeted outreach strategies included deliberate, systematic approaches to reach veterans with SMI outside of regularly scheduled visits. These strategies were designed to be proactive, often prioritizing veterans at risk of disengaging from care. Designated VA care team members identified and reached out to veterans well before 12 months had passed since their prior visit (the VA definition of disengagement from care); visits included any care at VA, including, but not exclusively, primary care. Table 2 describes the key components of targeted outreach strategies: (1) identifying veterans’ last visit; (2) prioritizing which veterans to outreach to; and (3) assigning responsibility and reaching out. A key defining feature of targeted outreach is that veterans were identified and prioritized for outreach independent from any visits with mental health or other VA services.
In identifying veterans at risk for disengagement, a designated employee in mental health or primary care (eg, local recovery coordinator) reviewed a VA dashboard or locally developed report that identified veterans who have not engaged in care for several months. This process was repeated regularly. The designated employee either contacted those veterans directly or coordinated with other clinicians and support staff. When possible, a clinician or nurse with an existing relationship with the veteran would call them. If no such relationship existed, an administrative staff member made a cold call, sometimes accompanied by mailed outreach materials.
Routine practices were business-as-usual activities embedded in regular clinical workflows that facilitated engagement or reengagement of veterans with SMI in care. Of note, and in contrast to targeted outreach, these activities were tied to veteran visits with mental health practitioners. These practices were typically described as being at least as important as targeted outreach efforts. For example, during mental health visits, clinicians routinely checked the VA electronic health record to assess whether veterans had an assigned primary care team. If not, they would contact the primary care service to refer the patient for a primary care visit and assignment. If the patient already had a primary care team assigned, the mental health practitioner checked for recent primary care visits. If none were evident, the mental health practitioner might email the assigned PCP or contact them via instant message.
At some facilities, mental health support staff were able to directly schedule primary care appointments, which was identified as an important enabling factor in promoting mental health patient engagement in primary care. Some interviewees seemed to take for granted the idea that mental health practitioners would help engage patients in primary care—suggesting that these practices had perhaps become a cultural norm within their facility. However, some interviewees identified clear strategies for making these practices a consistent part of care—for example, by designing a protocol for initial mental health assessments to include a routine check for primary care engagement.
Mental Health/Primary Care Connections
Interviewees characterized the nature of the connections between mental health and primary care at their facilities. Nearly all interviewees described that their medical centers had extensive ties, formal and informal, between mental health and primary care.
Formal ties may include the reverse integration care model, in which primary care services are embedded in mental health settings. Interviewees at sites with programs based on this model noted that these programs enabled warm hand-offs from mental health to primary care and suggested that it can foster integration between primary care and mental health care for patients with SMI. However, the size, scope, and structure of these programs varied, sometimes serving a small proportion of a facility’s population of SMI patients. Other examples of formal ties included written agreements, establishing frequent, regular meetings between mental health and primary care leadership and front-line staff, and giving mental health clerks the ability to directly schedule primary care appointments.
Informal ties between mental health and primary care included communication and personal working relationships between mental health and PCPs, facilitated by mental health and primary care leaders working together in workgroups and other administrative activities. Some participants described a history of collaboration between mental health and primary care leaders yielding productive and trusting working relationships. Some interviewees described frequent direct communication between individual mental health practitioners and PCPs—either face-to-face or via secure messaging.
Discussion
VA facilities with high levels of primary care engagement among veterans with SMI used extensive engagement strategies, including a diverse array of targeted outreach and routine practices. In both approaches, intentional organizational structural and process decisions, as well as formal and informal ties between mental health and primary care, established and supported them. In addition, organizational cultural factors were especially relevant to routine practice strategies.
To enable targeted outreach, a bevy of organizational resources, both local and national were required. Large accountable care organizations and integrated delivery systems, like the VA, are often better able to create dashboards and other informational resources for population health management compared with smaller, less integrated health care systems. Though these resources are difficult to create in fragmented systems, comparable tools have been explored by multiple state health departments.12 Our findings suggest that these data tools, though resource intensive to develop, may enable facilities to be more methodical and reliable in conducting outreach to vulnerable patients.
In contrast to targeted outreach, routine practices depend less on population health management resources and more on cultural norms. Such norms are notoriously difficult to change, but intentional structural decisions like embedding primary care engagement in mental health protocols may signal that primary care engagement is an important and legitimate consideration for mental health care.13
We identified extensive and heterogenous connections between mental health and primary care in our sample of VA facilities with high engagement of patients with SMI in primary care. A growing body of literature on relational coordination studies the factors that contribute to organizational siloing and mechanisms for breaking down those silos so work can be coordinated across boundaries (eg, the organizational boundary between mental health and primary care).14 Coordinating care across these boundaries, through good relational coordination practices has been shown to improve outcomes in health care and other sectors. Notably, VA facilities in our sample had several of the defining characteristics of good relational coordination: relationships between mental health and primary care that include shared goals, shared knowledge, and mutual respect, all reinforced by frequent communication structured around problem solving.15 The relational coordination literature also offers a way to identify evidence-based interventions for facilitating relational coordination in places where it is lacking, for example, with information systems, boundary-spanning individuals, facility design, and formal conflict resolution.15 Future work might explore how relational coordination can be further used to optimize mental health and primary care connections to keep veterans with SMI engaged in care.
Our approach of interviewing informants in higher-performing facilities draws heavily on the idea of positive deviance, which holds that information on what works in health care is available from organizations that already are demonstrating “consistently exceptional performance.”16 This approach works best when high performance and organizational characteristics are observable for a large number of facilities, and when high-performing facilities are willing to share their strategies. These features allow investigators to identify promising practices and hypotheses that can then be empirically tested and compared. Such testing, including assessing for unintended consequences, is needed for the approaches we identified. Research is also needed to assess for factors that would promote the implementation of effective strategies.
Limitations
As a QI project seeking to identify promising practices, our interviews were limited to 18 key informants across 11 VA facilities with high engagement of care among veterans with SMI. No inferences can be made that these practices are directly related to this high level of engagement, nor the differential impact of different practices. Future work is needed to assess for these relationships. We also did not interview veterans to understand their perspectives on these strategies, which is an additional important topic for future work. In addition, these interviews were performed before the start of the COVID-19 pandemic. Further work is needed to understand how these strategies may have been modified in response to changes in practice. The shift to care from in-person to virtual services may have impacted both clinical interactions with veterans, as well as between clinicians.
Conclusions
Interviews with key informants demonstrate that while engaging and retaining veterans with SMI in primary care is vital, it also requires intentional and potentially resource-intensive practices, including targeted outreach and routine engagement strategies embedded into mental health visits. These promising practices can provide valuable insights for both VA and community health care systems providing care to patients with SMI.
Acknowledgments
We thank Gracielle J. Tan, MD for administrative assistance in preparing this manuscript.
People with serious mental illness (SMI) are at substantial risk for premature mortality, dying on average 10 to 20 years earlier than others.1 The reasons for this disparity are complex; however, the high prevalence of chronic disease and physical comorbidities in the SMI population have been identified as prominent factors.2 Engagement and reengagement in care, including primary care for medical comorbidities, can mitigate these mortality risks.2-4 Among veterans with SMI lost to follow-up care for more than 12 months, those not successfully reengaged in care were more likely to die compared with those reengaged in care.2,3
Given this evidence, health care systems, including the US Department of Veterans Affairs (VA), have looked to better engage these patients in care. These efforts have included mental health population health management, colocation of mental health with primary care, designation of primary care teams specializing in SMI, and integration of mental health and primary care services for patients experiencing homelessness.5-8
As part of a national approach to encourage locally driven quality improvement (QI), the VA compiles performance metrics for each facility, across a gamut of care settings, conditions, and veteran populations.9 Quarterly facility report cards, with longitudinal data and cross-facility comparisons, enable facilities to identify targets for QI and track improvement progress. One metric reports on the proportion of enrolled veterans with SMI who have primary care engagement, defined as having an assigned primary care practitioner (PCP) and a primary care visit in the prior 12 months.
In support of a QI initiative at the VA Greater Los Angeles Healthcare System (VAGLAHS), we sought to describe promising practices being utilized by VA facilities with higher levels of primary care engagement among their veterans with SMI populations.
Methods
We conducted semistructured telephone interviews with a purposeful sample of key informants at VA facilities with high levels of engagement in primary care among veterans with SMI. All project components were conducted by an interdisciplinary team, which included a medical anthropologist (JM), a mental health physician (PR), an internal medicine physician (KC), and other health services researchers (JB, AG). Because the primary objective of the project was QI, this project was designated as nonresearch by the VAGLAHS Institutional Review Board.
The VA Facility Complexity Model classifies facilities into 5 tiers: 1a (most complex), 1b, 1c, 2, and 3 (least complex), based on patient care volume, patient risk, complexity of clinical programs, and size of research and teaching programs. We sampled informants at VA facilities with complexity ratings of 1a or 1b with better than median scores for primary care engagement of veterans with SMI based on report cards from January 2019 to March 2019. To increase the likelihood of identifying lessons that can generalize to the VAGLAHS with its large population of veterans experiencing homelessness, we selected facilities serving populations consisting of more than 1000 veterans experiencing homelessness.
At each selected facility, we first aimed to interview mental health leaders responsible for quality measurement and improvement identified from a national VA database. We then used snowball sampling to identify other informants at these VA facilities who were knowledgeable about relevant processes. Potential interviewees were contacted via email.
Interviews
The interview guide was developed by the interdisciplinary team and based on published literature about strategies for engaging patients with SMI in care. Interview guide questions focused on local practice arrangements, panel management, population health practices, and quality measurement and improvement efforts for engaging veterans with SMI in primary care (Appendix). Interviews were conducted by telephone, from May 2019 through July 2019, by experienced qualitative interviewers (JM, JB). Interviewees were assured confidentiality of their responses.
Interview audio recordings were used to generate detailed notes (AG). Structured summaries were prepared from these notes, using a template based on the interview guide. We organized these summaries into matrices for analysis, grouping summarized points by interview domains to facilitate comparison across interviews.10-11 Our team reviewed and discussed the matrices, and iteratively identified and defined themes to identify the common engagement approaches and the nature of the connections between mental health and primary care. To ensure rigor, findings were checked by the senior qualitative lead (JM).
Results
The median SMI engagement score—defined as the proportion of veterans with SMI who have had a primary care visit in the prior 12 months and who have an assigned PCP—was 75.6% across 1a and 1b VA facilities. We identified 16 VA facilities that had a median or higher score and more than 1000 enrolled veterans experiencing homelessness. From these16 facilities, we emailed 31 potential interviewees, 14 of whom were identified from a VA database and 17 referred by other interviewees. In total, we interviewed 18 key informants across 11 (69%) facilities, including chiefs of psychology and mental health services, PCPs with mental health expertise, QI specialists, a psychosocial rehabilitation leader, and a local recovery coordinator, who helps veterans with SMI access recovery-oriented services. Characteristics of the facilities and interviewees are shown in Table 1. Interviews lasted a mean 35 (range, 26-50) minutes.
Engagement Approaches
The strategies used to engage veterans with SMI were heterogenous, with no single strategy common across all facilities. However, we identified 2 categories of engagement approaches: targeted outreach and routine practices.
Targeted outreach strategies included deliberate, systematic approaches to reach veterans with SMI outside of regularly scheduled visits. These strategies were designed to be proactive, often prioritizing veterans at risk of disengaging from care. Designated VA care team members identified and reached out to veterans well before 12 months had passed since their prior visit (the VA definition of disengagement from care); visits included any care at VA, including, but not exclusively, primary care. Table 2 describes the key components of targeted outreach strategies: (1) identifying veterans’ last visit; (2) prioritizing which veterans to outreach to; and (3) assigning responsibility and reaching out. A key defining feature of targeted outreach is that veterans were identified and prioritized for outreach independent from any visits with mental health or other VA services.
In identifying veterans at risk for disengagement, a designated employee in mental health or primary care (eg, local recovery coordinator) reviewed a VA dashboard or locally developed report that identified veterans who have not engaged in care for several months. This process was repeated regularly. The designated employee either contacted those veterans directly or coordinated with other clinicians and support staff. When possible, a clinician or nurse with an existing relationship with the veteran would call them. If no such relationship existed, an administrative staff member made a cold call, sometimes accompanied by mailed outreach materials.
Routine practices were business-as-usual activities embedded in regular clinical workflows that facilitated engagement or reengagement of veterans with SMI in care. Of note, and in contrast to targeted outreach, these activities were tied to veteran visits with mental health practitioners. These practices were typically described as being at least as important as targeted outreach efforts. For example, during mental health visits, clinicians routinely checked the VA electronic health record to assess whether veterans had an assigned primary care team. If not, they would contact the primary care service to refer the patient for a primary care visit and assignment. If the patient already had a primary care team assigned, the mental health practitioner checked for recent primary care visits. If none were evident, the mental health practitioner might email the assigned PCP or contact them via instant message.
At some facilities, mental health support staff were able to directly schedule primary care appointments, which was identified as an important enabling factor in promoting mental health patient engagement in primary care. Some interviewees seemed to take for granted the idea that mental health practitioners would help engage patients in primary care—suggesting that these practices had perhaps become a cultural norm within their facility. However, some interviewees identified clear strategies for making these practices a consistent part of care—for example, by designing a protocol for initial mental health assessments to include a routine check for primary care engagement.
Mental Health/Primary Care Connections
Interviewees characterized the nature of the connections between mental health and primary care at their facilities. Nearly all interviewees described that their medical centers had extensive ties, formal and informal, between mental health and primary care.
Formal ties may include the reverse integration care model, in which primary care services are embedded in mental health settings. Interviewees at sites with programs based on this model noted that these programs enabled warm hand-offs from mental health to primary care and suggested that it can foster integration between primary care and mental health care for patients with SMI. However, the size, scope, and structure of these programs varied, sometimes serving a small proportion of a facility’s population of SMI patients. Other examples of formal ties included written agreements, establishing frequent, regular meetings between mental health and primary care leadership and front-line staff, and giving mental health clerks the ability to directly schedule primary care appointments.
Informal ties between mental health and primary care included communication and personal working relationships between mental health and PCPs, facilitated by mental health and primary care leaders working together in workgroups and other administrative activities. Some participants described a history of collaboration between mental health and primary care leaders yielding productive and trusting working relationships. Some interviewees described frequent direct communication between individual mental health practitioners and PCPs—either face-to-face or via secure messaging.
Discussion
VA facilities with high levels of primary care engagement among veterans with SMI used extensive engagement strategies, including a diverse array of targeted outreach and routine practices. In both approaches, intentional organizational structural and process decisions, as well as formal and informal ties between mental health and primary care, established and supported them. In addition, organizational cultural factors were especially relevant to routine practice strategies.
To enable targeted outreach, a bevy of organizational resources, both local and national were required. Large accountable care organizations and integrated delivery systems, like the VA, are often better able to create dashboards and other informational resources for population health management compared with smaller, less integrated health care systems. Though these resources are difficult to create in fragmented systems, comparable tools have been explored by multiple state health departments.12 Our findings suggest that these data tools, though resource intensive to develop, may enable facilities to be more methodical and reliable in conducting outreach to vulnerable patients.
In contrast to targeted outreach, routine practices depend less on population health management resources and more on cultural norms. Such norms are notoriously difficult to change, but intentional structural decisions like embedding primary care engagement in mental health protocols may signal that primary care engagement is an important and legitimate consideration for mental health care.13
We identified extensive and heterogenous connections between mental health and primary care in our sample of VA facilities with high engagement of patients with SMI in primary care. A growing body of literature on relational coordination studies the factors that contribute to organizational siloing and mechanisms for breaking down those silos so work can be coordinated across boundaries (eg, the organizational boundary between mental health and primary care).14 Coordinating care across these boundaries, through good relational coordination practices has been shown to improve outcomes in health care and other sectors. Notably, VA facilities in our sample had several of the defining characteristics of good relational coordination: relationships between mental health and primary care that include shared goals, shared knowledge, and mutual respect, all reinforced by frequent communication structured around problem solving.15 The relational coordination literature also offers a way to identify evidence-based interventions for facilitating relational coordination in places where it is lacking, for example, with information systems, boundary-spanning individuals, facility design, and formal conflict resolution.15 Future work might explore how relational coordination can be further used to optimize mental health and primary care connections to keep veterans with SMI engaged in care.
Our approach of interviewing informants in higher-performing facilities draws heavily on the idea of positive deviance, which holds that information on what works in health care is available from organizations that already are demonstrating “consistently exceptional performance.”16 This approach works best when high performance and organizational characteristics are observable for a large number of facilities, and when high-performing facilities are willing to share their strategies. These features allow investigators to identify promising practices and hypotheses that can then be empirically tested and compared. Such testing, including assessing for unintended consequences, is needed for the approaches we identified. Research is also needed to assess for factors that would promote the implementation of effective strategies.
Limitations
As a QI project seeking to identify promising practices, our interviews were limited to 18 key informants across 11 VA facilities with high engagement of care among veterans with SMI. No inferences can be made that these practices are directly related to this high level of engagement, nor the differential impact of different practices. Future work is needed to assess for these relationships. We also did not interview veterans to understand their perspectives on these strategies, which is an additional important topic for future work. In addition, these interviews were performed before the start of the COVID-19 pandemic. Further work is needed to understand how these strategies may have been modified in response to changes in practice. The shift to care from in-person to virtual services may have impacted both clinical interactions with veterans, as well as between clinicians.
Conclusions
Interviews with key informants demonstrate that while engaging and retaining veterans with SMI in primary care is vital, it also requires intentional and potentially resource-intensive practices, including targeted outreach and routine engagement strategies embedded into mental health visits. These promising practices can provide valuable insights for both VA and community health care systems providing care to patients with SMI.
Acknowledgments
We thank Gracielle J. Tan, MD for administrative assistance in preparing this manuscript.
1. Liu NH, Daumit GL, Dua T, et al. Excess mortality in persons with severe mental disorders: a multilevel intervention framework and priorities for clinical practice, policy and research agendas. World Psychiatry. 2017;16(1):30-40. doi:10.1002/wps.20384
2. Bowersox NW, Kilbourne AM, Abraham KM, et al. Cause-specific mortality among veterans with serious mental illness lost to follow-up. Gen Hosp Psychiatry. 2012;34(6):651-653. doi:10.1016/j.genhosppsych.2012.05.014
3. Davis CL, Kilbourne AM, Blow FC, et al. Reduced mortality among Department of Veterans Affairs patients with schizophrenia or bipolar disorder lost to follow-up and engaged in active outreach to return for care. Am J Public Health. 2012;102(suppl 1):S74-S79. doi:10.2105/AJPH.2011.300502
4. Copeland LA, Zeber JE, Wang CP, et al. Patterns of primary care and mortality among patients with schizophrenia or diabetes: a cluster analysis approach to the retrospective study of healthcare utilization. BMC Health Serv Res. 2009;9:127. doi:10.1186/1472-6963-9-127
5. Abraham KM, Mach J, Visnic S, McCarthy JF. Enhancing treatment reengagement for veterans with serious mental illness: evaluating the effectiveness of SMI re-engage. Psychiatr Serv. 2018;69(8):887-895. doi:10.1176/appi.ps.201700407
6. Ward MC, Druss BG. Reverse integration initiatives for individuals with serious mental illness. Focus (Am Psychiatr Publ). 2017;15(3):271-278. doi:10.1176/appi.focus.20170011
7. Chang ET, Vinzon M, Cohen AN, Young AS. Effective models urgently needed to improve physical care for people with serious mental illnesses. Health Serv Insights. 2019;12:1178632919837628. Published 2019 Apr 2. doi:10.1177/1178632919837628
8. Gabrielian S, Gordon AJ, Gelberg L, et al. Primary care medical services for homeless veterans. Fed Pract. 2014;31(10):10-19.
9. Lemke S, Boden MT, Kearney LK, et al. Measurement-based management of mental health quality and access in VHA: SAIL mental health domain. Psychol Serv. 2017;14(1):1-12. doi:10.1037/ser0000097
10. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855-866. doi:10.1177/104973230201200611
11. Zuchowski JL, Chrystal JG, Hamilton AB, et al. Coordinating care across health care systems for Veterans with gynecologic malignancies: a qualitative analysis. Med Care. 2017;55(suppl 1):S53-S60. doi:10.1097/MLR.0000000000000737
12. Daumit GL, Stone EM, Kennedy-Hendricks A, Choksy S, Marsteller JA, McGinty EE. Care coordination and population health management strategies and challenges in a behavioral health home model. Med Care. 2019;57(1):79-84. doi:10.1097/MLR.0000000000001023
13. Parmelli E, Flodgren G, Beyer F, et al. The effectiveness of strategies to change organisational culture to improve healthcare performance: a systematic review. Implement Sci. 2011;6(33):1-8. doi:10.1186/1748-5908-6-33
14. Bolton R, Logan C, Gittell JH. Revisiting relational coordination: a systematic review. J Appl Behav Sci. 2021;57(3):290-322. doi:10.1177/0021886321991597
15. Gittell JH, Godfrey M, Thistlethwaite J. Interprofessional collaborative practice and relational coordination: improving healthcare through relationships. J Interprof Care. 2013;27(3):210-13. doi:10.3109/13561820.2012.730564
16. Bradley EH, Curry LA, Ramanadhan S, Rowe L, Nembhard IM, Krumholz HM. Research in action: using positive deviance to improve quality of health care. Implement Sci. 2009;4:25. Published 2009 May 8. doi:10.1186/1748-5908-4-25
1. Liu NH, Daumit GL, Dua T, et al. Excess mortality in persons with severe mental disorders: a multilevel intervention framework and priorities for clinical practice, policy and research agendas. World Psychiatry. 2017;16(1):30-40. doi:10.1002/wps.20384
2. Bowersox NW, Kilbourne AM, Abraham KM, et al. Cause-specific mortality among veterans with serious mental illness lost to follow-up. Gen Hosp Psychiatry. 2012;34(6):651-653. doi:10.1016/j.genhosppsych.2012.05.014
3. Davis CL, Kilbourne AM, Blow FC, et al. Reduced mortality among Department of Veterans Affairs patients with schizophrenia or bipolar disorder lost to follow-up and engaged in active outreach to return for care. Am J Public Health. 2012;102(suppl 1):S74-S79. doi:10.2105/AJPH.2011.300502
4. Copeland LA, Zeber JE, Wang CP, et al. Patterns of primary care and mortality among patients with schizophrenia or diabetes: a cluster analysis approach to the retrospective study of healthcare utilization. BMC Health Serv Res. 2009;9:127. doi:10.1186/1472-6963-9-127
5. Abraham KM, Mach J, Visnic S, McCarthy JF. Enhancing treatment reengagement for veterans with serious mental illness: evaluating the effectiveness of SMI re-engage. Psychiatr Serv. 2018;69(8):887-895. doi:10.1176/appi.ps.201700407
6. Ward MC, Druss BG. Reverse integration initiatives for individuals with serious mental illness. Focus (Am Psychiatr Publ). 2017;15(3):271-278. doi:10.1176/appi.focus.20170011
7. Chang ET, Vinzon M, Cohen AN, Young AS. Effective models urgently needed to improve physical care for people with serious mental illnesses. Health Serv Insights. 2019;12:1178632919837628. Published 2019 Apr 2. doi:10.1177/1178632919837628
8. Gabrielian S, Gordon AJ, Gelberg L, et al. Primary care medical services for homeless veterans. Fed Pract. 2014;31(10):10-19.
9. Lemke S, Boden MT, Kearney LK, et al. Measurement-based management of mental health quality and access in VHA: SAIL mental health domain. Psychol Serv. 2017;14(1):1-12. doi:10.1037/ser0000097
10. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855-866. doi:10.1177/104973230201200611
11. Zuchowski JL, Chrystal JG, Hamilton AB, et al. Coordinating care across health care systems for Veterans with gynecologic malignancies: a qualitative analysis. Med Care. 2017;55(suppl 1):S53-S60. doi:10.1097/MLR.0000000000000737
12. Daumit GL, Stone EM, Kennedy-Hendricks A, Choksy S, Marsteller JA, McGinty EE. Care coordination and population health management strategies and challenges in a behavioral health home model. Med Care. 2019;57(1):79-84. doi:10.1097/MLR.0000000000001023
13. Parmelli E, Flodgren G, Beyer F, et al. The effectiveness of strategies to change organisational culture to improve healthcare performance: a systematic review. Implement Sci. 2011;6(33):1-8. doi:10.1186/1748-5908-6-33
14. Bolton R, Logan C, Gittell JH. Revisiting relational coordination: a systematic review. J Appl Behav Sci. 2021;57(3):290-322. doi:10.1177/0021886321991597
15. Gittell JH, Godfrey M, Thistlethwaite J. Interprofessional collaborative practice and relational coordination: improving healthcare through relationships. J Interprof Care. 2013;27(3):210-13. doi:10.3109/13561820.2012.730564
16. Bradley EH, Curry LA, Ramanadhan S, Rowe L, Nembhard IM, Krumholz HM. Research in action: using positive deviance to improve quality of health care. Implement Sci. 2009;4:25. Published 2009 May 8. doi:10.1186/1748-5908-4-25
Catheter-Directed Retrieval of an Infected Fragment in a Vietnam War Veteran
Shrapnel injuries are commonly encountered in war zones.1 Shrapnel injuries can remain asymptomatic or become systemic, with health effects of the retained foreign body ranging from local to systemic toxicities depending on the patient’s reaction to the chemical composition and corrosiveness of the fragments in vivo.2 We present a case of a reactivating shrapnel injury in the form of a retroperitoneal infection and subsequent iliopsoas abscess. A collaborative procedure was performed between surgery and interventional radiology to snare and remove the infected fragment and drain the abscess.
Case Presentation
While serving in Vietnam, a soldier sustained a fragment injury to his left lower abdomen. He underwent a laparotomy, small bowel resection, and a temporary ileostomy at the time of the injury. Nearly 50 years later, the patient presented with chronic left lower quadrant pain and a low-grade fever. He was diagnosed clinically in the emergency department (ED) with diverticulitis and treated with antibiotics. The patient initially responded to treatment but returned 6 months later with similar symptoms, low-grade fever, and mild leukocytosis. A computed tomography (CT) scan during that encounter without IV contrast revealed a few scattered colonic diverticula without definite diverticulitis as well as a metallic fragment embedded in the left iliopsoas with increased soft tissue density.
The patient was diagnosed with a pelvic/abdominal wall hematoma and was discharged with pain medication. The patient reported recurrent attacks of left lower quadrant pain, fever, and changes in bowel habits, prompting gastrointestinal consultation and a colonoscopy that was unremarkable. Ten months later, the patient again presented to the ED, with recurrent symptoms, a fever of 102 °F, and leukocytosis with a white blood cell count of 11.7 × 109/L. CT scan with IV contrast revealed a large left iliopsoas abscess associated with an approximately 1-cm metallic fragment (Figure 1). A drainage catheter was placed under CT guidance and approximately 270 mL of purulent fluid was drained. Culture of the fluid was positive for Escherichia coli (E coli). Two days after drain placement, the fragment was removed as a joint procedure with interventional radiology and surgery. Using the drainage catheter tract as a point of entry, multiple attempts were made to retrieve the fragment with Olympus EndoJaw endoscopic forceps without success.
Ultimately a stiff directional sheath from a Cook Medical transjugular liver biopsy kit was used with a Merit Medical EnSnare to relocate the fragment to the left inguinal region for surgical excision (Figures 2, 3, and 4). The fragment was removed and swabbed for culture and sensitivity and a BLAKE drain was placed in the evacuated abscess cavity. The patient tolerated the procedure well and was discharged the following day. Three days later, culture and sensitivity grew E coli and Acinetobacter, thus confirming infection and a nidus for the surrounding abscess formation. On follow-up with general surgery 7 days later, the patient reported he was doing well, and the drain was removed without difficulty.
Discussion
Foreign body injuries can be benign or debilitating depending on the initial damage, anatomical location of the foreign body, composition of the foreign body, and the patient’s response to it. Retained shrapnel deep within the muscle tissue rarely causes complications. Although many times embedded objects can be asymptomatic and require no further management, migration of the foreign body or the formation of a fistula is possible, causing symptoms and requiring surgical intervention.1 One case involved the formation of a purulent fistula appearing a year after an explosive wound to the lumbosacral spine, which was treated with antimicrobials. Recurrence of the fistula several times after treatment led to surgical removal of the shrapnel along with antibiotic treatment of the osteomyelitis.3 Although uncommon, lead exposure that occurs due to retained foreign body fragments from gunshot or military-related injuries can cause systemic lead toxicity. Symptoms may range from abdominal pain, nausea, and constipation to jaundice and hepatitis.4 The severity has also been stated to correlate with the surface area of the lead exposed for dissolution.5 Migration of foreign bodies and shrapnel to other sites in the body, such as movement from soft tissues into distantly located body cavities, have been reported as well. Such a case involved the spontaneous onset of knee synovitis due to an intra-articular metallic object that was introduced via a blast injury to the upper third of the ipsilateral thigh.1
In this patient’s case, a large intramuscular abscess had formed nearly 50 years after the initial combat injury, requiring drainage of the abscess and removal of the fragment. By snaring the foreign body to a more superficial site, the surgical removal only required a minor incision, decreasing recovery time and the likelihood of postoperative complications that would have been associated with a large retroperitoneal dissection. While loop snare is often the first-line technique for the removal of intravascular foreign bodies, its use in soft tissue retained materials is scarcely reported.6 The more typical uses involve the removal of intraluminal materials, such as partially fractured venous catheters, guide wires, stents, and vena cava filters. The same report mentioned that in all 16 cases of percutaneous foreign body retrieval, no surgical intervention was required.7 In the case of most nonvascular foreign bodies, however, surgical retrieval is usually performed.8
Surgical removal of foreign bodies can be difficult in cases where a foreign body is anatomically located next to vital structures.9 An additional challenge with a sole surgical approach to foreign body retrieval is when it is small in size and lies deep within the soft tissue, as was the case for our patient. In such cases, the surgical procedure can be time consuming and lead to more trauma to the surrounding tissues.10 These factors alone necessitate consideration of postoperative morbidity and mortality.
In our patient, the retained fragment was embedded in the wall of an abscess located retroperitoneally in his iliopsoas muscle. When considering the proximity of the iliopsoas muscle to the digestive tract, urinary tract, and iliac lymph nodes, it is reasonable for infectious material to come in contact with the foreign body from these nearby structures, resulting in secondary infection.11 Surgery was previously considered the first-line treatment for retroperitoneal abscesses until the advent of imaging-guided percutaneous drainage.12
In some instances, surgical drainage may still be attempted, such as if there are different disease processes requiring open surgery or if percutaneous catheter drainage is not technically possible due to the location of the abscess, thick exudate, loculation/septations, or phlegmon. In these cases, laparoscopic drainage as opposed to open surgical drainage can provide the benefits of an open procedure (ie, total drainage and resection of infected tissue) but is less invasive, requires a smaller incision, and heals faster.13 Percutaneous drainage is the current first-line treatment due to the lack of need for general anesthesia, lower cost, and better morbidity and mortality outcomes compared to surgical methods.12 While percutaneous drainage proved to be immediately therapeutic for our patient, the risk of abscess recurrence with the retained infected fragment necessitated coordination of procedures across specialties to provide the best outcome for the patient.
Conclusions
This case demonstrates a multidisciplinary approach to transforming an otherwise large retroperitoneal dissection to a minimally invasive and technically efficient abscess drainage and foreign body retrieval.
1. Schroeder JE, Lowe J, Chaimsky G, Liebergall M, Mosheiff R. Migrating shrapnel: a rare cause of knee synovitis. Mil Med. 2010;175(11):929-930. doi:10.7205/milmed-d-09-00254
2. Centeno JA, Rogers DA, van der Voet GB, et al. Embedded fragments from U.S. military personnel—chemical analysis and potential health implications. Int J Environ Res Public Health. 2014;11(2):1261-1278. Published 2014 Jan 23. doi:10.3390/ijerph110201261
3. Carija R, Busic Z, Bradaric N, Bulovic B, Borzic Z, Pavicic-Perkovic S. Surgical removal of metallic foreign body (shrapnel) from the lumbosacral spine and the treatment of chronic osteomyelitis: a case report. West Indian Med J. 2014;63(4):373-375. doi:10.7727/wimj.2012.290
4. Grasso I, Blattner M, Short T, Downs J. Severe systemic lead toxicity resulting from extra-articular retained shrapnel presenting as jaundice and hepatitis: a case report and review of the literature. Mil Med. 2017;182(3-4):e1843-e1848. doi:10.7205/MILMED-D-16-00231
5. Dillman RO, Crumb CK, Lidsky MJ. Lead poisoning from a gunshot wound: report of a case and review of the literature. Am J Med. 1979;66(3):509-514. doi:10.1016/0002-9343(79)91083-0
6. Woodhouse JB, Uberoi R. Techniques for intravascular foreign body retrieval. Cardiovasc Intervent Radiol. 2013;36(4):888-897. doi:10.1007/s00270-012-0488-8
7. Mallmann CV, Wolf KJ, Wacker FK. Retrieval of vascular foreign bodies using a self-made wire snare. Acta Radiol. 2008;49(10):1124-1128. doi:10.1080/02841850802454741
8. Nosher JL, Siegel R. Percutaneous retrieval of nonvascular foreign bodies. Radiology. 1993;187(3):649-651. doi:10.1148/radiology.187.3.8497610
9. Fu Y, Cui LG, Romagnoli C, Li ZQ, Lei YT. Ultrasound-guided removal of retained soft tissue foreign body with late presentation. Chin Med J (Engl). 2017;130(14):1753-1754. doi:10.4103/0366-6999.209910
10. Liang HD, Li H, Feng H, Zhao ZN, Song WJ, Yuan B. Application of intraoperative navigation and positioning system in the removal of deep foreign bodies in the limbs. Chin Med J (Engl). 2019;132(11):1375-1377. doi:10.1097/CM9.0000000000000253
11. Moriarty CM, Baker RJ. A pain in the psoas. Sports Health. 2016;8(6):568-572. doi:10.1177/1941738116665112
12. Akhan O, Durmaz H, Balcı S, Birgi E, Çiftçi T, Akıncı D. Percutaneous drainage of retroperitoneal abscesses: variables for success, failure, and recurrence. Diagn Interv Radiol. 2020;26(2):124-130. doi:10.5152/dir.2019.19199
13. Hong CH, Hong YC, Bae SH, et al. Laparoscopic drainage as a minimally invasive treatment for a psoas abscess: a single center case series and literature review. Medicine (Baltimore). 2020;99(14):e19640. doi:10.1097/MD.0000000000019640
Shrapnel injuries are commonly encountered in war zones.1 Shrapnel injuries can remain asymptomatic or become systemic, with health effects of the retained foreign body ranging from local to systemic toxicities depending on the patient’s reaction to the chemical composition and corrosiveness of the fragments in vivo.2 We present a case of a reactivating shrapnel injury in the form of a retroperitoneal infection and subsequent iliopsoas abscess. A collaborative procedure was performed between surgery and interventional radiology to snare and remove the infected fragment and drain the abscess.
Case Presentation
While serving in Vietnam, a soldier sustained a fragment injury to his left lower abdomen. He underwent a laparotomy, small bowel resection, and a temporary ileostomy at the time of the injury. Nearly 50 years later, the patient presented with chronic left lower quadrant pain and a low-grade fever. He was diagnosed clinically in the emergency department (ED) with diverticulitis and treated with antibiotics. The patient initially responded to treatment but returned 6 months later with similar symptoms, low-grade fever, and mild leukocytosis. A computed tomography (CT) scan during that encounter without IV contrast revealed a few scattered colonic diverticula without definite diverticulitis as well as a metallic fragment embedded in the left iliopsoas with increased soft tissue density.
The patient was diagnosed with a pelvic/abdominal wall hematoma and was discharged with pain medication. The patient reported recurrent attacks of left lower quadrant pain, fever, and changes in bowel habits, prompting gastrointestinal consultation and a colonoscopy that was unremarkable. Ten months later, the patient again presented to the ED, with recurrent symptoms, a fever of 102 °F, and leukocytosis with a white blood cell count of 11.7 × 109/L. CT scan with IV contrast revealed a large left iliopsoas abscess associated with an approximately 1-cm metallic fragment (Figure 1). A drainage catheter was placed under CT guidance and approximately 270 mL of purulent fluid was drained. Culture of the fluid was positive for Escherichia coli (E coli). Two days after drain placement, the fragment was removed as a joint procedure with interventional radiology and surgery. Using the drainage catheter tract as a point of entry, multiple attempts were made to retrieve the fragment with Olympus EndoJaw endoscopic forceps without success.
Ultimately a stiff directional sheath from a Cook Medical transjugular liver biopsy kit was used with a Merit Medical EnSnare to relocate the fragment to the left inguinal region for surgical excision (Figures 2, 3, and 4). The fragment was removed and swabbed for culture and sensitivity and a BLAKE drain was placed in the evacuated abscess cavity. The patient tolerated the procedure well and was discharged the following day. Three days later, culture and sensitivity grew E coli and Acinetobacter, thus confirming infection and a nidus for the surrounding abscess formation. On follow-up with general surgery 7 days later, the patient reported he was doing well, and the drain was removed without difficulty.
Discussion
Foreign body injuries can be benign or debilitating depending on the initial damage, anatomical location of the foreign body, composition of the foreign body, and the patient’s response to it. Retained shrapnel deep within the muscle tissue rarely causes complications. Although many times embedded objects can be asymptomatic and require no further management, migration of the foreign body or the formation of a fistula is possible, causing symptoms and requiring surgical intervention.1 One case involved the formation of a purulent fistula appearing a year after an explosive wound to the lumbosacral spine, which was treated with antimicrobials. Recurrence of the fistula several times after treatment led to surgical removal of the shrapnel along with antibiotic treatment of the osteomyelitis.3 Although uncommon, lead exposure that occurs due to retained foreign body fragments from gunshot or military-related injuries can cause systemic lead toxicity. Symptoms may range from abdominal pain, nausea, and constipation to jaundice and hepatitis.4 The severity has also been stated to correlate with the surface area of the lead exposed for dissolution.5 Migration of foreign bodies and shrapnel to other sites in the body, such as movement from soft tissues into distantly located body cavities, have been reported as well. Such a case involved the spontaneous onset of knee synovitis due to an intra-articular metallic object that was introduced via a blast injury to the upper third of the ipsilateral thigh.1
In this patient’s case, a large intramuscular abscess had formed nearly 50 years after the initial combat injury, requiring drainage of the abscess and removal of the fragment. By snaring the foreign body to a more superficial site, the surgical removal only required a minor incision, decreasing recovery time and the likelihood of postoperative complications that would have been associated with a large retroperitoneal dissection. While loop snare is often the first-line technique for the removal of intravascular foreign bodies, its use in soft tissue retained materials is scarcely reported.6 The more typical uses involve the removal of intraluminal materials, such as partially fractured venous catheters, guide wires, stents, and vena cava filters. The same report mentioned that in all 16 cases of percutaneous foreign body retrieval, no surgical intervention was required.7 In the case of most nonvascular foreign bodies, however, surgical retrieval is usually performed.8
Surgical removal of foreign bodies can be difficult in cases where a foreign body is anatomically located next to vital structures.9 An additional challenge with a sole surgical approach to foreign body retrieval is when it is small in size and lies deep within the soft tissue, as was the case for our patient. In such cases, the surgical procedure can be time consuming and lead to more trauma to the surrounding tissues.10 These factors alone necessitate consideration of postoperative morbidity and mortality.
In our patient, the retained fragment was embedded in the wall of an abscess located retroperitoneally in his iliopsoas muscle. When considering the proximity of the iliopsoas muscle to the digestive tract, urinary tract, and iliac lymph nodes, it is reasonable for infectious material to come in contact with the foreign body from these nearby structures, resulting in secondary infection.11 Surgery was previously considered the first-line treatment for retroperitoneal abscesses until the advent of imaging-guided percutaneous drainage.12
In some instances, surgical drainage may still be attempted, such as if there are different disease processes requiring open surgery or if percutaneous catheter drainage is not technically possible due to the location of the abscess, thick exudate, loculation/septations, or phlegmon. In these cases, laparoscopic drainage as opposed to open surgical drainage can provide the benefits of an open procedure (ie, total drainage and resection of infected tissue) but is less invasive, requires a smaller incision, and heals faster.13 Percutaneous drainage is the current first-line treatment due to the lack of need for general anesthesia, lower cost, and better morbidity and mortality outcomes compared to surgical methods.12 While percutaneous drainage proved to be immediately therapeutic for our patient, the risk of abscess recurrence with the retained infected fragment necessitated coordination of procedures across specialties to provide the best outcome for the patient.
Conclusions
This case demonstrates a multidisciplinary approach to transforming an otherwise large retroperitoneal dissection to a minimally invasive and technically efficient abscess drainage and foreign body retrieval.
Shrapnel injuries are commonly encountered in war zones.1 Shrapnel injuries can remain asymptomatic or become systemic, with health effects of the retained foreign body ranging from local to systemic toxicities depending on the patient’s reaction to the chemical composition and corrosiveness of the fragments in vivo.2 We present a case of a reactivating shrapnel injury in the form of a retroperitoneal infection and subsequent iliopsoas abscess. A collaborative procedure was performed between surgery and interventional radiology to snare and remove the infected fragment and drain the abscess.
Case Presentation
While serving in Vietnam, a soldier sustained a fragment injury to his left lower abdomen. He underwent a laparotomy, small bowel resection, and a temporary ileostomy at the time of the injury. Nearly 50 years later, the patient presented with chronic left lower quadrant pain and a low-grade fever. He was diagnosed clinically in the emergency department (ED) with diverticulitis and treated with antibiotics. The patient initially responded to treatment but returned 6 months later with similar symptoms, low-grade fever, and mild leukocytosis. A computed tomography (CT) scan during that encounter without IV contrast revealed a few scattered colonic diverticula without definite diverticulitis as well as a metallic fragment embedded in the left iliopsoas with increased soft tissue density.
The patient was diagnosed with a pelvic/abdominal wall hematoma and was discharged with pain medication. The patient reported recurrent attacks of left lower quadrant pain, fever, and changes in bowel habits, prompting gastrointestinal consultation and a colonoscopy that was unremarkable. Ten months later, the patient again presented to the ED, with recurrent symptoms, a fever of 102 °F, and leukocytosis with a white blood cell count of 11.7 × 109/L. CT scan with IV contrast revealed a large left iliopsoas abscess associated with an approximately 1-cm metallic fragment (Figure 1). A drainage catheter was placed under CT guidance and approximately 270 mL of purulent fluid was drained. Culture of the fluid was positive for Escherichia coli (E coli). Two days after drain placement, the fragment was removed as a joint procedure with interventional radiology and surgery. Using the drainage catheter tract as a point of entry, multiple attempts were made to retrieve the fragment with Olympus EndoJaw endoscopic forceps without success.
Ultimately a stiff directional sheath from a Cook Medical transjugular liver biopsy kit was used with a Merit Medical EnSnare to relocate the fragment to the left inguinal region for surgical excision (Figures 2, 3, and 4). The fragment was removed and swabbed for culture and sensitivity and a BLAKE drain was placed in the evacuated abscess cavity. The patient tolerated the procedure well and was discharged the following day. Three days later, culture and sensitivity grew E coli and Acinetobacter, thus confirming infection and a nidus for the surrounding abscess formation. On follow-up with general surgery 7 days later, the patient reported he was doing well, and the drain was removed without difficulty.
Discussion
Foreign body injuries can be benign or debilitating depending on the initial damage, anatomical location of the foreign body, composition of the foreign body, and the patient’s response to it. Retained shrapnel deep within the muscle tissue rarely causes complications. Although many times embedded objects can be asymptomatic and require no further management, migration of the foreign body or the formation of a fistula is possible, causing symptoms and requiring surgical intervention.1 One case involved the formation of a purulent fistula appearing a year after an explosive wound to the lumbosacral spine, which was treated with antimicrobials. Recurrence of the fistula several times after treatment led to surgical removal of the shrapnel along with antibiotic treatment of the osteomyelitis.3 Although uncommon, lead exposure that occurs due to retained foreign body fragments from gunshot or military-related injuries can cause systemic lead toxicity. Symptoms may range from abdominal pain, nausea, and constipation to jaundice and hepatitis.4 The severity has also been stated to correlate with the surface area of the lead exposed for dissolution.5 Migration of foreign bodies and shrapnel to other sites in the body, such as movement from soft tissues into distantly located body cavities, have been reported as well. Such a case involved the spontaneous onset of knee synovitis due to an intra-articular metallic object that was introduced via a blast injury to the upper third of the ipsilateral thigh.1
In this patient’s case, a large intramuscular abscess had formed nearly 50 years after the initial combat injury, requiring drainage of the abscess and removal of the fragment. By snaring the foreign body to a more superficial site, the surgical removal only required a minor incision, decreasing recovery time and the likelihood of postoperative complications that would have been associated with a large retroperitoneal dissection. While loop snare is often the first-line technique for the removal of intravascular foreign bodies, its use in soft tissue retained materials is scarcely reported.6 The more typical uses involve the removal of intraluminal materials, such as partially fractured venous catheters, guide wires, stents, and vena cava filters. The same report mentioned that in all 16 cases of percutaneous foreign body retrieval, no surgical intervention was required.7 In the case of most nonvascular foreign bodies, however, surgical retrieval is usually performed.8
Surgical removal of foreign bodies can be difficult in cases where a foreign body is anatomically located next to vital structures.9 An additional challenge with a sole surgical approach to foreign body retrieval is when it is small in size and lies deep within the soft tissue, as was the case for our patient. In such cases, the surgical procedure can be time consuming and lead to more trauma to the surrounding tissues.10 These factors alone necessitate consideration of postoperative morbidity and mortality.
In our patient, the retained fragment was embedded in the wall of an abscess located retroperitoneally in his iliopsoas muscle. When considering the proximity of the iliopsoas muscle to the digestive tract, urinary tract, and iliac lymph nodes, it is reasonable for infectious material to come in contact with the foreign body from these nearby structures, resulting in secondary infection.11 Surgery was previously considered the first-line treatment for retroperitoneal abscesses until the advent of imaging-guided percutaneous drainage.12
In some instances, surgical drainage may still be attempted, such as if there are different disease processes requiring open surgery or if percutaneous catheter drainage is not technically possible due to the location of the abscess, thick exudate, loculation/septations, or phlegmon. In these cases, laparoscopic drainage as opposed to open surgical drainage can provide the benefits of an open procedure (ie, total drainage and resection of infected tissue) but is less invasive, requires a smaller incision, and heals faster.13 Percutaneous drainage is the current first-line treatment due to the lack of need for general anesthesia, lower cost, and better morbidity and mortality outcomes compared to surgical methods.12 While percutaneous drainage proved to be immediately therapeutic for our patient, the risk of abscess recurrence with the retained infected fragment necessitated coordination of procedures across specialties to provide the best outcome for the patient.
Conclusions
This case demonstrates a multidisciplinary approach to transforming an otherwise large retroperitoneal dissection to a minimally invasive and technically efficient abscess drainage and foreign body retrieval.
1. Schroeder JE, Lowe J, Chaimsky G, Liebergall M, Mosheiff R. Migrating shrapnel: a rare cause of knee synovitis. Mil Med. 2010;175(11):929-930. doi:10.7205/milmed-d-09-00254
2. Centeno JA, Rogers DA, van der Voet GB, et al. Embedded fragments from U.S. military personnel—chemical analysis and potential health implications. Int J Environ Res Public Health. 2014;11(2):1261-1278. Published 2014 Jan 23. doi:10.3390/ijerph110201261
3. Carija R, Busic Z, Bradaric N, Bulovic B, Borzic Z, Pavicic-Perkovic S. Surgical removal of metallic foreign body (shrapnel) from the lumbosacral spine and the treatment of chronic osteomyelitis: a case report. West Indian Med J. 2014;63(4):373-375. doi:10.7727/wimj.2012.290
4. Grasso I, Blattner M, Short T, Downs J. Severe systemic lead toxicity resulting from extra-articular retained shrapnel presenting as jaundice and hepatitis: a case report and review of the literature. Mil Med. 2017;182(3-4):e1843-e1848. doi:10.7205/MILMED-D-16-00231
5. Dillman RO, Crumb CK, Lidsky MJ. Lead poisoning from a gunshot wound: report of a case and review of the literature. Am J Med. 1979;66(3):509-514. doi:10.1016/0002-9343(79)91083-0
6. Woodhouse JB, Uberoi R. Techniques for intravascular foreign body retrieval. Cardiovasc Intervent Radiol. 2013;36(4):888-897. doi:10.1007/s00270-012-0488-8
7. Mallmann CV, Wolf KJ, Wacker FK. Retrieval of vascular foreign bodies using a self-made wire snare. Acta Radiol. 2008;49(10):1124-1128. doi:10.1080/02841850802454741
8. Nosher JL, Siegel R. Percutaneous retrieval of nonvascular foreign bodies. Radiology. 1993;187(3):649-651. doi:10.1148/radiology.187.3.8497610
9. Fu Y, Cui LG, Romagnoli C, Li ZQ, Lei YT. Ultrasound-guided removal of retained soft tissue foreign body with late presentation. Chin Med J (Engl). 2017;130(14):1753-1754. doi:10.4103/0366-6999.209910
10. Liang HD, Li H, Feng H, Zhao ZN, Song WJ, Yuan B. Application of intraoperative navigation and positioning system in the removal of deep foreign bodies in the limbs. Chin Med J (Engl). 2019;132(11):1375-1377. doi:10.1097/CM9.0000000000000253
11. Moriarty CM, Baker RJ. A pain in the psoas. Sports Health. 2016;8(6):568-572. doi:10.1177/1941738116665112
12. Akhan O, Durmaz H, Balcı S, Birgi E, Çiftçi T, Akıncı D. Percutaneous drainage of retroperitoneal abscesses: variables for success, failure, and recurrence. Diagn Interv Radiol. 2020;26(2):124-130. doi:10.5152/dir.2019.19199
13. Hong CH, Hong YC, Bae SH, et al. Laparoscopic drainage as a minimally invasive treatment for a psoas abscess: a single center case series and literature review. Medicine (Baltimore). 2020;99(14):e19640. doi:10.1097/MD.0000000000019640
1. Schroeder JE, Lowe J, Chaimsky G, Liebergall M, Mosheiff R. Migrating shrapnel: a rare cause of knee synovitis. Mil Med. 2010;175(11):929-930. doi:10.7205/milmed-d-09-00254
2. Centeno JA, Rogers DA, van der Voet GB, et al. Embedded fragments from U.S. military personnel—chemical analysis and potential health implications. Int J Environ Res Public Health. 2014;11(2):1261-1278. Published 2014 Jan 23. doi:10.3390/ijerph110201261
3. Carija R, Busic Z, Bradaric N, Bulovic B, Borzic Z, Pavicic-Perkovic S. Surgical removal of metallic foreign body (shrapnel) from the lumbosacral spine and the treatment of chronic osteomyelitis: a case report. West Indian Med J. 2014;63(4):373-375. doi:10.7727/wimj.2012.290
4. Grasso I, Blattner M, Short T, Downs J. Severe systemic lead toxicity resulting from extra-articular retained shrapnel presenting as jaundice and hepatitis: a case report and review of the literature. Mil Med. 2017;182(3-4):e1843-e1848. doi:10.7205/MILMED-D-16-00231
5. Dillman RO, Crumb CK, Lidsky MJ. Lead poisoning from a gunshot wound: report of a case and review of the literature. Am J Med. 1979;66(3):509-514. doi:10.1016/0002-9343(79)91083-0
6. Woodhouse JB, Uberoi R. Techniques for intravascular foreign body retrieval. Cardiovasc Intervent Radiol. 2013;36(4):888-897. doi:10.1007/s00270-012-0488-8
7. Mallmann CV, Wolf KJ, Wacker FK. Retrieval of vascular foreign bodies using a self-made wire snare. Acta Radiol. 2008;49(10):1124-1128. doi:10.1080/02841850802454741
8. Nosher JL, Siegel R. Percutaneous retrieval of nonvascular foreign bodies. Radiology. 1993;187(3):649-651. doi:10.1148/radiology.187.3.8497610
9. Fu Y, Cui LG, Romagnoli C, Li ZQ, Lei YT. Ultrasound-guided removal of retained soft tissue foreign body with late presentation. Chin Med J (Engl). 2017;130(14):1753-1754. doi:10.4103/0366-6999.209910
10. Liang HD, Li H, Feng H, Zhao ZN, Song WJ, Yuan B. Application of intraoperative navigation and positioning system in the removal of deep foreign bodies in the limbs. Chin Med J (Engl). 2019;132(11):1375-1377. doi:10.1097/CM9.0000000000000253
11. Moriarty CM, Baker RJ. A pain in the psoas. Sports Health. 2016;8(6):568-572. doi:10.1177/1941738116665112
12. Akhan O, Durmaz H, Balcı S, Birgi E, Çiftçi T, Akıncı D. Percutaneous drainage of retroperitoneal abscesses: variables for success, failure, and recurrence. Diagn Interv Radiol. 2020;26(2):124-130. doi:10.5152/dir.2019.19199
13. Hong CH, Hong YC, Bae SH, et al. Laparoscopic drainage as a minimally invasive treatment for a psoas abscess: a single center case series and literature review. Medicine (Baltimore). 2020;99(14):e19640. doi:10.1097/MD.0000000000019640
A ‘big breakfast’ diet affects hunger, not weight loss
Cell Metabolism, from the University of Aberdeen. The idea that ‘front-loading’ calories early in the day might help dieting attempts was based on the belief that consuming the bulk of daily calories in the morning optimizes weight loss by burning calories more efficiently and quickly.
, published in“There are a lot of myths surrounding the timing of eating and how it might influence either body weight or health,” said senior author Alexandra Johnstone, PhD, a researcher at the Rowett Institute, University of Aberdeen, who specializes in appetite control. “This has been driven largely by the circadian rhythm field. But we in the nutrition field have wondered how this could be possible. Where would the energy go? We decided to take a closer look at how time of day interacts with metabolism.”
Her team undertook a randomized crossover trial of 30 overweight and obese subjects recruited via social media ads. Participants – 16 men and 14 women – had a mean age of 51 years, and body mass index of 27-42 kg/ m2 but were otherwise healthy. The researchers compared two calorie-restricted but isoenergetic weight loss diets: morning-loaded calories with 45% of intake at breakfast, 35% at lunch, and 20% at dinner, and evening-loaded calories with the inverse proportions of 20%, 35%, and 45% at breakfast, lunch, and dinner, respectively.
Each diet was followed for 4 weeks, with a controlled baseline diet in which calories were balanced throughout the day provided for 1 week at the outset and during a 1-week washout period between the two intervention diets. Each person’s calorie intake was fixed, referenced to their individual measured resting metabolic rate, to assess the effect on weight loss and energy expenditure of meal timing under isoenergetic intake. Both diets were designed to provide the same nutrient composition of 30% protein, 35% carbohydrate, and 35% fat.
All food and beverages were provided, “making this the most rigorously controlled study to assess timing of eating in humans to date,” the team said, “with the aim of accounting for all aspects of energy balance.”
No optimum time to eat for weight loss
Results showed that both diets resulted in significant weight reduction at the end of each dietary intervention period, with subjects losing an average of just over 3 kg during each of the 4-week periods. However, there was no difference in weight loss between the morning-loaded and evening-loaded diets.
The relative size of breakfast and dinner – whether a person eats the largest meal early or late in the day – does not have an impact on metabolism, the team said. This challenges previous studies that have suggested that “evening eaters” – now a majority of the U.K. population – have a greater likelihood of gaining weight and greater difficulty in losing it.
“Participants were provided with all their meals for 8 weeks and their energy expenditure and body composition monitored for changes, using gold standard techniques at the Rowett Institute,” Dr. Johnstone said. “The same number of calories was consumed by volunteers at different times of the day, with energy expenditure measures using analysis of urine.
“This study is important because it challenges the previously held belief that eating at different times of the day leads to differential energy expenditure. The research shows that under weight loss conditions there is no optimum time to eat in order to manage weight, and that change in body weight is determined by energy balance.”
Meal timing reduces hunger but does not affect weight loss
However, the research also revealed that when subjects consumed the morning-loaded (big breakfast) diet, they reported feeling significantly less hungry later in the day. “Morning-loaded intake may assist with compliance to weight loss regime, through a greater suppression of appetite,” the authors said, adding that this “could foster easier weight loss in the real world.”
“The participants reported that their appetites were better controlled on the days they ate a bigger breakfast and that they felt satiated throughout the rest of the day,” Dr. Johnstone said.
“We know that appetite control is important to achieve weight loss, and our study suggests that those consuming the most calories in the morning felt less hungry, in contrast to when they consumed more calories in the evening period.
“This could be quite useful in the real-world environment, versus in the research setting that we were working in.”
‘Major finding’ for chrono-nutrition
Coauthor Jonathan Johnston, PhD, professor of chronobiology and integrative physiology at the University of Surrey, said: “This is a major finding for the field of meal timing (‘chrono-nutrition’) research. Many aspects of human biology change across the day and we are starting to understand how this interacts with food intake.
“Our new research shows that, in weight loss conditions, the size of breakfast and dinner regulates our appetite but not the total amount of energy that our bodies use,” Dr. Johnston said. “We plan to build upon this research to improve the health of the general population and specific groups, e.g, shift workers.”
It’s possible that shift workers could have different metabolic responses, due to the disruption of their circadian rhythms, the team said. Dr. Johnstone noted that this type of experiment could also be applied to the study of intermittent fasting (time-restricted eating), to help determine the best time of day for people to consume their calories.
“One thing that’s important to note is that when it comes to timing and dieting, there is not likely going to be one diet that fits all,” she concluded. “Figuring this out is going to be the future of diet studies, but it’s something that’s very difficult to measure.”
Great variability in individual responses to diets
Commenting on the study, Helena Gibson-Moore, RNutr (PH), nutrition scientist and spokesperson for the British Nutrition Foundation, said: “With about two in three adults in the UK either overweight or obese, it’s important that research continues to look into effective strategies for people to lose weight.” She described the study as “interesting,” and a challenge to previous research supporting “front-loading” calories earlier in the day as more effective for weight loss.
“However, whilst in this study there were no differences in weight loss, participants did report significantly lower hunger when eating a higher proportion of calories in the morning,” she said. “Therefore, for people who prefer having a big breakfast this may still be a useful way to help compliance to a weight loss regime through feeling less hungry in the evening, which in turn may lead to a reduced calorie intake later in the day.
“However, research has shown that as individuals we respond to diets in different ways. For example, a study comparing weight loss after a healthy low-fat diet vs. a healthy low-carbohydrate diet showed similar mean weight loss at 12 months, but there was large variability in the personal responses to each diet with some participants actually gaining weight.
“Differences in individual responses to dietary exposures has led to research into a personalized nutrition approach which requires collection of personal data and then provides individualized advice based on this.” Research has suggested that personalized dietary and physical activity advice was more effective than conventional generalized advice, she said.
“The bottom line for effective weight loss is that it is clear there is ‘no one size fits all’ approach and different weight loss strategies can work for different people but finding effective strategies for long-term sustainability of weight loss continues to be the major challenge. There are many factors that impact successful weight management and for some people it may not just be what we eat that is important, but also how and when we eat.”
This study was funded by the Medical Research Council and the Scottish Government, Rural and Environment Science and Analytical Services Division.
A version of this article first appeared on Medscape.co.uk.
Cell Metabolism, from the University of Aberdeen. The idea that ‘front-loading’ calories early in the day might help dieting attempts was based on the belief that consuming the bulk of daily calories in the morning optimizes weight loss by burning calories more efficiently and quickly.
, published in“There are a lot of myths surrounding the timing of eating and how it might influence either body weight or health,” said senior author Alexandra Johnstone, PhD, a researcher at the Rowett Institute, University of Aberdeen, who specializes in appetite control. “This has been driven largely by the circadian rhythm field. But we in the nutrition field have wondered how this could be possible. Where would the energy go? We decided to take a closer look at how time of day interacts with metabolism.”
Her team undertook a randomized crossover trial of 30 overweight and obese subjects recruited via social media ads. Participants – 16 men and 14 women – had a mean age of 51 years, and body mass index of 27-42 kg/ m2 but were otherwise healthy. The researchers compared two calorie-restricted but isoenergetic weight loss diets: morning-loaded calories with 45% of intake at breakfast, 35% at lunch, and 20% at dinner, and evening-loaded calories with the inverse proportions of 20%, 35%, and 45% at breakfast, lunch, and dinner, respectively.
Each diet was followed for 4 weeks, with a controlled baseline diet in which calories were balanced throughout the day provided for 1 week at the outset and during a 1-week washout period between the two intervention diets. Each person’s calorie intake was fixed, referenced to their individual measured resting metabolic rate, to assess the effect on weight loss and energy expenditure of meal timing under isoenergetic intake. Both diets were designed to provide the same nutrient composition of 30% protein, 35% carbohydrate, and 35% fat.
All food and beverages were provided, “making this the most rigorously controlled study to assess timing of eating in humans to date,” the team said, “with the aim of accounting for all aspects of energy balance.”
No optimum time to eat for weight loss
Results showed that both diets resulted in significant weight reduction at the end of each dietary intervention period, with subjects losing an average of just over 3 kg during each of the 4-week periods. However, there was no difference in weight loss between the morning-loaded and evening-loaded diets.
The relative size of breakfast and dinner – whether a person eats the largest meal early or late in the day – does not have an impact on metabolism, the team said. This challenges previous studies that have suggested that “evening eaters” – now a majority of the U.K. population – have a greater likelihood of gaining weight and greater difficulty in losing it.
“Participants were provided with all their meals for 8 weeks and their energy expenditure and body composition monitored for changes, using gold standard techniques at the Rowett Institute,” Dr. Johnstone said. “The same number of calories was consumed by volunteers at different times of the day, with energy expenditure measures using analysis of urine.
“This study is important because it challenges the previously held belief that eating at different times of the day leads to differential energy expenditure. The research shows that under weight loss conditions there is no optimum time to eat in order to manage weight, and that change in body weight is determined by energy balance.”
Meal timing reduces hunger but does not affect weight loss
However, the research also revealed that when subjects consumed the morning-loaded (big breakfast) diet, they reported feeling significantly less hungry later in the day. “Morning-loaded intake may assist with compliance to weight loss regime, through a greater suppression of appetite,” the authors said, adding that this “could foster easier weight loss in the real world.”
“The participants reported that their appetites were better controlled on the days they ate a bigger breakfast and that they felt satiated throughout the rest of the day,” Dr. Johnstone said.
“We know that appetite control is important to achieve weight loss, and our study suggests that those consuming the most calories in the morning felt less hungry, in contrast to when they consumed more calories in the evening period.
“This could be quite useful in the real-world environment, versus in the research setting that we were working in.”
‘Major finding’ for chrono-nutrition
Coauthor Jonathan Johnston, PhD, professor of chronobiology and integrative physiology at the University of Surrey, said: “This is a major finding for the field of meal timing (‘chrono-nutrition’) research. Many aspects of human biology change across the day and we are starting to understand how this interacts with food intake.
“Our new research shows that, in weight loss conditions, the size of breakfast and dinner regulates our appetite but not the total amount of energy that our bodies use,” Dr. Johnston said. “We plan to build upon this research to improve the health of the general population and specific groups, e.g, shift workers.”
It’s possible that shift workers could have different metabolic responses, due to the disruption of their circadian rhythms, the team said. Dr. Johnstone noted that this type of experiment could also be applied to the study of intermittent fasting (time-restricted eating), to help determine the best time of day for people to consume their calories.
“One thing that’s important to note is that when it comes to timing and dieting, there is not likely going to be one diet that fits all,” she concluded. “Figuring this out is going to be the future of diet studies, but it’s something that’s very difficult to measure.”
Great variability in individual responses to diets
Commenting on the study, Helena Gibson-Moore, RNutr (PH), nutrition scientist and spokesperson for the British Nutrition Foundation, said: “With about two in three adults in the UK either overweight or obese, it’s important that research continues to look into effective strategies for people to lose weight.” She described the study as “interesting,” and a challenge to previous research supporting “front-loading” calories earlier in the day as more effective for weight loss.
“However, whilst in this study there were no differences in weight loss, participants did report significantly lower hunger when eating a higher proportion of calories in the morning,” she said. “Therefore, for people who prefer having a big breakfast this may still be a useful way to help compliance to a weight loss regime through feeling less hungry in the evening, which in turn may lead to a reduced calorie intake later in the day.
“However, research has shown that as individuals we respond to diets in different ways. For example, a study comparing weight loss after a healthy low-fat diet vs. a healthy low-carbohydrate diet showed similar mean weight loss at 12 months, but there was large variability in the personal responses to each diet with some participants actually gaining weight.
“Differences in individual responses to dietary exposures has led to research into a personalized nutrition approach which requires collection of personal data and then provides individualized advice based on this.” Research has suggested that personalized dietary and physical activity advice was more effective than conventional generalized advice, she said.
“The bottom line for effective weight loss is that it is clear there is ‘no one size fits all’ approach and different weight loss strategies can work for different people but finding effective strategies for long-term sustainability of weight loss continues to be the major challenge. There are many factors that impact successful weight management and for some people it may not just be what we eat that is important, but also how and when we eat.”
This study was funded by the Medical Research Council and the Scottish Government, Rural and Environment Science and Analytical Services Division.
A version of this article first appeared on Medscape.co.uk.
Cell Metabolism, from the University of Aberdeen. The idea that ‘front-loading’ calories early in the day might help dieting attempts was based on the belief that consuming the bulk of daily calories in the morning optimizes weight loss by burning calories more efficiently and quickly.
, published in“There are a lot of myths surrounding the timing of eating and how it might influence either body weight or health,” said senior author Alexandra Johnstone, PhD, a researcher at the Rowett Institute, University of Aberdeen, who specializes in appetite control. “This has been driven largely by the circadian rhythm field. But we in the nutrition field have wondered how this could be possible. Where would the energy go? We decided to take a closer look at how time of day interacts with metabolism.”
Her team undertook a randomized crossover trial of 30 overweight and obese subjects recruited via social media ads. Participants – 16 men and 14 women – had a mean age of 51 years, and body mass index of 27-42 kg/ m2 but were otherwise healthy. The researchers compared two calorie-restricted but isoenergetic weight loss diets: morning-loaded calories with 45% of intake at breakfast, 35% at lunch, and 20% at dinner, and evening-loaded calories with the inverse proportions of 20%, 35%, and 45% at breakfast, lunch, and dinner, respectively.
Each diet was followed for 4 weeks, with a controlled baseline diet in which calories were balanced throughout the day provided for 1 week at the outset and during a 1-week washout period between the two intervention diets. Each person’s calorie intake was fixed, referenced to their individual measured resting metabolic rate, to assess the effect on weight loss and energy expenditure of meal timing under isoenergetic intake. Both diets were designed to provide the same nutrient composition of 30% protein, 35% carbohydrate, and 35% fat.
All food and beverages were provided, “making this the most rigorously controlled study to assess timing of eating in humans to date,” the team said, “with the aim of accounting for all aspects of energy balance.”
No optimum time to eat for weight loss
Results showed that both diets resulted in significant weight reduction at the end of each dietary intervention period, with subjects losing an average of just over 3 kg during each of the 4-week periods. However, there was no difference in weight loss between the morning-loaded and evening-loaded diets.
The relative size of breakfast and dinner – whether a person eats the largest meal early or late in the day – does not have an impact on metabolism, the team said. This challenges previous studies that have suggested that “evening eaters” – now a majority of the U.K. population – have a greater likelihood of gaining weight and greater difficulty in losing it.
“Participants were provided with all their meals for 8 weeks and their energy expenditure and body composition monitored for changes, using gold standard techniques at the Rowett Institute,” Dr. Johnstone said. “The same number of calories was consumed by volunteers at different times of the day, with energy expenditure measures using analysis of urine.
“This study is important because it challenges the previously held belief that eating at different times of the day leads to differential energy expenditure. The research shows that under weight loss conditions there is no optimum time to eat in order to manage weight, and that change in body weight is determined by energy balance.”
Meal timing reduces hunger but does not affect weight loss
However, the research also revealed that when subjects consumed the morning-loaded (big breakfast) diet, they reported feeling significantly less hungry later in the day. “Morning-loaded intake may assist with compliance to weight loss regime, through a greater suppression of appetite,” the authors said, adding that this “could foster easier weight loss in the real world.”
“The participants reported that their appetites were better controlled on the days they ate a bigger breakfast and that they felt satiated throughout the rest of the day,” Dr. Johnstone said.
“We know that appetite control is important to achieve weight loss, and our study suggests that those consuming the most calories in the morning felt less hungry, in contrast to when they consumed more calories in the evening period.
“This could be quite useful in the real-world environment, versus in the research setting that we were working in.”
‘Major finding’ for chrono-nutrition
Coauthor Jonathan Johnston, PhD, professor of chronobiology and integrative physiology at the University of Surrey, said: “This is a major finding for the field of meal timing (‘chrono-nutrition’) research. Many aspects of human biology change across the day and we are starting to understand how this interacts with food intake.
“Our new research shows that, in weight loss conditions, the size of breakfast and dinner regulates our appetite but not the total amount of energy that our bodies use,” Dr. Johnston said. “We plan to build upon this research to improve the health of the general population and specific groups, e.g, shift workers.”
It’s possible that shift workers could have different metabolic responses, due to the disruption of their circadian rhythms, the team said. Dr. Johnstone noted that this type of experiment could also be applied to the study of intermittent fasting (time-restricted eating), to help determine the best time of day for people to consume their calories.
“One thing that’s important to note is that when it comes to timing and dieting, there is not likely going to be one diet that fits all,” she concluded. “Figuring this out is going to be the future of diet studies, but it’s something that’s very difficult to measure.”
Great variability in individual responses to diets
Commenting on the study, Helena Gibson-Moore, RNutr (PH), nutrition scientist and spokesperson for the British Nutrition Foundation, said: “With about two in three adults in the UK either overweight or obese, it’s important that research continues to look into effective strategies for people to lose weight.” She described the study as “interesting,” and a challenge to previous research supporting “front-loading” calories earlier in the day as more effective for weight loss.
“However, whilst in this study there were no differences in weight loss, participants did report significantly lower hunger when eating a higher proportion of calories in the morning,” she said. “Therefore, for people who prefer having a big breakfast this may still be a useful way to help compliance to a weight loss regime through feeling less hungry in the evening, which in turn may lead to a reduced calorie intake later in the day.
“However, research has shown that as individuals we respond to diets in different ways. For example, a study comparing weight loss after a healthy low-fat diet vs. a healthy low-carbohydrate diet showed similar mean weight loss at 12 months, but there was large variability in the personal responses to each diet with some participants actually gaining weight.
“Differences in individual responses to dietary exposures has led to research into a personalized nutrition approach which requires collection of personal data and then provides individualized advice based on this.” Research has suggested that personalized dietary and physical activity advice was more effective than conventional generalized advice, she said.
“The bottom line for effective weight loss is that it is clear there is ‘no one size fits all’ approach and different weight loss strategies can work for different people but finding effective strategies for long-term sustainability of weight loss continues to be the major challenge. There are many factors that impact successful weight management and for some people it may not just be what we eat that is important, but also how and when we eat.”
This study was funded by the Medical Research Council and the Scottish Government, Rural and Environment Science and Analytical Services Division.
A version of this article first appeared on Medscape.co.uk.
FROM CELL METABOLISM
Successful Use of Lanadelumab in an Older Patient With Type II Hereditary Angioedema
Hereditary angioedema (HAE) is a rare genetic disorder affecting about 1 in 67,000 individuals and may lead to increased morbidity and mortality.1,2 HAE is characterized by recurring episodes of subcutaneous and/or submucosal edema without urticaria due to an excess of bradykinin.2,3 Autosomal dominant inheritance is present in 75% of patients with HAE and is classified into 2 main types.2 Type I HAE is caused by deficiency of C1 esterase inhibitor, accounting for 85% of cases.2 Type II HAE is marked by normal to elevated levels of C1 esterase inhibitor but with reduced activity.2
Cutaneous and abdominal angioedema attacks are the most common presentation.1 However, any location may be affected, including the face, oropharynx, and larynx.1 Only 0.9% of all HAE attacks cause laryngeal edema, but 50% of HAE patients have experienced a laryngeal attack, which may be lethal.1 An angioedema attack can range in severity, depending on the location and degree of edema.3 In addition, patients with HAE often are diagnosed with anxiety and depression secondary to their poor quality of life.4 Thus, long-term prophylaxis of attacks is crucial to reduce the physical and psychological implications.
Previously, HAE was treated with antifibrinolytic agents and attenuated androgens for short- and long-term prophylaxis.1 These treatment modalities are now considered second-line since the development of novel medications with improved efficacy and limited adverse effects (AEs).1 For long-term prophylaxis, subcutaneous and IV C1 esterase inhibitor has been proven effective in both types I and II HAE.1 Another option, lanadelumab, a subcutaneously delivered monoclonal antibody inhibitor of plasma kallikrein, has been proven to decrease the frequency of HAE attacks without significant AEs.5 Lanadelumab works by binding to the active site of plasma kallikrein, which reduces its activity and slows the production of bradykinin.6 This results in decreasing vascular permeability and swelling episodes in patients with HAE.7 Data, however, are limited, specifically regarding patients with type II HAE and patients aged ≥ 65 years.5 This article reports on an older male with type II HAE successfully treated with lanadelumab.
Case Presentation
An 81-year-old male patient with hypertension, hypertriglyceridemia, and aortic aneurysm had recurrent, frequent episodes of severe abdominal pain with a remote history of extremity and scrotal swelling since adolescence. He was misdiagnosed for years and was eventually determined to have HAE at age 75 years after his niece was diagnosed, prompting him to be reevaluated for his frequent bouts of abdominal pain. His laboratory findings were consistent with HAE type II with low C4 (7.8 mg/dL), normal C1 esterase inhibitor levels (24 mg/dL), and low levels of C1 esterase inhibitor activity (28% of normal).
Initially, he described having weekly attacks of abdominal pain that could last 1 to several days. At worst, these attacks would last up to a month, causing a decrease in appetite and weight loss. At age 77 years, he began an on-demand treatment, icatibant, a bradykinin receptor blocker. After initiating icatibant during an acute attack, the pain would diminish within 1 to 2 hours, and within several hours, he would be pain free. Previously, pain relief would take several days to weeks. He continued to use icatibant on-demand, typically requiring treatment every 1 to 2 months for only the more severe attacks.
After an increasing frequency of abdominal pain attacks, prophylactic medication was recommended. Therefore, subcutaneous lanadelumab 300 mg every 2 weeks was initiated for long-term prophylaxis. The patient went from requiring on-demand treatment 2 to 3 times per month to once in 6 months after starting lanadelumab. In addition, he tolerated the medication well without any AEs.
Discussion
According to the international WAO/EAACI 2021 guidelines, HAE treatment goals are “to achieve complete control of the disease and to normalize patients’ lives.”8 On-demand treatment options include C1 esterase inhibitor, icatibant, or ecallantide (a kallikrein inhibitor).8 Long-term prophylaxis in HAE should be considered, accounting for disease activity, burden, control, and patient preference. Five medications have been used for long-term prophylaxis: antifibrinolytic agents (not recommended), attenuated androgens (considered second-line), C1 esterase inhibitor, berotralstat, and lanadelumab.8
Antifibrinolytics are no longer recommended for long-term prophylactic treatment as their efficacy is poor and was not considered for our patient. Attenuated androgens, such as danazol, have a history of prophylactic use in patients with HAE due to their good efficacy but are suboptimal due to their significant AE profile and many drug-drug interactions.8 In addition, androgens have many contraindications, including hypertension and hypertriglyceridemia, which were both present in our patient. Consequently, danazol was not an advised treatment for our patient. C1 esterase inhibitor is often used to prevent HAE attacks and can be given intravenously or subcutaneously, typically administered biweekly. A potential AE of C1 esterase inhibitor is thrombosis.Therefore, C1 esterase inhibitor was not a preferred choice in our older patient with a history of hypercoagulability. Berotralstat, a plasma kallikrein inhibitor, is an oral treatment option that also has shown efficacy in long-term prophylaxis. The most common AEs of berotralstat tend to be gastrointestinal symptoms, and the medication requires dose adjustment for patients with hepatic impairment.8 Berotralstat was not considered because it was not an approved treatment option at the time of this patient’s treatment. Lanadelumab is a human monoclonal antibody against plasma kallikrein, which decreases bradykinin production in patients with HAE, thus preventing angioedema attacks.5 Data regarding the use of lanadelumab in patients with type II HAE are limited, but because HAE with normal C1 esterase inhibitor levels involves the production of bradykinin via kallikrein, lanadelumab should still be effective.1 Lanadelumab was chosen for our patient because of its minimal AEs and is not known to increase the risk of thrombosis.
Lanadelumab is a novel medication, recently approved in 2018 by the US Food and Drug Administration for the treatment of type I and type 2 HAE in patients aged ≥ 12 years.7 The phase 3 Hereditary Angioedema Long-term Prophylaxis (HELP) study concluded that treatment with subcutaneous lanadelumab for 26 weeks significantly decreased the frequency of angioedema attacks compared with placebo.5 However, 113 (90.4%) of patients in the phase III HELP study had type I HAE.5 Of the 125 patients that completed this randomized, double-blind study, only 12 had type II HAE.5 In addition, this study only included 5 patients aged ≥ 65 years.5 Also, no patients aged ≥ 65 years were part of the treatment arms that included a lanadelumab dose of 300 mg.5 In a case series of 12 patients in Canada, treatment with lanadelumab decreased angioedema attacks by 72%.9 However, the series only included 1 patient with type II HAE who was aged 36 years.9 Therefore, our case demonstrates the efficacy of lanadelumab in a patient aged ≥ 65 years with type II HAE.
Conclusions
HAE is a rare and potentially fatal disease characterized by recurrent, unpredictable attacks of edema throughout the body. The disease burden adversely affects a patient’s quality of life. Therefore, long-term prophylaxis is critical to managing patients with HAE. Lanadelumab has been proven as an effective long-term prophylactic treatment option for HAE attacks. This case supports the use of lanadelumab in patients with type II HAE and patients aged ≥ 65 years.
Acknowledgments
The patient was initially written up based on his delayed diagnosis as a case report.3 An earlier version of this article was presented by Samuel Weiss, MD, and Derek Smith, MD, as a poster at the American Academy of Allergy, Asthma, and Immunology virtual conference February 26 to March 1, 2021.
1. Busse PJ, Christiansen SC. Hereditary angioedema. N Engl J Med. 2020;382(12):1136-1148. doi:10.1056/NEJMra1808012
2. Bernstein JA. Severity of hereditary angioedema, prevalence, and diagnostic considerations. Am J Manag Care. 2018;24(14)(suppl):S292-S298.
3. Berger J, Carroll MP Jr, Champoux E, Coop CA. Extremely delayed diagnosis of type II hereditary angioedema: case report and review of the literature. Mil Med. 2018;183(11-12):e765-e767. doi:10.1093/milmed/usy031
4. Fouche AS, Saunders EF, Craig T. Depression and anxiety in patients with hereditary angioedema. Ann Allergy Asthma Immunol. 2014;112(4):371-375. doi:10.1016/j.anai.2013.05.028
5. Banerji A, Riedl MA, Bernstein JA, et al; HELP Investigators. Effect of lanadelumab compared with placebo on prevention of hereditary angioedema attacks: a randomized clinical trial. JAMA. 2018;320(20):2108-2121. doi:10.1001/jama.2018.16773
6. Busse PJ, Farkas H, Banerji A, et al. Lanadelumab for the prophylactic treatment of hereditary angioedema with C1 inhibitor deficiency: a review of preclinical and phase I studies. BioDrugs. 2019;33(1):33-43. doi:10.1007/s40259-018-0325-y
7. Riedl MA, Maurer M, Bernstein JA, et al. Lanadelumab demonstrates rapid and sustained prevention of hereditary angioedema attacks. Allergy. 2020;75(11):2879-2887. doi:10.1111/all.14416
8. Maurer M, Magerl M, Betschel S, et al. The international WAO/EAACI guideline for the management of hereditary angioedema—the 2021 revision and update. Allergy. 2022;77(7):1961-1990. doi:10.1111/all.15214
9. Iaboni A, Kanani A, Lacuesta G, Song C, Kan M, Betschel SD. Impact of lanadelumab in hereditary angioedema: a case series of 12 patients in Canada. Allergy Asthma Clin Immunol. 2021;17(1):78. Published 2021 Jul 23. doi:10.1186/s13223-021-00579-6
Hereditary angioedema (HAE) is a rare genetic disorder affecting about 1 in 67,000 individuals and may lead to increased morbidity and mortality.1,2 HAE is characterized by recurring episodes of subcutaneous and/or submucosal edema without urticaria due to an excess of bradykinin.2,3 Autosomal dominant inheritance is present in 75% of patients with HAE and is classified into 2 main types.2 Type I HAE is caused by deficiency of C1 esterase inhibitor, accounting for 85% of cases.2 Type II HAE is marked by normal to elevated levels of C1 esterase inhibitor but with reduced activity.2
Cutaneous and abdominal angioedema attacks are the most common presentation.1 However, any location may be affected, including the face, oropharynx, and larynx.1 Only 0.9% of all HAE attacks cause laryngeal edema, but 50% of HAE patients have experienced a laryngeal attack, which may be lethal.1 An angioedema attack can range in severity, depending on the location and degree of edema.3 In addition, patients with HAE often are diagnosed with anxiety and depression secondary to their poor quality of life.4 Thus, long-term prophylaxis of attacks is crucial to reduce the physical and psychological implications.
Previously, HAE was treated with antifibrinolytic agents and attenuated androgens for short- and long-term prophylaxis.1 These treatment modalities are now considered second-line since the development of novel medications with improved efficacy and limited adverse effects (AEs).1 For long-term prophylaxis, subcutaneous and IV C1 esterase inhibitor has been proven effective in both types I and II HAE.1 Another option, lanadelumab, a subcutaneously delivered monoclonal antibody inhibitor of plasma kallikrein, has been proven to decrease the frequency of HAE attacks without significant AEs.5 Lanadelumab works by binding to the active site of plasma kallikrein, which reduces its activity and slows the production of bradykinin.6 This results in decreasing vascular permeability and swelling episodes in patients with HAE.7 Data, however, are limited, specifically regarding patients with type II HAE and patients aged ≥ 65 years.5 This article reports on an older male with type II HAE successfully treated with lanadelumab.
Case Presentation
An 81-year-old male patient with hypertension, hypertriglyceridemia, and aortic aneurysm had recurrent, frequent episodes of severe abdominal pain with a remote history of extremity and scrotal swelling since adolescence. He was misdiagnosed for years and was eventually determined to have HAE at age 75 years after his niece was diagnosed, prompting him to be reevaluated for his frequent bouts of abdominal pain. His laboratory findings were consistent with HAE type II with low C4 (7.8 mg/dL), normal C1 esterase inhibitor levels (24 mg/dL), and low levels of C1 esterase inhibitor activity (28% of normal).
Initially, he described having weekly attacks of abdominal pain that could last 1 to several days. At worst, these attacks would last up to a month, causing a decrease in appetite and weight loss. At age 77 years, he began an on-demand treatment, icatibant, a bradykinin receptor blocker. After initiating icatibant during an acute attack, the pain would diminish within 1 to 2 hours, and within several hours, he would be pain free. Previously, pain relief would take several days to weeks. He continued to use icatibant on-demand, typically requiring treatment every 1 to 2 months for only the more severe attacks.
After an increasing frequency of abdominal pain attacks, prophylactic medication was recommended. Therefore, subcutaneous lanadelumab 300 mg every 2 weeks was initiated for long-term prophylaxis. The patient went from requiring on-demand treatment 2 to 3 times per month to once in 6 months after starting lanadelumab. In addition, he tolerated the medication well without any AEs.
Discussion
According to the international WAO/EAACI 2021 guidelines, HAE treatment goals are “to achieve complete control of the disease and to normalize patients’ lives.”8 On-demand treatment options include C1 esterase inhibitor, icatibant, or ecallantide (a kallikrein inhibitor).8 Long-term prophylaxis in HAE should be considered, accounting for disease activity, burden, control, and patient preference. Five medications have been used for long-term prophylaxis: antifibrinolytic agents (not recommended), attenuated androgens (considered second-line), C1 esterase inhibitor, berotralstat, and lanadelumab.8
Antifibrinolytics are no longer recommended for long-term prophylactic treatment as their efficacy is poor and was not considered for our patient. Attenuated androgens, such as danazol, have a history of prophylactic use in patients with HAE due to their good efficacy but are suboptimal due to their significant AE profile and many drug-drug interactions.8 In addition, androgens have many contraindications, including hypertension and hypertriglyceridemia, which were both present in our patient. Consequently, danazol was not an advised treatment for our patient. C1 esterase inhibitor is often used to prevent HAE attacks and can be given intravenously or subcutaneously, typically administered biweekly. A potential AE of C1 esterase inhibitor is thrombosis.Therefore, C1 esterase inhibitor was not a preferred choice in our older patient with a history of hypercoagulability. Berotralstat, a plasma kallikrein inhibitor, is an oral treatment option that also has shown efficacy in long-term prophylaxis. The most common AEs of berotralstat tend to be gastrointestinal symptoms, and the medication requires dose adjustment for patients with hepatic impairment.8 Berotralstat was not considered because it was not an approved treatment option at the time of this patient’s treatment. Lanadelumab is a human monoclonal antibody against plasma kallikrein, which decreases bradykinin production in patients with HAE, thus preventing angioedema attacks.5 Data regarding the use of lanadelumab in patients with type II HAE are limited, but because HAE with normal C1 esterase inhibitor levels involves the production of bradykinin via kallikrein, lanadelumab should still be effective.1 Lanadelumab was chosen for our patient because of its minimal AEs and is not known to increase the risk of thrombosis.
Lanadelumab is a novel medication, recently approved in 2018 by the US Food and Drug Administration for the treatment of type I and type 2 HAE in patients aged ≥ 12 years.7 The phase 3 Hereditary Angioedema Long-term Prophylaxis (HELP) study concluded that treatment with subcutaneous lanadelumab for 26 weeks significantly decreased the frequency of angioedema attacks compared with placebo.5 However, 113 (90.4%) of patients in the phase III HELP study had type I HAE.5 Of the 125 patients that completed this randomized, double-blind study, only 12 had type II HAE.5 In addition, this study only included 5 patients aged ≥ 65 years.5 Also, no patients aged ≥ 65 years were part of the treatment arms that included a lanadelumab dose of 300 mg.5 In a case series of 12 patients in Canada, treatment with lanadelumab decreased angioedema attacks by 72%.9 However, the series only included 1 patient with type II HAE who was aged 36 years.9 Therefore, our case demonstrates the efficacy of lanadelumab in a patient aged ≥ 65 years with type II HAE.
Conclusions
HAE is a rare and potentially fatal disease characterized by recurrent, unpredictable attacks of edema throughout the body. The disease burden adversely affects a patient’s quality of life. Therefore, long-term prophylaxis is critical to managing patients with HAE. Lanadelumab has been proven as an effective long-term prophylactic treatment option for HAE attacks. This case supports the use of lanadelumab in patients with type II HAE and patients aged ≥ 65 years.
Acknowledgments
The patient was initially written up based on his delayed diagnosis as a case report.3 An earlier version of this article was presented by Samuel Weiss, MD, and Derek Smith, MD, as a poster at the American Academy of Allergy, Asthma, and Immunology virtual conference February 26 to March 1, 2021.
Hereditary angioedema (HAE) is a rare genetic disorder affecting about 1 in 67,000 individuals and may lead to increased morbidity and mortality.1,2 HAE is characterized by recurring episodes of subcutaneous and/or submucosal edema without urticaria due to an excess of bradykinin.2,3 Autosomal dominant inheritance is present in 75% of patients with HAE and is classified into 2 main types.2 Type I HAE is caused by deficiency of C1 esterase inhibitor, accounting for 85% of cases.2 Type II HAE is marked by normal to elevated levels of C1 esterase inhibitor but with reduced activity.2
Cutaneous and abdominal angioedema attacks are the most common presentation.1 However, any location may be affected, including the face, oropharynx, and larynx.1 Only 0.9% of all HAE attacks cause laryngeal edema, but 50% of HAE patients have experienced a laryngeal attack, which may be lethal.1 An angioedema attack can range in severity, depending on the location and degree of edema.3 In addition, patients with HAE often are diagnosed with anxiety and depression secondary to their poor quality of life.4 Thus, long-term prophylaxis of attacks is crucial to reduce the physical and psychological implications.
Previously, HAE was treated with antifibrinolytic agents and attenuated androgens for short- and long-term prophylaxis.1 These treatment modalities are now considered second-line since the development of novel medications with improved efficacy and limited adverse effects (AEs).1 For long-term prophylaxis, subcutaneous and IV C1 esterase inhibitor has been proven effective in both types I and II HAE.1 Another option, lanadelumab, a subcutaneously delivered monoclonal antibody inhibitor of plasma kallikrein, has been proven to decrease the frequency of HAE attacks without significant AEs.5 Lanadelumab works by binding to the active site of plasma kallikrein, which reduces its activity and slows the production of bradykinin.6 This results in decreasing vascular permeability and swelling episodes in patients with HAE.7 Data, however, are limited, specifically regarding patients with type II HAE and patients aged ≥ 65 years.5 This article reports on an older male with type II HAE successfully treated with lanadelumab.
Case Presentation
An 81-year-old male patient with hypertension, hypertriglyceridemia, and aortic aneurysm had recurrent, frequent episodes of severe abdominal pain with a remote history of extremity and scrotal swelling since adolescence. He was misdiagnosed for years and was eventually determined to have HAE at age 75 years after his niece was diagnosed, prompting him to be reevaluated for his frequent bouts of abdominal pain. His laboratory findings were consistent with HAE type II with low C4 (7.8 mg/dL), normal C1 esterase inhibitor levels (24 mg/dL), and low levels of C1 esterase inhibitor activity (28% of normal).
Initially, he described having weekly attacks of abdominal pain that could last 1 to several days. At worst, these attacks would last up to a month, causing a decrease in appetite and weight loss. At age 77 years, he began an on-demand treatment, icatibant, a bradykinin receptor blocker. After initiating icatibant during an acute attack, the pain would diminish within 1 to 2 hours, and within several hours, he would be pain free. Previously, pain relief would take several days to weeks. He continued to use icatibant on-demand, typically requiring treatment every 1 to 2 months for only the more severe attacks.
After an increasing frequency of abdominal pain attacks, prophylactic medication was recommended. Therefore, subcutaneous lanadelumab 300 mg every 2 weeks was initiated for long-term prophylaxis. The patient went from requiring on-demand treatment 2 to 3 times per month to once in 6 months after starting lanadelumab. In addition, he tolerated the medication well without any AEs.
Discussion
According to the international WAO/EAACI 2021 guidelines, HAE treatment goals are “to achieve complete control of the disease and to normalize patients’ lives.”8 On-demand treatment options include C1 esterase inhibitor, icatibant, or ecallantide (a kallikrein inhibitor).8 Long-term prophylaxis in HAE should be considered, accounting for disease activity, burden, control, and patient preference. Five medications have been used for long-term prophylaxis: antifibrinolytic agents (not recommended), attenuated androgens (considered second-line), C1 esterase inhibitor, berotralstat, and lanadelumab.8
Antifibrinolytics are no longer recommended for long-term prophylactic treatment as their efficacy is poor and was not considered for our patient. Attenuated androgens, such as danazol, have a history of prophylactic use in patients with HAE due to their good efficacy but are suboptimal due to their significant AE profile and many drug-drug interactions.8 In addition, androgens have many contraindications, including hypertension and hypertriglyceridemia, which were both present in our patient. Consequently, danazol was not an advised treatment for our patient. C1 esterase inhibitor is often used to prevent HAE attacks and can be given intravenously or subcutaneously, typically administered biweekly. A potential AE of C1 esterase inhibitor is thrombosis.Therefore, C1 esterase inhibitor was not a preferred choice in our older patient with a history of hypercoagulability. Berotralstat, a plasma kallikrein inhibitor, is an oral treatment option that also has shown efficacy in long-term prophylaxis. The most common AEs of berotralstat tend to be gastrointestinal symptoms, and the medication requires dose adjustment for patients with hepatic impairment.8 Berotralstat was not considered because it was not an approved treatment option at the time of this patient’s treatment. Lanadelumab is a human monoclonal antibody against plasma kallikrein, which decreases bradykinin production in patients with HAE, thus preventing angioedema attacks.5 Data regarding the use of lanadelumab in patients with type II HAE are limited, but because HAE with normal C1 esterase inhibitor levels involves the production of bradykinin via kallikrein, lanadelumab should still be effective.1 Lanadelumab was chosen for our patient because of its minimal AEs and is not known to increase the risk of thrombosis.
Lanadelumab is a novel medication, recently approved in 2018 by the US Food and Drug Administration for the treatment of type I and type 2 HAE in patients aged ≥ 12 years.7 The phase 3 Hereditary Angioedema Long-term Prophylaxis (HELP) study concluded that treatment with subcutaneous lanadelumab for 26 weeks significantly decreased the frequency of angioedema attacks compared with placebo.5 However, 113 (90.4%) of patients in the phase III HELP study had type I HAE.5 Of the 125 patients that completed this randomized, double-blind study, only 12 had type II HAE.5 In addition, this study only included 5 patients aged ≥ 65 years.5 Also, no patients aged ≥ 65 years were part of the treatment arms that included a lanadelumab dose of 300 mg.5 In a case series of 12 patients in Canada, treatment with lanadelumab decreased angioedema attacks by 72%.9 However, the series only included 1 patient with type II HAE who was aged 36 years.9 Therefore, our case demonstrates the efficacy of lanadelumab in a patient aged ≥ 65 years with type II HAE.
Conclusions
HAE is a rare and potentially fatal disease characterized by recurrent, unpredictable attacks of edema throughout the body. The disease burden adversely affects a patient’s quality of life. Therefore, long-term prophylaxis is critical to managing patients with HAE. Lanadelumab has been proven as an effective long-term prophylactic treatment option for HAE attacks. This case supports the use of lanadelumab in patients with type II HAE and patients aged ≥ 65 years.
Acknowledgments
The patient was initially written up based on his delayed diagnosis as a case report.3 An earlier version of this article was presented by Samuel Weiss, MD, and Derek Smith, MD, as a poster at the American Academy of Allergy, Asthma, and Immunology virtual conference February 26 to March 1, 2021.
1. Busse PJ, Christiansen SC. Hereditary angioedema. N Engl J Med. 2020;382(12):1136-1148. doi:10.1056/NEJMra1808012
2. Bernstein JA. Severity of hereditary angioedema, prevalence, and diagnostic considerations. Am J Manag Care. 2018;24(14)(suppl):S292-S298.
3. Berger J, Carroll MP Jr, Champoux E, Coop CA. Extremely delayed diagnosis of type II hereditary angioedema: case report and review of the literature. Mil Med. 2018;183(11-12):e765-e767. doi:10.1093/milmed/usy031
4. Fouche AS, Saunders EF, Craig T. Depression and anxiety in patients with hereditary angioedema. Ann Allergy Asthma Immunol. 2014;112(4):371-375. doi:10.1016/j.anai.2013.05.028
5. Banerji A, Riedl MA, Bernstein JA, et al; HELP Investigators. Effect of lanadelumab compared with placebo on prevention of hereditary angioedema attacks: a randomized clinical trial. JAMA. 2018;320(20):2108-2121. doi:10.1001/jama.2018.16773
6. Busse PJ, Farkas H, Banerji A, et al. Lanadelumab for the prophylactic treatment of hereditary angioedema with C1 inhibitor deficiency: a review of preclinical and phase I studies. BioDrugs. 2019;33(1):33-43. doi:10.1007/s40259-018-0325-y
7. Riedl MA, Maurer M, Bernstein JA, et al. Lanadelumab demonstrates rapid and sustained prevention of hereditary angioedema attacks. Allergy. 2020;75(11):2879-2887. doi:10.1111/all.14416
8. Maurer M, Magerl M, Betschel S, et al. The international WAO/EAACI guideline for the management of hereditary angioedema—the 2021 revision and update. Allergy. 2022;77(7):1961-1990. doi:10.1111/all.15214
9. Iaboni A, Kanani A, Lacuesta G, Song C, Kan M, Betschel SD. Impact of lanadelumab in hereditary angioedema: a case series of 12 patients in Canada. Allergy Asthma Clin Immunol. 2021;17(1):78. Published 2021 Jul 23. doi:10.1186/s13223-021-00579-6
1. Busse PJ, Christiansen SC. Hereditary angioedema. N Engl J Med. 2020;382(12):1136-1148. doi:10.1056/NEJMra1808012
2. Bernstein JA. Severity of hereditary angioedema, prevalence, and diagnostic considerations. Am J Manag Care. 2018;24(14)(suppl):S292-S298.
3. Berger J, Carroll MP Jr, Champoux E, Coop CA. Extremely delayed diagnosis of type II hereditary angioedema: case report and review of the literature. Mil Med. 2018;183(11-12):e765-e767. doi:10.1093/milmed/usy031
4. Fouche AS, Saunders EF, Craig T. Depression and anxiety in patients with hereditary angioedema. Ann Allergy Asthma Immunol. 2014;112(4):371-375. doi:10.1016/j.anai.2013.05.028
5. Banerji A, Riedl MA, Bernstein JA, et al; HELP Investigators. Effect of lanadelumab compared with placebo on prevention of hereditary angioedema attacks: a randomized clinical trial. JAMA. 2018;320(20):2108-2121. doi:10.1001/jama.2018.16773
6. Busse PJ, Farkas H, Banerji A, et al. Lanadelumab for the prophylactic treatment of hereditary angioedema with C1 inhibitor deficiency: a review of preclinical and phase I studies. BioDrugs. 2019;33(1):33-43. doi:10.1007/s40259-018-0325-y
7. Riedl MA, Maurer M, Bernstein JA, et al. Lanadelumab demonstrates rapid and sustained prevention of hereditary angioedema attacks. Allergy. 2020;75(11):2879-2887. doi:10.1111/all.14416
8. Maurer M, Magerl M, Betschel S, et al. The international WAO/EAACI guideline for the management of hereditary angioedema—the 2021 revision and update. Allergy. 2022;77(7):1961-1990. doi:10.1111/all.15214
9. Iaboni A, Kanani A, Lacuesta G, Song C, Kan M, Betschel SD. Impact of lanadelumab in hereditary angioedema: a case series of 12 patients in Canada. Allergy Asthma Clin Immunol. 2021;17(1):78. Published 2021 Jul 23. doi:10.1186/s13223-021-00579-6
75 Years of the Historic Partnership Between the VA and Academic Medical Centers
The US government has a legacy of providing support for veterans. Pensions were offered to disabled veterans as early as 1776, and benefits were expanded to cover medical needs as the country grew and modernized.1,2 Enacted during the Civil War, the General Pension Act increased benefits for widows and dependents.2 Rehabilitation and vocational training assistance benefits were added after World War I, and the US Department of Veterans Affairs (VA) was created in 1930 to consolidate all benefits under one umbrella organization.2,3
Prior to World War II, the VA lacked the bed capacity for the 4 million veterans who were eligible for care. This shortage became more acute by the end of the war, when the number of eligible veterans increased by 15 million.4 Although the VA successfully built bed capacity through acquisition of military hospitals, VA hospitals struggled to recruit clinical staff.2 Physicians were hesitant to join the VA because civil service salaries were lower than comparable positions in the community, and the VA offered limited opportunities for research or continuing education. These limitations negatively impacted the overall reputation of the VA. The American Medical Association (AMA) was reluctant to directly admit VA physicians for membership because of a “lower” standard of care at VA hospitals.2 This review will describe how passage of 2 legislative actions, the Servicemen’s Readjustment Act and Public Law (PL)79-293, and a key policy memorandum set the foundation for the partnership between the VA and academic medical centers. This led to improved medical care for veterans and expansion of health professions education for VA and the nation.5,6
GI Bill of Rights
The passage of the Servicemen’s Readjustment Act of 1944, better known as the GI Bill of Rights, provided education assistance, guaranteed home loans, and unemployment payments to veterans.5 All medical officers serving during the war were eligible for this benefit, which effectively increased the number of potential physician trainees at the end of World War II by almost 60,000.7 Medical education at the time was simultaneously undergoing a transformation with more rigorous training and a push to standardize medical education across state lines. While prerequisite training was not required for admission to many medical schools and curricula varied in length based on state licensing requirements, more programs were adding premedical education requirements and transitioning to the 4-year curricula seen today. At this time, only 23 states required postgraduate internships for licensure, but this number was growing.8 The American Board of Medical Specialties was established several years prior to World War II in 1934 to elevate the quality of care; the desire for residency training and board certification continued to gain traction during the 1940s.9
Medical Training
In anticipation of an influx of medical trainees, the Committee on Postwar Medical Service conducted a comprehensive survey to understand the training needs of physician veterans returning from World War II.7 The survey collected data from medical officers on their desired length of training, interest in specialty board certification, time served, and type of medical practice prior to enlisting. Length of desired training was categorized as short (up to 6 months), which would serve as a refresher course and provide updates on recent advances in medicine and surgery, and long (> 6 months), which resembled a modern internship or residency. Nineteen percent did not want additional training, 22% wished to pursue short courses, and 51% were interested in longer courses. Most respondents also wished to obtain board certification.7 The AMA played a significant role in supporting the expansion of training opportunities, encouraging all accredited hospitals to assess their capacity to determine the number of additional residents they could accommodate. The AMA also awarded hospitals with existing internship programs temporary accreditation to allow them to add extended training through residency programs.7
Medical schools devised creative solutions to meet the needs of returning physician veterans and capitalize on the available educational benefits. Postgraduate refresher courses that varied in length from hours to months were developed focusing on an array of topics. In addition to basic medical principles, courses covered general topics, such as advances in medicine, to specialty topics, such as nutrition or ophthalmology.7 Although the courses could not be counted toward board certification, participation increased by almost 300% in the 1945/1946 academic year relative to the previous year.7 Increasing access to the longer training courses, including internships and residencies, was often achieved through experiences outside the clinical setting. Yale University modified its curriculum to reduce time devoted to lectures on published materials and encourage active learning and community outreach.10 Northwestern University assigned residents to spend 1 of their 3 years “out of residence” in basic science and clinical instruction provided by the medical school. Tuition assistance from the GI Bill supported the additional expenses incurred by the medical school to fund laboratory space, equipment, and the salaries of the basic science instructors and administrative staff.11
Public Law 79-293
Public Law 79-293 was passed on January 3, 1946, establishing the Department of Medicine and Surgery within the VA. The law, which became the basis for Title 38 chapters 73 and 74, allowed VA hospitals flexibility to hire doctors, dentists, and nurses without regard to the civil service regulations and salary restrictions associated with other federal positions.6
Concerns about quality of care had been mounting for years, and the release of several sensationalized and critical articles motivated VA leadership to make sweeping changes. One article described neglect at VA hospitals.12 Excessive paperwork and low economic benefits were identified as barriers to the recruitment of qualified clinicians at the VA.2 The VA Special Medical Advisory Group investigating the claims recommended that the VA encourage their hospitals to affiliate with medical schools to improve the quality of care. This group also recommended that new VA hospitals be constructed near academic medical centers to allow access to consultants.2 Three large veterans service organizations (American Legion, Veterans of Foreign Wars, and Disabled American Veterans) conducted their own investigations in response to the media reports. The organizations reported that the quality of care in most VA hospitals was already on par with the community but indicated that the VA would benefit from expansion of medical research and training, increased bed capacity, reduction in the administrative burden on clinicians, and increased salaries for clinical staff.2
Policy Memorandum No. 2
The relationship between VA and academic medical centers was solidified on January 30, 1946, with adoption of Policy Memorandum No. 2.13 This memorandum allowed for the establishment of relationships with academic medical centers to provide “the veteran a much higher standard of medical care than could be given him with a wholly full-time medical staff.” Shortly after this memorandum was signed, residents from Northwestern University and the University of Illinois at Chicago began clinical rotations at the Hines VA facility in Chicago, Illinois.2 By 1947, 62 medical schools had committed to an affiliation with local VA hospitals and 21 deans’ committees were in operation, which were responsible for the appointment of physician residents and consultants. The AMA extended direct membership privileges to VA physicians, and by 1947 the number of residency positions doubled nationally.14,15 The almost universal support of the relationship between VA and academic affiliates provided educational opportunities for returning veterans and raised standards for medical education nationally.
Current State
Since the passage of PL 79-293 and PM No. 2, the VA-academic health professions education partnership has grown to include 113,000 trainees rotating through 150 VA medical centers annually from more than 1400 colleges and universities.16 Most VA podiatrists, psychologists, optometrists, and physicians working in VA medical centers also trained at VA, and trainees are 37% more likely to consider a job at VA after completing their clinical rotations. This unique partnership began 76 years ago and continues to provide clinicians “for VA and the nation.”
1. Glasson WH. History of military pension legislation in the United States. Columbia University Press; 1900.
2. Lewis BJ. Veterans Administration medical program relationship with medical schools in the United States. Dissertation. The American University; 1969.
3. Kracke RR. The role of the medical college in the medical care of the veteran. J Med Assoc State Ala. 1950;19(8):225-230.
4. US Department of Veterans Affairs, Office of Public Affairs. VA History in Brief. VA Pamphlet 80-97-2. Washington, DC: United States Department of Veterans Affairs; 1997.
5. Servicesmen’s Readjustment Act of 1944. 38 USC § 370 (1944).
6. To establish a Department of Medicine and Surgery in the Veterans’ Administration. 38 USC § 73-74 (1946). Accessed August 2, 2022.
7. Lueth HC. Postgraduate wishes of medical officers: final report on 21,029 questionnaires. J Am Med Assoc. 1945; 127(13):759-770.
8. Johnson V, Arestad FH, Tipner A. Medical education in the United States and Canada: forty-sixth annual report on medical education in the United States and Canada by the Council on Medical Education and Hospitals of the American Medical Association. J Am Med Assoc. 1946;131(16):1277-1310.
9. Chesney AM. Some impacts of the specialty board movement on medical education. J Assoc Am Med Coll. 1948;23(2):83-89.
10. Hiscock IV. New frontiers in health education. Can J Public Health. 1946;37(11):452-457.
11. Colwell AR. Principles of graduate medical instruction: with a specific plan of application in a medical school. J Am Med Assoc. 1945;127(13):741-746.
12. Maisel, AQ. The veteran betrayed. How long will the Veterans’ Administration continue to give third-rate medical care to first-rate men? Cosmopolitan. 1945(3):45.
13. US Veterans Administration. Policy Memorandum No. 2: Policy in association of veterans’ hospitals with medical schools. January 30, 1946.
14. American Medical Association. Digest of Official Actions: 1846-1958. JAMA. 1946;132:1094.
15. Wentz DK, Ford CV. A brief history of the internship. JAMA. 1984;252(24):3390-3394. doi:10.1001/jama.1984.03350240036035
16. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education academic year 2022-2021. Accessed August 8, 2022. https://www.va.gov/OAA/docs/OAA_Stats_AY_2020_2021_FINAL.pdf
The US government has a legacy of providing support for veterans. Pensions were offered to disabled veterans as early as 1776, and benefits were expanded to cover medical needs as the country grew and modernized.1,2 Enacted during the Civil War, the General Pension Act increased benefits for widows and dependents.2 Rehabilitation and vocational training assistance benefits were added after World War I, and the US Department of Veterans Affairs (VA) was created in 1930 to consolidate all benefits under one umbrella organization.2,3
Prior to World War II, the VA lacked the bed capacity for the 4 million veterans who were eligible for care. This shortage became more acute by the end of the war, when the number of eligible veterans increased by 15 million.4 Although the VA successfully built bed capacity through acquisition of military hospitals, VA hospitals struggled to recruit clinical staff.2 Physicians were hesitant to join the VA because civil service salaries were lower than comparable positions in the community, and the VA offered limited opportunities for research or continuing education. These limitations negatively impacted the overall reputation of the VA. The American Medical Association (AMA) was reluctant to directly admit VA physicians for membership because of a “lower” standard of care at VA hospitals.2 This review will describe how passage of 2 legislative actions, the Servicemen’s Readjustment Act and Public Law (PL)79-293, and a key policy memorandum set the foundation for the partnership between the VA and academic medical centers. This led to improved medical care for veterans and expansion of health professions education for VA and the nation.5,6
GI Bill of Rights
The passage of the Servicemen’s Readjustment Act of 1944, better known as the GI Bill of Rights, provided education assistance, guaranteed home loans, and unemployment payments to veterans.5 All medical officers serving during the war were eligible for this benefit, which effectively increased the number of potential physician trainees at the end of World War II by almost 60,000.7 Medical education at the time was simultaneously undergoing a transformation with more rigorous training and a push to standardize medical education across state lines. While prerequisite training was not required for admission to many medical schools and curricula varied in length based on state licensing requirements, more programs were adding premedical education requirements and transitioning to the 4-year curricula seen today. At this time, only 23 states required postgraduate internships for licensure, but this number was growing.8 The American Board of Medical Specialties was established several years prior to World War II in 1934 to elevate the quality of care; the desire for residency training and board certification continued to gain traction during the 1940s.9
Medical Training
In anticipation of an influx of medical trainees, the Committee on Postwar Medical Service conducted a comprehensive survey to understand the training needs of physician veterans returning from World War II.7 The survey collected data from medical officers on their desired length of training, interest in specialty board certification, time served, and type of medical practice prior to enlisting. Length of desired training was categorized as short (up to 6 months), which would serve as a refresher course and provide updates on recent advances in medicine and surgery, and long (> 6 months), which resembled a modern internship or residency. Nineteen percent did not want additional training, 22% wished to pursue short courses, and 51% were interested in longer courses. Most respondents also wished to obtain board certification.7 The AMA played a significant role in supporting the expansion of training opportunities, encouraging all accredited hospitals to assess their capacity to determine the number of additional residents they could accommodate. The AMA also awarded hospitals with existing internship programs temporary accreditation to allow them to add extended training through residency programs.7
Medical schools devised creative solutions to meet the needs of returning physician veterans and capitalize on the available educational benefits. Postgraduate refresher courses that varied in length from hours to months were developed focusing on an array of topics. In addition to basic medical principles, courses covered general topics, such as advances in medicine, to specialty topics, such as nutrition or ophthalmology.7 Although the courses could not be counted toward board certification, participation increased by almost 300% in the 1945/1946 academic year relative to the previous year.7 Increasing access to the longer training courses, including internships and residencies, was often achieved through experiences outside the clinical setting. Yale University modified its curriculum to reduce time devoted to lectures on published materials and encourage active learning and community outreach.10 Northwestern University assigned residents to spend 1 of their 3 years “out of residence” in basic science and clinical instruction provided by the medical school. Tuition assistance from the GI Bill supported the additional expenses incurred by the medical school to fund laboratory space, equipment, and the salaries of the basic science instructors and administrative staff.11
Public Law 79-293
Public Law 79-293 was passed on January 3, 1946, establishing the Department of Medicine and Surgery within the VA. The law, which became the basis for Title 38 chapters 73 and 74, allowed VA hospitals flexibility to hire doctors, dentists, and nurses without regard to the civil service regulations and salary restrictions associated with other federal positions.6
Concerns about quality of care had been mounting for years, and the release of several sensationalized and critical articles motivated VA leadership to make sweeping changes. One article described neglect at VA hospitals.12 Excessive paperwork and low economic benefits were identified as barriers to the recruitment of qualified clinicians at the VA.2 The VA Special Medical Advisory Group investigating the claims recommended that the VA encourage their hospitals to affiliate with medical schools to improve the quality of care. This group also recommended that new VA hospitals be constructed near academic medical centers to allow access to consultants.2 Three large veterans service organizations (American Legion, Veterans of Foreign Wars, and Disabled American Veterans) conducted their own investigations in response to the media reports. The organizations reported that the quality of care in most VA hospitals was already on par with the community but indicated that the VA would benefit from expansion of medical research and training, increased bed capacity, reduction in the administrative burden on clinicians, and increased salaries for clinical staff.2
Policy Memorandum No. 2
The relationship between VA and academic medical centers was solidified on January 30, 1946, with adoption of Policy Memorandum No. 2.13 This memorandum allowed for the establishment of relationships with academic medical centers to provide “the veteran a much higher standard of medical care than could be given him with a wholly full-time medical staff.” Shortly after this memorandum was signed, residents from Northwestern University and the University of Illinois at Chicago began clinical rotations at the Hines VA facility in Chicago, Illinois.2 By 1947, 62 medical schools had committed to an affiliation with local VA hospitals and 21 deans’ committees were in operation, which were responsible for the appointment of physician residents and consultants. The AMA extended direct membership privileges to VA physicians, and by 1947 the number of residency positions doubled nationally.14,15 The almost universal support of the relationship between VA and academic affiliates provided educational opportunities for returning veterans and raised standards for medical education nationally.
Current State
Since the passage of PL 79-293 and PM No. 2, the VA-academic health professions education partnership has grown to include 113,000 trainees rotating through 150 VA medical centers annually from more than 1400 colleges and universities.16 Most VA podiatrists, psychologists, optometrists, and physicians working in VA medical centers also trained at VA, and trainees are 37% more likely to consider a job at VA after completing their clinical rotations. This unique partnership began 76 years ago and continues to provide clinicians “for VA and the nation.”
The US government has a legacy of providing support for veterans. Pensions were offered to disabled veterans as early as 1776, and benefits were expanded to cover medical needs as the country grew and modernized.1,2 Enacted during the Civil War, the General Pension Act increased benefits for widows and dependents.2 Rehabilitation and vocational training assistance benefits were added after World War I, and the US Department of Veterans Affairs (VA) was created in 1930 to consolidate all benefits under one umbrella organization.2,3
Prior to World War II, the VA lacked the bed capacity for the 4 million veterans who were eligible for care. This shortage became more acute by the end of the war, when the number of eligible veterans increased by 15 million.4 Although the VA successfully built bed capacity through acquisition of military hospitals, VA hospitals struggled to recruit clinical staff.2 Physicians were hesitant to join the VA because civil service salaries were lower than comparable positions in the community, and the VA offered limited opportunities for research or continuing education. These limitations negatively impacted the overall reputation of the VA. The American Medical Association (AMA) was reluctant to directly admit VA physicians for membership because of a “lower” standard of care at VA hospitals.2 This review will describe how passage of 2 legislative actions, the Servicemen’s Readjustment Act and Public Law (PL)79-293, and a key policy memorandum set the foundation for the partnership between the VA and academic medical centers. This led to improved medical care for veterans and expansion of health professions education for VA and the nation.5,6
GI Bill of Rights
The passage of the Servicemen’s Readjustment Act of 1944, better known as the GI Bill of Rights, provided education assistance, guaranteed home loans, and unemployment payments to veterans.5 All medical officers serving during the war were eligible for this benefit, which effectively increased the number of potential physician trainees at the end of World War II by almost 60,000.7 Medical education at the time was simultaneously undergoing a transformation with more rigorous training and a push to standardize medical education across state lines. While prerequisite training was not required for admission to many medical schools and curricula varied in length based on state licensing requirements, more programs were adding premedical education requirements and transitioning to the 4-year curricula seen today. At this time, only 23 states required postgraduate internships for licensure, but this number was growing.8 The American Board of Medical Specialties was established several years prior to World War II in 1934 to elevate the quality of care; the desire for residency training and board certification continued to gain traction during the 1940s.9
Medical Training
In anticipation of an influx of medical trainees, the Committee on Postwar Medical Service conducted a comprehensive survey to understand the training needs of physician veterans returning from World War II.7 The survey collected data from medical officers on their desired length of training, interest in specialty board certification, time served, and type of medical practice prior to enlisting. Length of desired training was categorized as short (up to 6 months), which would serve as a refresher course and provide updates on recent advances in medicine and surgery, and long (> 6 months), which resembled a modern internship or residency. Nineteen percent did not want additional training, 22% wished to pursue short courses, and 51% were interested in longer courses. Most respondents also wished to obtain board certification.7 The AMA played a significant role in supporting the expansion of training opportunities, encouraging all accredited hospitals to assess their capacity to determine the number of additional residents they could accommodate. The AMA also awarded hospitals with existing internship programs temporary accreditation to allow them to add extended training through residency programs.7
Medical schools devised creative solutions to meet the needs of returning physician veterans and capitalize on the available educational benefits. Postgraduate refresher courses that varied in length from hours to months were developed focusing on an array of topics. In addition to basic medical principles, courses covered general topics, such as advances in medicine, to specialty topics, such as nutrition or ophthalmology.7 Although the courses could not be counted toward board certification, participation increased by almost 300% in the 1945/1946 academic year relative to the previous year.7 Increasing access to the longer training courses, including internships and residencies, was often achieved through experiences outside the clinical setting. Yale University modified its curriculum to reduce time devoted to lectures on published materials and encourage active learning and community outreach.10 Northwestern University assigned residents to spend 1 of their 3 years “out of residence” in basic science and clinical instruction provided by the medical school. Tuition assistance from the GI Bill supported the additional expenses incurred by the medical school to fund laboratory space, equipment, and the salaries of the basic science instructors and administrative staff.11
Public Law 79-293
Public Law 79-293 was passed on January 3, 1946, establishing the Department of Medicine and Surgery within the VA. The law, which became the basis for Title 38 chapters 73 and 74, allowed VA hospitals flexibility to hire doctors, dentists, and nurses without regard to the civil service regulations and salary restrictions associated with other federal positions.6
Concerns about quality of care had been mounting for years, and the release of several sensationalized and critical articles motivated VA leadership to make sweeping changes. One article described neglect at VA hospitals.12 Excessive paperwork and low economic benefits were identified as barriers to the recruitment of qualified clinicians at the VA.2 The VA Special Medical Advisory Group investigating the claims recommended that the VA encourage their hospitals to affiliate with medical schools to improve the quality of care. This group also recommended that new VA hospitals be constructed near academic medical centers to allow access to consultants.2 Three large veterans service organizations (American Legion, Veterans of Foreign Wars, and Disabled American Veterans) conducted their own investigations in response to the media reports. The organizations reported that the quality of care in most VA hospitals was already on par with the community but indicated that the VA would benefit from expansion of medical research and training, increased bed capacity, reduction in the administrative burden on clinicians, and increased salaries for clinical staff.2
Policy Memorandum No. 2
The relationship between VA and academic medical centers was solidified on January 30, 1946, with adoption of Policy Memorandum No. 2.13 This memorandum allowed for the establishment of relationships with academic medical centers to provide “the veteran a much higher standard of medical care than could be given him with a wholly full-time medical staff.” Shortly after this memorandum was signed, residents from Northwestern University and the University of Illinois at Chicago began clinical rotations at the Hines VA facility in Chicago, Illinois.2 By 1947, 62 medical schools had committed to an affiliation with local VA hospitals and 21 deans’ committees were in operation, which were responsible for the appointment of physician residents and consultants. The AMA extended direct membership privileges to VA physicians, and by 1947 the number of residency positions doubled nationally.14,15 The almost universal support of the relationship between VA and academic affiliates provided educational opportunities for returning veterans and raised standards for medical education nationally.
Current State
Since the passage of PL 79-293 and PM No. 2, the VA-academic health professions education partnership has grown to include 113,000 trainees rotating through 150 VA medical centers annually from more than 1400 colleges and universities.16 Most VA podiatrists, psychologists, optometrists, and physicians working in VA medical centers also trained at VA, and trainees are 37% more likely to consider a job at VA after completing their clinical rotations. This unique partnership began 76 years ago and continues to provide clinicians “for VA and the nation.”
1. Glasson WH. History of military pension legislation in the United States. Columbia University Press; 1900.
2. Lewis BJ. Veterans Administration medical program relationship with medical schools in the United States. Dissertation. The American University; 1969.
3. Kracke RR. The role of the medical college in the medical care of the veteran. J Med Assoc State Ala. 1950;19(8):225-230.
4. US Department of Veterans Affairs, Office of Public Affairs. VA History in Brief. VA Pamphlet 80-97-2. Washington, DC: United States Department of Veterans Affairs; 1997.
5. Servicesmen’s Readjustment Act of 1944. 38 USC § 370 (1944).
6. To establish a Department of Medicine and Surgery in the Veterans’ Administration. 38 USC § 73-74 (1946). Accessed August 2, 2022.
7. Lueth HC. Postgraduate wishes of medical officers: final report on 21,029 questionnaires. J Am Med Assoc. 1945; 127(13):759-770.
8. Johnson V, Arestad FH, Tipner A. Medical education in the United States and Canada: forty-sixth annual report on medical education in the United States and Canada by the Council on Medical Education and Hospitals of the American Medical Association. J Am Med Assoc. 1946;131(16):1277-1310.
9. Chesney AM. Some impacts of the specialty board movement on medical education. J Assoc Am Med Coll. 1948;23(2):83-89.
10. Hiscock IV. New frontiers in health education. Can J Public Health. 1946;37(11):452-457.
11. Colwell AR. Principles of graduate medical instruction: with a specific plan of application in a medical school. J Am Med Assoc. 1945;127(13):741-746.
12. Maisel, AQ. The veteran betrayed. How long will the Veterans’ Administration continue to give third-rate medical care to first-rate men? Cosmopolitan. 1945(3):45.
13. US Veterans Administration. Policy Memorandum No. 2: Policy in association of veterans’ hospitals with medical schools. January 30, 1946.
14. American Medical Association. Digest of Official Actions: 1846-1958. JAMA. 1946;132:1094.
15. Wentz DK, Ford CV. A brief history of the internship. JAMA. 1984;252(24):3390-3394. doi:10.1001/jama.1984.03350240036035
16. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education academic year 2022-2021. Accessed August 8, 2022. https://www.va.gov/OAA/docs/OAA_Stats_AY_2020_2021_FINAL.pdf
1. Glasson WH. History of military pension legislation in the United States. Columbia University Press; 1900.
2. Lewis BJ. Veterans Administration medical program relationship with medical schools in the United States. Dissertation. The American University; 1969.
3. Kracke RR. The role of the medical college in the medical care of the veteran. J Med Assoc State Ala. 1950;19(8):225-230.
4. US Department of Veterans Affairs, Office of Public Affairs. VA History in Brief. VA Pamphlet 80-97-2. Washington, DC: United States Department of Veterans Affairs; 1997.
5. Servicesmen’s Readjustment Act of 1944. 38 USC § 370 (1944).
6. To establish a Department of Medicine and Surgery in the Veterans’ Administration. 38 USC § 73-74 (1946). Accessed August 2, 2022.
7. Lueth HC. Postgraduate wishes of medical officers: final report on 21,029 questionnaires. J Am Med Assoc. 1945; 127(13):759-770.
8. Johnson V, Arestad FH, Tipner A. Medical education in the United States and Canada: forty-sixth annual report on medical education in the United States and Canada by the Council on Medical Education and Hospitals of the American Medical Association. J Am Med Assoc. 1946;131(16):1277-1310.
9. Chesney AM. Some impacts of the specialty board movement on medical education. J Assoc Am Med Coll. 1948;23(2):83-89.
10. Hiscock IV. New frontiers in health education. Can J Public Health. 1946;37(11):452-457.
11. Colwell AR. Principles of graduate medical instruction: with a specific plan of application in a medical school. J Am Med Assoc. 1945;127(13):741-746.
12. Maisel, AQ. The veteran betrayed. How long will the Veterans’ Administration continue to give third-rate medical care to first-rate men? Cosmopolitan. 1945(3):45.
13. US Veterans Administration. Policy Memorandum No. 2: Policy in association of veterans’ hospitals with medical schools. January 30, 1946.
14. American Medical Association. Digest of Official Actions: 1846-1958. JAMA. 1946;132:1094.
15. Wentz DK, Ford CV. A brief history of the internship. JAMA. 1984;252(24):3390-3394. doi:10.1001/jama.1984.03350240036035
16. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education academic year 2022-2021. Accessed August 8, 2022. https://www.va.gov/OAA/docs/OAA_Stats_AY_2020_2021_FINAL.pdf
How does salt intake relate to mortality?
Intake of salt is a biological necessity, inextricably woven into physiologic systems. However, excessive salt intake is associated with high blood pressure. Hypertension is linked to increased cardiovascular morbidity and mortality, and it is estimated that excessive salt intake causes approximately 5 million deaths per year worldwide. Reducing salt intake lowers blood pressure, but processed foods contain “hidden” salt, which makes dietary control of salt difficult. This problem is compounded by growing inequalities in food systems, which present another hurdle to sustaining individual dietary control of salt intake.
Of the 87 risk factors included in the Global Burden of Diseases, Injuries, and Risk Factors Study 2019, high systolic blood pressure was identified as the leading risk factor for disease burden at the global level and for its effect on human health. A range of strategies, including primary care management and reduction in sodium intake, are known to reduce the burden of this critical risk factor. Two questions remain unanswered:
Cardiovascular disease and death
Because dietary sodium intake has been identified as a risk factor for cardiovascular disease and premature death, high sodium intake can be expected to curtail life span. A study tested this hypothesis by analyzing the relationship between sodium intake and life expectancy and survival in 181 countries. Sodium intake correlated positively with life expectancy and inversely with all-cause mortality worldwide and in high-income countries, which argues against dietary sodium intake curtailing life span or a being risk factor for premature death. These results help fuel a scientific debate about sodium intake, life expectancy, and mortality. The debate requires interpreting composite data of positive linear, J-shaped, or inverse linear correlations, which underscores the uncertainty regarding this issue.
In a prospective study of 501,379 participants from the UK Biobank, researchers found that higher frequency of adding salt to foods was significantly associated with a higher risk of premature mortality and lower life expectancy independently of diet, lifestyle, socioeconomic level, and preexisting diseases. They found that the positive association appeared to be attenuated with increasing intake of high-potassium foods (vegetables and fruits).
In addition, the researchers made the following observations:
- For cause-specific premature mortality, they found that higher frequency of adding salt to foods was significantly associated with a higher risk of cardiovascular disease mortality and cancer mortality (P-trend < .001 and P-trend < .001, respectively).
- Always adding salt to foods was associated with the lower life expectancy at the age of 50 years by 1.50 (95% confidence interval, 0.72-2.30) and 2.28 (95% CI, 1.66-2.90) years for women and men, respectively, compared with participants who never or rarely added salt to foods.
The researchers noted that adding salt to foods (usually at the table) is common and is directly related to an individual’s long-term preference for salty foods and habitual salt intake. Indeed, in the Western diet, adding salt at the table accounts for 6%-20% of total salt intake. In addition, commonly used table salt contains 97%-99% sodium chloride, minimizing the potential confounding effects of other dietary factors, including potassium. Therefore, adding salt to foods provides a way to evaluate the association between habitual sodium intake and mortality – something that is relevant, given that it has been estimated that in 2010, a total of 1.65 million deaths from cardiovascular causes were attributable to consumption of more than 2.0 g of sodium per day.
Salt sensitivity
Current evidence supports a recommendation for moderate sodium intake in the general population (3-5 g/day). Persons with hypertension should consume salt at the lower end of that range. Some dietary guidelines recommend consuming less than 2,300 mg dietary sodium per day for persons aged 14 years or older and less for persons aged 2-13 years. Although low sodium intake (< 2.0 g/day) has been achieved in short-term clinical trials, sustained low sodium intake has not been achieved in any of the longer-term clinical trials (duration > 6 months).
The controversy continues as to the relationship between low sodium intake and blood pressure or cardiovascular diseases. Most studies show that both in individuals with hypertension and those without, blood pressure is reduced by consuming less sodium. However, it is not necessarily lowered by reducing sodium intake (< 3-5 g/day). With a sodium-rich diet, most normotensive individuals experienced a minimal change in mean arterial pressure; for many individuals with hypertension, the values increased by about 4 mm Hg. In addition, among individuals with hypertension who are “salt sensitive,” arterial pressure can increase by > 10 mm Hg in response to high sodium intake.
The effect of potassium
Replacing some of the sodium chloride in regular salt with potassium chloride may mitigate some of salt’s harmful cardiovascular effects. Indeed, salt substitutes that have reduced sodium levels and increased potassium levels have been shown to lower blood pressure.
In one trial, researchers enrolled over 20,000 persons from 600 villages in rural China and compared the use of regular salt (100% sodium chloride) with the use of a salt substitute (75% sodium chloride and 25% potassium chloride by mass).
The participants were at high risk for stroke, cardiovascular events, and death. The mean duration of follow-up was 4.74 years. The results were surprising. The rate of stroke was lower with the salt substitute than with regular salt (29.14 events vs. 33.65 events per 1,000 person-years; rate ratio, 0.86; 95% CI, 0.77-0.96; P = .006), as were the rates of major cardiovascular events and death from any cause. The rate of serious adverse events attributed to hyperkalemia was not significantly higher with the salt substitute than with regular salt.
Although there is an ongoing debate about the extent of salt’s effects on the cardiovascular system, there is no doubt that in most places in the world, people are consuming more salt than the body needs.
A lot depends upon the kind of diet consumed by a particular population. Processed food is rarely used in rural areas, such as those involved in the above-mentioned trial, with dietary sodium chloride being added while preparing food at home. This is a determining factor with regard to cardiovascular outcomes, but it cannot be generalized to other social-environmental settings.
In much of the world, commercial food preservation introduces a lot of sodium chloride into the diet, and most salt intake could not be fully attributed to the use of salt substitutes. Indeed, by comparing the sodium content of cereal-based products currently sold on the Italian market with the respective benchmarks proposed by the World Health Organization, researchers found that for most items, the sodium content is much higher than the benchmarks, especially with flatbreads, leavened breads, and crackers/savory biscuits. This shows that there is work to be done to achieve the World Health Organization/United Nations objective of a 30% global reduction in sodium intake by 2025.
This article was translated from Univadis Italy. A version of this article first appeared on Medscape.com.
Intake of salt is a biological necessity, inextricably woven into physiologic systems. However, excessive salt intake is associated with high blood pressure. Hypertension is linked to increased cardiovascular morbidity and mortality, and it is estimated that excessive salt intake causes approximately 5 million deaths per year worldwide. Reducing salt intake lowers blood pressure, but processed foods contain “hidden” salt, which makes dietary control of salt difficult. This problem is compounded by growing inequalities in food systems, which present another hurdle to sustaining individual dietary control of salt intake.
Of the 87 risk factors included in the Global Burden of Diseases, Injuries, and Risk Factors Study 2019, high systolic blood pressure was identified as the leading risk factor for disease burden at the global level and for its effect on human health. A range of strategies, including primary care management and reduction in sodium intake, are known to reduce the burden of this critical risk factor. Two questions remain unanswered:
Cardiovascular disease and death
Because dietary sodium intake has been identified as a risk factor for cardiovascular disease and premature death, high sodium intake can be expected to curtail life span. A study tested this hypothesis by analyzing the relationship between sodium intake and life expectancy and survival in 181 countries. Sodium intake correlated positively with life expectancy and inversely with all-cause mortality worldwide and in high-income countries, which argues against dietary sodium intake curtailing life span or a being risk factor for premature death. These results help fuel a scientific debate about sodium intake, life expectancy, and mortality. The debate requires interpreting composite data of positive linear, J-shaped, or inverse linear correlations, which underscores the uncertainty regarding this issue.
In a prospective study of 501,379 participants from the UK Biobank, researchers found that higher frequency of adding salt to foods was significantly associated with a higher risk of premature mortality and lower life expectancy independently of diet, lifestyle, socioeconomic level, and preexisting diseases. They found that the positive association appeared to be attenuated with increasing intake of high-potassium foods (vegetables and fruits).
In addition, the researchers made the following observations:
- For cause-specific premature mortality, they found that higher frequency of adding salt to foods was significantly associated with a higher risk of cardiovascular disease mortality and cancer mortality (P-trend < .001 and P-trend < .001, respectively).
- Always adding salt to foods was associated with the lower life expectancy at the age of 50 years by 1.50 (95% confidence interval, 0.72-2.30) and 2.28 (95% CI, 1.66-2.90) years for women and men, respectively, compared with participants who never or rarely added salt to foods.
The researchers noted that adding salt to foods (usually at the table) is common and is directly related to an individual’s long-term preference for salty foods and habitual salt intake. Indeed, in the Western diet, adding salt at the table accounts for 6%-20% of total salt intake. In addition, commonly used table salt contains 97%-99% sodium chloride, minimizing the potential confounding effects of other dietary factors, including potassium. Therefore, adding salt to foods provides a way to evaluate the association between habitual sodium intake and mortality – something that is relevant, given that it has been estimated that in 2010, a total of 1.65 million deaths from cardiovascular causes were attributable to consumption of more than 2.0 g of sodium per day.
Salt sensitivity
Current evidence supports a recommendation for moderate sodium intake in the general population (3-5 g/day). Persons with hypertension should consume salt at the lower end of that range. Some dietary guidelines recommend consuming less than 2,300 mg dietary sodium per day for persons aged 14 years or older and less for persons aged 2-13 years. Although low sodium intake (< 2.0 g/day) has been achieved in short-term clinical trials, sustained low sodium intake has not been achieved in any of the longer-term clinical trials (duration > 6 months).
The controversy continues as to the relationship between low sodium intake and blood pressure or cardiovascular diseases. Most studies show that both in individuals with hypertension and those without, blood pressure is reduced by consuming less sodium. However, it is not necessarily lowered by reducing sodium intake (< 3-5 g/day). With a sodium-rich diet, most normotensive individuals experienced a minimal change in mean arterial pressure; for many individuals with hypertension, the values increased by about 4 mm Hg. In addition, among individuals with hypertension who are “salt sensitive,” arterial pressure can increase by > 10 mm Hg in response to high sodium intake.
The effect of potassium
Replacing some of the sodium chloride in regular salt with potassium chloride may mitigate some of salt’s harmful cardiovascular effects. Indeed, salt substitutes that have reduced sodium levels and increased potassium levels have been shown to lower blood pressure.
In one trial, researchers enrolled over 20,000 persons from 600 villages in rural China and compared the use of regular salt (100% sodium chloride) with the use of a salt substitute (75% sodium chloride and 25% potassium chloride by mass).
The participants were at high risk for stroke, cardiovascular events, and death. The mean duration of follow-up was 4.74 years. The results were surprising. The rate of stroke was lower with the salt substitute than with regular salt (29.14 events vs. 33.65 events per 1,000 person-years; rate ratio, 0.86; 95% CI, 0.77-0.96; P = .006), as were the rates of major cardiovascular events and death from any cause. The rate of serious adverse events attributed to hyperkalemia was not significantly higher with the salt substitute than with regular salt.
Although there is an ongoing debate about the extent of salt’s effects on the cardiovascular system, there is no doubt that in most places in the world, people are consuming more salt than the body needs.
A lot depends upon the kind of diet consumed by a particular population. Processed food is rarely used in rural areas, such as those involved in the above-mentioned trial, with dietary sodium chloride being added while preparing food at home. This is a determining factor with regard to cardiovascular outcomes, but it cannot be generalized to other social-environmental settings.
In much of the world, commercial food preservation introduces a lot of sodium chloride into the diet, and most salt intake could not be fully attributed to the use of salt substitutes. Indeed, by comparing the sodium content of cereal-based products currently sold on the Italian market with the respective benchmarks proposed by the World Health Organization, researchers found that for most items, the sodium content is much higher than the benchmarks, especially with flatbreads, leavened breads, and crackers/savory biscuits. This shows that there is work to be done to achieve the World Health Organization/United Nations objective of a 30% global reduction in sodium intake by 2025.
This article was translated from Univadis Italy. A version of this article first appeared on Medscape.com.
Intake of salt is a biological necessity, inextricably woven into physiologic systems. However, excessive salt intake is associated with high blood pressure. Hypertension is linked to increased cardiovascular morbidity and mortality, and it is estimated that excessive salt intake causes approximately 5 million deaths per year worldwide. Reducing salt intake lowers blood pressure, but processed foods contain “hidden” salt, which makes dietary control of salt difficult. This problem is compounded by growing inequalities in food systems, which present another hurdle to sustaining individual dietary control of salt intake.
Of the 87 risk factors included in the Global Burden of Diseases, Injuries, and Risk Factors Study 2019, high systolic blood pressure was identified as the leading risk factor for disease burden at the global level and for its effect on human health. A range of strategies, including primary care management and reduction in sodium intake, are known to reduce the burden of this critical risk factor. Two questions remain unanswered:
Cardiovascular disease and death
Because dietary sodium intake has been identified as a risk factor for cardiovascular disease and premature death, high sodium intake can be expected to curtail life span. A study tested this hypothesis by analyzing the relationship between sodium intake and life expectancy and survival in 181 countries. Sodium intake correlated positively with life expectancy and inversely with all-cause mortality worldwide and in high-income countries, which argues against dietary sodium intake curtailing life span or a being risk factor for premature death. These results help fuel a scientific debate about sodium intake, life expectancy, and mortality. The debate requires interpreting composite data of positive linear, J-shaped, or inverse linear correlations, which underscores the uncertainty regarding this issue.
In a prospective study of 501,379 participants from the UK Biobank, researchers found that higher frequency of adding salt to foods was significantly associated with a higher risk of premature mortality and lower life expectancy independently of diet, lifestyle, socioeconomic level, and preexisting diseases. They found that the positive association appeared to be attenuated with increasing intake of high-potassium foods (vegetables and fruits).
In addition, the researchers made the following observations:
- For cause-specific premature mortality, they found that higher frequency of adding salt to foods was significantly associated with a higher risk of cardiovascular disease mortality and cancer mortality (P-trend < .001 and P-trend < .001, respectively).
- Always adding salt to foods was associated with the lower life expectancy at the age of 50 years by 1.50 (95% confidence interval, 0.72-2.30) and 2.28 (95% CI, 1.66-2.90) years for women and men, respectively, compared with participants who never or rarely added salt to foods.
The researchers noted that adding salt to foods (usually at the table) is common and is directly related to an individual’s long-term preference for salty foods and habitual salt intake. Indeed, in the Western diet, adding salt at the table accounts for 6%-20% of total salt intake. In addition, commonly used table salt contains 97%-99% sodium chloride, minimizing the potential confounding effects of other dietary factors, including potassium. Therefore, adding salt to foods provides a way to evaluate the association between habitual sodium intake and mortality – something that is relevant, given that it has been estimated that in 2010, a total of 1.65 million deaths from cardiovascular causes were attributable to consumption of more than 2.0 g of sodium per day.
Salt sensitivity
Current evidence supports a recommendation for moderate sodium intake in the general population (3-5 g/day). Persons with hypertension should consume salt at the lower end of that range. Some dietary guidelines recommend consuming less than 2,300 mg dietary sodium per day for persons aged 14 years or older and less for persons aged 2-13 years. Although low sodium intake (< 2.0 g/day) has been achieved in short-term clinical trials, sustained low sodium intake has not been achieved in any of the longer-term clinical trials (duration > 6 months).
The controversy continues as to the relationship between low sodium intake and blood pressure or cardiovascular diseases. Most studies show that both in individuals with hypertension and those without, blood pressure is reduced by consuming less sodium. However, it is not necessarily lowered by reducing sodium intake (< 3-5 g/day). With a sodium-rich diet, most normotensive individuals experienced a minimal change in mean arterial pressure; for many individuals with hypertension, the values increased by about 4 mm Hg. In addition, among individuals with hypertension who are “salt sensitive,” arterial pressure can increase by > 10 mm Hg in response to high sodium intake.
The effect of potassium
Replacing some of the sodium chloride in regular salt with potassium chloride may mitigate some of salt’s harmful cardiovascular effects. Indeed, salt substitutes that have reduced sodium levels and increased potassium levels have been shown to lower blood pressure.
In one trial, researchers enrolled over 20,000 persons from 600 villages in rural China and compared the use of regular salt (100% sodium chloride) with the use of a salt substitute (75% sodium chloride and 25% potassium chloride by mass).
The participants were at high risk for stroke, cardiovascular events, and death. The mean duration of follow-up was 4.74 years. The results were surprising. The rate of stroke was lower with the salt substitute than with regular salt (29.14 events vs. 33.65 events per 1,000 person-years; rate ratio, 0.86; 95% CI, 0.77-0.96; P = .006), as were the rates of major cardiovascular events and death from any cause. The rate of serious adverse events attributed to hyperkalemia was not significantly higher with the salt substitute than with regular salt.
Although there is an ongoing debate about the extent of salt’s effects on the cardiovascular system, there is no doubt that in most places in the world, people are consuming more salt than the body needs.
A lot depends upon the kind of diet consumed by a particular population. Processed food is rarely used in rural areas, such as those involved in the above-mentioned trial, with dietary sodium chloride being added while preparing food at home. This is a determining factor with regard to cardiovascular outcomes, but it cannot be generalized to other social-environmental settings.
In much of the world, commercial food preservation introduces a lot of sodium chloride into the diet, and most salt intake could not be fully attributed to the use of salt substitutes. Indeed, by comparing the sodium content of cereal-based products currently sold on the Italian market with the respective benchmarks proposed by the World Health Organization, researchers found that for most items, the sodium content is much higher than the benchmarks, especially with flatbreads, leavened breads, and crackers/savory biscuits. This shows that there is work to be done to achieve the World Health Organization/United Nations objective of a 30% global reduction in sodium intake by 2025.
This article was translated from Univadis Italy. A version of this article first appeared on Medscape.com.
The potential problem(s) with a once-a-year COVID vaccine
Comments from the White House this week suggesting a once-a-year COVID-19 shot for most Americans, “just like your annual flu shot,” were met with backlash from many who say COVID and influenza come from different viruses and need different schedules.
Remarks, from “capitulation” to too few data, hit the airwaves and social media.
Some, however, agree with the White House vision and say that asking people to get one shot in the fall instead of periodic pushes for boosters will raise public confidence and buy-in and reduce consumer confusion.
Health leaders, including Bob Wachter, MD, chair of the department of medicine at the University of California, San Francisco, say they like the framing of the concept – that people who are not high-risk should plan each year for a COVID shot and a flu shot.
& we need strategy to bump uptake,” Dr. Wachter tweeted this week.
But the numbers of Americans seeking boosters remain low. Only one-third of all eligible people 50 years and older have gotten a second COVID booster, according to the Centers for Disease Control and Prevention. About half of those who got the original two shots got a first booster.
Meanwhile, the United States is still averaging about 70,000 new COVID cases and more than 300 deaths every day.
The suggested change in approach comes as Pfizer/BioNTech and Moderna roll out their new boosters that target Omicron subvariants BA.4 and BA.5 after the CDC recommended their use and the U.S. Food and Drug Administration approved emergency use authorization.
“As the virus continues to change, we will now be able to update our vaccines annually to target the dominant variant,” President Joe Biden said in a statement promoting the yearly approach.
Some say annual shot premature
Other experts say it’s too soon to tell whether an annual approach will work.
“We have no data to support that current vaccines, including the new BA.5 booster, will provide durable protection beyond 4-6 months. It would be good to aspire to this objective, and much longer duration or protection, but that will likely require next generation and nasal vaccines,” said Eric Topol, MD, Medscape’s editor-in-chief and founder and director of the Scripps Research Translational Institute.
A report in Nature Reviews Immunology states, “Mucosal vaccines offer the potential to trigger robust protective immune responses at the predominant sites of pathogen infection” and potentially “can prevent an infection from becoming established in the first place, rather than only curtailing infection and protecting against the development of disease symptoms.”
Dr. Topol tweeted after the White House statements, “[An annual vaccine] has the ring of Covid capitulation.”
William Schaffner, MD, an infectious disease expert at Vanderbilt University, Nashville, Tenn., told this news organization that he cautions against interpreting the White House comments as official policy.
“This is the difficulty of having public health announcements come out of Washington,” he said. “They ought to come out of the CDC.”
He says there is a reasonable analogy between COVID and influenza, but warns, “don’t push the analogy.”
They are both serious respiratory viruses that can cause much illness and death in essentially the same populations, he notes. These are the older, frail people, people who have underlying illnesses or are immunocompromised.
Both viruses also mutate. But there the paths diverge.
“We’ve gotten into a pattern of annually updating the influenza vaccine because it is such a singularly seasonal virus,” Dr. Schaffner said. “Basically it disappears during the summer. We’ve had plenty of COVID during the summers.”
For COVID, he said, “We will need a periodic booster. Could this be annually? That would certainly make it easier.” But it’s too soon to tell, he said.
Dr. Schaffner noted that several manufacturers are working on a combined flu/COVID vaccine.
Just a ‘first step’ toward annual shot
The currently updated COVID vaccine may be the first step toward an annual vaccine, but it’s only the first step, Dr. Schaffner said. “We haven’t committed to further steps yet because we’re watching this virus.”
Syra Madad, DHSc, MSc, an infectious disease epidemiologist at Harvard University’s Belfer Center for Science and International Affairs, Cambridge, Mass., and the New York City hospital system, told this news organization that arguments on both sides make sense.
Having a single message once a year can help eliminate the considerable confusion involving people on individual timelines with different levels of immunity and separate campaigns for COVID and flu shots coming at different times of the year.
“Communication around vaccines is very muddled and that shows in our overall vaccination rates, particularly booster rates,” she says. “The overall strategy is hopeful and makes sense if we’re going to progress that way based on data.”
However, she said that the data are just not there yet to show it’s time for an annual vaccine. First, scientists will need to see how long protection lasts with the Omicron-specific vaccine and how well and how long it protects against severe disease and death as well as infection.
COVID is less predictable than influenza and the influenza vaccine has been around for decades, Dr. Madad noted. With influenza, the patterns are more easily anticipated with their “ladder-like pattern,” she said. “COVID-19 is not like that.”
What is hopeful, she said, “is that we’ve been in the Omicron dynasty since November of 2021. I’m hopeful that we’ll stick with that particular variant.”
Dr. Topol, Dr. Schaffner, and Dr. Madad declared no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Comments from the White House this week suggesting a once-a-year COVID-19 shot for most Americans, “just like your annual flu shot,” were met with backlash from many who say COVID and influenza come from different viruses and need different schedules.
Remarks, from “capitulation” to too few data, hit the airwaves and social media.
Some, however, agree with the White House vision and say that asking people to get one shot in the fall instead of periodic pushes for boosters will raise public confidence and buy-in and reduce consumer confusion.
Health leaders, including Bob Wachter, MD, chair of the department of medicine at the University of California, San Francisco, say they like the framing of the concept – that people who are not high-risk should plan each year for a COVID shot and a flu shot.
& we need strategy to bump uptake,” Dr. Wachter tweeted this week.
But the numbers of Americans seeking boosters remain low. Only one-third of all eligible people 50 years and older have gotten a second COVID booster, according to the Centers for Disease Control and Prevention. About half of those who got the original two shots got a first booster.
Meanwhile, the United States is still averaging about 70,000 new COVID cases and more than 300 deaths every day.
The suggested change in approach comes as Pfizer/BioNTech and Moderna roll out their new boosters that target Omicron subvariants BA.4 and BA.5 after the CDC recommended their use and the U.S. Food and Drug Administration approved emergency use authorization.
“As the virus continues to change, we will now be able to update our vaccines annually to target the dominant variant,” President Joe Biden said in a statement promoting the yearly approach.
Some say annual shot premature
Other experts say it’s too soon to tell whether an annual approach will work.
“We have no data to support that current vaccines, including the new BA.5 booster, will provide durable protection beyond 4-6 months. It would be good to aspire to this objective, and much longer duration or protection, but that will likely require next generation and nasal vaccines,” said Eric Topol, MD, Medscape’s editor-in-chief and founder and director of the Scripps Research Translational Institute.
A report in Nature Reviews Immunology states, “Mucosal vaccines offer the potential to trigger robust protective immune responses at the predominant sites of pathogen infection” and potentially “can prevent an infection from becoming established in the first place, rather than only curtailing infection and protecting against the development of disease symptoms.”
Dr. Topol tweeted after the White House statements, “[An annual vaccine] has the ring of Covid capitulation.”
William Schaffner, MD, an infectious disease expert at Vanderbilt University, Nashville, Tenn., told this news organization that he cautions against interpreting the White House comments as official policy.
“This is the difficulty of having public health announcements come out of Washington,” he said. “They ought to come out of the CDC.”
He says there is a reasonable analogy between COVID and influenza, but warns, “don’t push the analogy.”
They are both serious respiratory viruses that can cause much illness and death in essentially the same populations, he notes. These are the older, frail people, people who have underlying illnesses or are immunocompromised.
Both viruses also mutate. But there the paths diverge.
“We’ve gotten into a pattern of annually updating the influenza vaccine because it is such a singularly seasonal virus,” Dr. Schaffner said. “Basically it disappears during the summer. We’ve had plenty of COVID during the summers.”
For COVID, he said, “We will need a periodic booster. Could this be annually? That would certainly make it easier.” But it’s too soon to tell, he said.
Dr. Schaffner noted that several manufacturers are working on a combined flu/COVID vaccine.
Just a ‘first step’ toward annual shot
The currently updated COVID vaccine may be the first step toward an annual vaccine, but it’s only the first step, Dr. Schaffner said. “We haven’t committed to further steps yet because we’re watching this virus.”
Syra Madad, DHSc, MSc, an infectious disease epidemiologist at Harvard University’s Belfer Center for Science and International Affairs, Cambridge, Mass., and the New York City hospital system, told this news organization that arguments on both sides make sense.
Having a single message once a year can help eliminate the considerable confusion involving people on individual timelines with different levels of immunity and separate campaigns for COVID and flu shots coming at different times of the year.
“Communication around vaccines is very muddled and that shows in our overall vaccination rates, particularly booster rates,” she says. “The overall strategy is hopeful and makes sense if we’re going to progress that way based on data.”
However, she said that the data are just not there yet to show it’s time for an annual vaccine. First, scientists will need to see how long protection lasts with the Omicron-specific vaccine and how well and how long it protects against severe disease and death as well as infection.
COVID is less predictable than influenza and the influenza vaccine has been around for decades, Dr. Madad noted. With influenza, the patterns are more easily anticipated with their “ladder-like pattern,” she said. “COVID-19 is not like that.”
What is hopeful, she said, “is that we’ve been in the Omicron dynasty since November of 2021. I’m hopeful that we’ll stick with that particular variant.”
Dr. Topol, Dr. Schaffner, and Dr. Madad declared no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Comments from the White House this week suggesting a once-a-year COVID-19 shot for most Americans, “just like your annual flu shot,” were met with backlash from many who say COVID and influenza come from different viruses and need different schedules.
Remarks, from “capitulation” to too few data, hit the airwaves and social media.
Some, however, agree with the White House vision and say that asking people to get one shot in the fall instead of periodic pushes for boosters will raise public confidence and buy-in and reduce consumer confusion.
Health leaders, including Bob Wachter, MD, chair of the department of medicine at the University of California, San Francisco, say they like the framing of the concept – that people who are not high-risk should plan each year for a COVID shot and a flu shot.
& we need strategy to bump uptake,” Dr. Wachter tweeted this week.
But the numbers of Americans seeking boosters remain low. Only one-third of all eligible people 50 years and older have gotten a second COVID booster, according to the Centers for Disease Control and Prevention. About half of those who got the original two shots got a first booster.
Meanwhile, the United States is still averaging about 70,000 new COVID cases and more than 300 deaths every day.
The suggested change in approach comes as Pfizer/BioNTech and Moderna roll out their new boosters that target Omicron subvariants BA.4 and BA.5 after the CDC recommended their use and the U.S. Food and Drug Administration approved emergency use authorization.
“As the virus continues to change, we will now be able to update our vaccines annually to target the dominant variant,” President Joe Biden said in a statement promoting the yearly approach.
Some say annual shot premature
Other experts say it’s too soon to tell whether an annual approach will work.
“We have no data to support that current vaccines, including the new BA.5 booster, will provide durable protection beyond 4-6 months. It would be good to aspire to this objective, and much longer duration or protection, but that will likely require next generation and nasal vaccines,” said Eric Topol, MD, Medscape’s editor-in-chief and founder and director of the Scripps Research Translational Institute.
A report in Nature Reviews Immunology states, “Mucosal vaccines offer the potential to trigger robust protective immune responses at the predominant sites of pathogen infection” and potentially “can prevent an infection from becoming established in the first place, rather than only curtailing infection and protecting against the development of disease symptoms.”
Dr. Topol tweeted after the White House statements, “[An annual vaccine] has the ring of Covid capitulation.”
William Schaffner, MD, an infectious disease expert at Vanderbilt University, Nashville, Tenn., told this news organization that he cautions against interpreting the White House comments as official policy.
“This is the difficulty of having public health announcements come out of Washington,” he said. “They ought to come out of the CDC.”
He says there is a reasonable analogy between COVID and influenza, but warns, “don’t push the analogy.”
They are both serious respiratory viruses that can cause much illness and death in essentially the same populations, he notes. These are the older, frail people, people who have underlying illnesses or are immunocompromised.
Both viruses also mutate. But there the paths diverge.
“We’ve gotten into a pattern of annually updating the influenza vaccine because it is such a singularly seasonal virus,” Dr. Schaffner said. “Basically it disappears during the summer. We’ve had plenty of COVID during the summers.”
For COVID, he said, “We will need a periodic booster. Could this be annually? That would certainly make it easier.” But it’s too soon to tell, he said.
Dr. Schaffner noted that several manufacturers are working on a combined flu/COVID vaccine.
Just a ‘first step’ toward annual shot
The currently updated COVID vaccine may be the first step toward an annual vaccine, but it’s only the first step, Dr. Schaffner said. “We haven’t committed to further steps yet because we’re watching this virus.”
Syra Madad, DHSc, MSc, an infectious disease epidemiologist at Harvard University’s Belfer Center for Science and International Affairs, Cambridge, Mass., and the New York City hospital system, told this news organization that arguments on both sides make sense.
Having a single message once a year can help eliminate the considerable confusion involving people on individual timelines with different levels of immunity and separate campaigns for COVID and flu shots coming at different times of the year.
“Communication around vaccines is very muddled and that shows in our overall vaccination rates, particularly booster rates,” she says. “The overall strategy is hopeful and makes sense if we’re going to progress that way based on data.”
However, she said that the data are just not there yet to show it’s time for an annual vaccine. First, scientists will need to see how long protection lasts with the Omicron-specific vaccine and how well and how long it protects against severe disease and death as well as infection.
COVID is less predictable than influenza and the influenza vaccine has been around for decades, Dr. Madad noted. With influenza, the patterns are more easily anticipated with their “ladder-like pattern,” she said. “COVID-19 is not like that.”
What is hopeful, she said, “is that we’ve been in the Omicron dynasty since November of 2021. I’m hopeful that we’ll stick with that particular variant.”
Dr. Topol, Dr. Schaffner, and Dr. Madad declared no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Flashy, blingy doc sabotages his own malpractice trial in rural farm town
During a medical malpractice trial in New Jersey, jurors waited nearly 4 hours for the physician defendant to show up. When he did arrive, the body-building surgeon was sporting two thick gold chains and a diamond pinky ring, and had the top buttons of his shirt open enough to reveal his chest hair.
“This trial was in a very rural, farming community,” recalls medical liability defense attorney Catherine Flynn, of Flynn Watts LLC, based in Parsippany, N.J. “Many of the jurors were wearing flannel shirts and jeans. The doctor’s wife walked in wearing a five-carat diamond ring and other jewelry.”
Ms. Flynn took the couple aside and asked them to remove the jewelry. She explained that the opulent accessories could damage the jury’s view of the physician. The surgeon and his wife, however, refused to remove their jewelry, she said. They didn’t think it was a big deal.
The case against the surgeon involved intraoperative damage to a patient when the physician inadvertently removed a portion of nerve in the area of the procedure. After repair of the nerve, the patient had a positive result. However, the patient alleged the surgeon’s negligence resulted in permanent damage despite the successful repair.
Jurors ultimately found the physician negligent in the case and awarded the plaintiff $1.2 million. Ms. Flynn believes that physician’s flamboyant attire and arrogant nature tainted the jury’s decision.
“In certain counties in New Jersey, his attire would not have been a problem,” she said. “In this rural, farming county, it was a huge problem. You have to know your audience. There are a lot of other things that come into play in a medical malpractice case, but when it comes to damages in a case, you don’t want to be sending the message that supports what somebody’s bias may already be telling them about a doctor.”
The surgeon appealed the verdict, and the case ultimately settled for a lesser amount, according to Ms. Flynn.
An over-the-top wardrobe is just one way that physicians can negatively influence jurors during legal trials. From subtle facial expressions to sudden outbursts to downright rudeness, attorneys have witnessed countless examples of physicians sabotaging their own trials.
“The minute you enter the courthouse, jurors or potential jurors are sizing you up,” says health law attorney Michael Clark, of Womble Bond Dickinson (US) LLP, based in Houston. “The same phenomenon occurs in a deposition. Awareness of how you are being assessed at all times, and the image that is needed, is important since a negative impression by jurors can have a detrimental effect on a physician’s case.”
Juror: We didn’t like the doctor’s shoes
In another case, attorneys warned a physician defendant against dressing in his signature wardrobe during his trial. Against their advice, the doctor showed up daily to his trial in bright pastel, monochromatic suits with matching Gucci-brand shoes, said medical liability defense attorney Meredith C. Lander, of Kaufman Borgeest & Ryan LLP, based in Connecticut. On the witness stand, the doctor was long-winded and wasn’t “terribly likable,” Ms. Lander said.
However, the evidence weighed in the physician’s favor, and there was strong testimony by defense experts. The physician won the case, Ms. Lander said, but after the verdict, the jury foreperson approached the trial attorney and made some disparaging remarks about the defendant.
“The foreperson said the jury didn’t like the doctor or his ‘Gucci suits and shoes,’ but they believed the experts,” Ms. Lander said.
Disruptive behavior can also harm jurors’ perception of physicians, Ms. Flynn adds. During one instance, a surgeon insisted on sitting next to Ms. Flynn, although she generally requests clients sit in the first row so that jurors are not so focused on their reactions during testimony. The surgeon loudly peppered Ms. Flynn with questions as witnesses testified, prompting a reprimand from the judge.
“The judge admonished the doctor several times and said, ‘Doctor, you’re raising your voice. You’ll get a chance to speak with your attorney during the break,’ ” Ms. Flynn recalled. “The doctor refused to stop talking, and the judge told him in front of the jury to go sit in the back of the courtroom. His reaction was, ‘Why do I have to move?! I need to sit here!’ ”
The surgeon eventually moved to the back of the courtroom and a sheriff’s deputy stood next to him. Testimony continued until a note in the form of a paper airplane landed on the table in front of Ms. Flynn. She carefully crumpled the note and tossed it in the wastebasket. Luckily, this drew a laugh from jurors, she said.
But things got worse when the surgeon testified. Rather than answer the questions, he interrupted and started telling jurors his own version of events.
“The judge finally said, ‘Doctor, if you don’t listen to your attorney and answer her questions, I’m going to make you get off the stand,’ ” Ms. Flynn said. “That was the most unbelievable, egregious self-sabotage trial moment I’ve ever experienced.”
Fortunately, the physician’s legal case was strong, and the experts who testified drove the defense’s side home, Ms. Flynn said. The surgeon won the case.
Attorney: Watch what you say in the elevator
Other, more subtle behaviors – while often unintentional – can also be damaging.
Physicians often let their guard down while outside the courtroom and can unknowingly wind up next to a juror in an elevator or standing in a hallway, said Laura Postilion, a partner at Quintairos, Prieto, Wood & Boyer, P.A., based in Chicago.
“For instance, a doctor is in an elevator and feels that some witness on the stand was lying,” Ms. Postilion said. “They might be very upset about it and start ranting about a witness lying, not realizing there is a juror is in the elevator with you.”
Physicians should also be cautious when speaking on the phone to their family or friends during a trial break.
“At the Daley Center in downtown Chicago, there are these long corridors and long line of windows; a lot of people will stand there during breaks. A doctor may be talking to his or her spouse and saying, ‘Yeah, this juror is sleeping!’ Jurors are [often] looking for drama. They’re looking for somebody letting their guard down. Hearing a doctor speak badly about them would certainly give them a reason to dislike the physician.”
Ms. Postilion warns against talking about jurors in or outside of the courtroom. This includes parking structures, she said.
Physicians can take additional steps to save themselves from negative judgment from jurors, attorneys say. Even before the trial starts, Ms. Postilion advises clients to make their social media accounts private. Some curious jurors may look up a physician’s social media accounts to learn more about their personal life, political leanings, or social beliefs, which could prejudice them against the doctor, she said.
Once on the stand, the words and tone used are key. The last thing a physician defendant wants is to come across as arrogant or condescending to jurors, said medical liability defense attorney Michael Moroney, of Flynn Watts LLC.
“For instance, a defendant might say, ‘Well, let me make this simple for you,’ as if they’re talking to a bunch of schoolchildren,” he said. “You don’t know who’s on the jury. That type of language can be offensive.”
Ms. Lander counsels her clients to refrain from using the common phrase, “honestly,” before answering questions on the stand.
“Everything you’re saying on the stand is presumed to be honest,” she said. “When you start an answer with, ‘Honestly…’ out of habit, it really does undercut everything that follows and everything else that’s already been said. It suggests that you were not being honest in your other answers.”
Attitude, body language speak volumes
Keep in mind that plaintiffs’ attorneys will try their best to rattle physicians on the stand and get them to appear unlikeable, says Mr. Clark, the Houston-based health law attorney. Physicians who lose their cool and begin arguing with attorneys play into their strategy.
“Plaintiffs’ attorneys have been trained in ways to get under their skin,” he said. “Righteous indignation and annoyance are best left for a rare occasion. Think about how you feel in a social setting when people are bickering in front of you. It’s uncomfortable at best. That’s how a jury feels too.”
Body language is also important, Mr. Clark notes. Physicians should avoid crossed arms, leaning back and rocking, or putting a hand on their mouth while testifying, he said. Many attorneys have practice sessions with their clients and record the interaction so that doctors can watch it and see how they look.
“Know your strengths and weaknesses,” he said. “Get help from your lawyer and perhaps consultants about how to improve these skills. Practice and preparation are important.”
Ms. Postilion goes over courtroom clothing with physician clients before trial. Anything “too flashy, too high-end, or too dumpy” should be avoided, she said. Getting accustomed to the courtroom and practicing in an empty courtroom are good ways to ensure that a physician’s voice is loud enough and projecting far enough in the courtroom, she adds.
“The doctor should try to be the best version of him- or herself to jurors,” she said. “A jury can pick up someone who’s trying to be something they’re not. A good attorney can help the doctor find the best version of themselves and capitalize on it. What is it that you want the jury to know about your care of the patient? Take that overall feeling and make sure it’s clearly expressed to the jury.”
A version of this article first appeared on Medscape.com.
During a medical malpractice trial in New Jersey, jurors waited nearly 4 hours for the physician defendant to show up. When he did arrive, the body-building surgeon was sporting two thick gold chains and a diamond pinky ring, and had the top buttons of his shirt open enough to reveal his chest hair.
“This trial was in a very rural, farming community,” recalls medical liability defense attorney Catherine Flynn, of Flynn Watts LLC, based in Parsippany, N.J. “Many of the jurors were wearing flannel shirts and jeans. The doctor’s wife walked in wearing a five-carat diamond ring and other jewelry.”
Ms. Flynn took the couple aside and asked them to remove the jewelry. She explained that the opulent accessories could damage the jury’s view of the physician. The surgeon and his wife, however, refused to remove their jewelry, she said. They didn’t think it was a big deal.
The case against the surgeon involved intraoperative damage to a patient when the physician inadvertently removed a portion of nerve in the area of the procedure. After repair of the nerve, the patient had a positive result. However, the patient alleged the surgeon’s negligence resulted in permanent damage despite the successful repair.
Jurors ultimately found the physician negligent in the case and awarded the plaintiff $1.2 million. Ms. Flynn believes that physician’s flamboyant attire and arrogant nature tainted the jury’s decision.
“In certain counties in New Jersey, his attire would not have been a problem,” she said. “In this rural, farming county, it was a huge problem. You have to know your audience. There are a lot of other things that come into play in a medical malpractice case, but when it comes to damages in a case, you don’t want to be sending the message that supports what somebody’s bias may already be telling them about a doctor.”
The surgeon appealed the verdict, and the case ultimately settled for a lesser amount, according to Ms. Flynn.
An over-the-top wardrobe is just one way that physicians can negatively influence jurors during legal trials. From subtle facial expressions to sudden outbursts to downright rudeness, attorneys have witnessed countless examples of physicians sabotaging their own trials.
“The minute you enter the courthouse, jurors or potential jurors are sizing you up,” says health law attorney Michael Clark, of Womble Bond Dickinson (US) LLP, based in Houston. “The same phenomenon occurs in a deposition. Awareness of how you are being assessed at all times, and the image that is needed, is important since a negative impression by jurors can have a detrimental effect on a physician’s case.”
Juror: We didn’t like the doctor’s shoes
In another case, attorneys warned a physician defendant against dressing in his signature wardrobe during his trial. Against their advice, the doctor showed up daily to his trial in bright pastel, monochromatic suits with matching Gucci-brand shoes, said medical liability defense attorney Meredith C. Lander, of Kaufman Borgeest & Ryan LLP, based in Connecticut. On the witness stand, the doctor was long-winded and wasn’t “terribly likable,” Ms. Lander said.
However, the evidence weighed in the physician’s favor, and there was strong testimony by defense experts. The physician won the case, Ms. Lander said, but after the verdict, the jury foreperson approached the trial attorney and made some disparaging remarks about the defendant.
“The foreperson said the jury didn’t like the doctor or his ‘Gucci suits and shoes,’ but they believed the experts,” Ms. Lander said.
Disruptive behavior can also harm jurors’ perception of physicians, Ms. Flynn adds. During one instance, a surgeon insisted on sitting next to Ms. Flynn, although she generally requests clients sit in the first row so that jurors are not so focused on their reactions during testimony. The surgeon loudly peppered Ms. Flynn with questions as witnesses testified, prompting a reprimand from the judge.
“The judge admonished the doctor several times and said, ‘Doctor, you’re raising your voice. You’ll get a chance to speak with your attorney during the break,’ ” Ms. Flynn recalled. “The doctor refused to stop talking, and the judge told him in front of the jury to go sit in the back of the courtroom. His reaction was, ‘Why do I have to move?! I need to sit here!’ ”
The surgeon eventually moved to the back of the courtroom and a sheriff’s deputy stood next to him. Testimony continued until a note in the form of a paper airplane landed on the table in front of Ms. Flynn. She carefully crumpled the note and tossed it in the wastebasket. Luckily, this drew a laugh from jurors, she said.
But things got worse when the surgeon testified. Rather than answer the questions, he interrupted and started telling jurors his own version of events.
“The judge finally said, ‘Doctor, if you don’t listen to your attorney and answer her questions, I’m going to make you get off the stand,’ ” Ms. Flynn said. “That was the most unbelievable, egregious self-sabotage trial moment I’ve ever experienced.”
Fortunately, the physician’s legal case was strong, and the experts who testified drove the defense’s side home, Ms. Flynn said. The surgeon won the case.
Attorney: Watch what you say in the elevator
Other, more subtle behaviors – while often unintentional – can also be damaging.
Physicians often let their guard down while outside the courtroom and can unknowingly wind up next to a juror in an elevator or standing in a hallway, said Laura Postilion, a partner at Quintairos, Prieto, Wood & Boyer, P.A., based in Chicago.
“For instance, a doctor is in an elevator and feels that some witness on the stand was lying,” Ms. Postilion said. “They might be very upset about it and start ranting about a witness lying, not realizing there is a juror is in the elevator with you.”
Physicians should also be cautious when speaking on the phone to their family or friends during a trial break.
“At the Daley Center in downtown Chicago, there are these long corridors and long line of windows; a lot of people will stand there during breaks. A doctor may be talking to his or her spouse and saying, ‘Yeah, this juror is sleeping!’ Jurors are [often] looking for drama. They’re looking for somebody letting their guard down. Hearing a doctor speak badly about them would certainly give them a reason to dislike the physician.”
Ms. Postilion warns against talking about jurors in or outside of the courtroom. This includes parking structures, she said.
Physicians can take additional steps to save themselves from negative judgment from jurors, attorneys say. Even before the trial starts, Ms. Postilion advises clients to make their social media accounts private. Some curious jurors may look up a physician’s social media accounts to learn more about their personal life, political leanings, or social beliefs, which could prejudice them against the doctor, she said.
Once on the stand, the words and tone used are key. The last thing a physician defendant wants is to come across as arrogant or condescending to jurors, said medical liability defense attorney Michael Moroney, of Flynn Watts LLC.
“For instance, a defendant might say, ‘Well, let me make this simple for you,’ as if they’re talking to a bunch of schoolchildren,” he said. “You don’t know who’s on the jury. That type of language can be offensive.”
Ms. Lander counsels her clients to refrain from using the common phrase, “honestly,” before answering questions on the stand.
“Everything you’re saying on the stand is presumed to be honest,” she said. “When you start an answer with, ‘Honestly…’ out of habit, it really does undercut everything that follows and everything else that’s already been said. It suggests that you were not being honest in your other answers.”
Attitude, body language speak volumes
Keep in mind that plaintiffs’ attorneys will try their best to rattle physicians on the stand and get them to appear unlikeable, says Mr. Clark, the Houston-based health law attorney. Physicians who lose their cool and begin arguing with attorneys play into their strategy.
“Plaintiffs’ attorneys have been trained in ways to get under their skin,” he said. “Righteous indignation and annoyance are best left for a rare occasion. Think about how you feel in a social setting when people are bickering in front of you. It’s uncomfortable at best. That’s how a jury feels too.”
Body language is also important, Mr. Clark notes. Physicians should avoid crossed arms, leaning back and rocking, or putting a hand on their mouth while testifying, he said. Many attorneys have practice sessions with their clients and record the interaction so that doctors can watch it and see how they look.
“Know your strengths and weaknesses,” he said. “Get help from your lawyer and perhaps consultants about how to improve these skills. Practice and preparation are important.”
Ms. Postilion goes over courtroom clothing with physician clients before trial. Anything “too flashy, too high-end, or too dumpy” should be avoided, she said. Getting accustomed to the courtroom and practicing in an empty courtroom are good ways to ensure that a physician’s voice is loud enough and projecting far enough in the courtroom, she adds.
“The doctor should try to be the best version of him- or herself to jurors,” she said. “A jury can pick up someone who’s trying to be something they’re not. A good attorney can help the doctor find the best version of themselves and capitalize on it. What is it that you want the jury to know about your care of the patient? Take that overall feeling and make sure it’s clearly expressed to the jury.”
A version of this article first appeared on Medscape.com.
During a medical malpractice trial in New Jersey, jurors waited nearly 4 hours for the physician defendant to show up. When he did arrive, the body-building surgeon was sporting two thick gold chains and a diamond pinky ring, and had the top buttons of his shirt open enough to reveal his chest hair.
“This trial was in a very rural, farming community,” recalls medical liability defense attorney Catherine Flynn, of Flynn Watts LLC, based in Parsippany, N.J. “Many of the jurors were wearing flannel shirts and jeans. The doctor’s wife walked in wearing a five-carat diamond ring and other jewelry.”
Ms. Flynn took the couple aside and asked them to remove the jewelry. She explained that the opulent accessories could damage the jury’s view of the physician. The surgeon and his wife, however, refused to remove their jewelry, she said. They didn’t think it was a big deal.
The case against the surgeon involved intraoperative damage to a patient when the physician inadvertently removed a portion of nerve in the area of the procedure. After repair of the nerve, the patient had a positive result. However, the patient alleged the surgeon’s negligence resulted in permanent damage despite the successful repair.
Jurors ultimately found the physician negligent in the case and awarded the plaintiff $1.2 million. Ms. Flynn believes that physician’s flamboyant attire and arrogant nature tainted the jury’s decision.
“In certain counties in New Jersey, his attire would not have been a problem,” she said. “In this rural, farming county, it was a huge problem. You have to know your audience. There are a lot of other things that come into play in a medical malpractice case, but when it comes to damages in a case, you don’t want to be sending the message that supports what somebody’s bias may already be telling them about a doctor.”
The surgeon appealed the verdict, and the case ultimately settled for a lesser amount, according to Ms. Flynn.
An over-the-top wardrobe is just one way that physicians can negatively influence jurors during legal trials. From subtle facial expressions to sudden outbursts to downright rudeness, attorneys have witnessed countless examples of physicians sabotaging their own trials.
“The minute you enter the courthouse, jurors or potential jurors are sizing you up,” says health law attorney Michael Clark, of Womble Bond Dickinson (US) LLP, based in Houston. “The same phenomenon occurs in a deposition. Awareness of how you are being assessed at all times, and the image that is needed, is important since a negative impression by jurors can have a detrimental effect on a physician’s case.”
Juror: We didn’t like the doctor’s shoes
In another case, attorneys warned a physician defendant against dressing in his signature wardrobe during his trial. Against their advice, the doctor showed up daily to his trial in bright pastel, monochromatic suits with matching Gucci-brand shoes, said medical liability defense attorney Meredith C. Lander, of Kaufman Borgeest & Ryan LLP, based in Connecticut. On the witness stand, the doctor was long-winded and wasn’t “terribly likable,” Ms. Lander said.
However, the evidence weighed in the physician’s favor, and there was strong testimony by defense experts. The physician won the case, Ms. Lander said, but after the verdict, the jury foreperson approached the trial attorney and made some disparaging remarks about the defendant.
“The foreperson said the jury didn’t like the doctor or his ‘Gucci suits and shoes,’ but they believed the experts,” Ms. Lander said.
Disruptive behavior can also harm jurors’ perception of physicians, Ms. Flynn adds. During one instance, a surgeon insisted on sitting next to Ms. Flynn, although she generally requests clients sit in the first row so that jurors are not so focused on their reactions during testimony. The surgeon loudly peppered Ms. Flynn with questions as witnesses testified, prompting a reprimand from the judge.
“The judge admonished the doctor several times and said, ‘Doctor, you’re raising your voice. You’ll get a chance to speak with your attorney during the break,’ ” Ms. Flynn recalled. “The doctor refused to stop talking, and the judge told him in front of the jury to go sit in the back of the courtroom. His reaction was, ‘Why do I have to move?! I need to sit here!’ ”
The surgeon eventually moved to the back of the courtroom and a sheriff’s deputy stood next to him. Testimony continued until a note in the form of a paper airplane landed on the table in front of Ms. Flynn. She carefully crumpled the note and tossed it in the wastebasket. Luckily, this drew a laugh from jurors, she said.
But things got worse when the surgeon testified. Rather than answer the questions, he interrupted and started telling jurors his own version of events.
“The judge finally said, ‘Doctor, if you don’t listen to your attorney and answer her questions, I’m going to make you get off the stand,’ ” Ms. Flynn said. “That was the most unbelievable, egregious self-sabotage trial moment I’ve ever experienced.”
Fortunately, the physician’s legal case was strong, and the experts who testified drove the defense’s side home, Ms. Flynn said. The surgeon won the case.
Attorney: Watch what you say in the elevator
Other, more subtle behaviors – while often unintentional – can also be damaging.
Physicians often let their guard down while outside the courtroom and can unknowingly wind up next to a juror in an elevator or standing in a hallway, said Laura Postilion, a partner at Quintairos, Prieto, Wood & Boyer, P.A., based in Chicago.
“For instance, a doctor is in an elevator and feels that some witness on the stand was lying,” Ms. Postilion said. “They might be very upset about it and start ranting about a witness lying, not realizing there is a juror is in the elevator with you.”
Physicians should also be cautious when speaking on the phone to their family or friends during a trial break.
“At the Daley Center in downtown Chicago, there are these long corridors and long line of windows; a lot of people will stand there during breaks. A doctor may be talking to his or her spouse and saying, ‘Yeah, this juror is sleeping!’ Jurors are [often] looking for drama. They’re looking for somebody letting their guard down. Hearing a doctor speak badly about them would certainly give them a reason to dislike the physician.”
Ms. Postilion warns against talking about jurors in or outside of the courtroom. This includes parking structures, she said.
Physicians can take additional steps to save themselves from negative judgment from jurors, attorneys say. Even before the trial starts, Ms. Postilion advises clients to make their social media accounts private. Some curious jurors may look up a physician’s social media accounts to learn more about their personal life, political leanings, or social beliefs, which could prejudice them against the doctor, she said.
Once on the stand, the words and tone used are key. The last thing a physician defendant wants is to come across as arrogant or condescending to jurors, said medical liability defense attorney Michael Moroney, of Flynn Watts LLC.
“For instance, a defendant might say, ‘Well, let me make this simple for you,’ as if they’re talking to a bunch of schoolchildren,” he said. “You don’t know who’s on the jury. That type of language can be offensive.”
Ms. Lander counsels her clients to refrain from using the common phrase, “honestly,” before answering questions on the stand.
“Everything you’re saying on the stand is presumed to be honest,” she said. “When you start an answer with, ‘Honestly…’ out of habit, it really does undercut everything that follows and everything else that’s already been said. It suggests that you were not being honest in your other answers.”
Attitude, body language speak volumes
Keep in mind that plaintiffs’ attorneys will try their best to rattle physicians on the stand and get them to appear unlikeable, says Mr. Clark, the Houston-based health law attorney. Physicians who lose their cool and begin arguing with attorneys play into their strategy.
“Plaintiffs’ attorneys have been trained in ways to get under their skin,” he said. “Righteous indignation and annoyance are best left for a rare occasion. Think about how you feel in a social setting when people are bickering in front of you. It’s uncomfortable at best. That’s how a jury feels too.”
Body language is also important, Mr. Clark notes. Physicians should avoid crossed arms, leaning back and rocking, or putting a hand on their mouth while testifying, he said. Many attorneys have practice sessions with their clients and record the interaction so that doctors can watch it and see how they look.
“Know your strengths and weaknesses,” he said. “Get help from your lawyer and perhaps consultants about how to improve these skills. Practice and preparation are important.”
Ms. Postilion goes over courtroom clothing with physician clients before trial. Anything “too flashy, too high-end, or too dumpy” should be avoided, she said. Getting accustomed to the courtroom and practicing in an empty courtroom are good ways to ensure that a physician’s voice is loud enough and projecting far enough in the courtroom, she adds.
“The doctor should try to be the best version of him- or herself to jurors,” she said. “A jury can pick up someone who’s trying to be something they’re not. A good attorney can help the doctor find the best version of themselves and capitalize on it. What is it that you want the jury to know about your care of the patient? Take that overall feeling and make sure it’s clearly expressed to the jury.”
A version of this article first appeared on Medscape.com.
Overall survival dips with vitamin D deficiency in melanoma
Whereas the 5-year overall survival was 90% when vitamin D serum levels were above a 10 ng/mL threshold, it was 84% when levels fell below it. Notably, the gap in overall survival between those above and below the threshold appeared to widen as time went on.
The research adds to existing evidence that “vitamin D levels can play an important and independent role in patients’ survival outcomes,” study investigator Inés Gracia-Darder, MD, told this news organization. “The important application in clinical practice would be to know if vitamin D supplementation influences the survival of melanoma patients,” said Dr. Gracia-Darder, a clinical specialist in dermatology at the Hospital Universitari Son Espases, Mallorca, Spain.
Known association, but not much data
“It is not a new finding,” but there are limited data, especially in melanoma, said Julie De Smedt, MD, of KU Leuven, Belgium, who was asked to comment on the results. Other groups have shown, certainly for cancer in general, that vitamin D can have an effect on overall survival.
“Low levels of vitamin D are associated with the pathological parameters of the melanoma, such as the thickness of the tumor,” Dr. De Smedt said in an interview, indicating that it’s not just overall survival that might be affected.
“So we assume that also has an effect on melanoma-specific survival,” she added.
That assumption, however, is not supported by the data Dr. Gracia-Darder presented, as there was no difference in melanoma-specific survival among the two groups of patients that had been studied.
Retrospective cohort analysis
Vitamin D levels had been studied in 264 patients who were included in the retrospective cohort analysis. All had invasive melanomas, and all had been seen at the Hospital Clinic of Barcelona between January 1998 and June 2021. Their mean age was 57 years, and the median follow-up was 6.7 years.
For inclusion, all patients had to have had their vitamin D levels measured after being diagnosed with melanoma; those with a 25-hydroxyvitamin D3 serum level of less than 10 ng/mL were deemed to be vitamin D deficient, whereas those with levels of 10 ng/mL and above were deemed normal or insufficient.
A measurement less than 10 ng/mL is considered vitamin D deficiency, Dr. De Smedt said. “But there is a difference between countries, and there’s also a difference between societies,” noting the cut-off used in the lab where she works is 20 ng/mL. This makes it difficult to compare studies, she said.
Independent association with overall survival
Seasonal variation in vitamin D levels were considered as a possible confounding factor, but Dr. Gracia-Darder noted that there was a similar distribution of measurements taken between October to March and April to September.
Univariate and multivariate analyses established vitamin D deficiency as being independently associated with overall survival with hazard ratios of 2.34 and 2.45, respectively.
Other predictive factors were having a higher Breslow index, as well as older age and gender.
Time to recommend vitamin D supplementation?
So should patients with melanoma have their vitamin D levels routinely checked? And what about advising them to take vitamin D supplements?
“In our practice, we analyze the vitamin D levels of our patients,” Dr. Gracia-Darder said. Patients are told to limit their exposure to the sun because of their skin cancer, so they are very likely to become vitamin D deficient.
While dietary changes or supplements might be suggested, there’s no real evidence to support upping vitamin D levels to date, so “future prospective studies are needed,” Dr. Gracia-Darder added.
Such studies have already started, including one in Italy, one in Australia, and another study that Dr. De Smedt has been involved with for the past few years.
Called the ViDMe study, it’s a multicenter, randomized, double-blind trial in which patients are being given a high-dose oral vitamin D supplement or placebo once a month for at least 1 year. About 430 patients with a first cutaneous malignant melanoma have been included in the trial, which started in December 2012.
It is hoped that the results will show that the supplementation will have had a protective effect on the risk of relapse and that there will be a correlation between vitamin D levels in the blood and vitamin D receptor immunoreactivity in the tumor.
“The study is still blinded,” Dr. De Smedt said. “We will unblind in the coming months and then at the end of the year, maybe next year, we will have the results.”
The study reported by Dr. Gracia-Darder did not receive any specific funding. Dr. Gracia-Darder disclosed that the melanoma unit where the study was performed receives many grants and funds to carry out research. She reported no other relevant financial relationships. Dr. De Smedt had no relevant financial relationships. The ViDMe study is sponsored by the Universitaire Ziekenhuizen Leuven.
A version of this article first appeared on Medscape.com.
Whereas the 5-year overall survival was 90% when vitamin D serum levels were above a 10 ng/mL threshold, it was 84% when levels fell below it. Notably, the gap in overall survival between those above and below the threshold appeared to widen as time went on.
The research adds to existing evidence that “vitamin D levels can play an important and independent role in patients’ survival outcomes,” study investigator Inés Gracia-Darder, MD, told this news organization. “The important application in clinical practice would be to know if vitamin D supplementation influences the survival of melanoma patients,” said Dr. Gracia-Darder, a clinical specialist in dermatology at the Hospital Universitari Son Espases, Mallorca, Spain.
Known association, but not much data
“It is not a new finding,” but there are limited data, especially in melanoma, said Julie De Smedt, MD, of KU Leuven, Belgium, who was asked to comment on the results. Other groups have shown, certainly for cancer in general, that vitamin D can have an effect on overall survival.
“Low levels of vitamin D are associated with the pathological parameters of the melanoma, such as the thickness of the tumor,” Dr. De Smedt said in an interview, indicating that it’s not just overall survival that might be affected.
“So we assume that also has an effect on melanoma-specific survival,” she added.
That assumption, however, is not supported by the data Dr. Gracia-Darder presented, as there was no difference in melanoma-specific survival among the two groups of patients that had been studied.
Retrospective cohort analysis
Vitamin D levels had been studied in 264 patients who were included in the retrospective cohort analysis. All had invasive melanomas, and all had been seen at the Hospital Clinic of Barcelona between January 1998 and June 2021. Their mean age was 57 years, and the median follow-up was 6.7 years.
For inclusion, all patients had to have had their vitamin D levels measured after being diagnosed with melanoma; those with a 25-hydroxyvitamin D3 serum level of less than 10 ng/mL were deemed to be vitamin D deficient, whereas those with levels of 10 ng/mL and above were deemed normal or insufficient.
A measurement less than 10 ng/mL is considered vitamin D deficiency, Dr. De Smedt said. “But there is a difference between countries, and there’s also a difference between societies,” noting the cut-off used in the lab where she works is 20 ng/mL. This makes it difficult to compare studies, she said.
Independent association with overall survival
Seasonal variation in vitamin D levels were considered as a possible confounding factor, but Dr. Gracia-Darder noted that there was a similar distribution of measurements taken between October to March and April to September.
Univariate and multivariate analyses established vitamin D deficiency as being independently associated with overall survival with hazard ratios of 2.34 and 2.45, respectively.
Other predictive factors were having a higher Breslow index, as well as older age and gender.
Time to recommend vitamin D supplementation?
So should patients with melanoma have their vitamin D levels routinely checked? And what about advising them to take vitamin D supplements?
“In our practice, we analyze the vitamin D levels of our patients,” Dr. Gracia-Darder said. Patients are told to limit their exposure to the sun because of their skin cancer, so they are very likely to become vitamin D deficient.
While dietary changes or supplements might be suggested, there’s no real evidence to support upping vitamin D levels to date, so “future prospective studies are needed,” Dr. Gracia-Darder added.
Such studies have already started, including one in Italy, one in Australia, and another study that Dr. De Smedt has been involved with for the past few years.
Called the ViDMe study, it’s a multicenter, randomized, double-blind trial in which patients are being given a high-dose oral vitamin D supplement or placebo once a month for at least 1 year. About 430 patients with a first cutaneous malignant melanoma have been included in the trial, which started in December 2012.
It is hoped that the results will show that the supplementation will have had a protective effect on the risk of relapse and that there will be a correlation between vitamin D levels in the blood and vitamin D receptor immunoreactivity in the tumor.
“The study is still blinded,” Dr. De Smedt said. “We will unblind in the coming months and then at the end of the year, maybe next year, we will have the results.”
The study reported by Dr. Gracia-Darder did not receive any specific funding. Dr. Gracia-Darder disclosed that the melanoma unit where the study was performed receives many grants and funds to carry out research. She reported no other relevant financial relationships. Dr. De Smedt had no relevant financial relationships. The ViDMe study is sponsored by the Universitaire Ziekenhuizen Leuven.
A version of this article first appeared on Medscape.com.
Whereas the 5-year overall survival was 90% when vitamin D serum levels were above a 10 ng/mL threshold, it was 84% when levels fell below it. Notably, the gap in overall survival between those above and below the threshold appeared to widen as time went on.
The research adds to existing evidence that “vitamin D levels can play an important and independent role in patients’ survival outcomes,” study investigator Inés Gracia-Darder, MD, told this news organization. “The important application in clinical practice would be to know if vitamin D supplementation influences the survival of melanoma patients,” said Dr. Gracia-Darder, a clinical specialist in dermatology at the Hospital Universitari Son Espases, Mallorca, Spain.
Known association, but not much data
“It is not a new finding,” but there are limited data, especially in melanoma, said Julie De Smedt, MD, of KU Leuven, Belgium, who was asked to comment on the results. Other groups have shown, certainly for cancer in general, that vitamin D can have an effect on overall survival.
“Low levels of vitamin D are associated with the pathological parameters of the melanoma, such as the thickness of the tumor,” Dr. De Smedt said in an interview, indicating that it’s not just overall survival that might be affected.
“So we assume that also has an effect on melanoma-specific survival,” she added.
That assumption, however, is not supported by the data Dr. Gracia-Darder presented, as there was no difference in melanoma-specific survival among the two groups of patients that had been studied.
Retrospective cohort analysis
Vitamin D levels had been studied in 264 patients who were included in the retrospective cohort analysis. All had invasive melanomas, and all had been seen at the Hospital Clinic of Barcelona between January 1998 and June 2021. Their mean age was 57 years, and the median follow-up was 6.7 years.
For inclusion, all patients had to have had their vitamin D levels measured after being diagnosed with melanoma; those with a 25-hydroxyvitamin D3 serum level of less than 10 ng/mL were deemed to be vitamin D deficient, whereas those with levels of 10 ng/mL and above were deemed normal or insufficient.
A measurement less than 10 ng/mL is considered vitamin D deficiency, Dr. De Smedt said. “But there is a difference between countries, and there’s also a difference between societies,” noting the cut-off used in the lab where she works is 20 ng/mL. This makes it difficult to compare studies, she said.
Independent association with overall survival
Seasonal variation in vitamin D levels were considered as a possible confounding factor, but Dr. Gracia-Darder noted that there was a similar distribution of measurements taken between October to March and April to September.
Univariate and multivariate analyses established vitamin D deficiency as being independently associated with overall survival with hazard ratios of 2.34 and 2.45, respectively.
Other predictive factors were having a higher Breslow index, as well as older age and gender.
Time to recommend vitamin D supplementation?
So should patients with melanoma have their vitamin D levels routinely checked? And what about advising them to take vitamin D supplements?
“In our practice, we analyze the vitamin D levels of our patients,” Dr. Gracia-Darder said. Patients are told to limit their exposure to the sun because of their skin cancer, so they are very likely to become vitamin D deficient.
While dietary changes or supplements might be suggested, there’s no real evidence to support upping vitamin D levels to date, so “future prospective studies are needed,” Dr. Gracia-Darder added.
Such studies have already started, including one in Italy, one in Australia, and another study that Dr. De Smedt has been involved with for the past few years.
Called the ViDMe study, it’s a multicenter, randomized, double-blind trial in which patients are being given a high-dose oral vitamin D supplement or placebo once a month for at least 1 year. About 430 patients with a first cutaneous malignant melanoma have been included in the trial, which started in December 2012.
It is hoped that the results will show that the supplementation will have had a protective effect on the risk of relapse and that there will be a correlation between vitamin D levels in the blood and vitamin D receptor immunoreactivity in the tumor.
“The study is still blinded,” Dr. De Smedt said. “We will unblind in the coming months and then at the end of the year, maybe next year, we will have the results.”
The study reported by Dr. Gracia-Darder did not receive any specific funding. Dr. Gracia-Darder disclosed that the melanoma unit where the study was performed receives many grants and funds to carry out research. She reported no other relevant financial relationships. Dr. De Smedt had no relevant financial relationships. The ViDMe study is sponsored by the Universitaire Ziekenhuizen Leuven.
A version of this article first appeared on Medscape.com.
FROM THE EADV CONGRESS
Full-dose antithrombotic aids selected COVID-19 ICU patients
BARCELONA – Hospitalized patients in the ICU because of an acute COVID-19 infection had significantly fewer thrombotic events and complications when treated with full-dose anticoagulation, compared with patients who received standard-dose anticoagulation prophylaxis, but full-dose anticoagulation also triggered an excess of moderate and severe bleeding events, randomized trial results show.
The new findings from the COVID-PACT trial in an exclusively U.S.-based cohort of 382 on-treatment patients in the ICU with COVID-19 infection may lead to a change in existing guidelines, which currently recommend standard-dose prophylaxis based on results from prior head-to-head comparisons, such as guidelines posted March 2022 from the American Society of Hematology.
” after weighing an individual patient’s risk for both thrombotic events and bleeding, David D. Berg, MD, said at the annual congress of the European Society of Cardiology. Simultaneous with his report at the congress, the results also appeared online in the journal Circulation.
“What the results tell us is that full-dose anticoagulation in critically ill patients with COVID-19 is highly effective for reducing thrombotic complications,” said Dr. Berg, a cardiologist and critical care physician at Brigham and Women’s Hospital, Boston.
The report’s designated discussant agreed with Dr. Berg’s conclusions.
‘Need to replace the guidelines’
“We probably need to replace the guidelines,” said Eduardo Ramacciotti, MD, PhD, MPH, a professor of vascular surgery at Santa Casa School of Medicine, São Paulo. Dr. Ramacciotti praised the study’s design, the endpoints, and the fact that the design excluded patients at high risk for bleeding complications, particularly those with a fibrinogen level below 200 mg/dL (2 g/L).
But other experts questioned the significance of the COVID-PACT results given that the outcomes did not show that full-dose anticoagulation produced incremental improvement in patient survival.
“We should abandon the thought that intensified anticoagulation should be routine, because it did not overall increase the number of patients discharged from the hospital alive,” commented John W. Eikelboom, MBBS, a professor of hematology and thromboembolism at McMaster University, Hamilton, Ont.
“Preventing venous thrombosis is a good thing, but the money is in saving lives and stopping need for ventilation, and we haven’t been successful doing that with an antithrombotic strategy,” said Dr. Eikelboom. “It is useful to prevent venous thrombosis, but we need to look elsewhere to improve the outcomes of [critically ill] patients with COVID-19.”
Reducing thromboembolism is a ‘valid goal’
Dr. Berg took a different view. “It’s a valid goal to try to reduce venous thromboembolism complications,” the major benefit seen in his study, he said. “There is clinical significance to reducing thrombotic events in terms of how people feel, their functional status, and their complications. There are a lot of clinically relevant consequences of thrombosis beyond mortality.”
COVID-PACT ran at 34 U.S. centers from August 2020 to March 2022 but stopped short of its enrollment goal of 750 patients because of waning numbers of patients with COVID-19 admitted to ICUs. In addition to randomly assigning patients within 96 hours of their ICU admission to full-dose anticoagulation or to standard-dose antithrombotic prophylaxis, the study included a second, concurrent randomization to the antiplatelet agent clopidogrel (Plavix) or to no antiplatelet drug. Both randomizations used an open-label design.
The results failed to show a discernable effect from adding clopidogrel on both the primary efficacy and primary safety endpoints, adding to accumulated evidence that treatment with an antiplatelet agent, including aspirin, confers no antithrombotic benefit in patients with COVID-19.
The trial’s participants averaged 61 years old, 68% were obese, 59% had hypertension, and 32% had diabetes. The median time after ICU admission when randomized treatment began was 2.1 days, and researchers followed patients for a median of 13 days, including a median time on anticoagulation of 10.6 days.
The trial design allowed clinicians to use either low molecular weight heparin or unfractionated heparin for anticoagulation, and 82% of patients received low molecular weight heparin as their initial treatment. The prespecified design called for an on-treatment analysis because of an anticipated high crossover rate. During the trial, 34% of patients who started on the prophylactic dose switched to full dose, and 17% had the reverse crossover.
95% increased win ratio with full dose
The study’s primary efficacy endpoint used a win-ratio analysis that included seven different adverse outcomes that ranged from death from venous or arterial thrombosis to clinically silent deep vein thrombosis. Treatment with full-dose anticoagulation led to a significant 95% increase in win ratio.
Researchers also applied a more conventional time-to-first-event secondary efficacy analysis, which showed that full-dose anticoagulation cut the incidence of an adverse outcome by a significant 44% relative to prophylactic dosing.
The two study groups showed no difference in all-cause death rates. The efficacy advantage of the full-dose regimen was driven by reduced rates of venous thrombotic events, especially a reduction in clinically evident deep vein thrombotic events.
The primary safety endpoint was the rate of fatal or life-threatening bleeding episodes, and while life-threatening bleeds were numerically more common among the full-dose recipients (four events, compared with one event on prophylaxis dosing) the difference was not significant, and no patients died from a bleeding event.
More secondary safety bleeds
The safety difference showed up in a secondary measure of bleeding severity, the rate of GUSTO moderate or severe bleeds. These occurred in 15 of the full-dose recipients, compared with 1 patient on the prophylactic dose.
Dr. Berg highlighted that several prior studies have assessed various anticoagulation regimens in critically ill (ICU-admitted and on respiratory or cardiovascular support) patients with COVID-19. For example, two influential reports published in 2021 by the same team of investigators in the New England Journal of Medicine had sharply divergent results.
One multicenter study, which tested full-dose heparin against prophylactic treatment in more than 1,000 critically ill patients, was stopped prematurely because it had not shown a significant difference between the treatment arms. The second study, in more than 2,000 multicenter patients with COVID-19 who did not require critical-level organ support, showed clear superiority of the full-dose heparin regimen.
Notably, both previous studies used a different primary efficacy endpoint than the COVID-PACT study. The earlier reports both measured efficacy in terms of patients being alive and off organ support by 21 days from randomization.
Patients to exclude
Although Dr. Berg stressed the clear positive result, he also cautioned that they should not apply to patients excluded from the study: those with severe coagulopathies, those with severe thrombocytopenia, and patients already maintained on dual antiplatelet therapy. He also cautioned against using the full-dose strategy in elderly patients, because in COVID-PACT, those who developed bleeding complications tended to be older.
Dr. Berg also noted that heparin prophylaxis is a well-established intervention for ICU-admitted patients without COVID-19 for the purpose of preventing venous thromboembolisms without evidence that this approach reduces deaths or organ failure.
But he conceded that “the priority of treatment depends on whether it saves lives, so anticoagulation is probably not as high a priority as other effective treatments” that reduce mortality. “Preventing venous thromboembolism has rarely been shown to have a mortality benefit,” Dr. Berg noted.
COVID-PACT received no direct commercial funding. Dr. Berg has been a consultant to AstraZeneca, Mobility Bio, and Youngene Therapeutics, and he participated in a trial sponsored by Kowa. Dr. Ramacciotti has been a consultant to or speaker on behalf of Aspen, Bayer, Daiichi Sankyo, Mylan, Pfizer, and Sanofi, and he has received research support from Bayer, Esperon, Novartis, and Pfizer. Dr. Eikelboom has received honoraria and research support from Bayer.
A version of this article first appeared on Medscape.com.
BARCELONA – Hospitalized patients in the ICU because of an acute COVID-19 infection had significantly fewer thrombotic events and complications when treated with full-dose anticoagulation, compared with patients who received standard-dose anticoagulation prophylaxis, but full-dose anticoagulation also triggered an excess of moderate and severe bleeding events, randomized trial results show.
The new findings from the COVID-PACT trial in an exclusively U.S.-based cohort of 382 on-treatment patients in the ICU with COVID-19 infection may lead to a change in existing guidelines, which currently recommend standard-dose prophylaxis based on results from prior head-to-head comparisons, such as guidelines posted March 2022 from the American Society of Hematology.
” after weighing an individual patient’s risk for both thrombotic events and bleeding, David D. Berg, MD, said at the annual congress of the European Society of Cardiology. Simultaneous with his report at the congress, the results also appeared online in the journal Circulation.
“What the results tell us is that full-dose anticoagulation in critically ill patients with COVID-19 is highly effective for reducing thrombotic complications,” said Dr. Berg, a cardiologist and critical care physician at Brigham and Women’s Hospital, Boston.
The report’s designated discussant agreed with Dr. Berg’s conclusions.
‘Need to replace the guidelines’
“We probably need to replace the guidelines,” said Eduardo Ramacciotti, MD, PhD, MPH, a professor of vascular surgery at Santa Casa School of Medicine, São Paulo. Dr. Ramacciotti praised the study’s design, the endpoints, and the fact that the design excluded patients at high risk for bleeding complications, particularly those with a fibrinogen level below 200 mg/dL (2 g/L).
But other experts questioned the significance of the COVID-PACT results given that the outcomes did not show that full-dose anticoagulation produced incremental improvement in patient survival.
“We should abandon the thought that intensified anticoagulation should be routine, because it did not overall increase the number of patients discharged from the hospital alive,” commented John W. Eikelboom, MBBS, a professor of hematology and thromboembolism at McMaster University, Hamilton, Ont.
“Preventing venous thrombosis is a good thing, but the money is in saving lives and stopping need for ventilation, and we haven’t been successful doing that with an antithrombotic strategy,” said Dr. Eikelboom. “It is useful to prevent venous thrombosis, but we need to look elsewhere to improve the outcomes of [critically ill] patients with COVID-19.”
Reducing thromboembolism is a ‘valid goal’
Dr. Berg took a different view. “It’s a valid goal to try to reduce venous thromboembolism complications,” the major benefit seen in his study, he said. “There is clinical significance to reducing thrombotic events in terms of how people feel, their functional status, and their complications. There are a lot of clinically relevant consequences of thrombosis beyond mortality.”
COVID-PACT ran at 34 U.S. centers from August 2020 to March 2022 but stopped short of its enrollment goal of 750 patients because of waning numbers of patients with COVID-19 admitted to ICUs. In addition to randomly assigning patients within 96 hours of their ICU admission to full-dose anticoagulation or to standard-dose antithrombotic prophylaxis, the study included a second, concurrent randomization to the antiplatelet agent clopidogrel (Plavix) or to no antiplatelet drug. Both randomizations used an open-label design.
The results failed to show a discernable effect from adding clopidogrel on both the primary efficacy and primary safety endpoints, adding to accumulated evidence that treatment with an antiplatelet agent, including aspirin, confers no antithrombotic benefit in patients with COVID-19.
The trial’s participants averaged 61 years old, 68% were obese, 59% had hypertension, and 32% had diabetes. The median time after ICU admission when randomized treatment began was 2.1 days, and researchers followed patients for a median of 13 days, including a median time on anticoagulation of 10.6 days.
The trial design allowed clinicians to use either low molecular weight heparin or unfractionated heparin for anticoagulation, and 82% of patients received low molecular weight heparin as their initial treatment. The prespecified design called for an on-treatment analysis because of an anticipated high crossover rate. During the trial, 34% of patients who started on the prophylactic dose switched to full dose, and 17% had the reverse crossover.
95% increased win ratio with full dose
The study’s primary efficacy endpoint used a win-ratio analysis that included seven different adverse outcomes that ranged from death from venous or arterial thrombosis to clinically silent deep vein thrombosis. Treatment with full-dose anticoagulation led to a significant 95% increase in win ratio.
Researchers also applied a more conventional time-to-first-event secondary efficacy analysis, which showed that full-dose anticoagulation cut the incidence of an adverse outcome by a significant 44% relative to prophylactic dosing.
The two study groups showed no difference in all-cause death rates. The efficacy advantage of the full-dose regimen was driven by reduced rates of venous thrombotic events, especially a reduction in clinically evident deep vein thrombotic events.
The primary safety endpoint was the rate of fatal or life-threatening bleeding episodes, and while life-threatening bleeds were numerically more common among the full-dose recipients (four events, compared with one event on prophylaxis dosing) the difference was not significant, and no patients died from a bleeding event.
More secondary safety bleeds
The safety difference showed up in a secondary measure of bleeding severity, the rate of GUSTO moderate or severe bleeds. These occurred in 15 of the full-dose recipients, compared with 1 patient on the prophylactic dose.
Dr. Berg highlighted that several prior studies have assessed various anticoagulation regimens in critically ill (ICU-admitted and on respiratory or cardiovascular support) patients with COVID-19. For example, two influential reports published in 2021 by the same team of investigators in the New England Journal of Medicine had sharply divergent results.
One multicenter study, which tested full-dose heparin against prophylactic treatment in more than 1,000 critically ill patients, was stopped prematurely because it had not shown a significant difference between the treatment arms. The second study, in more than 2,000 multicenter patients with COVID-19 who did not require critical-level organ support, showed clear superiority of the full-dose heparin regimen.
Notably, both previous studies used a different primary efficacy endpoint than the COVID-PACT study. The earlier reports both measured efficacy in terms of patients being alive and off organ support by 21 days from randomization.
Patients to exclude
Although Dr. Berg stressed the clear positive result, he also cautioned that they should not apply to patients excluded from the study: those with severe coagulopathies, those with severe thrombocytopenia, and patients already maintained on dual antiplatelet therapy. He also cautioned against using the full-dose strategy in elderly patients, because in COVID-PACT, those who developed bleeding complications tended to be older.
Dr. Berg also noted that heparin prophylaxis is a well-established intervention for ICU-admitted patients without COVID-19 for the purpose of preventing venous thromboembolisms without evidence that this approach reduces deaths or organ failure.
But he conceded that “the priority of treatment depends on whether it saves lives, so anticoagulation is probably not as high a priority as other effective treatments” that reduce mortality. “Preventing venous thromboembolism has rarely been shown to have a mortality benefit,” Dr. Berg noted.
COVID-PACT received no direct commercial funding. Dr. Berg has been a consultant to AstraZeneca, Mobility Bio, and Youngene Therapeutics, and he participated in a trial sponsored by Kowa. Dr. Ramacciotti has been a consultant to or speaker on behalf of Aspen, Bayer, Daiichi Sankyo, Mylan, Pfizer, and Sanofi, and he has received research support from Bayer, Esperon, Novartis, and Pfizer. Dr. Eikelboom has received honoraria and research support from Bayer.
A version of this article first appeared on Medscape.com.
BARCELONA – Hospitalized patients in the ICU because of an acute COVID-19 infection had significantly fewer thrombotic events and complications when treated with full-dose anticoagulation, compared with patients who received standard-dose anticoagulation prophylaxis, but full-dose anticoagulation also triggered an excess of moderate and severe bleeding events, randomized trial results show.
The new findings from the COVID-PACT trial in an exclusively U.S.-based cohort of 382 on-treatment patients in the ICU with COVID-19 infection may lead to a change in existing guidelines, which currently recommend standard-dose prophylaxis based on results from prior head-to-head comparisons, such as guidelines posted March 2022 from the American Society of Hematology.
” after weighing an individual patient’s risk for both thrombotic events and bleeding, David D. Berg, MD, said at the annual congress of the European Society of Cardiology. Simultaneous with his report at the congress, the results also appeared online in the journal Circulation.
“What the results tell us is that full-dose anticoagulation in critically ill patients with COVID-19 is highly effective for reducing thrombotic complications,” said Dr. Berg, a cardiologist and critical care physician at Brigham and Women’s Hospital, Boston.
The report’s designated discussant agreed with Dr. Berg’s conclusions.
‘Need to replace the guidelines’
“We probably need to replace the guidelines,” said Eduardo Ramacciotti, MD, PhD, MPH, a professor of vascular surgery at Santa Casa School of Medicine, São Paulo. Dr. Ramacciotti praised the study’s design, the endpoints, and the fact that the design excluded patients at high risk for bleeding complications, particularly those with a fibrinogen level below 200 mg/dL (2 g/L).
But other experts questioned the significance of the COVID-PACT results given that the outcomes did not show that full-dose anticoagulation produced incremental improvement in patient survival.
“We should abandon the thought that intensified anticoagulation should be routine, because it did not overall increase the number of patients discharged from the hospital alive,” commented John W. Eikelboom, MBBS, a professor of hematology and thromboembolism at McMaster University, Hamilton, Ont.
“Preventing venous thrombosis is a good thing, but the money is in saving lives and stopping need for ventilation, and we haven’t been successful doing that with an antithrombotic strategy,” said Dr. Eikelboom. “It is useful to prevent venous thrombosis, but we need to look elsewhere to improve the outcomes of [critically ill] patients with COVID-19.”
Reducing thromboembolism is a ‘valid goal’
Dr. Berg took a different view. “It’s a valid goal to try to reduce venous thromboembolism complications,” the major benefit seen in his study, he said. “There is clinical significance to reducing thrombotic events in terms of how people feel, their functional status, and their complications. There are a lot of clinically relevant consequences of thrombosis beyond mortality.”
COVID-PACT ran at 34 U.S. centers from August 2020 to March 2022 but stopped short of its enrollment goal of 750 patients because of waning numbers of patients with COVID-19 admitted to ICUs. In addition to randomly assigning patients within 96 hours of their ICU admission to full-dose anticoagulation or to standard-dose antithrombotic prophylaxis, the study included a second, concurrent randomization to the antiplatelet agent clopidogrel (Plavix) or to no antiplatelet drug. Both randomizations used an open-label design.
The results failed to show a discernable effect from adding clopidogrel on both the primary efficacy and primary safety endpoints, adding to accumulated evidence that treatment with an antiplatelet agent, including aspirin, confers no antithrombotic benefit in patients with COVID-19.
The trial’s participants averaged 61 years old, 68% were obese, 59% had hypertension, and 32% had diabetes. The median time after ICU admission when randomized treatment began was 2.1 days, and researchers followed patients for a median of 13 days, including a median time on anticoagulation of 10.6 days.
The trial design allowed clinicians to use either low molecular weight heparin or unfractionated heparin for anticoagulation, and 82% of patients received low molecular weight heparin as their initial treatment. The prespecified design called for an on-treatment analysis because of an anticipated high crossover rate. During the trial, 34% of patients who started on the prophylactic dose switched to full dose, and 17% had the reverse crossover.
95% increased win ratio with full dose
The study’s primary efficacy endpoint used a win-ratio analysis that included seven different adverse outcomes that ranged from death from venous or arterial thrombosis to clinically silent deep vein thrombosis. Treatment with full-dose anticoagulation led to a significant 95% increase in win ratio.
Researchers also applied a more conventional time-to-first-event secondary efficacy analysis, which showed that full-dose anticoagulation cut the incidence of an adverse outcome by a significant 44% relative to prophylactic dosing.
The two study groups showed no difference in all-cause death rates. The efficacy advantage of the full-dose regimen was driven by reduced rates of venous thrombotic events, especially a reduction in clinically evident deep vein thrombotic events.
The primary safety endpoint was the rate of fatal or life-threatening bleeding episodes, and while life-threatening bleeds were numerically more common among the full-dose recipients (four events, compared with one event on prophylaxis dosing) the difference was not significant, and no patients died from a bleeding event.
More secondary safety bleeds
The safety difference showed up in a secondary measure of bleeding severity, the rate of GUSTO moderate or severe bleeds. These occurred in 15 of the full-dose recipients, compared with 1 patient on the prophylactic dose.
Dr. Berg highlighted that several prior studies have assessed various anticoagulation regimens in critically ill (ICU-admitted and on respiratory or cardiovascular support) patients with COVID-19. For example, two influential reports published in 2021 by the same team of investigators in the New England Journal of Medicine had sharply divergent results.
One multicenter study, which tested full-dose heparin against prophylactic treatment in more than 1,000 critically ill patients, was stopped prematurely because it had not shown a significant difference between the treatment arms. The second study, in more than 2,000 multicenter patients with COVID-19 who did not require critical-level organ support, showed clear superiority of the full-dose heparin regimen.
Notably, both previous studies used a different primary efficacy endpoint than the COVID-PACT study. The earlier reports both measured efficacy in terms of patients being alive and off organ support by 21 days from randomization.
Patients to exclude
Although Dr. Berg stressed the clear positive result, he also cautioned that they should not apply to patients excluded from the study: those with severe coagulopathies, those with severe thrombocytopenia, and patients already maintained on dual antiplatelet therapy. He also cautioned against using the full-dose strategy in elderly patients, because in COVID-PACT, those who developed bleeding complications tended to be older.
Dr. Berg also noted that heparin prophylaxis is a well-established intervention for ICU-admitted patients without COVID-19 for the purpose of preventing venous thromboembolisms without evidence that this approach reduces deaths or organ failure.
But he conceded that “the priority of treatment depends on whether it saves lives, so anticoagulation is probably not as high a priority as other effective treatments” that reduce mortality. “Preventing venous thromboembolism has rarely been shown to have a mortality benefit,” Dr. Berg noted.
COVID-PACT received no direct commercial funding. Dr. Berg has been a consultant to AstraZeneca, Mobility Bio, and Youngene Therapeutics, and he participated in a trial sponsored by Kowa. Dr. Ramacciotti has been a consultant to or speaker on behalf of Aspen, Bayer, Daiichi Sankyo, Mylan, Pfizer, and Sanofi, and he has received research support from Bayer, Esperon, Novartis, and Pfizer. Dr. Eikelboom has received honoraria and research support from Bayer.
A version of this article first appeared on Medscape.com.