Defending the Home Planet

Article Type
Changed

Like me, some of you may have been following the agonizing news about the unprecedented brushfires in Australia that have devastated human, animal, and vegetative life in that country so culturally akin to our own.1 For many people who believe the overwhelming majority of scientific reports on climate change, these apocalyptic fires are an empirical demonstration of the truth of the dire prophecies for the future of our planet. Scientists have demonstrated that although climate change may not have caused the worst fires in Australia’s history, they may have contributed to the conditions that enabled them to spread so far and wide and reach such a destructive intensity.2The heartbreaking pictures of singed koalas and displaced people and the helpless feeling that all I can do from here is donate money set me to thinking about the relationship between the military, health, and climate change, which is the subject of this column.

As I write this in mid-January of a new decade and glance at the weather headlines, I read about an earthquake in Puerto Rico and tornadoes in the southern US. This makes it quite plausible that our comfortable lifestyle and technological civilization could in the coming decades go the way of the dinosaurs, also victims of climate change.

Initially, my first thought about this relationship is a negative one—images of scorched earth policies that stretch back to ancient wars jump to mind. Reflection and research on the topic though suggest that the relationship may be more complicated and conflicted. Alas, I can only touch on a few of the themes in this brief format.

It may not be as obvious that climate change also threatens the military, which is the guardian of that civilization. In 2018, for example, Hurricane Michael caused nearly $5 billion in damages to Tyndall Air Force Base in Florida.3 A year later, the US Department of Defense (DoD) released a report on the effects of climate change as mandated by Congress.4 Even though some congressional critics expressed concern about the report’s lack of depth and detail,5 the report asserted that, “The effects of a changing climate are a national security issue with potential impacts to Department of Defense (DoD or the Department) missions, operational plans, and installations.”4

The US Department of Veterans Affairs (VA) is not immune either. Natural disasters have already disrupted the delivery of health care at its many aging facilities. Climate change was called the “engine”6 driving Hurricane Maria, which in 2017 slammed into Puerto Rico, including its VA medical center, and resulted in shortages of supplies, staff, and basic utilities.7 The facility and the island are still trying to rebuild. In response to weather-exposed vulnerability in VA infrastructure, Senator and presidential candidate Elizabeth Warren (D-MA) and Senator Brian Schatz (D-HI), the ranking member of the Subcommittee on Military Construction, sent a letter to VA leadership arguing that “Strengthening VA’s resilience to climate change is consistent with the agency’s mission to deliver timely, high-quality care and benefits to America’s veterans.”8

It has been reported that the current administration has countered initiatives to prepare for the challenges of providing health care to service members and veterans in a climate changed world.9 Sadly, but predictably, in the politicized federal health care arena, the safety of our service members and, in turn, the domestic and national security and peace that depend on them are caught in the partisan debate over global warming, though it is not likely Congress or federal agency leaders will abandon planning to safeguard service members who will see duty and combat in a radically altered ecology and veterans and who will need to have VA continue to be the reliable safety net despite an increasingly erratic environment.10

Climate change is a divisive political issue; there is a proud tradition of conservatism and self-reliance in military members, active duty and veteran alike. That was why I was surprised and impressed when I saw the results of a recent survey on climate change. In January 2019, 293 active-duty service members and veterans were surveyed.

Participants were selected to reflect the ethnic makeup, educational level, and political allegiance of the military population, which enhanced the validity of the findings.11Participants were asked to indicate whether they believed that the earth was warming secondary to human or natural processes; not growing warmer at all; or whether they were unsure. Similar to the general population, 46% agreed that climate change is anthropogenic.11 More than three-fourths believed it was likely climate change would adversely affect the places they worked, like military installations; 61% thought it likely that global warming could lead to armed conflict over resources. Seven in 10 respondents believed that climate is changing vs 46% who did not. Of respondents who believe climate change is real, 87% see it as a threat to military bases compared with 60% who do not accept the science that the earth is warming.11

This survey, though, is only a small study, and the military and VA are big tents under which a wide range of political persuasions and diverse beliefs co-exist. There are many readers of Federal Practitioner who will no doubt reject nearly every word I have written, in what I know is a controversial column. But it matters that the military and veteran constituency are thinking and speaking about the issue of climate change.11 Why? The answer takes us back to the disaster in Australia. When the fires and the devastation they wrought escalated beyond the powers of the civil authorities to handle, it was the military whose technical skill, coordinated readiness, and personal courage and dedication that was called on to rescue thousands of civilians from the inferno.12 So it will be in our country and around the world when disasters—manmade, natural, or both—threaten to engulf life in all its wondrous variety. Those who battle extreme weather will have unique health needs, and their valiant sacrifices deserve to have health care systems ready and able to treat them.

References

1. Thompson A. Australia’s bushfires have likely devastated wildlife–and the impact will only get worse. Scientific American. https://www.scientificamerican.com/article/australias-bushfires-have-likely-devastated-wildlife-and-the-impact-will-only-get-worse. Published January 8, 2020. Accessed January 16, 2020.

2. Gibbens S. Intense ‘firestorms’ forming from Australia’s deadly wildfires. https://www.nationalgeographic.com/science/2020/01/australian-wildfires-cause-firestorms. Published January 9, 2020. Accessed January 15, 2020.

3. Shapiro A. Tyndall Air Force Base still faces challenges in recovering from Hurricane Michael. https://www.npr.org/2019/05/31/728754872/tyndall-air-force-base-still-faces-challenges-in-recovering-from-hurricane-micha. Published May 31, 2019. Accessed January 16, 2020.

4. US Department of Defense, Office of the Undersecretary for Acquisition and Sustainment. Report on effects of a changing climate to the Department of Defense. https://www.documentcloud.org/documents/5689153-DoD-Final-Climate-Report.html. Published January 2019. Accessed January 16, 2020.

5. Maucione S. DoD justifies climate change report, says response was mission-centric. https://federalnewsnetwork.com/defense-main/2019/03/dod-justifies-climate-change-report-says-response-was-mission-centric. Published March 28, 2019. Accessed January 16, 2020.

6. Shane L 3rd. Puerto Rico’s VA hospital weathers Maria, but challenges loom. https://www.armytimes.com/veterans/2017/09/22/puerto-ricos-va-hospital-weathers-hurricane-maria-but-challenges-loom. Published September 22, 2017. Accessed January 16, 2020.

7. Hersher R. Climate change was the engine that powered Hurricane Maria’s devastating rains. https://www.npr.org/2019/04/17/714098828/climate-change-was-the-engine-that-powered-hurricane-marias-devastating-rains. Published April 17, 2019. Accessed January 16, 2020.

8. Senators Warren and Schatz request an update from the Department of Veterans Affairs on efforts to build resilience to climate change [press release]. https://www.warren.senate.gov/oversight/letters/senators-warren-and-schatz-request-an-update-from-the-department-of-veterans-affairs-on-efforts-to-build-resilience-to-climate-change. Published October 1, 2019. Accessed January 16, 2020.

9. Simkins JD. Navy quietly ends climate change task force, reversing Obama initiative. https://www.navytimes.com/off-duty/military-culture/2019/08/26/navy-quietly-ends-climate-change-task-force-reversing-obama-initiative. Published August 26, 2019. Accessed January 16, 2020.

10. Eilperin J, Dennis B, Ryan M. As White House questions climate change, U.S. military is planning for it. https://www.washingtonpost.com/national/health-science/as-white-house-questions-climate-change-us-military-is-planning-for-it/2019/04/08/78142546-57c0-11e9-814f-e2f46684196e_story.html. Published April 8, 2019. Accessed January 16, 2020.

11. Motta M, Spindel J, Ralston R. Veterans are concerned about climate change and that matters. http://theconversation.com/veterans-are-concerned-about-climate-change-and-that-matters-110685. Published March 8, 2019. Accessed January 16, 2020.

12. Albeck-Ripka L, Kwai I, Fuller T, Tarabay J. ‘It’s an atomic bomb’: Australia deploys military as fires spread. https://www.nytimes.com/2020/01/04/world/australia/fires-military.html. Updated January 5, 2020. Accessed January 18, 2020.

Article PDF
Author and Disclosure Information

Correspondence: Cynthia Geppert ([email protected])

 

Disclaimer
The opinions expressed herein are those of the author and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Issue
Federal Practitioner - 37(2)a
Publications
Topics
Page Number
67-68
Sections
Author and Disclosure Information

Correspondence: Cynthia Geppert ([email protected])

 

Disclaimer
The opinions expressed herein are those of the author and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Author and Disclosure Information

Correspondence: Cynthia Geppert ([email protected])

 

Disclaimer
The opinions expressed herein are those of the author and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Article PDF
Article PDF

Like me, some of you may have been following the agonizing news about the unprecedented brushfires in Australia that have devastated human, animal, and vegetative life in that country so culturally akin to our own.1 For many people who believe the overwhelming majority of scientific reports on climate change, these apocalyptic fires are an empirical demonstration of the truth of the dire prophecies for the future of our planet. Scientists have demonstrated that although climate change may not have caused the worst fires in Australia’s history, they may have contributed to the conditions that enabled them to spread so far and wide and reach such a destructive intensity.2The heartbreaking pictures of singed koalas and displaced people and the helpless feeling that all I can do from here is donate money set me to thinking about the relationship between the military, health, and climate change, which is the subject of this column.

As I write this in mid-January of a new decade and glance at the weather headlines, I read about an earthquake in Puerto Rico and tornadoes in the southern US. This makes it quite plausible that our comfortable lifestyle and technological civilization could in the coming decades go the way of the dinosaurs, also victims of climate change.

Initially, my first thought about this relationship is a negative one—images of scorched earth policies that stretch back to ancient wars jump to mind. Reflection and research on the topic though suggest that the relationship may be more complicated and conflicted. Alas, I can only touch on a few of the themes in this brief format.

It may not be as obvious that climate change also threatens the military, which is the guardian of that civilization. In 2018, for example, Hurricane Michael caused nearly $5 billion in damages to Tyndall Air Force Base in Florida.3 A year later, the US Department of Defense (DoD) released a report on the effects of climate change as mandated by Congress.4 Even though some congressional critics expressed concern about the report’s lack of depth and detail,5 the report asserted that, “The effects of a changing climate are a national security issue with potential impacts to Department of Defense (DoD or the Department) missions, operational plans, and installations.”4

The US Department of Veterans Affairs (VA) is not immune either. Natural disasters have already disrupted the delivery of health care at its many aging facilities. Climate change was called the “engine”6 driving Hurricane Maria, which in 2017 slammed into Puerto Rico, including its VA medical center, and resulted in shortages of supplies, staff, and basic utilities.7 The facility and the island are still trying to rebuild. In response to weather-exposed vulnerability in VA infrastructure, Senator and presidential candidate Elizabeth Warren (D-MA) and Senator Brian Schatz (D-HI), the ranking member of the Subcommittee on Military Construction, sent a letter to VA leadership arguing that “Strengthening VA’s resilience to climate change is consistent with the agency’s mission to deliver timely, high-quality care and benefits to America’s veterans.”8

It has been reported that the current administration has countered initiatives to prepare for the challenges of providing health care to service members and veterans in a climate changed world.9 Sadly, but predictably, in the politicized federal health care arena, the safety of our service members and, in turn, the domestic and national security and peace that depend on them are caught in the partisan debate over global warming, though it is not likely Congress or federal agency leaders will abandon planning to safeguard service members who will see duty and combat in a radically altered ecology and veterans and who will need to have VA continue to be the reliable safety net despite an increasingly erratic environment.10

Climate change is a divisive political issue; there is a proud tradition of conservatism and self-reliance in military members, active duty and veteran alike. That was why I was surprised and impressed when I saw the results of a recent survey on climate change. In January 2019, 293 active-duty service members and veterans were surveyed.

Participants were selected to reflect the ethnic makeup, educational level, and political allegiance of the military population, which enhanced the validity of the findings.11Participants were asked to indicate whether they believed that the earth was warming secondary to human or natural processes; not growing warmer at all; or whether they were unsure. Similar to the general population, 46% agreed that climate change is anthropogenic.11 More than three-fourths believed it was likely climate change would adversely affect the places they worked, like military installations; 61% thought it likely that global warming could lead to armed conflict over resources. Seven in 10 respondents believed that climate is changing vs 46% who did not. Of respondents who believe climate change is real, 87% see it as a threat to military bases compared with 60% who do not accept the science that the earth is warming.11

This survey, though, is only a small study, and the military and VA are big tents under which a wide range of political persuasions and diverse beliefs co-exist. There are many readers of Federal Practitioner who will no doubt reject nearly every word I have written, in what I know is a controversial column. But it matters that the military and veteran constituency are thinking and speaking about the issue of climate change.11 Why? The answer takes us back to the disaster in Australia. When the fires and the devastation they wrought escalated beyond the powers of the civil authorities to handle, it was the military whose technical skill, coordinated readiness, and personal courage and dedication that was called on to rescue thousands of civilians from the inferno.12 So it will be in our country and around the world when disasters—manmade, natural, or both—threaten to engulf life in all its wondrous variety. Those who battle extreme weather will have unique health needs, and their valiant sacrifices deserve to have health care systems ready and able to treat them.

Like me, some of you may have been following the agonizing news about the unprecedented brushfires in Australia that have devastated human, animal, and vegetative life in that country so culturally akin to our own.1 For many people who believe the overwhelming majority of scientific reports on climate change, these apocalyptic fires are an empirical demonstration of the truth of the dire prophecies for the future of our planet. Scientists have demonstrated that although climate change may not have caused the worst fires in Australia’s history, they may have contributed to the conditions that enabled them to spread so far and wide and reach such a destructive intensity.2The heartbreaking pictures of singed koalas and displaced people and the helpless feeling that all I can do from here is donate money set me to thinking about the relationship between the military, health, and climate change, which is the subject of this column.

As I write this in mid-January of a new decade and glance at the weather headlines, I read about an earthquake in Puerto Rico and tornadoes in the southern US. This makes it quite plausible that our comfortable lifestyle and technological civilization could in the coming decades go the way of the dinosaurs, also victims of climate change.

Initially, my first thought about this relationship is a negative one—images of scorched earth policies that stretch back to ancient wars jump to mind. Reflection and research on the topic though suggest that the relationship may be more complicated and conflicted. Alas, I can only touch on a few of the themes in this brief format.

It may not be as obvious that climate change also threatens the military, which is the guardian of that civilization. In 2018, for example, Hurricane Michael caused nearly $5 billion in damages to Tyndall Air Force Base in Florida.3 A year later, the US Department of Defense (DoD) released a report on the effects of climate change as mandated by Congress.4 Even though some congressional critics expressed concern about the report’s lack of depth and detail,5 the report asserted that, “The effects of a changing climate are a national security issue with potential impacts to Department of Defense (DoD or the Department) missions, operational plans, and installations.”4

The US Department of Veterans Affairs (VA) is not immune either. Natural disasters have already disrupted the delivery of health care at its many aging facilities. Climate change was called the “engine”6 driving Hurricane Maria, which in 2017 slammed into Puerto Rico, including its VA medical center, and resulted in shortages of supplies, staff, and basic utilities.7 The facility and the island are still trying to rebuild. In response to weather-exposed vulnerability in VA infrastructure, Senator and presidential candidate Elizabeth Warren (D-MA) and Senator Brian Schatz (D-HI), the ranking member of the Subcommittee on Military Construction, sent a letter to VA leadership arguing that “Strengthening VA’s resilience to climate change is consistent with the agency’s mission to deliver timely, high-quality care and benefits to America’s veterans.”8

It has been reported that the current administration has countered initiatives to prepare for the challenges of providing health care to service members and veterans in a climate changed world.9 Sadly, but predictably, in the politicized federal health care arena, the safety of our service members and, in turn, the domestic and national security and peace that depend on them are caught in the partisan debate over global warming, though it is not likely Congress or federal agency leaders will abandon planning to safeguard service members who will see duty and combat in a radically altered ecology and veterans and who will need to have VA continue to be the reliable safety net despite an increasingly erratic environment.10

Climate change is a divisive political issue; there is a proud tradition of conservatism and self-reliance in military members, active duty and veteran alike. That was why I was surprised and impressed when I saw the results of a recent survey on climate change. In January 2019, 293 active-duty service members and veterans were surveyed.

Participants were selected to reflect the ethnic makeup, educational level, and political allegiance of the military population, which enhanced the validity of the findings.11Participants were asked to indicate whether they believed that the earth was warming secondary to human or natural processes; not growing warmer at all; or whether they were unsure. Similar to the general population, 46% agreed that climate change is anthropogenic.11 More than three-fourths believed it was likely climate change would adversely affect the places they worked, like military installations; 61% thought it likely that global warming could lead to armed conflict over resources. Seven in 10 respondents believed that climate is changing vs 46% who did not. Of respondents who believe climate change is real, 87% see it as a threat to military bases compared with 60% who do not accept the science that the earth is warming.11

This survey, though, is only a small study, and the military and VA are big tents under which a wide range of political persuasions and diverse beliefs co-exist. There are many readers of Federal Practitioner who will no doubt reject nearly every word I have written, in what I know is a controversial column. But it matters that the military and veteran constituency are thinking and speaking about the issue of climate change.11 Why? The answer takes us back to the disaster in Australia. When the fires and the devastation they wrought escalated beyond the powers of the civil authorities to handle, it was the military whose technical skill, coordinated readiness, and personal courage and dedication that was called on to rescue thousands of civilians from the inferno.12 So it will be in our country and around the world when disasters—manmade, natural, or both—threaten to engulf life in all its wondrous variety. Those who battle extreme weather will have unique health needs, and their valiant sacrifices deserve to have health care systems ready and able to treat them.

References

1. Thompson A. Australia’s bushfires have likely devastated wildlife–and the impact will only get worse. Scientific American. https://www.scientificamerican.com/article/australias-bushfires-have-likely-devastated-wildlife-and-the-impact-will-only-get-worse. Published January 8, 2020. Accessed January 16, 2020.

2. Gibbens S. Intense ‘firestorms’ forming from Australia’s deadly wildfires. https://www.nationalgeographic.com/science/2020/01/australian-wildfires-cause-firestorms. Published January 9, 2020. Accessed January 15, 2020.

3. Shapiro A. Tyndall Air Force Base still faces challenges in recovering from Hurricane Michael. https://www.npr.org/2019/05/31/728754872/tyndall-air-force-base-still-faces-challenges-in-recovering-from-hurricane-micha. Published May 31, 2019. Accessed January 16, 2020.

4. US Department of Defense, Office of the Undersecretary for Acquisition and Sustainment. Report on effects of a changing climate to the Department of Defense. https://www.documentcloud.org/documents/5689153-DoD-Final-Climate-Report.html. Published January 2019. Accessed January 16, 2020.

5. Maucione S. DoD justifies climate change report, says response was mission-centric. https://federalnewsnetwork.com/defense-main/2019/03/dod-justifies-climate-change-report-says-response-was-mission-centric. Published March 28, 2019. Accessed January 16, 2020.

6. Shane L 3rd. Puerto Rico’s VA hospital weathers Maria, but challenges loom. https://www.armytimes.com/veterans/2017/09/22/puerto-ricos-va-hospital-weathers-hurricane-maria-but-challenges-loom. Published September 22, 2017. Accessed January 16, 2020.

7. Hersher R. Climate change was the engine that powered Hurricane Maria’s devastating rains. https://www.npr.org/2019/04/17/714098828/climate-change-was-the-engine-that-powered-hurricane-marias-devastating-rains. Published April 17, 2019. Accessed January 16, 2020.

8. Senators Warren and Schatz request an update from the Department of Veterans Affairs on efforts to build resilience to climate change [press release]. https://www.warren.senate.gov/oversight/letters/senators-warren-and-schatz-request-an-update-from-the-department-of-veterans-affairs-on-efforts-to-build-resilience-to-climate-change. Published October 1, 2019. Accessed January 16, 2020.

9. Simkins JD. Navy quietly ends climate change task force, reversing Obama initiative. https://www.navytimes.com/off-duty/military-culture/2019/08/26/navy-quietly-ends-climate-change-task-force-reversing-obama-initiative. Published August 26, 2019. Accessed January 16, 2020.

10. Eilperin J, Dennis B, Ryan M. As White House questions climate change, U.S. military is planning for it. https://www.washingtonpost.com/national/health-science/as-white-house-questions-climate-change-us-military-is-planning-for-it/2019/04/08/78142546-57c0-11e9-814f-e2f46684196e_story.html. Published April 8, 2019. Accessed January 16, 2020.

11. Motta M, Spindel J, Ralston R. Veterans are concerned about climate change and that matters. http://theconversation.com/veterans-are-concerned-about-climate-change-and-that-matters-110685. Published March 8, 2019. Accessed January 16, 2020.

12. Albeck-Ripka L, Kwai I, Fuller T, Tarabay J. ‘It’s an atomic bomb’: Australia deploys military as fires spread. https://www.nytimes.com/2020/01/04/world/australia/fires-military.html. Updated January 5, 2020. Accessed January 18, 2020.

References

1. Thompson A. Australia’s bushfires have likely devastated wildlife–and the impact will only get worse. Scientific American. https://www.scientificamerican.com/article/australias-bushfires-have-likely-devastated-wildlife-and-the-impact-will-only-get-worse. Published January 8, 2020. Accessed January 16, 2020.

2. Gibbens S. Intense ‘firestorms’ forming from Australia’s deadly wildfires. https://www.nationalgeographic.com/science/2020/01/australian-wildfires-cause-firestorms. Published January 9, 2020. Accessed January 15, 2020.

3. Shapiro A. Tyndall Air Force Base still faces challenges in recovering from Hurricane Michael. https://www.npr.org/2019/05/31/728754872/tyndall-air-force-base-still-faces-challenges-in-recovering-from-hurricane-micha. Published May 31, 2019. Accessed January 16, 2020.

4. US Department of Defense, Office of the Undersecretary for Acquisition and Sustainment. Report on effects of a changing climate to the Department of Defense. https://www.documentcloud.org/documents/5689153-DoD-Final-Climate-Report.html. Published January 2019. Accessed January 16, 2020.

5. Maucione S. DoD justifies climate change report, says response was mission-centric. https://federalnewsnetwork.com/defense-main/2019/03/dod-justifies-climate-change-report-says-response-was-mission-centric. Published March 28, 2019. Accessed January 16, 2020.

6. Shane L 3rd. Puerto Rico’s VA hospital weathers Maria, but challenges loom. https://www.armytimes.com/veterans/2017/09/22/puerto-ricos-va-hospital-weathers-hurricane-maria-but-challenges-loom. Published September 22, 2017. Accessed January 16, 2020.

7. Hersher R. Climate change was the engine that powered Hurricane Maria’s devastating rains. https://www.npr.org/2019/04/17/714098828/climate-change-was-the-engine-that-powered-hurricane-marias-devastating-rains. Published April 17, 2019. Accessed January 16, 2020.

8. Senators Warren and Schatz request an update from the Department of Veterans Affairs on efforts to build resilience to climate change [press release]. https://www.warren.senate.gov/oversight/letters/senators-warren-and-schatz-request-an-update-from-the-department-of-veterans-affairs-on-efforts-to-build-resilience-to-climate-change. Published October 1, 2019. Accessed January 16, 2020.

9. Simkins JD. Navy quietly ends climate change task force, reversing Obama initiative. https://www.navytimes.com/off-duty/military-culture/2019/08/26/navy-quietly-ends-climate-change-task-force-reversing-obama-initiative. Published August 26, 2019. Accessed January 16, 2020.

10. Eilperin J, Dennis B, Ryan M. As White House questions climate change, U.S. military is planning for it. https://www.washingtonpost.com/national/health-science/as-white-house-questions-climate-change-us-military-is-planning-for-it/2019/04/08/78142546-57c0-11e9-814f-e2f46684196e_story.html. Published April 8, 2019. Accessed January 16, 2020.

11. Motta M, Spindel J, Ralston R. Veterans are concerned about climate change and that matters. http://theconversation.com/veterans-are-concerned-about-climate-change-and-that-matters-110685. Published March 8, 2019. Accessed January 16, 2020.

12. Albeck-Ripka L, Kwai I, Fuller T, Tarabay J. ‘It’s an atomic bomb’: Australia deploys military as fires spread. https://www.nytimes.com/2020/01/04/world/australia/fires-military.html. Updated January 5, 2020. Accessed January 18, 2020.

Issue
Federal Practitioner - 37(2)a
Issue
Federal Practitioner - 37(2)a
Page Number
67-68
Page Number
67-68
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Stimulant Medication Prescribing Practices Within a VA Health Care System

Article Type
Changed
Setting clear expectations for patients and prescribers before and during prescription use and the development of a clinical practice protocol may improve patient misuse of stimulant medications.

Dispensing of prescription stimulant medications, such as methylphenidate or amphetamine salts, has been expanding at a rapid rate over the past 2 decades. An astounding 58 million stimulant medications were prescribed in 2014.1,2 Adults now exceed youths in the proportion of prescribed stimulant medications.1,3

Off-label use of prescription stimulant medications, such as for performance enhancement, fatigue management, weight loss, medication-assisted therapy for stimulant use disorders, and adjunctive treatment for certain depressive disorders, is reported to be ≥ 40% of total stimulant use and is much more common in adults.1 A 2017 study assessing risk of amphetamine use disorder and mortality among veterans prescribed stimulant medications within the Veterans Health Administration (VHA) reported off-label use in nearly 3 of every 5 incident users in 2012.4 Off-label use also is significantly more common when prescribed by nonpsychiatric physicians compared with that of psychiatrists.1

One study assessing stimulant prescribing from 2006 to 2009 found that nearly 60% of adults were prescribed stimulant medications by nonpsychiatrist physicians, and only 34% of those adults prescribed a stimulant by a nonpsychiatrist physician had a diagnosis of attention-deficit hyperactivity disorder (ADHD).5 Findings from managed care plans covering years from 2000 to 2004 were similar, concluding that 30% of the adult patients who were prescribed methylphenidate had at least 1 medical claim with a diagnosis of ADHD.6 Of the approximately 16 million adults prescribed stimulant medications in 2017, > 5 million of them reported stimulant misuse.3 Much attention has been focused on misuse of stimulant medications by youths and young adults, but new information suggests that increased monitoring is needed among the US adult population. Per the US Department of Veterans Affairs (VA) Academic Detailing Stimulant Dashboard, as of October 2018 the national average of veterans with a documented substance use disorder (SUD) who are also prescribed stimulant medications through the VHA exceeds 20%, < 50% have an annual urine drug screen (UDS), and > 10% are coprescribed opioids and benzodiazepines.The percentage of veterans prescribed stimulant medications in the presence of a SUD has increased over the past decade, with a reported 8.7% incidence in 2002 increasing to 14.3% in 2012.4

There are currently no protocols, prescribing restrictions, or required monitoring parameters in place for prescription stimulant use within the Lexington VA Health Care System (LVAHCS). The purpose of this study was to evaluate the prescribing practices at LVAHCS of stimulant medications and identify opportunities for improvement in the prescribing and monitoring of this drug class.

Methods

This study was a single-center quality improvement project evaluating the prescribing practices of stimulant medications within LVAHCS and exempt from institutional review board approval. Veterans were included in the study if they were prescribed amphetamine salts, dextroamphetamine, lisdexamphetamine, or methylphenidate between January 1, 2018 and June 30, 2018; however, the veterans’ entire stimulant use history was assessed. Exclusion criteria included duration of use of < 2 months or < 2 prescriptions filled during the study period. Data for veterans who met the prespecified inclusion and exclusion criteria were collected via chart review and Microsoft SQL Server Management Studio.

 

 

Collected data included age, gender, stimulant regimen (drug name, dose, frequency), indication and duration of use, prescriber name and specialty, prescribing origin of initial stimulant medication, and whether stimulant use predated military service. Monitoring of stimulant medications was assessed via UDS at least annually, query of the prescription drug monitoring program (PDMP) at least quarterly, and average time between follow-up appointments with stimulant prescriber.

Monitoring parameters were assessed from January 1, 2017 through June 30, 2018, as it was felt that the 6-month study period would be too narrow to accurately assess monitoring trends. Mental health diagnoses, ADHD diagnostic testing if applicable, documented SUD or stimulant misuse past or present, and concomitant central nervous system (CNS) depressant use also were collected. CNS depressants evaluated were those that have abuse potential or significant psychotropic effects and included benzodiazepines, antipsychotics, opioids, gabapentin/pregabalin, Z-hypnotics, and muscle relaxants.

Results

The majority of participants were male (168/200) with an average age of 43.3 years. Dextroamphetamine/amphetamine was the most used stimulant (48.5%), followed by methylphenidate (40%), and dextroamphetamine (10%). Lisdexamphetamine was the least used stimulant, likely due to its formulary-restricted status within this facility. An extended release (ER) formulation was utilized in 1 of 4 participants, with 1 of 20 participants prescribed a combination of immediate release (IR) and ER formulations. Duration of use ranged from 3 months to 14 years, with an average duration of 4 years (Table 1).

Nearly 40% of participants reported an origin of stimulant initiation outside of LVAHCS. Fourteen percent of participants were started on prescription stimulant medications while active-duty service members. Stimulant medications were initiated at another VA facility in 10.5% of instances, and 15% of participants reported being prescribed stimulant medications by a civilian prescriber prior to receiving them at LVAHCS. Seventy-four of 79 (93.6%) participants with an origin of stimulant prescription outside of LVAHCS reported a US Federal Food and Drug Administration (FDA)-approved indication for use. The majority (87%) of stimulant medications were prescribed by the mental health service, and 25% of initial stimulant prescriptions were written by a single mental health prescriber. Eleven percent of participants were prescribed stimulant medications by multiple specialties, and nearly all participants had > 1 stimulant prescriber over the course of their treatment. More than 10% of veterans had their stimulant medication discontinued by one prescriber and then restarted by another prescriber.

Stimulant medications were used for FDA-approved indications (ADHD and narcolepsy) in 69.5% of participants. Note, this included patients who maintained an ADHD diagnosis in their medical record even if it was not substantiated with diagnostic testing. Of the participants reporting ADHD as an indication for stimulant use, diagnostic testing was conducted at LVAHCS to confirm an ADHD diagnosis in 58.6% (78/133) participants; 20.5% (16/78) of these diagnostic tests did not support the diagnosis of ADHD. All documented indications for use can be found in Table 2.



As expected, the most common indication was ADHD (66.5%), followed by ADHD-like symptoms (9%), refractory depression (7%), and fatigue (5.5%). Fourteen percent of participants had ≥ 1 change in indication for use, with some participants having up to 4 different documented indications while being prescribed stimulant medications. Twelve percent of participants were either denied stimulant initiation, or current stimulant medications were discontinued by one health provider and were restarted by another following a prescriber change. Aside from indication for stimulant use, 90% of participants had at least one additional mental health diagnosis. The rate of all mental health diagnoses documented in the medical record problem list can be found in Table 3.



A UDS was collected at least annually in 37% of participants. A methylphenidate confirmatory screen was ordered to assess adherence in just 2 (2.5%) participants prescribed methylphenidate. While actively prescribed stimulant medications, PDMP was queried quarterly in 26% of participants. Time to follow-up with the prescriber ranged from 1 to 15 months, and 40% of participants had follow-up at least quarterly. Instance of SUD, either active or in remission, differed when searched via problem list (36/200) and prescriber documentation (63/200). The most common SUD was alcohol use disorder (13%), followed by cannabis use disorder (5%), polysubstance use disorder (5%), opioid use disorder (4.5%), stimulant use disorder (2.5%), and sedative use disorder (1%). Twenty-five participants currently prescribed stimulant medications had stimulant abuse/misuse documented in their medical record. Fifty-four percent of participants were prescribed at least 1 CNS depressant considered to have abuse potential or significant psychotropic effects. Opioids were most common (23%), followed by muscle relaxants (15.5%), benzodiazepines (15%), antipsychotics (13%), gabapentin/pregabalin (12%), and Z-hypnotics (12%).

 

 

Discussion

The source of the initial stimulant prescription was assessed. The majority of veterans had received medical care prior to receiving care at LVAHCS, whether on active duty, from another VA facility throughout the country, or by a private civilian prescriber. The origin of initial stimulant medication and indication for stimulant medication use were patient reported. Requiring medical records from civilian providers prior to continuing stimulant medication is prescriber-dependent and was not available for all participants.

As expected, the majority of participants (87%) received their first stimulant prescription via a prescriber in the mental health specialty, 20 were prescribed stimulant medications from primary care, 4 from the emergency department (ED), and 2 from neurology. Three of the 4 stimulant prescriptions written in the ED were for continuity of care until the veteran could have an appointment with a mental health or primary care provider, and the other was prescribed by a mental health nurse practitioner for a veteran who presented to the ED with complaints of ADHD-like symptoms. More than 10% of veterans had their stimulant medication discontinued by one prescriber and then restarted by another prescriber.

The reasons for discontinuation included a positive UDS result for cocaine, psychosis, broken narcotic contract, ADHD diagnosis not supported by psychological testing, chronic bipolar disorder secondary to stimulant use, diversion, stimulant misuse, and lack of indication for use. There also were a handful of veterans whose VA prescribers declined to initiate prescription stimulant medications for various reasons, so the veteran sought care from a civilian prescriber who prescribed stimulant medications, then returned to the VA for medication management, and stimulant medications were continued. Fourteen percent (28/200) of participants had multiple indications for use at some point during stimulant medication therapy. Eight of those were a reasonable change from ADHD to ADHD-like symptoms when diagnosis was not substantiated by testing. The cause of other changes in indication for use was not well documented and often unclear. One veteran had 4 different indications for use documented in the medical record, often changing with each change in prescriber. It appeared that the most recent prescriber was uncertain of the actual indication for use but did not want to discontinue the medication. This prescriber documented that the stimulant medication should continue for presumed ADHD/mood/fatigue/cognitive dysfunction, which were all of the indications documented by the veteran’s previous prescribers.

 

Reasons for Discontinuation

ADHD was the most prominent indication for use, although the indication was changed to ADHD-like symptoms in several veterans for whom diagnostic testing did not support the ADHD diagnosis. Seventy-eight of 133 veterans prescribed stimulant medications for ADHD received diagnostic testing via a psychologist at LVAHCS. For the 11 veterans who had testing after stimulant initiation, a stimulant-free period was required prior to testing to ensure an accurate diagnosis. For 21% of veterans, the ADHD diagnosis was unsubstantiated by formal testing; however, all of these veterans continued stimulant medication use. For 1 veteran, the psychologist performing the testing documented new diagnoses, including moderate to severe stimulant use disorder and malingering both for PTSD and ADHD. The rate of stimulant prescribing inconsistency, “prescriber-hopping,” and unsupported ADHD diagnosis results warrant a conversation about expectations for transitions of care regarding stimulant medications, not only from outside to inside LVAHCS, but from prescriber to prescriber within the facility.

 

 

In some cases, stimulant medications were discontinued by a prescriber secondary to a worsening of another mental health condition. More than half of the participants in this study had an anxiety disorder diagnosis. Whether or not anxiety predated stimulant use or whether the use of stimulant medications contributed to the diagnosis and thus the addition of an additional CNS depressant to treat anxiety may be an area of research for future consideration. Although bipolar disorder, anxiety disorders, psychosis, and SUD are not contraindications for use of stimulant medications, caution must be used in patients with these diagnoses. Prescribers must weigh risks vs benefits as well as perform close monitoring during use. Similarly, one might look further into stimulant medications prescribed for fatigue and assess the role of any simultaneously prescribed CNS depressants. Is the stimulant being used to treat the adverse effect (AE) of another medication? In 2 documented instances in this study, a psychologist conducted diagnostic testing who reported that the veteran did not meet the criteria for ADHD but that a stimulant may help counteract the iatrogenic effect of anticonvulsants. In both instances stimulant use continued.

Prescription Monitoring

Polysubstance use disorder (5%) was the third most common SUD recorded among study participants. The majority of those with polysubstance use disorder reported abuse/misuse of illicit or prescribed stimulants. Stimulant abuse/misuse was documented in 25 of 200 (12.5%) study participants. In several instances, abuse/misuse was detected by the LVAHCS delivery coordination pharmacist who tracks patterns of early fill requests and prescriptions reported lost/stolen. This pharmacist may request that the prescriber obtain PDMP query, UDS, or pill count if concerning patterns are noted. Lisdexamphetamine is a formulary-restricted medication at LVAHCS, but it was noted to be approved for use when prescribers requested an abuse-deterrent formulation. Investigators noticed a trend in veterans whose prescriptions exceeded the recommended maximum dosage also having stimulant abuse/misuse documented in their medical record. The highest documented total daily dose in this study was 120-mg amphetamine salts IR for ADHD, compared with the normal recommended dosing range of 5 to 40 mg/d for the same indication.

Various modalities were used to monitor participants but less than half of veterans had an annual UDS, quarterly PDMP query, and quarterly prescriber follow-up. PDMP queries and prescriber follow-up was assessed quarterly as would be reasonable given that private sector practitioners may issue multiple prescriptions authorizing the patient to receive up to a 90-day supply.7 Prescriber follow-up ranged from 1 to 15 months. A longer time to follow-up was seen more frequently in stimulant medications prescribed by primary care as compared with that of mental health.

Clinical Practice Protocol

Data from this study were collected with the intent to identify opportunities for improvement in the prescribing and monitoring of stimulant medications. From the above results investigators concluded that this facility may benefit from implementation of a facility-specific clinical practice protocol (CPP) for stimulant prescribing. It may also be beneficial to formulate a chronic stimulant management agreement between patient and prescriber to provide informed consent and clear expectations prior to stimulant medication initiation.

 

 

A CPP could be used to establish stimulant prescribing rules within a facility, which may limit who can prescribe stimulant medications or include a review process and/or required documentation in the medical record when being prescribed outside of specified dosing range and indications for use designated in the CPP or other evidence-based guidelines. Transition of care was found to be an area of opportunity in this study, which could be mitigated with the requirement of a baseline assessment prior to stimulant initiation with the expectation that it be completed regardless of prior prescription stimulant medication use. There was a lack of consistent monitoring for participants in this study, which may be improved if required monitoring parameters and frequency were provided for prescribers. For example, monitoring of heart rate and blood pressure was not assessed in this study, but a CPP may include monitoring vital signs before and after each dose change and every 6 months, per recommendation from the National Institute for Health and Care Excellence ADHD Diagnosis and Management guideline published in 2018.8The CPP may list the responsibilities of all those involved in the prescribing of stimulant medications, such as mental health service leadership, prescribers, nursing staff, pharmacists, social workers, psychologists, and other mental health staff. For prescribers this may include a thorough baseline assessment and criteria for use that must be met prior to stimulant initiation, documentation that must be included in the medical record and required monitoring during stimulant treatment, and expectations for increased monitoring and/or termination of treatment with nonadherence, diversion, or abuse/misuse.

The responsibilities of pharmacists may include establishing criteria for use of nonformulary and restricted agents as well as completion of nonformulary/restricted requests, reviewing dosages that exceed the recommended FDA daily maximum, reviewing uncommon off-label uses of stimulant medications, review and document early fill requests, potential nonadherence, potential drug-seeking behavior, and communication of the following information to the primary prescriber. For other mental health staff this may include documenting any reported AEs of the medication, referring the patient to their prescriber or pharmacist for any medication questions or concerns, and assessment of effectiveness and/or worsening behavior during patient contact.

Limitations

One limitation of this study was the way that data were pulled from patient charts. For example, only 3/200 participants in this study had insomnia per diagnosis codes, whereas that number was substantially higher when chart review was used to assess active prescriptions for sleep aids or documented complaints of insomnia in prescriber progress notes. For this same reason, rates of SUDs must be interpreted with caution as well. SUD diagnosis, both current and in remission were taken into account during data collection. Per diagnosis codes, 36 (18%) veterans in this study had a history of SUD, but this number was higher (31.5%) during chart review. The majority of discrepancies were found when participants reported a history of SUD to the prescriber, but this information was not captured via the problem list or encounter codes. What some may consider a minor omission in documentation can have a large impact on patient care as it is unlikely that prescribers have adequate administrative time to complete a chart review in order to find a complete past medical history as was required of investigators in this study. For this reason, incomplete provider documentation and human error that can occur as a result of a retrospective chart review were also identified as study limitations.

 

 

Conclusion

Our data show that there is still substantial room for improvement in the prescribing and monitoring of stimulant medications. The rate of stimulant prescribing inconsistency, prescriber-hopping, and unsupported ADHD diagnosis resulting from formal diagnostic testing warrant a review in the processes for transition of care regarding stimulant medications, both within and outside of this facility. A lack of consistent monitoring was also identified in this study. One of the most appreciable areas of opportunity resulting from this study is the need for consistency in both the prescribing and monitoring of stimulant medications. From the above results investigators concluded that this facility may benefit from implementation of a CPP for stimulant prescribing as well as a chronic stimulant management agreement to provide clear expectations for patients and prescribers prior to and during prescription stimulant use.

Acknowledgments 

We thank Tori Wilhoit, PharmD candidate, and Dana Fischer, PharmD candidate, for their participation in data collection and Courtney Eatmon, PharmD, BCPP, for her general administrative support throughout this study.

References

1. Safer DJ. Recent trends in stimulant usage. J Atten Disord. 2016;20(6):471-477.

2. Christopher Jones; US Food and Drug Administration. The opioid epidemic overview and a look to the future. http://www.agencymeddirectors.wa.gov/Files/OpioidConference/2Jones_OPIOIDEPIDEMICOVERVIEW.pdf. Published June 12, 2015. Accessed January 16, 2020.

3. Compton WM, Han B, Blanco C, Johnson K, Jones CM. Prevalence and correlates of prescription stimulant use, misuse, use disorders, motivations for misuse among adults in the United States. Am J Psychiatry. 2018;175(8):741-755.  

4. Westover AN, Nakonezney PA, Halm EA, Adinoff B. Risk of amphetamine use disorder and mortality among incident users of prescribed stimulant medications in the Veterans Administration. Addiction. 2018;113(5):857-867.

5. Olfson M, Blanco C, Wang S, Greenhill LL. Trends in office-based treatment of adults with stimulant medications in the United States. J Clin Psychiatry. 2013;74(1):43-50.

6. Olfson M, Marcus SC, Zhang HF, and Wan GJ. Continuity in methylphenidate treatment of adults with attention-deficit/hyperactivity disorder. J Manag Care Pharm. 2007;13(7): 570-577.

7. 21 CFR § 1306.12

8. National Collaborating Centre for Mental Health (UK). Attention deficit hyperactivity disorder: diagnosis and management of ADHD in children, young people and adults. NICE Clinical Guidelines, No. 87. Leicester, United Kingdom: British Psychological Society; 2018.

Article PDF
Author and Disclosure Information

Caroline Richmond is a Clinical Pharmacy Specialist at the Memphis VA Medical Center in Tennessee. Justin Butler is an Academic Detailing Pharmacist at the Lexington Veterans Affairs Health Care System in Kentucky.
Corresponding author: Caroline Richmond ([email protected])

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Issue
Federal Practitioner - 37(2)a
Publications
Topics
Page Number
86-91
Sections
Author and Disclosure Information

Caroline Richmond is a Clinical Pharmacy Specialist at the Memphis VA Medical Center in Tennessee. Justin Butler is an Academic Detailing Pharmacist at the Lexington Veterans Affairs Health Care System in Kentucky.
Corresponding author: Caroline Richmond ([email protected])

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Author and Disclosure Information

Caroline Richmond is a Clinical Pharmacy Specialist at the Memphis VA Medical Center in Tennessee. Justin Butler is an Academic Detailing Pharmacist at the Lexington Veterans Affairs Health Care System in Kentucky.
Corresponding author: Caroline Richmond ([email protected])

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Article PDF
Article PDF
Setting clear expectations for patients and prescribers before and during prescription use and the development of a clinical practice protocol may improve patient misuse of stimulant medications.
Setting clear expectations for patients and prescribers before and during prescription use and the development of a clinical practice protocol may improve patient misuse of stimulant medications.

Dispensing of prescription stimulant medications, such as methylphenidate or amphetamine salts, has been expanding at a rapid rate over the past 2 decades. An astounding 58 million stimulant medications were prescribed in 2014.1,2 Adults now exceed youths in the proportion of prescribed stimulant medications.1,3

Off-label use of prescription stimulant medications, such as for performance enhancement, fatigue management, weight loss, medication-assisted therapy for stimulant use disorders, and adjunctive treatment for certain depressive disorders, is reported to be ≥ 40% of total stimulant use and is much more common in adults.1 A 2017 study assessing risk of amphetamine use disorder and mortality among veterans prescribed stimulant medications within the Veterans Health Administration (VHA) reported off-label use in nearly 3 of every 5 incident users in 2012.4 Off-label use also is significantly more common when prescribed by nonpsychiatric physicians compared with that of psychiatrists.1

One study assessing stimulant prescribing from 2006 to 2009 found that nearly 60% of adults were prescribed stimulant medications by nonpsychiatrist physicians, and only 34% of those adults prescribed a stimulant by a nonpsychiatrist physician had a diagnosis of attention-deficit hyperactivity disorder (ADHD).5 Findings from managed care plans covering years from 2000 to 2004 were similar, concluding that 30% of the adult patients who were prescribed methylphenidate had at least 1 medical claim with a diagnosis of ADHD.6 Of the approximately 16 million adults prescribed stimulant medications in 2017, > 5 million of them reported stimulant misuse.3 Much attention has been focused on misuse of stimulant medications by youths and young adults, but new information suggests that increased monitoring is needed among the US adult population. Per the US Department of Veterans Affairs (VA) Academic Detailing Stimulant Dashboard, as of October 2018 the national average of veterans with a documented substance use disorder (SUD) who are also prescribed stimulant medications through the VHA exceeds 20%, < 50% have an annual urine drug screen (UDS), and > 10% are coprescribed opioids and benzodiazepines.The percentage of veterans prescribed stimulant medications in the presence of a SUD has increased over the past decade, with a reported 8.7% incidence in 2002 increasing to 14.3% in 2012.4

There are currently no protocols, prescribing restrictions, or required monitoring parameters in place for prescription stimulant use within the Lexington VA Health Care System (LVAHCS). The purpose of this study was to evaluate the prescribing practices at LVAHCS of stimulant medications and identify opportunities for improvement in the prescribing and monitoring of this drug class.

Methods

This study was a single-center quality improvement project evaluating the prescribing practices of stimulant medications within LVAHCS and exempt from institutional review board approval. Veterans were included in the study if they were prescribed amphetamine salts, dextroamphetamine, lisdexamphetamine, or methylphenidate between January 1, 2018 and June 30, 2018; however, the veterans’ entire stimulant use history was assessed. Exclusion criteria included duration of use of < 2 months or < 2 prescriptions filled during the study period. Data for veterans who met the prespecified inclusion and exclusion criteria were collected via chart review and Microsoft SQL Server Management Studio.

 

 

Collected data included age, gender, stimulant regimen (drug name, dose, frequency), indication and duration of use, prescriber name and specialty, prescribing origin of initial stimulant medication, and whether stimulant use predated military service. Monitoring of stimulant medications was assessed via UDS at least annually, query of the prescription drug monitoring program (PDMP) at least quarterly, and average time between follow-up appointments with stimulant prescriber.

Monitoring parameters were assessed from January 1, 2017 through June 30, 2018, as it was felt that the 6-month study period would be too narrow to accurately assess monitoring trends. Mental health diagnoses, ADHD diagnostic testing if applicable, documented SUD or stimulant misuse past or present, and concomitant central nervous system (CNS) depressant use also were collected. CNS depressants evaluated were those that have abuse potential or significant psychotropic effects and included benzodiazepines, antipsychotics, opioids, gabapentin/pregabalin, Z-hypnotics, and muscle relaxants.

Results

The majority of participants were male (168/200) with an average age of 43.3 years. Dextroamphetamine/amphetamine was the most used stimulant (48.5%), followed by methylphenidate (40%), and dextroamphetamine (10%). Lisdexamphetamine was the least used stimulant, likely due to its formulary-restricted status within this facility. An extended release (ER) formulation was utilized in 1 of 4 participants, with 1 of 20 participants prescribed a combination of immediate release (IR) and ER formulations. Duration of use ranged from 3 months to 14 years, with an average duration of 4 years (Table 1).

Nearly 40% of participants reported an origin of stimulant initiation outside of LVAHCS. Fourteen percent of participants were started on prescription stimulant medications while active-duty service members. Stimulant medications were initiated at another VA facility in 10.5% of instances, and 15% of participants reported being prescribed stimulant medications by a civilian prescriber prior to receiving them at LVAHCS. Seventy-four of 79 (93.6%) participants with an origin of stimulant prescription outside of LVAHCS reported a US Federal Food and Drug Administration (FDA)-approved indication for use. The majority (87%) of stimulant medications were prescribed by the mental health service, and 25% of initial stimulant prescriptions were written by a single mental health prescriber. Eleven percent of participants were prescribed stimulant medications by multiple specialties, and nearly all participants had > 1 stimulant prescriber over the course of their treatment. More than 10% of veterans had their stimulant medication discontinued by one prescriber and then restarted by another prescriber.

Stimulant medications were used for FDA-approved indications (ADHD and narcolepsy) in 69.5% of participants. Note, this included patients who maintained an ADHD diagnosis in their medical record even if it was not substantiated with diagnostic testing. Of the participants reporting ADHD as an indication for stimulant use, diagnostic testing was conducted at LVAHCS to confirm an ADHD diagnosis in 58.6% (78/133) participants; 20.5% (16/78) of these diagnostic tests did not support the diagnosis of ADHD. All documented indications for use can be found in Table 2.



As expected, the most common indication was ADHD (66.5%), followed by ADHD-like symptoms (9%), refractory depression (7%), and fatigue (5.5%). Fourteen percent of participants had ≥ 1 change in indication for use, with some participants having up to 4 different documented indications while being prescribed stimulant medications. Twelve percent of participants were either denied stimulant initiation, or current stimulant medications were discontinued by one health provider and were restarted by another following a prescriber change. Aside from indication for stimulant use, 90% of participants had at least one additional mental health diagnosis. The rate of all mental health diagnoses documented in the medical record problem list can be found in Table 3.



A UDS was collected at least annually in 37% of participants. A methylphenidate confirmatory screen was ordered to assess adherence in just 2 (2.5%) participants prescribed methylphenidate. While actively prescribed stimulant medications, PDMP was queried quarterly in 26% of participants. Time to follow-up with the prescriber ranged from 1 to 15 months, and 40% of participants had follow-up at least quarterly. Instance of SUD, either active or in remission, differed when searched via problem list (36/200) and prescriber documentation (63/200). The most common SUD was alcohol use disorder (13%), followed by cannabis use disorder (5%), polysubstance use disorder (5%), opioid use disorder (4.5%), stimulant use disorder (2.5%), and sedative use disorder (1%). Twenty-five participants currently prescribed stimulant medications had stimulant abuse/misuse documented in their medical record. Fifty-four percent of participants were prescribed at least 1 CNS depressant considered to have abuse potential or significant psychotropic effects. Opioids were most common (23%), followed by muscle relaxants (15.5%), benzodiazepines (15%), antipsychotics (13%), gabapentin/pregabalin (12%), and Z-hypnotics (12%).

 

 

Discussion

The source of the initial stimulant prescription was assessed. The majority of veterans had received medical care prior to receiving care at LVAHCS, whether on active duty, from another VA facility throughout the country, or by a private civilian prescriber. The origin of initial stimulant medication and indication for stimulant medication use were patient reported. Requiring medical records from civilian providers prior to continuing stimulant medication is prescriber-dependent and was not available for all participants.

As expected, the majority of participants (87%) received their first stimulant prescription via a prescriber in the mental health specialty, 20 were prescribed stimulant medications from primary care, 4 from the emergency department (ED), and 2 from neurology. Three of the 4 stimulant prescriptions written in the ED were for continuity of care until the veteran could have an appointment with a mental health or primary care provider, and the other was prescribed by a mental health nurse practitioner for a veteran who presented to the ED with complaints of ADHD-like symptoms. More than 10% of veterans had their stimulant medication discontinued by one prescriber and then restarted by another prescriber.

The reasons for discontinuation included a positive UDS result for cocaine, psychosis, broken narcotic contract, ADHD diagnosis not supported by psychological testing, chronic bipolar disorder secondary to stimulant use, diversion, stimulant misuse, and lack of indication for use. There also were a handful of veterans whose VA prescribers declined to initiate prescription stimulant medications for various reasons, so the veteran sought care from a civilian prescriber who prescribed stimulant medications, then returned to the VA for medication management, and stimulant medications were continued. Fourteen percent (28/200) of participants had multiple indications for use at some point during stimulant medication therapy. Eight of those were a reasonable change from ADHD to ADHD-like symptoms when diagnosis was not substantiated by testing. The cause of other changes in indication for use was not well documented and often unclear. One veteran had 4 different indications for use documented in the medical record, often changing with each change in prescriber. It appeared that the most recent prescriber was uncertain of the actual indication for use but did not want to discontinue the medication. This prescriber documented that the stimulant medication should continue for presumed ADHD/mood/fatigue/cognitive dysfunction, which were all of the indications documented by the veteran’s previous prescribers.

 

Reasons for Discontinuation

ADHD was the most prominent indication for use, although the indication was changed to ADHD-like symptoms in several veterans for whom diagnostic testing did not support the ADHD diagnosis. Seventy-eight of 133 veterans prescribed stimulant medications for ADHD received diagnostic testing via a psychologist at LVAHCS. For the 11 veterans who had testing after stimulant initiation, a stimulant-free period was required prior to testing to ensure an accurate diagnosis. For 21% of veterans, the ADHD diagnosis was unsubstantiated by formal testing; however, all of these veterans continued stimulant medication use. For 1 veteran, the psychologist performing the testing documented new diagnoses, including moderate to severe stimulant use disorder and malingering both for PTSD and ADHD. The rate of stimulant prescribing inconsistency, “prescriber-hopping,” and unsupported ADHD diagnosis results warrant a conversation about expectations for transitions of care regarding stimulant medications, not only from outside to inside LVAHCS, but from prescriber to prescriber within the facility.

 

 

In some cases, stimulant medications were discontinued by a prescriber secondary to a worsening of another mental health condition. More than half of the participants in this study had an anxiety disorder diagnosis. Whether or not anxiety predated stimulant use or whether the use of stimulant medications contributed to the diagnosis and thus the addition of an additional CNS depressant to treat anxiety may be an area of research for future consideration. Although bipolar disorder, anxiety disorders, psychosis, and SUD are not contraindications for use of stimulant medications, caution must be used in patients with these diagnoses. Prescribers must weigh risks vs benefits as well as perform close monitoring during use. Similarly, one might look further into stimulant medications prescribed for fatigue and assess the role of any simultaneously prescribed CNS depressants. Is the stimulant being used to treat the adverse effect (AE) of another medication? In 2 documented instances in this study, a psychologist conducted diagnostic testing who reported that the veteran did not meet the criteria for ADHD but that a stimulant may help counteract the iatrogenic effect of anticonvulsants. In both instances stimulant use continued.

Prescription Monitoring

Polysubstance use disorder (5%) was the third most common SUD recorded among study participants. The majority of those with polysubstance use disorder reported abuse/misuse of illicit or prescribed stimulants. Stimulant abuse/misuse was documented in 25 of 200 (12.5%) study participants. In several instances, abuse/misuse was detected by the LVAHCS delivery coordination pharmacist who tracks patterns of early fill requests and prescriptions reported lost/stolen. This pharmacist may request that the prescriber obtain PDMP query, UDS, or pill count if concerning patterns are noted. Lisdexamphetamine is a formulary-restricted medication at LVAHCS, but it was noted to be approved for use when prescribers requested an abuse-deterrent formulation. Investigators noticed a trend in veterans whose prescriptions exceeded the recommended maximum dosage also having stimulant abuse/misuse documented in their medical record. The highest documented total daily dose in this study was 120-mg amphetamine salts IR for ADHD, compared with the normal recommended dosing range of 5 to 40 mg/d for the same indication.

Various modalities were used to monitor participants but less than half of veterans had an annual UDS, quarterly PDMP query, and quarterly prescriber follow-up. PDMP queries and prescriber follow-up was assessed quarterly as would be reasonable given that private sector practitioners may issue multiple prescriptions authorizing the patient to receive up to a 90-day supply.7 Prescriber follow-up ranged from 1 to 15 months. A longer time to follow-up was seen more frequently in stimulant medications prescribed by primary care as compared with that of mental health.

Clinical Practice Protocol

Data from this study were collected with the intent to identify opportunities for improvement in the prescribing and monitoring of stimulant medications. From the above results investigators concluded that this facility may benefit from implementation of a facility-specific clinical practice protocol (CPP) for stimulant prescribing. It may also be beneficial to formulate a chronic stimulant management agreement between patient and prescriber to provide informed consent and clear expectations prior to stimulant medication initiation.

 

 

A CPP could be used to establish stimulant prescribing rules within a facility, which may limit who can prescribe stimulant medications or include a review process and/or required documentation in the medical record when being prescribed outside of specified dosing range and indications for use designated in the CPP or other evidence-based guidelines. Transition of care was found to be an area of opportunity in this study, which could be mitigated with the requirement of a baseline assessment prior to stimulant initiation with the expectation that it be completed regardless of prior prescription stimulant medication use. There was a lack of consistent monitoring for participants in this study, which may be improved if required monitoring parameters and frequency were provided for prescribers. For example, monitoring of heart rate and blood pressure was not assessed in this study, but a CPP may include monitoring vital signs before and after each dose change and every 6 months, per recommendation from the National Institute for Health and Care Excellence ADHD Diagnosis and Management guideline published in 2018.8The CPP may list the responsibilities of all those involved in the prescribing of stimulant medications, such as mental health service leadership, prescribers, nursing staff, pharmacists, social workers, psychologists, and other mental health staff. For prescribers this may include a thorough baseline assessment and criteria for use that must be met prior to stimulant initiation, documentation that must be included in the medical record and required monitoring during stimulant treatment, and expectations for increased monitoring and/or termination of treatment with nonadherence, diversion, or abuse/misuse.

The responsibilities of pharmacists may include establishing criteria for use of nonformulary and restricted agents as well as completion of nonformulary/restricted requests, reviewing dosages that exceed the recommended FDA daily maximum, reviewing uncommon off-label uses of stimulant medications, review and document early fill requests, potential nonadherence, potential drug-seeking behavior, and communication of the following information to the primary prescriber. For other mental health staff this may include documenting any reported AEs of the medication, referring the patient to their prescriber or pharmacist for any medication questions or concerns, and assessment of effectiveness and/or worsening behavior during patient contact.

Limitations

One limitation of this study was the way that data were pulled from patient charts. For example, only 3/200 participants in this study had insomnia per diagnosis codes, whereas that number was substantially higher when chart review was used to assess active prescriptions for sleep aids or documented complaints of insomnia in prescriber progress notes. For this same reason, rates of SUDs must be interpreted with caution as well. SUD diagnosis, both current and in remission were taken into account during data collection. Per diagnosis codes, 36 (18%) veterans in this study had a history of SUD, but this number was higher (31.5%) during chart review. The majority of discrepancies were found when participants reported a history of SUD to the prescriber, but this information was not captured via the problem list or encounter codes. What some may consider a minor omission in documentation can have a large impact on patient care as it is unlikely that prescribers have adequate administrative time to complete a chart review in order to find a complete past medical history as was required of investigators in this study. For this reason, incomplete provider documentation and human error that can occur as a result of a retrospective chart review were also identified as study limitations.

 

 

Conclusion

Our data show that there is still substantial room for improvement in the prescribing and monitoring of stimulant medications. The rate of stimulant prescribing inconsistency, prescriber-hopping, and unsupported ADHD diagnosis resulting from formal diagnostic testing warrant a review in the processes for transition of care regarding stimulant medications, both within and outside of this facility. A lack of consistent monitoring was also identified in this study. One of the most appreciable areas of opportunity resulting from this study is the need for consistency in both the prescribing and monitoring of stimulant medications. From the above results investigators concluded that this facility may benefit from implementation of a CPP for stimulant prescribing as well as a chronic stimulant management agreement to provide clear expectations for patients and prescribers prior to and during prescription stimulant use.

Acknowledgments 

We thank Tori Wilhoit, PharmD candidate, and Dana Fischer, PharmD candidate, for their participation in data collection and Courtney Eatmon, PharmD, BCPP, for her general administrative support throughout this study.

Dispensing of prescription stimulant medications, such as methylphenidate or amphetamine salts, has been expanding at a rapid rate over the past 2 decades. An astounding 58 million stimulant medications were prescribed in 2014.1,2 Adults now exceed youths in the proportion of prescribed stimulant medications.1,3

Off-label use of prescription stimulant medications, such as for performance enhancement, fatigue management, weight loss, medication-assisted therapy for stimulant use disorders, and adjunctive treatment for certain depressive disorders, is reported to be ≥ 40% of total stimulant use and is much more common in adults.1 A 2017 study assessing risk of amphetamine use disorder and mortality among veterans prescribed stimulant medications within the Veterans Health Administration (VHA) reported off-label use in nearly 3 of every 5 incident users in 2012.4 Off-label use also is significantly more common when prescribed by nonpsychiatric physicians compared with that of psychiatrists.1

One study assessing stimulant prescribing from 2006 to 2009 found that nearly 60% of adults were prescribed stimulant medications by nonpsychiatrist physicians, and only 34% of those adults prescribed a stimulant by a nonpsychiatrist physician had a diagnosis of attention-deficit hyperactivity disorder (ADHD).5 Findings from managed care plans covering years from 2000 to 2004 were similar, concluding that 30% of the adult patients who were prescribed methylphenidate had at least 1 medical claim with a diagnosis of ADHD.6 Of the approximately 16 million adults prescribed stimulant medications in 2017, > 5 million of them reported stimulant misuse.3 Much attention has been focused on misuse of stimulant medications by youths and young adults, but new information suggests that increased monitoring is needed among the US adult population. Per the US Department of Veterans Affairs (VA) Academic Detailing Stimulant Dashboard, as of October 2018 the national average of veterans with a documented substance use disorder (SUD) who are also prescribed stimulant medications through the VHA exceeds 20%, < 50% have an annual urine drug screen (UDS), and > 10% are coprescribed opioids and benzodiazepines.The percentage of veterans prescribed stimulant medications in the presence of a SUD has increased over the past decade, with a reported 8.7% incidence in 2002 increasing to 14.3% in 2012.4

There are currently no protocols, prescribing restrictions, or required monitoring parameters in place for prescription stimulant use within the Lexington VA Health Care System (LVAHCS). The purpose of this study was to evaluate the prescribing practices at LVAHCS of stimulant medications and identify opportunities for improvement in the prescribing and monitoring of this drug class.

Methods

This study was a single-center quality improvement project evaluating the prescribing practices of stimulant medications within LVAHCS and exempt from institutional review board approval. Veterans were included in the study if they were prescribed amphetamine salts, dextroamphetamine, lisdexamphetamine, or methylphenidate between January 1, 2018 and June 30, 2018; however, the veterans’ entire stimulant use history was assessed. Exclusion criteria included duration of use of < 2 months or < 2 prescriptions filled during the study period. Data for veterans who met the prespecified inclusion and exclusion criteria were collected via chart review and Microsoft SQL Server Management Studio.

 

 

Collected data included age, gender, stimulant regimen (drug name, dose, frequency), indication and duration of use, prescriber name and specialty, prescribing origin of initial stimulant medication, and whether stimulant use predated military service. Monitoring of stimulant medications was assessed via UDS at least annually, query of the prescription drug monitoring program (PDMP) at least quarterly, and average time between follow-up appointments with stimulant prescriber.

Monitoring parameters were assessed from January 1, 2017 through June 30, 2018, as it was felt that the 6-month study period would be too narrow to accurately assess monitoring trends. Mental health diagnoses, ADHD diagnostic testing if applicable, documented SUD or stimulant misuse past or present, and concomitant central nervous system (CNS) depressant use also were collected. CNS depressants evaluated were those that have abuse potential or significant psychotropic effects and included benzodiazepines, antipsychotics, opioids, gabapentin/pregabalin, Z-hypnotics, and muscle relaxants.

Results

The majority of participants were male (168/200) with an average age of 43.3 years. Dextroamphetamine/amphetamine was the most used stimulant (48.5%), followed by methylphenidate (40%), and dextroamphetamine (10%). Lisdexamphetamine was the least used stimulant, likely due to its formulary-restricted status within this facility. An extended release (ER) formulation was utilized in 1 of 4 participants, with 1 of 20 participants prescribed a combination of immediate release (IR) and ER formulations. Duration of use ranged from 3 months to 14 years, with an average duration of 4 years (Table 1).

Nearly 40% of participants reported an origin of stimulant initiation outside of LVAHCS. Fourteen percent of participants were started on prescription stimulant medications while active-duty service members. Stimulant medications were initiated at another VA facility in 10.5% of instances, and 15% of participants reported being prescribed stimulant medications by a civilian prescriber prior to receiving them at LVAHCS. Seventy-four of 79 (93.6%) participants with an origin of stimulant prescription outside of LVAHCS reported a US Federal Food and Drug Administration (FDA)-approved indication for use. The majority (87%) of stimulant medications were prescribed by the mental health service, and 25% of initial stimulant prescriptions were written by a single mental health prescriber. Eleven percent of participants were prescribed stimulant medications by multiple specialties, and nearly all participants had > 1 stimulant prescriber over the course of their treatment. More than 10% of veterans had their stimulant medication discontinued by one prescriber and then restarted by another prescriber.

Stimulant medications were used for FDA-approved indications (ADHD and narcolepsy) in 69.5% of participants. Note, this included patients who maintained an ADHD diagnosis in their medical record even if it was not substantiated with diagnostic testing. Of the participants reporting ADHD as an indication for stimulant use, diagnostic testing was conducted at LVAHCS to confirm an ADHD diagnosis in 58.6% (78/133) participants; 20.5% (16/78) of these diagnostic tests did not support the diagnosis of ADHD. All documented indications for use can be found in Table 2.



As expected, the most common indication was ADHD (66.5%), followed by ADHD-like symptoms (9%), refractory depression (7%), and fatigue (5.5%). Fourteen percent of participants had ≥ 1 change in indication for use, with some participants having up to 4 different documented indications while being prescribed stimulant medications. Twelve percent of participants were either denied stimulant initiation, or current stimulant medications were discontinued by one health provider and were restarted by another following a prescriber change. Aside from indication for stimulant use, 90% of participants had at least one additional mental health diagnosis. The rate of all mental health diagnoses documented in the medical record problem list can be found in Table 3.



A UDS was collected at least annually in 37% of participants. A methylphenidate confirmatory screen was ordered to assess adherence in just 2 (2.5%) participants prescribed methylphenidate. While actively prescribed stimulant medications, PDMP was queried quarterly in 26% of participants. Time to follow-up with the prescriber ranged from 1 to 15 months, and 40% of participants had follow-up at least quarterly. Instance of SUD, either active or in remission, differed when searched via problem list (36/200) and prescriber documentation (63/200). The most common SUD was alcohol use disorder (13%), followed by cannabis use disorder (5%), polysubstance use disorder (5%), opioid use disorder (4.5%), stimulant use disorder (2.5%), and sedative use disorder (1%). Twenty-five participants currently prescribed stimulant medications had stimulant abuse/misuse documented in their medical record. Fifty-four percent of participants were prescribed at least 1 CNS depressant considered to have abuse potential or significant psychotropic effects. Opioids were most common (23%), followed by muscle relaxants (15.5%), benzodiazepines (15%), antipsychotics (13%), gabapentin/pregabalin (12%), and Z-hypnotics (12%).

 

 

Discussion

The source of the initial stimulant prescription was assessed. The majority of veterans had received medical care prior to receiving care at LVAHCS, whether on active duty, from another VA facility throughout the country, or by a private civilian prescriber. The origin of initial stimulant medication and indication for stimulant medication use were patient reported. Requiring medical records from civilian providers prior to continuing stimulant medication is prescriber-dependent and was not available for all participants.

As expected, the majority of participants (87%) received their first stimulant prescription via a prescriber in the mental health specialty, 20 were prescribed stimulant medications from primary care, 4 from the emergency department (ED), and 2 from neurology. Three of the 4 stimulant prescriptions written in the ED were for continuity of care until the veteran could have an appointment with a mental health or primary care provider, and the other was prescribed by a mental health nurse practitioner for a veteran who presented to the ED with complaints of ADHD-like symptoms. More than 10% of veterans had their stimulant medication discontinued by one prescriber and then restarted by another prescriber.

The reasons for discontinuation included a positive UDS result for cocaine, psychosis, broken narcotic contract, ADHD diagnosis not supported by psychological testing, chronic bipolar disorder secondary to stimulant use, diversion, stimulant misuse, and lack of indication for use. There also were a handful of veterans whose VA prescribers declined to initiate prescription stimulant medications for various reasons, so the veteran sought care from a civilian prescriber who prescribed stimulant medications, then returned to the VA for medication management, and stimulant medications were continued. Fourteen percent (28/200) of participants had multiple indications for use at some point during stimulant medication therapy. Eight of those were a reasonable change from ADHD to ADHD-like symptoms when diagnosis was not substantiated by testing. The cause of other changes in indication for use was not well documented and often unclear. One veteran had 4 different indications for use documented in the medical record, often changing with each change in prescriber. It appeared that the most recent prescriber was uncertain of the actual indication for use but did not want to discontinue the medication. This prescriber documented that the stimulant medication should continue for presumed ADHD/mood/fatigue/cognitive dysfunction, which were all of the indications documented by the veteran’s previous prescribers.

 

Reasons for Discontinuation

ADHD was the most prominent indication for use, although the indication was changed to ADHD-like symptoms in several veterans for whom diagnostic testing did not support the ADHD diagnosis. Seventy-eight of 133 veterans prescribed stimulant medications for ADHD received diagnostic testing via a psychologist at LVAHCS. For the 11 veterans who had testing after stimulant initiation, a stimulant-free period was required prior to testing to ensure an accurate diagnosis. For 21% of veterans, the ADHD diagnosis was unsubstantiated by formal testing; however, all of these veterans continued stimulant medication use. For 1 veteran, the psychologist performing the testing documented new diagnoses, including moderate to severe stimulant use disorder and malingering both for PTSD and ADHD. The rate of stimulant prescribing inconsistency, “prescriber-hopping,” and unsupported ADHD diagnosis results warrant a conversation about expectations for transitions of care regarding stimulant medications, not only from outside to inside LVAHCS, but from prescriber to prescriber within the facility.

 

 

In some cases, stimulant medications were discontinued by a prescriber secondary to a worsening of another mental health condition. More than half of the participants in this study had an anxiety disorder diagnosis. Whether or not anxiety predated stimulant use or whether the use of stimulant medications contributed to the diagnosis and thus the addition of an additional CNS depressant to treat anxiety may be an area of research for future consideration. Although bipolar disorder, anxiety disorders, psychosis, and SUD are not contraindications for use of stimulant medications, caution must be used in patients with these diagnoses. Prescribers must weigh risks vs benefits as well as perform close monitoring during use. Similarly, one might look further into stimulant medications prescribed for fatigue and assess the role of any simultaneously prescribed CNS depressants. Is the stimulant being used to treat the adverse effect (AE) of another medication? In 2 documented instances in this study, a psychologist conducted diagnostic testing who reported that the veteran did not meet the criteria for ADHD but that a stimulant may help counteract the iatrogenic effect of anticonvulsants. In both instances stimulant use continued.

Prescription Monitoring

Polysubstance use disorder (5%) was the third most common SUD recorded among study participants. The majority of those with polysubstance use disorder reported abuse/misuse of illicit or prescribed stimulants. Stimulant abuse/misuse was documented in 25 of 200 (12.5%) study participants. In several instances, abuse/misuse was detected by the LVAHCS delivery coordination pharmacist who tracks patterns of early fill requests and prescriptions reported lost/stolen. This pharmacist may request that the prescriber obtain PDMP query, UDS, or pill count if concerning patterns are noted. Lisdexamphetamine is a formulary-restricted medication at LVAHCS, but it was noted to be approved for use when prescribers requested an abuse-deterrent formulation. Investigators noticed a trend in veterans whose prescriptions exceeded the recommended maximum dosage also having stimulant abuse/misuse documented in their medical record. The highest documented total daily dose in this study was 120-mg amphetamine salts IR for ADHD, compared with the normal recommended dosing range of 5 to 40 mg/d for the same indication.

Various modalities were used to monitor participants but less than half of veterans had an annual UDS, quarterly PDMP query, and quarterly prescriber follow-up. PDMP queries and prescriber follow-up was assessed quarterly as would be reasonable given that private sector practitioners may issue multiple prescriptions authorizing the patient to receive up to a 90-day supply.7 Prescriber follow-up ranged from 1 to 15 months. A longer time to follow-up was seen more frequently in stimulant medications prescribed by primary care as compared with that of mental health.

Clinical Practice Protocol

Data from this study were collected with the intent to identify opportunities for improvement in the prescribing and monitoring of stimulant medications. From the above results investigators concluded that this facility may benefit from implementation of a facility-specific clinical practice protocol (CPP) for stimulant prescribing. It may also be beneficial to formulate a chronic stimulant management agreement between patient and prescriber to provide informed consent and clear expectations prior to stimulant medication initiation.

 

 

A CPP could be used to establish stimulant prescribing rules within a facility, which may limit who can prescribe stimulant medications or include a review process and/or required documentation in the medical record when being prescribed outside of specified dosing range and indications for use designated in the CPP or other evidence-based guidelines. Transition of care was found to be an area of opportunity in this study, which could be mitigated with the requirement of a baseline assessment prior to stimulant initiation with the expectation that it be completed regardless of prior prescription stimulant medication use. There was a lack of consistent monitoring for participants in this study, which may be improved if required monitoring parameters and frequency were provided for prescribers. For example, monitoring of heart rate and blood pressure was not assessed in this study, but a CPP may include monitoring vital signs before and after each dose change and every 6 months, per recommendation from the National Institute for Health and Care Excellence ADHD Diagnosis and Management guideline published in 2018.8The CPP may list the responsibilities of all those involved in the prescribing of stimulant medications, such as mental health service leadership, prescribers, nursing staff, pharmacists, social workers, psychologists, and other mental health staff. For prescribers this may include a thorough baseline assessment and criteria for use that must be met prior to stimulant initiation, documentation that must be included in the medical record and required monitoring during stimulant treatment, and expectations for increased monitoring and/or termination of treatment with nonadherence, diversion, or abuse/misuse.

The responsibilities of pharmacists may include establishing criteria for use of nonformulary and restricted agents as well as completion of nonformulary/restricted requests, reviewing dosages that exceed the recommended FDA daily maximum, reviewing uncommon off-label uses of stimulant medications, review and document early fill requests, potential nonadherence, potential drug-seeking behavior, and communication of the following information to the primary prescriber. For other mental health staff this may include documenting any reported AEs of the medication, referring the patient to their prescriber or pharmacist for any medication questions or concerns, and assessment of effectiveness and/or worsening behavior during patient contact.

Limitations

One limitation of this study was the way that data were pulled from patient charts. For example, only 3/200 participants in this study had insomnia per diagnosis codes, whereas that number was substantially higher when chart review was used to assess active prescriptions for sleep aids or documented complaints of insomnia in prescriber progress notes. For this same reason, rates of SUDs must be interpreted with caution as well. SUD diagnosis, both current and in remission were taken into account during data collection. Per diagnosis codes, 36 (18%) veterans in this study had a history of SUD, but this number was higher (31.5%) during chart review. The majority of discrepancies were found when participants reported a history of SUD to the prescriber, but this information was not captured via the problem list or encounter codes. What some may consider a minor omission in documentation can have a large impact on patient care as it is unlikely that prescribers have adequate administrative time to complete a chart review in order to find a complete past medical history as was required of investigators in this study. For this reason, incomplete provider documentation and human error that can occur as a result of a retrospective chart review were also identified as study limitations.

 

 

Conclusion

Our data show that there is still substantial room for improvement in the prescribing and monitoring of stimulant medications. The rate of stimulant prescribing inconsistency, prescriber-hopping, and unsupported ADHD diagnosis resulting from formal diagnostic testing warrant a review in the processes for transition of care regarding stimulant medications, both within and outside of this facility. A lack of consistent monitoring was also identified in this study. One of the most appreciable areas of opportunity resulting from this study is the need for consistency in both the prescribing and monitoring of stimulant medications. From the above results investigators concluded that this facility may benefit from implementation of a CPP for stimulant prescribing as well as a chronic stimulant management agreement to provide clear expectations for patients and prescribers prior to and during prescription stimulant use.

Acknowledgments 

We thank Tori Wilhoit, PharmD candidate, and Dana Fischer, PharmD candidate, for their participation in data collection and Courtney Eatmon, PharmD, BCPP, for her general administrative support throughout this study.

References

1. Safer DJ. Recent trends in stimulant usage. J Atten Disord. 2016;20(6):471-477.

2. Christopher Jones; US Food and Drug Administration. The opioid epidemic overview and a look to the future. http://www.agencymeddirectors.wa.gov/Files/OpioidConference/2Jones_OPIOIDEPIDEMICOVERVIEW.pdf. Published June 12, 2015. Accessed January 16, 2020.

3. Compton WM, Han B, Blanco C, Johnson K, Jones CM. Prevalence and correlates of prescription stimulant use, misuse, use disorders, motivations for misuse among adults in the United States. Am J Psychiatry. 2018;175(8):741-755.  

4. Westover AN, Nakonezney PA, Halm EA, Adinoff B. Risk of amphetamine use disorder and mortality among incident users of prescribed stimulant medications in the Veterans Administration. Addiction. 2018;113(5):857-867.

5. Olfson M, Blanco C, Wang S, Greenhill LL. Trends in office-based treatment of adults with stimulant medications in the United States. J Clin Psychiatry. 2013;74(1):43-50.

6. Olfson M, Marcus SC, Zhang HF, and Wan GJ. Continuity in methylphenidate treatment of adults with attention-deficit/hyperactivity disorder. J Manag Care Pharm. 2007;13(7): 570-577.

7. 21 CFR § 1306.12

8. National Collaborating Centre for Mental Health (UK). Attention deficit hyperactivity disorder: diagnosis and management of ADHD in children, young people and adults. NICE Clinical Guidelines, No. 87. Leicester, United Kingdom: British Psychological Society; 2018.

References

1. Safer DJ. Recent trends in stimulant usage. J Atten Disord. 2016;20(6):471-477.

2. Christopher Jones; US Food and Drug Administration. The opioid epidemic overview and a look to the future. http://www.agencymeddirectors.wa.gov/Files/OpioidConference/2Jones_OPIOIDEPIDEMICOVERVIEW.pdf. Published June 12, 2015. Accessed January 16, 2020.

3. Compton WM, Han B, Blanco C, Johnson K, Jones CM. Prevalence and correlates of prescription stimulant use, misuse, use disorders, motivations for misuse among adults in the United States. Am J Psychiatry. 2018;175(8):741-755.  

4. Westover AN, Nakonezney PA, Halm EA, Adinoff B. Risk of amphetamine use disorder and mortality among incident users of prescribed stimulant medications in the Veterans Administration. Addiction. 2018;113(5):857-867.

5. Olfson M, Blanco C, Wang S, Greenhill LL. Trends in office-based treatment of adults with stimulant medications in the United States. J Clin Psychiatry. 2013;74(1):43-50.

6. Olfson M, Marcus SC, Zhang HF, and Wan GJ. Continuity in methylphenidate treatment of adults with attention-deficit/hyperactivity disorder. J Manag Care Pharm. 2007;13(7): 570-577.

7. 21 CFR § 1306.12

8. National Collaborating Centre for Mental Health (UK). Attention deficit hyperactivity disorder: diagnosis and management of ADHD in children, young people and adults. NICE Clinical Guidelines, No. 87. Leicester, United Kingdom: British Psychological Society; 2018.

Issue
Federal Practitioner - 37(2)a
Issue
Federal Practitioner - 37(2)a
Page Number
86-91
Page Number
86-91
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

February 2020: Question 1

Article Type
Changed

Q1. Correct Answer: B

Rationale

The leading cause of death in patients with NASH is cardiovascular disease. Death from liver-related causes is much more common in NASH than in the general population, but is not the leading cause of death. Cancer-related death is among the top three causes of death in patients with NASH, but is not the most common.

References

1. Adams LA, Lymp JF, St Sauver J, et al. The natural history of nonalcoholic fatty liver disease: a population-based cohort study. Gastroenterology 2005;129:113-21.

2. Chalasani N, Younossi Z, Lavine JE, et al. The Diagnosis and Management of Nonalcoholic Fatty Liver Disease: Practice Guidance from the American Association for the Study of Liver Disease. Hepatology 2018;67:328-57.

Publications
Topics
Sections

Q1. Correct Answer: B

Rationale

The leading cause of death in patients with NASH is cardiovascular disease. Death from liver-related causes is much more common in NASH than in the general population, but is not the leading cause of death. Cancer-related death is among the top three causes of death in patients with NASH, but is not the most common.

References

1. Adams LA, Lymp JF, St Sauver J, et al. The natural history of nonalcoholic fatty liver disease: a population-based cohort study. Gastroenterology 2005;129:113-21.

2. Chalasani N, Younossi Z, Lavine JE, et al. The Diagnosis and Management of Nonalcoholic Fatty Liver Disease: Practice Guidance from the American Association for the Study of Liver Disease. Hepatology 2018;67:328-57.

Q1. Correct Answer: B

Rationale

The leading cause of death in patients with NASH is cardiovascular disease. Death from liver-related causes is much more common in NASH than in the general population, but is not the leading cause of death. Cancer-related death is among the top three causes of death in patients with NASH, but is not the most common.

References

1. Adams LA, Lymp JF, St Sauver J, et al. The natural history of nonalcoholic fatty liver disease: a population-based cohort study. Gastroenterology 2005;129:113-21.

2. Chalasani N, Younossi Z, Lavine JE, et al. The Diagnosis and Management of Nonalcoholic Fatty Liver Disease: Practice Guidance from the American Association for the Study of Liver Disease. Hepatology 2018;67:328-57.

Publications
Publications
Topics
Article Type
Sections
Questionnaire Body

You recently diagnosed a 66-year-old man with cirrhosis due to nonalcoholic steatohepatitis. The patient presents to your clinic now inquiring about his long-term prognosis.

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Introduction to population management

Article Type
Changed

Defining the key terms

Traditionally, U.S. health care has operated under a fee-for-service payment model, in which health care providers (such as physicians, hospitals, and health care systems) receive a fee for services such as office visits, hospital stays, procedures, and tests. However, reimbursement discussions are increasingly moving from fee-for-service to value-based, in which payments are tied to managing population health and total cost of care.

Dr. Marina Farah

Because these changes will impact the entire system all the way down to individual providers, in the upcoming Population Management article series in The Hospitalist, we will discuss the nuances and implications that physicians, executives, and hospitals should be aware of. In this first article, we will examine the impetus for the shift toward population management and introduce common terminology to lay the foundation for the future content.
 

The traditional model: Fee for service

Under the traditional fee-for-service payment system, health care providers are paid per unit of service. For example, hospitals receive diagnosis-related group (DRG) payments for inpatient stays, and physicians are paid per patient visit. The more services that hospitals or physicians provide, the more money both get paid, without financial consequences for quality outcomes or total cost of care. Total cost of care includes clinic visits, outpatient procedures and tests, hospital and ED visits, home health, skilled nursing facilities, durable medical equipment, and sometimes drugs during an episode of care (for example, a hospital stay plus 90 days after discharge) or over a period of time (for example, a month or a year).

As a result of the fee-for-service payment system, the United States spends more money on health care than other wealthy countries, yet it lags behind other countries on many quality measures, such as disease burden, overall mortality, premature death, and preventable death.1,2

In 2007, the Institute for Healthcare Improvement (IHI) developed the Triple Aim framework that focused on the following:

  • Improving the patient experience of care (including quality and satisfaction).
  • Improving the health of populations.
  • Reducing per capita cost of care.

Both public payers like Medicare and Medicaid, as well as private payers, embraced the Triple Aim to reform how health care is delivered and paid for. As such, health care delivery focus and financial incentives are shifting from managing discrete patient encounters for acute illness to managing population health and total cost of care.
 

A new approach: Population management

Before diving into population management, it is important to first understand the terms “population” and “population health.” A population can be defined geographically or may include employees of an organization, members of a health plan, or patients receiving care from a specific physician group or health care system. David A. Kindig, MD, PhD, professor emeritus of population health sciences at the University of Wisconsin–Madison, defined population health as “the health outcomes of a group of individuals, including the distribution of such outcomes within the group.”3 Dr. Kindig noted that population health outcomes have many determinants, such as the following:4

 

 

  • Health care (access, cost, quantity, and quality of health care services).
  • Individual behavior (including diet, exercise, and substance abuse).
  • Genetics.
  • The social environment (education, income, occupation, class, and social support).
  • Physical environment (air and water quality, lead exposure, and the design of neighborhoods).

IHI operationally defines population health by measures such as life expectancy, mortality rates, health and functional status, the incidence and/or prevalence of chronic disease, and behavioral and physiological factors such as smoking, physical activity, diet, blood pressure, body mass index, and cholesterol.5

On the other hand, population management is primarily concerned with health care determinants of health and, according to IHI, should be clearly distinguished from population health, which focuses on the broader determinants of health.5

Dr. Ron Greeno

According to Ron Greeno, MD, MHM, one of the founding members and a past-president of the Society of Hospital Medicine, population management is a “global approach of caring for an entire patient population to deliver safe and equitable care and to more intelligently allocate resources to keep people well.”

Population management requires understanding the patient population, which includes risk stratification and redesigning and delivering services that are guided by integrated clinical and administrative data and enabled by information technology.
 

Cost-sharing payment models

The cornerstone of population management is provider accountability for the cost of care, which can be accomplished through shared-risk models or population-based payments. Let’s take a closer look at each.

Under shared-risk models, providers receive payment based on their performance against cost targets. The goal is to generate cost savings by improving care coordination, engaging patients in shared decision making based on their health goals, and reducing utilization of care that provides little to no value for patients (for example, preventable hospital admissions or unnecessary imaging or procedures).

Cost targets and actual spending are reconciled retrospectively. If providers beat cost targets, they are eligible to keep a share of generated savings based on their performance on selected quality measures. However, if providers’ actual spending exceeds cost targets, they will compensate payers for a portion of the losses. Under one-sided risk models, providers are eligible for shared savings but not financially responsible for losses. Under two-sided risk models, providers are accountable for both savings and losses.

With prospective population-based payments, also known as capitation, providers receive in advance a fixed amount of money per patient per unit of time (for example, per month) that creates a budget to cover the cost of agreed-upon health care services. The prospective payments are risk adjusted and typically tied to performance on selected quality, effectiveness, and patient experience measures.

Professional services capitation arrangements between physician groups and payers cover the cost of physician services including primary care, specialty care, and related laboratory and radiology services. Under global capitation or global payment arrangements, health care systems receive payments that cover the total cost of care for the patient population for a defined period.

Population-based payments create incentives to provide high-quality and efficient care within a set budget.6 If actual cost of delivering services to the defined patient population comes under the budget, the providers will realize savings, but otherwise will encounter losses.
 

What is next?

Now that we have explained the impetus for population management and the terminology, in the next article in this series we will discuss the current state of population management. We will also delve into a hospitalist’s role and participation so you can be aware of impending changes and ensure you are set up for success, no matter how the payment models evolve.
 

Dr. Farah is a hospitalist, physician adviser, and Lean Six Sigma Black Belt. She is a performance improvement consultant based in Corvallis, Ore., and a member of The Hospitalist’s editorial advisory board.

References

1. Source: https://www.healthsystemtracker.org/chart-collection/health-spending-u-s-compare-countries/#item-start

2. Source: https://www.healthsystemtracker.org/brief/on-several-indicators-of-healthcare-quality the-u-s-falls-short/

3. Kindig D, Asada Y, Booske B. (2008). A Population Health Framework for Setting National and State Health Goals. JAMA, 299, 2081-2083.

4. Source: https://improvingpopulationhealth.typepad.com/blog/what-are-health-factorsdeterminants.html

5. Source: http://www.ihi.org/communities/blogs/population-health-population-management-terminology-in-us-health-care

6. Source: http://hcp-lan.org/workproducts/apm-refresh-whitepaper-final.pdf

Publications
Topics
Sections

Defining the key terms

Defining the key terms

Traditionally, U.S. health care has operated under a fee-for-service payment model, in which health care providers (such as physicians, hospitals, and health care systems) receive a fee for services such as office visits, hospital stays, procedures, and tests. However, reimbursement discussions are increasingly moving from fee-for-service to value-based, in which payments are tied to managing population health and total cost of care.

Dr. Marina Farah

Because these changes will impact the entire system all the way down to individual providers, in the upcoming Population Management article series in The Hospitalist, we will discuss the nuances and implications that physicians, executives, and hospitals should be aware of. In this first article, we will examine the impetus for the shift toward population management and introduce common terminology to lay the foundation for the future content.
 

The traditional model: Fee for service

Under the traditional fee-for-service payment system, health care providers are paid per unit of service. For example, hospitals receive diagnosis-related group (DRG) payments for inpatient stays, and physicians are paid per patient visit. The more services that hospitals or physicians provide, the more money both get paid, without financial consequences for quality outcomes or total cost of care. Total cost of care includes clinic visits, outpatient procedures and tests, hospital and ED visits, home health, skilled nursing facilities, durable medical equipment, and sometimes drugs during an episode of care (for example, a hospital stay plus 90 days after discharge) or over a period of time (for example, a month or a year).

As a result of the fee-for-service payment system, the United States spends more money on health care than other wealthy countries, yet it lags behind other countries on many quality measures, such as disease burden, overall mortality, premature death, and preventable death.1,2

In 2007, the Institute for Healthcare Improvement (IHI) developed the Triple Aim framework that focused on the following:

  • Improving the patient experience of care (including quality and satisfaction).
  • Improving the health of populations.
  • Reducing per capita cost of care.

Both public payers like Medicare and Medicaid, as well as private payers, embraced the Triple Aim to reform how health care is delivered and paid for. As such, health care delivery focus and financial incentives are shifting from managing discrete patient encounters for acute illness to managing population health and total cost of care.
 

A new approach: Population management

Before diving into population management, it is important to first understand the terms “population” and “population health.” A population can be defined geographically or may include employees of an organization, members of a health plan, or patients receiving care from a specific physician group or health care system. David A. Kindig, MD, PhD, professor emeritus of population health sciences at the University of Wisconsin–Madison, defined population health as “the health outcomes of a group of individuals, including the distribution of such outcomes within the group.”3 Dr. Kindig noted that population health outcomes have many determinants, such as the following:4

 

 

  • Health care (access, cost, quantity, and quality of health care services).
  • Individual behavior (including diet, exercise, and substance abuse).
  • Genetics.
  • The social environment (education, income, occupation, class, and social support).
  • Physical environment (air and water quality, lead exposure, and the design of neighborhoods).

IHI operationally defines population health by measures such as life expectancy, mortality rates, health and functional status, the incidence and/or prevalence of chronic disease, and behavioral and physiological factors such as smoking, physical activity, diet, blood pressure, body mass index, and cholesterol.5

On the other hand, population management is primarily concerned with health care determinants of health and, according to IHI, should be clearly distinguished from population health, which focuses on the broader determinants of health.5

Dr. Ron Greeno

According to Ron Greeno, MD, MHM, one of the founding members and a past-president of the Society of Hospital Medicine, population management is a “global approach of caring for an entire patient population to deliver safe and equitable care and to more intelligently allocate resources to keep people well.”

Population management requires understanding the patient population, which includes risk stratification and redesigning and delivering services that are guided by integrated clinical and administrative data and enabled by information technology.
 

Cost-sharing payment models

The cornerstone of population management is provider accountability for the cost of care, which can be accomplished through shared-risk models or population-based payments. Let’s take a closer look at each.

Under shared-risk models, providers receive payment based on their performance against cost targets. The goal is to generate cost savings by improving care coordination, engaging patients in shared decision making based on their health goals, and reducing utilization of care that provides little to no value for patients (for example, preventable hospital admissions or unnecessary imaging or procedures).

Cost targets and actual spending are reconciled retrospectively. If providers beat cost targets, they are eligible to keep a share of generated savings based on their performance on selected quality measures. However, if providers’ actual spending exceeds cost targets, they will compensate payers for a portion of the losses. Under one-sided risk models, providers are eligible for shared savings but not financially responsible for losses. Under two-sided risk models, providers are accountable for both savings and losses.

With prospective population-based payments, also known as capitation, providers receive in advance a fixed amount of money per patient per unit of time (for example, per month) that creates a budget to cover the cost of agreed-upon health care services. The prospective payments are risk adjusted and typically tied to performance on selected quality, effectiveness, and patient experience measures.

Professional services capitation arrangements between physician groups and payers cover the cost of physician services including primary care, specialty care, and related laboratory and radiology services. Under global capitation or global payment arrangements, health care systems receive payments that cover the total cost of care for the patient population for a defined period.

Population-based payments create incentives to provide high-quality and efficient care within a set budget.6 If actual cost of delivering services to the defined patient population comes under the budget, the providers will realize savings, but otherwise will encounter losses.
 

What is next?

Now that we have explained the impetus for population management and the terminology, in the next article in this series we will discuss the current state of population management. We will also delve into a hospitalist’s role and participation so you can be aware of impending changes and ensure you are set up for success, no matter how the payment models evolve.
 

Dr. Farah is a hospitalist, physician adviser, and Lean Six Sigma Black Belt. She is a performance improvement consultant based in Corvallis, Ore., and a member of The Hospitalist’s editorial advisory board.

References

1. Source: https://www.healthsystemtracker.org/chart-collection/health-spending-u-s-compare-countries/#item-start

2. Source: https://www.healthsystemtracker.org/brief/on-several-indicators-of-healthcare-quality the-u-s-falls-short/

3. Kindig D, Asada Y, Booske B. (2008). A Population Health Framework for Setting National and State Health Goals. JAMA, 299, 2081-2083.

4. Source: https://improvingpopulationhealth.typepad.com/blog/what-are-health-factorsdeterminants.html

5. Source: http://www.ihi.org/communities/blogs/population-health-population-management-terminology-in-us-health-care

6. Source: http://hcp-lan.org/workproducts/apm-refresh-whitepaper-final.pdf

Traditionally, U.S. health care has operated under a fee-for-service payment model, in which health care providers (such as physicians, hospitals, and health care systems) receive a fee for services such as office visits, hospital stays, procedures, and tests. However, reimbursement discussions are increasingly moving from fee-for-service to value-based, in which payments are tied to managing population health and total cost of care.

Dr. Marina Farah

Because these changes will impact the entire system all the way down to individual providers, in the upcoming Population Management article series in The Hospitalist, we will discuss the nuances and implications that physicians, executives, and hospitals should be aware of. In this first article, we will examine the impetus for the shift toward population management and introduce common terminology to lay the foundation for the future content.
 

The traditional model: Fee for service

Under the traditional fee-for-service payment system, health care providers are paid per unit of service. For example, hospitals receive diagnosis-related group (DRG) payments for inpatient stays, and physicians are paid per patient visit. The more services that hospitals or physicians provide, the more money both get paid, without financial consequences for quality outcomes or total cost of care. Total cost of care includes clinic visits, outpatient procedures and tests, hospital and ED visits, home health, skilled nursing facilities, durable medical equipment, and sometimes drugs during an episode of care (for example, a hospital stay plus 90 days after discharge) or over a period of time (for example, a month or a year).

As a result of the fee-for-service payment system, the United States spends more money on health care than other wealthy countries, yet it lags behind other countries on many quality measures, such as disease burden, overall mortality, premature death, and preventable death.1,2

In 2007, the Institute for Healthcare Improvement (IHI) developed the Triple Aim framework that focused on the following:

  • Improving the patient experience of care (including quality and satisfaction).
  • Improving the health of populations.
  • Reducing per capita cost of care.

Both public payers like Medicare and Medicaid, as well as private payers, embraced the Triple Aim to reform how health care is delivered and paid for. As such, health care delivery focus and financial incentives are shifting from managing discrete patient encounters for acute illness to managing population health and total cost of care.
 

A new approach: Population management

Before diving into population management, it is important to first understand the terms “population” and “population health.” A population can be defined geographically or may include employees of an organization, members of a health plan, or patients receiving care from a specific physician group or health care system. David A. Kindig, MD, PhD, professor emeritus of population health sciences at the University of Wisconsin–Madison, defined population health as “the health outcomes of a group of individuals, including the distribution of such outcomes within the group.”3 Dr. Kindig noted that population health outcomes have many determinants, such as the following:4

 

 

  • Health care (access, cost, quantity, and quality of health care services).
  • Individual behavior (including diet, exercise, and substance abuse).
  • Genetics.
  • The social environment (education, income, occupation, class, and social support).
  • Physical environment (air and water quality, lead exposure, and the design of neighborhoods).

IHI operationally defines population health by measures such as life expectancy, mortality rates, health and functional status, the incidence and/or prevalence of chronic disease, and behavioral and physiological factors such as smoking, physical activity, diet, blood pressure, body mass index, and cholesterol.5

On the other hand, population management is primarily concerned with health care determinants of health and, according to IHI, should be clearly distinguished from population health, which focuses on the broader determinants of health.5

Dr. Ron Greeno

According to Ron Greeno, MD, MHM, one of the founding members and a past-president of the Society of Hospital Medicine, population management is a “global approach of caring for an entire patient population to deliver safe and equitable care and to more intelligently allocate resources to keep people well.”

Population management requires understanding the patient population, which includes risk stratification and redesigning and delivering services that are guided by integrated clinical and administrative data and enabled by information technology.
 

Cost-sharing payment models

The cornerstone of population management is provider accountability for the cost of care, which can be accomplished through shared-risk models or population-based payments. Let’s take a closer look at each.

Under shared-risk models, providers receive payment based on their performance against cost targets. The goal is to generate cost savings by improving care coordination, engaging patients in shared decision making based on their health goals, and reducing utilization of care that provides little to no value for patients (for example, preventable hospital admissions or unnecessary imaging or procedures).

Cost targets and actual spending are reconciled retrospectively. If providers beat cost targets, they are eligible to keep a share of generated savings based on their performance on selected quality measures. However, if providers’ actual spending exceeds cost targets, they will compensate payers for a portion of the losses. Under one-sided risk models, providers are eligible for shared savings but not financially responsible for losses. Under two-sided risk models, providers are accountable for both savings and losses.

With prospective population-based payments, also known as capitation, providers receive in advance a fixed amount of money per patient per unit of time (for example, per month) that creates a budget to cover the cost of agreed-upon health care services. The prospective payments are risk adjusted and typically tied to performance on selected quality, effectiveness, and patient experience measures.

Professional services capitation arrangements between physician groups and payers cover the cost of physician services including primary care, specialty care, and related laboratory and radiology services. Under global capitation or global payment arrangements, health care systems receive payments that cover the total cost of care for the patient population for a defined period.

Population-based payments create incentives to provide high-quality and efficient care within a set budget.6 If actual cost of delivering services to the defined patient population comes under the budget, the providers will realize savings, but otherwise will encounter losses.
 

What is next?

Now that we have explained the impetus for population management and the terminology, in the next article in this series we will discuss the current state of population management. We will also delve into a hospitalist’s role and participation so you can be aware of impending changes and ensure you are set up for success, no matter how the payment models evolve.
 

Dr. Farah is a hospitalist, physician adviser, and Lean Six Sigma Black Belt. She is a performance improvement consultant based in Corvallis, Ore., and a member of The Hospitalist’s editorial advisory board.

References

1. Source: https://www.healthsystemtracker.org/chart-collection/health-spending-u-s-compare-countries/#item-start

2. Source: https://www.healthsystemtracker.org/brief/on-several-indicators-of-healthcare-quality the-u-s-falls-short/

3. Kindig D, Asada Y, Booske B. (2008). A Population Health Framework for Setting National and State Health Goals. JAMA, 299, 2081-2083.

4. Source: https://improvingpopulationhealth.typepad.com/blog/what-are-health-factorsdeterminants.html

5. Source: http://www.ihi.org/communities/blogs/population-health-population-management-terminology-in-us-health-care

6. Source: http://hcp-lan.org/workproducts/apm-refresh-whitepaper-final.pdf

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Be ready for patient questions on sunscreen safety, SPF choice

Article Type
Changed

– Dermatologists should be well versed in addressing common concerns that patients, family members, and the media have about photoprotection, Adam Friedman, MD, advised at the ODAC Dermatology, Aesthetic, & Surgical Conference.

Dr. Adam Friedman

“Know the controversies. Be armed and ready when these patients come to your office with questions,” Dr. Friedman, professor and interim chair of dermatology at George Washington University, Washington, said in an interview at the meeting, where he presented on issues related to photoprotection.

Which SPF to choose and the impact of sunscreen on vitamin D are among the issues patients may be asking about.

Sunscreen SPFs above 50 don’t technically provide a “meaningful” increase in ultraviolet protection, given that this value relates to filtering about 98% of UVB, but they still can provide some benefit, which has to do with real-world human error, Dr. Friedman said.

“Most people don’t use sunscreens the right way,” meaning they don’t apply enough to achieve the SPF listed, he added in the interview. “A higher SPF is meaningful, because if they apply less [sunscreen], they actually still are in that safety window,” with the higher SPF sunscreen. (The American Academy of Dermatology recommends an SPF of 30 or higher.) Several studies have shown that a SPF of 70 or 100 is superior to 50, likely because of this “dilutional” effect.

Patients may have concerns about the effects of sunscreen on vitamin D production, the environment, and hair loss, and whether they have endocrine disrupting effects, added Dr. Friedman, who is also the medical director of the meeting.

Inhibition of cutaneous vitamin D synthesis after using sunscreen can vary, based on whether a person has properly applied sunscreen, the season, latitude, and an individual’s age and obesity level. Patients with low vitamin D levels can use a vitamin D supplement to achieve sufficient levels, and patients concerned about the impact of sunscreen and vitamin D can be advised to take 600 IU of vitamin D3 a day, according to Dr. Friedman. Some studies have suggested that UVB exposure and risk of certain cancers are inversely correlated, implicating cutaneous vitamin D synthesis (J Clin Transl Endocrinol. 2014 Oct 5;1[4]:179-86). But correlation does not equal causation, he pointed out.



Other concerns stem from the potential for oxybenzone, a UVA/UVB filter in more than 70% of sunscreens, to act as an endocrine disruptor in people and whether it is potentially damaging the environment. The data driving these concerns “stem from the bench, not the real world,” Dr. Friedman said. While topical application of oxybenzone can result in systemic absorption, and even though it’s been detected in waters that are heavily populated or where people go on vacation, there is no evidence demonstrating toxicity to humans or the coral reefs. “At least the information we have to date says they don’t,” he added.

In a randomized clinical trial recently published in JAMA, Food and Drug Administration investigators found that systemic skin absorption with geometric mean plasma concentrations greater than 0.5 ng/mL with six active ingredients in sunscreen that were tested, including oxybenzone (JAMA. 2020;323[3]:256-7). The study was part of an FDA proposed rule requesting additional information on sunscreen ingredients; the plasma concentrations exceeded the level at which further safety studies could potentially be waived.

The study, Dr. Friedman said, “only demonstrated the ability to detect these UV filters at very small concentrations in the blood. They have yet to show any meaningful biologic correlation to these findings.”

For those patients who prefer not to use chemical filters, Dr. Friedman suggests recommending mineral-based sunscreens, of which he said micro- and nanoparticulate formulations offer the best cosmesis by sitting more evenly on the skin, being more amenable to thinner and less-lipophilic vehicles, and limiting visible light scattering (thereby limiting the unsightly white appearance) – while maintaining UV scattering efficacy. However, controversy has emerged as there are past studies that posit the theoretical danger of nanoparticles in sunscreens, given their potential to penetrate the skin and enter cells.

But continually emerging evidence has shown that commercially available nanosunscreens are safe, with no toxicity even at the cellular level when applied to the skin in sunscreen or in cosmetics. “All evidence to date suggests they do not do this,” Dr. Friedman said, noting that, in Europe, the European Commission’s Scientific Committee on Consumer Safety has stated that nanoparticles below a concentration of 25% in sunscreens is safe, “just don’t put them in aerosolized forms.”

Lastly, while some recent studies have detected titanium dioxide on the hair shafts of patients with and without frontal fibrosing alopecia, Dr. Friedman noted more evidence is needed before recommending that these patients avoid using sunscreen (Br J Dermatol. 2019 Jul;181[1]:216-7). “Correlation does not mean causation, and the current dogma is that there’s no connection between these two,” he commented.

Dr. Friedman reported consulting and advisory board relationships with numerous companies; he also reported speaking for Regeneron, Abbvie, and Dermira, and receiving grants with Pfizer and DF Pharma.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Dermatologists should be well versed in addressing common concerns that patients, family members, and the media have about photoprotection, Adam Friedman, MD, advised at the ODAC Dermatology, Aesthetic, & Surgical Conference.

Dr. Adam Friedman

“Know the controversies. Be armed and ready when these patients come to your office with questions,” Dr. Friedman, professor and interim chair of dermatology at George Washington University, Washington, said in an interview at the meeting, where he presented on issues related to photoprotection.

Which SPF to choose and the impact of sunscreen on vitamin D are among the issues patients may be asking about.

Sunscreen SPFs above 50 don’t technically provide a “meaningful” increase in ultraviolet protection, given that this value relates to filtering about 98% of UVB, but they still can provide some benefit, which has to do with real-world human error, Dr. Friedman said.

“Most people don’t use sunscreens the right way,” meaning they don’t apply enough to achieve the SPF listed, he added in the interview. “A higher SPF is meaningful, because if they apply less [sunscreen], they actually still are in that safety window,” with the higher SPF sunscreen. (The American Academy of Dermatology recommends an SPF of 30 or higher.) Several studies have shown that a SPF of 70 or 100 is superior to 50, likely because of this “dilutional” effect.

Patients may have concerns about the effects of sunscreen on vitamin D production, the environment, and hair loss, and whether they have endocrine disrupting effects, added Dr. Friedman, who is also the medical director of the meeting.

Inhibition of cutaneous vitamin D synthesis after using sunscreen can vary, based on whether a person has properly applied sunscreen, the season, latitude, and an individual’s age and obesity level. Patients with low vitamin D levels can use a vitamin D supplement to achieve sufficient levels, and patients concerned about the impact of sunscreen and vitamin D can be advised to take 600 IU of vitamin D3 a day, according to Dr. Friedman. Some studies have suggested that UVB exposure and risk of certain cancers are inversely correlated, implicating cutaneous vitamin D synthesis (J Clin Transl Endocrinol. 2014 Oct 5;1[4]:179-86). But correlation does not equal causation, he pointed out.



Other concerns stem from the potential for oxybenzone, a UVA/UVB filter in more than 70% of sunscreens, to act as an endocrine disruptor in people and whether it is potentially damaging the environment. The data driving these concerns “stem from the bench, not the real world,” Dr. Friedman said. While topical application of oxybenzone can result in systemic absorption, and even though it’s been detected in waters that are heavily populated or where people go on vacation, there is no evidence demonstrating toxicity to humans or the coral reefs. “At least the information we have to date says they don’t,” he added.

In a randomized clinical trial recently published in JAMA, Food and Drug Administration investigators found that systemic skin absorption with geometric mean plasma concentrations greater than 0.5 ng/mL with six active ingredients in sunscreen that were tested, including oxybenzone (JAMA. 2020;323[3]:256-7). The study was part of an FDA proposed rule requesting additional information on sunscreen ingredients; the plasma concentrations exceeded the level at which further safety studies could potentially be waived.

The study, Dr. Friedman said, “only demonstrated the ability to detect these UV filters at very small concentrations in the blood. They have yet to show any meaningful biologic correlation to these findings.”

For those patients who prefer not to use chemical filters, Dr. Friedman suggests recommending mineral-based sunscreens, of which he said micro- and nanoparticulate formulations offer the best cosmesis by sitting more evenly on the skin, being more amenable to thinner and less-lipophilic vehicles, and limiting visible light scattering (thereby limiting the unsightly white appearance) – while maintaining UV scattering efficacy. However, controversy has emerged as there are past studies that posit the theoretical danger of nanoparticles in sunscreens, given their potential to penetrate the skin and enter cells.

But continually emerging evidence has shown that commercially available nanosunscreens are safe, with no toxicity even at the cellular level when applied to the skin in sunscreen or in cosmetics. “All evidence to date suggests they do not do this,” Dr. Friedman said, noting that, in Europe, the European Commission’s Scientific Committee on Consumer Safety has stated that nanoparticles below a concentration of 25% in sunscreens is safe, “just don’t put them in aerosolized forms.”

Lastly, while some recent studies have detected titanium dioxide on the hair shafts of patients with and without frontal fibrosing alopecia, Dr. Friedman noted more evidence is needed before recommending that these patients avoid using sunscreen (Br J Dermatol. 2019 Jul;181[1]:216-7). “Correlation does not mean causation, and the current dogma is that there’s no connection between these two,” he commented.

Dr. Friedman reported consulting and advisory board relationships with numerous companies; he also reported speaking for Regeneron, Abbvie, and Dermira, and receiving grants with Pfizer and DF Pharma.

– Dermatologists should be well versed in addressing common concerns that patients, family members, and the media have about photoprotection, Adam Friedman, MD, advised at the ODAC Dermatology, Aesthetic, & Surgical Conference.

Dr. Adam Friedman

“Know the controversies. Be armed and ready when these patients come to your office with questions,” Dr. Friedman, professor and interim chair of dermatology at George Washington University, Washington, said in an interview at the meeting, where he presented on issues related to photoprotection.

Which SPF to choose and the impact of sunscreen on vitamin D are among the issues patients may be asking about.

Sunscreen SPFs above 50 don’t technically provide a “meaningful” increase in ultraviolet protection, given that this value relates to filtering about 98% of UVB, but they still can provide some benefit, which has to do with real-world human error, Dr. Friedman said.

“Most people don’t use sunscreens the right way,” meaning they don’t apply enough to achieve the SPF listed, he added in the interview. “A higher SPF is meaningful, because if they apply less [sunscreen], they actually still are in that safety window,” with the higher SPF sunscreen. (The American Academy of Dermatology recommends an SPF of 30 or higher.) Several studies have shown that a SPF of 70 or 100 is superior to 50, likely because of this “dilutional” effect.

Patients may have concerns about the effects of sunscreen on vitamin D production, the environment, and hair loss, and whether they have endocrine disrupting effects, added Dr. Friedman, who is also the medical director of the meeting.

Inhibition of cutaneous vitamin D synthesis after using sunscreen can vary, based on whether a person has properly applied sunscreen, the season, latitude, and an individual’s age and obesity level. Patients with low vitamin D levels can use a vitamin D supplement to achieve sufficient levels, and patients concerned about the impact of sunscreen and vitamin D can be advised to take 600 IU of vitamin D3 a day, according to Dr. Friedman. Some studies have suggested that UVB exposure and risk of certain cancers are inversely correlated, implicating cutaneous vitamin D synthesis (J Clin Transl Endocrinol. 2014 Oct 5;1[4]:179-86). But correlation does not equal causation, he pointed out.



Other concerns stem from the potential for oxybenzone, a UVA/UVB filter in more than 70% of sunscreens, to act as an endocrine disruptor in people and whether it is potentially damaging the environment. The data driving these concerns “stem from the bench, not the real world,” Dr. Friedman said. While topical application of oxybenzone can result in systemic absorption, and even though it’s been detected in waters that are heavily populated or where people go on vacation, there is no evidence demonstrating toxicity to humans or the coral reefs. “At least the information we have to date says they don’t,” he added.

In a randomized clinical trial recently published in JAMA, Food and Drug Administration investigators found that systemic skin absorption with geometric mean plasma concentrations greater than 0.5 ng/mL with six active ingredients in sunscreen that were tested, including oxybenzone (JAMA. 2020;323[3]:256-7). The study was part of an FDA proposed rule requesting additional information on sunscreen ingredients; the plasma concentrations exceeded the level at which further safety studies could potentially be waived.

The study, Dr. Friedman said, “only demonstrated the ability to detect these UV filters at very small concentrations in the blood. They have yet to show any meaningful biologic correlation to these findings.”

For those patients who prefer not to use chemical filters, Dr. Friedman suggests recommending mineral-based sunscreens, of which he said micro- and nanoparticulate formulations offer the best cosmesis by sitting more evenly on the skin, being more amenable to thinner and less-lipophilic vehicles, and limiting visible light scattering (thereby limiting the unsightly white appearance) – while maintaining UV scattering efficacy. However, controversy has emerged as there are past studies that posit the theoretical danger of nanoparticles in sunscreens, given their potential to penetrate the skin and enter cells.

But continually emerging evidence has shown that commercially available nanosunscreens are safe, with no toxicity even at the cellular level when applied to the skin in sunscreen or in cosmetics. “All evidence to date suggests they do not do this,” Dr. Friedman said, noting that, in Europe, the European Commission’s Scientific Committee on Consumer Safety has stated that nanoparticles below a concentration of 25% in sunscreens is safe, “just don’t put them in aerosolized forms.”

Lastly, while some recent studies have detected titanium dioxide on the hair shafts of patients with and without frontal fibrosing alopecia, Dr. Friedman noted more evidence is needed before recommending that these patients avoid using sunscreen (Br J Dermatol. 2019 Jul;181[1]:216-7). “Correlation does not mean causation, and the current dogma is that there’s no connection between these two,” he commented.

Dr. Friedman reported consulting and advisory board relationships with numerous companies; he also reported speaking for Regeneron, Abbvie, and Dermira, and receiving grants with Pfizer and DF Pharma.

Publications
Publications
Topics
Article Type
Sections
Article Source

EXPERT ANALYSIS FROM ODAC 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

HHS: Coronavirus risk low in U.S., vaccine development underway

Article Type
Changed

U.S. public health officials attempted to stymie concerns about the coronavirus during a press conference on Tuesday, emphasizing that most Americans are not in danger of contracting the illness and urging citizens not to take extreme measures in response to the low-risk virus.

“Right now, there is no spread of this virus in our communities here at home,” Centers for Disease Control and Prevention director Robert Redfield, MD, said during the Jan. 28 press conference. “This is why our current assessment is that the immediate health risk of this new virus to the general public is low in our nation. The coming days and weeks are likely to bring more confirmed cases here and around the world, including the possibility of some person-to-person spreading, but our goal of the ongoing U.S. public health response is to contain this outbreak and prevent sustained spread of the virus in our country.”

During the press conference, Department Health & Human Services Secretary Alex M. Azar II, reiterated there have been only five confirmed U.S. cases of the coronavirus thus far and all were associated with travel to Wuhan, China, where the virus first appeared. The number of confirmed cases in China, meanwhile, has risen to more than 4,500 with about 100 associated deaths.

U.S. health providers should be on the lookout for any patient who has traveled to China recently, particularly to Hubei province, and they should pay close attention to any relevant symptoms, Secretary Azar said during the press conference.

He defended the decision not to declare a public health emergency at this time, stressing that such a move is based on standards and requirements not yet met by the coronavirus.

“It’s important to remember where we are right now; we have five cases in the United States, each of those individuals with direct contact to Wuhan and no person-to-person transmission in the United States,” Secretary Azar said. “I won’t hesitate at all to invoke any authorities that I need to ensure that we’re taking all the steps to protect the American people, but I’ll do it when it’s appropriate under the standards that we have and the authorities that I need.”

In the meantime, a number of efforts are underway by U.S. agencies to assess the nation’s emergency preparedness stockpile, to assist American families in China with evacuation, and to pursue research into diagnostics and a potential vaccine for the virus, Secretary Azar said.

HHS.gov
HHS Secretary Alex Azar (left), NIAID Director Dr. Anthony Fauci, CDC Director Dr. Robert Redfield, and NCIRD Director Dr. Nancy Messonnier.


With regard to countermeasures, the CDC has rapidly developed a diagnostic based on the published sequence of the virus, said Anthony Fauci, MD, director for the National Institute of Allergy and Infectious Diseases (NIAID). The National Institutes of Health and the CDC are now working on the development of next-generation diagnostics to better identify the virus in the United States and throughout the world, Dr. Fauci said during the press conference.

Currently, there are no proven therapeutics for the coronavirus infection, Dr. Fauci said. Based on experiences with SARS and MERS, however, researchers are studying certain antiviral drugs that could potentially treat the virus, he said. This includes the antiviral drug remdesivir, which was developed for the treatment of the Ebola virus, and lopinavir/ritonavir (Kaletra), a combination therapy commonly used to treat HIV. In addition, monoclonal antibodies developed during the SARS outbreak are also being studied.

“Given the somewhat close homology between SARS and the new novel coronavirus, there could be some cross reactivity there that could be utilized,” he said.

Most importantly, he said, vaccine development is underway. Since China isolated the virus and published its sequence, U.S. researchers have already analyzed the components and determined an immunogen to be used in a vaccine, Dr. Fauci said. He anticipates moving to a Phase 1 trial within the next 3 months. The trial would then move to Phase 2 after another few more months for safety data.

“What we do from that point will be determined by what has happened with the outbreak over those months,” he said. “We are proceeding as if we will have to deploy a vaccine. In other words, we’re looking at the worst scenario that this becomes a bigger outbreak.”

Federal health officials, however, stressed that more data about infected patients in China is needed for research. HHS has repeatedly offered to send a CDC team to China to help with public health efforts, research, and response, but China has so far declined the offer, Secretary Azar added.

In addition, the CDC has updated its travel advisory in response to the illness. The latest travel guidance recommends that travelers avoid all nonessential travel to all parts of China.
Publications
Topics
Sections

U.S. public health officials attempted to stymie concerns about the coronavirus during a press conference on Tuesday, emphasizing that most Americans are not in danger of contracting the illness and urging citizens not to take extreme measures in response to the low-risk virus.

“Right now, there is no spread of this virus in our communities here at home,” Centers for Disease Control and Prevention director Robert Redfield, MD, said during the Jan. 28 press conference. “This is why our current assessment is that the immediate health risk of this new virus to the general public is low in our nation. The coming days and weeks are likely to bring more confirmed cases here and around the world, including the possibility of some person-to-person spreading, but our goal of the ongoing U.S. public health response is to contain this outbreak and prevent sustained spread of the virus in our country.”

During the press conference, Department Health & Human Services Secretary Alex M. Azar II, reiterated there have been only five confirmed U.S. cases of the coronavirus thus far and all were associated with travel to Wuhan, China, where the virus first appeared. The number of confirmed cases in China, meanwhile, has risen to more than 4,500 with about 100 associated deaths.

U.S. health providers should be on the lookout for any patient who has traveled to China recently, particularly to Hubei province, and they should pay close attention to any relevant symptoms, Secretary Azar said during the press conference.

He defended the decision not to declare a public health emergency at this time, stressing that such a move is based on standards and requirements not yet met by the coronavirus.

“It’s important to remember where we are right now; we have five cases in the United States, each of those individuals with direct contact to Wuhan and no person-to-person transmission in the United States,” Secretary Azar said. “I won’t hesitate at all to invoke any authorities that I need to ensure that we’re taking all the steps to protect the American people, but I’ll do it when it’s appropriate under the standards that we have and the authorities that I need.”

In the meantime, a number of efforts are underway by U.S. agencies to assess the nation’s emergency preparedness stockpile, to assist American families in China with evacuation, and to pursue research into diagnostics and a potential vaccine for the virus, Secretary Azar said.

HHS.gov
HHS Secretary Alex Azar (left), NIAID Director Dr. Anthony Fauci, CDC Director Dr. Robert Redfield, and NCIRD Director Dr. Nancy Messonnier.


With regard to countermeasures, the CDC has rapidly developed a diagnostic based on the published sequence of the virus, said Anthony Fauci, MD, director for the National Institute of Allergy and Infectious Diseases (NIAID). The National Institutes of Health and the CDC are now working on the development of next-generation diagnostics to better identify the virus in the United States and throughout the world, Dr. Fauci said during the press conference.

Currently, there are no proven therapeutics for the coronavirus infection, Dr. Fauci said. Based on experiences with SARS and MERS, however, researchers are studying certain antiviral drugs that could potentially treat the virus, he said. This includes the antiviral drug remdesivir, which was developed for the treatment of the Ebola virus, and lopinavir/ritonavir (Kaletra), a combination therapy commonly used to treat HIV. In addition, monoclonal antibodies developed during the SARS outbreak are also being studied.

“Given the somewhat close homology between SARS and the new novel coronavirus, there could be some cross reactivity there that could be utilized,” he said.

Most importantly, he said, vaccine development is underway. Since China isolated the virus and published its sequence, U.S. researchers have already analyzed the components and determined an immunogen to be used in a vaccine, Dr. Fauci said. He anticipates moving to a Phase 1 trial within the next 3 months. The trial would then move to Phase 2 after another few more months for safety data.

“What we do from that point will be determined by what has happened with the outbreak over those months,” he said. “We are proceeding as if we will have to deploy a vaccine. In other words, we’re looking at the worst scenario that this becomes a bigger outbreak.”

Federal health officials, however, stressed that more data about infected patients in China is needed for research. HHS has repeatedly offered to send a CDC team to China to help with public health efforts, research, and response, but China has so far declined the offer, Secretary Azar added.

In addition, the CDC has updated its travel advisory in response to the illness. The latest travel guidance recommends that travelers avoid all nonessential travel to all parts of China.

U.S. public health officials attempted to stymie concerns about the coronavirus during a press conference on Tuesday, emphasizing that most Americans are not in danger of contracting the illness and urging citizens not to take extreme measures in response to the low-risk virus.

“Right now, there is no spread of this virus in our communities here at home,” Centers for Disease Control and Prevention director Robert Redfield, MD, said during the Jan. 28 press conference. “This is why our current assessment is that the immediate health risk of this new virus to the general public is low in our nation. The coming days and weeks are likely to bring more confirmed cases here and around the world, including the possibility of some person-to-person spreading, but our goal of the ongoing U.S. public health response is to contain this outbreak and prevent sustained spread of the virus in our country.”

During the press conference, Department Health & Human Services Secretary Alex M. Azar II, reiterated there have been only five confirmed U.S. cases of the coronavirus thus far and all were associated with travel to Wuhan, China, where the virus first appeared. The number of confirmed cases in China, meanwhile, has risen to more than 4,500 with about 100 associated deaths.

U.S. health providers should be on the lookout for any patient who has traveled to China recently, particularly to Hubei province, and they should pay close attention to any relevant symptoms, Secretary Azar said during the press conference.

He defended the decision not to declare a public health emergency at this time, stressing that such a move is based on standards and requirements not yet met by the coronavirus.

“It’s important to remember where we are right now; we have five cases in the United States, each of those individuals with direct contact to Wuhan and no person-to-person transmission in the United States,” Secretary Azar said. “I won’t hesitate at all to invoke any authorities that I need to ensure that we’re taking all the steps to protect the American people, but I’ll do it when it’s appropriate under the standards that we have and the authorities that I need.”

In the meantime, a number of efforts are underway by U.S. agencies to assess the nation’s emergency preparedness stockpile, to assist American families in China with evacuation, and to pursue research into diagnostics and a potential vaccine for the virus, Secretary Azar said.

HHS.gov
HHS Secretary Alex Azar (left), NIAID Director Dr. Anthony Fauci, CDC Director Dr. Robert Redfield, and NCIRD Director Dr. Nancy Messonnier.


With regard to countermeasures, the CDC has rapidly developed a diagnostic based on the published sequence of the virus, said Anthony Fauci, MD, director for the National Institute of Allergy and Infectious Diseases (NIAID). The National Institutes of Health and the CDC are now working on the development of next-generation diagnostics to better identify the virus in the United States and throughout the world, Dr. Fauci said during the press conference.

Currently, there are no proven therapeutics for the coronavirus infection, Dr. Fauci said. Based on experiences with SARS and MERS, however, researchers are studying certain antiviral drugs that could potentially treat the virus, he said. This includes the antiviral drug remdesivir, which was developed for the treatment of the Ebola virus, and lopinavir/ritonavir (Kaletra), a combination therapy commonly used to treat HIV. In addition, monoclonal antibodies developed during the SARS outbreak are also being studied.

“Given the somewhat close homology between SARS and the new novel coronavirus, there could be some cross reactivity there that could be utilized,” he said.

Most importantly, he said, vaccine development is underway. Since China isolated the virus and published its sequence, U.S. researchers have already analyzed the components and determined an immunogen to be used in a vaccine, Dr. Fauci said. He anticipates moving to a Phase 1 trial within the next 3 months. The trial would then move to Phase 2 after another few more months for safety data.

“What we do from that point will be determined by what has happened with the outbreak over those months,” he said. “We are proceeding as if we will have to deploy a vaccine. In other words, we’re looking at the worst scenario that this becomes a bigger outbreak.”

Federal health officials, however, stressed that more data about infected patients in China is needed for research. HHS has repeatedly offered to send a CDC team to China to help with public health efforts, research, and response, but China has so far declined the offer, Secretary Azar added.

In addition, the CDC has updated its travel advisory in response to the illness. The latest travel guidance recommends that travelers avoid all nonessential travel to all parts of China.
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Improving health care with simulation

Article Type
Changed

QI is for clinicians too

Simulation is commonly used in the education and training of health care professionals, but more recently it’s entering the quality improvement world.

“Instead of just thinking about training individuals and teams, people are starting to use simulation to look at the physical layout of resuscitation bays, to map work flows of a patient journey through hospitals, to identify latent safety threats,” said Victoria Brazil, MD, MBA, lead author of a study on the subject in BMJ Quality & Safety. “These are great things to do, but many of the people doing it didn’t have quality improvement skills or knowledge – that’s why we wrote this article.”

Dr. Brazil, a specialist in health care simulation at Gold Coast (Australia) Hospital and Health Service, explained that, “in terms of the top takeaways, for quality improvement teams – and I’m including everyday clinicians in this: Think about simulation as one of the tools that can be utilized when looking at the questions of how we make our performance better, whether that’s a team performance, environmental, investigational impacts, or one of my key interests, whether that’s about exploring and shaping culture in hospitals, which we’ve done a lot of work on using simulation.”

Quality improvement has become a very specialized field, she added, so hospitalists may think it’s outside their purview. “As clinicians, we don’t think about ourselves as being engaged in quality improvement. I think that’s a shame, because many of the things that we can do bit by bit to make our patient outcomes better, we need to be thinking about finding better ways to do those things. I suggest simulation is one way, and that doesn’t need to be a massive simulation center. It can be simulating the kind of things that are important to you, your teams, and your patients and using those to both explore improved performance.”

Dr. Brazil said that Gold Coast Hospital has used simulation as a way of getting people from different departments and different professions together to shape culture through understanding shared knowledge and goals around patient journeys.

“That’s been pretty successful for us, and I think it’s really important that quality improvement has that understanding of context and culture as well as the idea of having specific interventions – maybe like a simulation – to try and improve an outcome,” she said.

Reference

1. Brazil V et al.. Connecting simulation and quality improvement: How can healthcare simulation really improve patient care? BMJ Qual Saf. 2019 Jul 18. doi: 10.1136/bmjqs-2019-009767.

Publications
Topics
Sections

QI is for clinicians too

QI is for clinicians too

Simulation is commonly used in the education and training of health care professionals, but more recently it’s entering the quality improvement world.

“Instead of just thinking about training individuals and teams, people are starting to use simulation to look at the physical layout of resuscitation bays, to map work flows of a patient journey through hospitals, to identify latent safety threats,” said Victoria Brazil, MD, MBA, lead author of a study on the subject in BMJ Quality & Safety. “These are great things to do, but many of the people doing it didn’t have quality improvement skills or knowledge – that’s why we wrote this article.”

Dr. Brazil, a specialist in health care simulation at Gold Coast (Australia) Hospital and Health Service, explained that, “in terms of the top takeaways, for quality improvement teams – and I’m including everyday clinicians in this: Think about simulation as one of the tools that can be utilized when looking at the questions of how we make our performance better, whether that’s a team performance, environmental, investigational impacts, or one of my key interests, whether that’s about exploring and shaping culture in hospitals, which we’ve done a lot of work on using simulation.”

Quality improvement has become a very specialized field, she added, so hospitalists may think it’s outside their purview. “As clinicians, we don’t think about ourselves as being engaged in quality improvement. I think that’s a shame, because many of the things that we can do bit by bit to make our patient outcomes better, we need to be thinking about finding better ways to do those things. I suggest simulation is one way, and that doesn’t need to be a massive simulation center. It can be simulating the kind of things that are important to you, your teams, and your patients and using those to both explore improved performance.”

Dr. Brazil said that Gold Coast Hospital has used simulation as a way of getting people from different departments and different professions together to shape culture through understanding shared knowledge and goals around patient journeys.

“That’s been pretty successful for us, and I think it’s really important that quality improvement has that understanding of context and culture as well as the idea of having specific interventions – maybe like a simulation – to try and improve an outcome,” she said.

Reference

1. Brazil V et al.. Connecting simulation and quality improvement: How can healthcare simulation really improve patient care? BMJ Qual Saf. 2019 Jul 18. doi: 10.1136/bmjqs-2019-009767.

Simulation is commonly used in the education and training of health care professionals, but more recently it’s entering the quality improvement world.

“Instead of just thinking about training individuals and teams, people are starting to use simulation to look at the physical layout of resuscitation bays, to map work flows of a patient journey through hospitals, to identify latent safety threats,” said Victoria Brazil, MD, MBA, lead author of a study on the subject in BMJ Quality & Safety. “These are great things to do, but many of the people doing it didn’t have quality improvement skills or knowledge – that’s why we wrote this article.”

Dr. Brazil, a specialist in health care simulation at Gold Coast (Australia) Hospital and Health Service, explained that, “in terms of the top takeaways, for quality improvement teams – and I’m including everyday clinicians in this: Think about simulation as one of the tools that can be utilized when looking at the questions of how we make our performance better, whether that’s a team performance, environmental, investigational impacts, or one of my key interests, whether that’s about exploring and shaping culture in hospitals, which we’ve done a lot of work on using simulation.”

Quality improvement has become a very specialized field, she added, so hospitalists may think it’s outside their purview. “As clinicians, we don’t think about ourselves as being engaged in quality improvement. I think that’s a shame, because many of the things that we can do bit by bit to make our patient outcomes better, we need to be thinking about finding better ways to do those things. I suggest simulation is one way, and that doesn’t need to be a massive simulation center. It can be simulating the kind of things that are important to you, your teams, and your patients and using those to both explore improved performance.”

Dr. Brazil said that Gold Coast Hospital has used simulation as a way of getting people from different departments and different professions together to shape culture through understanding shared knowledge and goals around patient journeys.

“That’s been pretty successful for us, and I think it’s really important that quality improvement has that understanding of context and culture as well as the idea of having specific interventions – maybe like a simulation – to try and improve an outcome,” she said.

Reference

1. Brazil V et al.. Connecting simulation and quality improvement: How can healthcare simulation really improve patient care? BMJ Qual Saf. 2019 Jul 18. doi: 10.1136/bmjqs-2019-009767.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Opioid deaths boost donor heart supply

Article Type
Changed

– The tragic opioid epidemic has “one small bright spot”: an expanding pool of eligible donor hearts for transplantation, Akshay S. Desai, MD, said at the annual Cardiovascular Conference at Snowmass sponsored by the American College of Cardiology.

Bruce Jancin/MDedge News
Dr. Akshay S. Desai

For decades, the annual volume of heart transplantations performed in the U.S. was static because of the huge mismatch between donor organ supply and demand. But heart transplant volume has increased steadily in the last few years – a result of the opioid epidemic.

Data from the U.S. Organ Procurement and Transplantation Network show that the proportion of donor hearts obtained from individuals who died from drug intoxication climbed from a mere 1.5% in 1999 to 17.6% in 2017, the most recent year for which data are available. Meanwhile, the size of the heart transplant waiting list, which rose year after year in 2009-2015, has since declined (N Engl J Med. 2019 Feb 7;380[6]:597-9).

“What’s amazing is that, even though these patients might have historically been considered high risk in general, the organs recovered from these patients – and particularly the hearts – don’t seem to be any worse in terms of allograft survival than the organs recovered from patients who died from other causes, which are the traditional sources, like blunt head trauma, gunshot wounds, or stroke, that lead to brain death. In general, these organs are useful and do quite well,” according to Dr. Desai, medical director of the cardiomyopathy and heart failure program at Brigham and Women’s Hospital, Boston.

He highlighted several other recent developments in the field of cardiac transplantation that promise to further expand the donor heart pool, including acceptance of hepatitis C–infected donors and organ donation after circulatory rather than brain death. Dr. Desai also drew attention to the unintended perverse consequences of a recent redesign of the U.S. donor heart allocation system and discussed the impressive improvement in clinical outcomes with mechanical circulatory support. He noted that, while relatively few cardiologists practice in the highly specialized centers where heart transplants take place, virtually all cardiologists are affected by advances in heart transplantation since hundreds of thousands of the estimated 7 million Americans with heart failure have advanced disease.

Heart transplantation, he emphasized, is becoming increasingly complex. Recipients are on average older, sicker, and have more comorbidities than in times past. As a result, there is greater need for dual organ transplants: heart/lung, heart/liver, or heart/kidney. Plus, more patients come to transplantation after prior cardiac surgery for implantation of a ventricular assist device, so sensitization to blood products is a growing issue. And, of course, the pool of transplant candidates has expanded.

“We’re now forced to take patients previously considered to have contraindications to transplant; for example, diabetes was a contraindication to transplant in the early years, but now it’s the rule in 35%-40% of our patients who present with advanced heart failure,” the cardiologist noted.
 

 

 

Transplants from HCV-infected donors to uninfected recipients

Hearts and lungs from donors with hepatitis C viremia were traditionally deemed unsuitable for transplant. That’s all changed in the current era of highly effective direct-acting antiviral agents for the treatment of HCV infection.

In the DONATE HCV trial, Dr. Desai’s colleagues at Brigham and Women’s Hospital showed that giving HCV-uninfected recipients of hearts or lungs from HCV-viremic donors a shortened 4-week course of treatment with sofosbuvir-velpatasvir (Epclusa) beginning within a few hours after transplantation uniformly blocked viral replication. Six months after transplantation, none of the study participants had a detectable HCV viral load, and all had excellent graft function (N Engl J Med. 2019 Apr 25;380[17]:1606-17).

“This is effective prevention of HCV infection by aggressive upfront therapy,” Dr. Desai explained. “We can now take organs from HCV-viremic patients and use them in solid organ transplantation. This has led to a skyrocketing increase in donors with HCV infection, and those donations have helped us clear the waiting list.”
 

Donation after circulatory death

Australian transplant physicians have pioneered the use of donor hearts obtained after circulatory death in individuals with devastating neurologic injury who didn’t quite meet the criteria for brain death, which is the traditional prerequisite. In the new scenario, withdrawal of life-supporting therapy is followed by circulatory death, then the donor heart is procured and preserved via extracorporeal perfusion until transplantation.

The Australians report excellent outcomes, with rates of overall survival and rejection episodes similar to outcomes from brain-dead donors (J Am Coll Cardiol. 2019 Apr 2;73[12]:1447-59). The first U.S. heart transplant involving donation after circulatory death took place at Duke University in Durham, North Carolina. A multicenter U.S. clinical trial of this practice is underway.

If the results are positive and the practice of donation after circulatory death becomes widely implemented, the U.S. heart donor pool could increase by 30%.
 

Recent overhaul of donor heart allocation system may have backfired

The U.S. donor heart allocation system was redesigned in the fall of 2018 in an effort to reduce waiting times. One of the biggest changes involved breaking down the category with the highest urgency status into three new subcategories based upon sickness. Now, the highest-urgency category is for patients in cardiogenic shock who are supported by extracorporeal membrane oxygenation (ECMO) or other temporary mechanical circulatory support devices.

But an analysis of United Network for Organ Sharing (UNOS) data suggests this change has unintended adverse consequences for clinical outcomes.

Indeed, the investigators reported that the use of ECMO support is fourfold greater in the new system, the use of durable left ventricular assist devices (LVADs) as a bridge to transplant is down, and outcomes are worse. The 180-day rate of freedom from death or retransplantation was 77.9%, down significantly from 93.4% in the former system. In a multivariate analysis, patients transplanted in the new system had an adjusted 2.1-fold increased risk of death or retransplantation (J Heart Lung Transplant. 2020 Jan;39[1]:1-4).

“When you create a new listing system, you create new incentives, and people start to manage patients differently,” Dr. Desai observed. “Increasingly now, the path direct to transplant is through temporary mechanical circulatory support rather than durable mechanical circulatory support. Is that a good idea? We don’t know, but if you look at the best data, those on ECMO or percutaneous VADs have the worst outcomes. So the question of whether we should take the sickest of sick patients directly to transplant as a standard strategy has come under scrutiny.”
 

Improved durable LVAD technology brings impressive clinical outcomes

Results of the landmark MOMENTUM 3 randomized trial showed that 2-year clinical outcomes with the magnetically levitated centrifugal-flow HeartMate 3 LVAD now rival those of percutaneous mitral valve repair using the MitraClip device. Two-year all-cause mortality in the LVAD recipients was 22% versus 29.1% with the MitraClip in the COAPT trial and 34.9% in the MITRA-FR trial. The HeartMate 3 reduces the hemocompatibility issues that plagued earlier-generation durable LVADs, with resultant lower rates of pump thrombosis, stroke, and GI bleeding. Indeed, the outcomes in MOMENTUM 3 were so good – and so similar – with the HeartMate 3, regardless of whether the intended treatment goal was as a bridge to transplant or as lifelong destination therapy, that the investigators have recently proposed doing away with those distinctions.

“It is possible that use of arbitrary categorizations based on current or future transplant eligibility should be clinically abandoned in favor of a single preimplant strategy: to extend the survival and improve the quality of life of patients with medically refractory heart failure,” according to the investigators (JAMA Cardiol. 2020 Jan 15. doi: 10.1001/jamacardio.2019.5323).

The next step forward in LVAD technology is already on the horizon: a fully implantable device that eliminates the transcutaneous drive-line for the power supply, which is prone to infection and diminishes overall quality of life. This investigational device utilizes wireless coplanar energy transfer, with a coil ring placed around the lung and fixed to the chest wall. The implanted battery provides more than 6 hours of power without a recharge (J Heart Lung Transplant. 2019 Apr;38[4]:339-43).

“The first LVAD patient has gone swimming in Kazakhstan,” according to Dr. Desai.

Myocardial recovery in LVAD recipients remains elusive

The initial hope for LVADs was that they would not only be able to serve as a bridge to transplantation or as lifetime therapy, but that the prolonged unloading of the ventricle would enable potent medical therapy to rescue myocardial function so that the device could eventually be explanted. That does happen, but only rarely. In a large registry study, myocardial recovery occurred in only about 1% of patients on mechanical circulatory support. Attempts to enhance the process by add-on stem cell therapy have thus far been ineffective.

“For the moment, recovery is still a hope, not a reality,” the cardiologist said.

He reported serving as a consultant to more than a dozen pharmaceutical or medical device companies and receiving research grants from Alnylam, AstraZeneca, Bayer Healthcare, MyoKardia, and Novartis.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– The tragic opioid epidemic has “one small bright spot”: an expanding pool of eligible donor hearts for transplantation, Akshay S. Desai, MD, said at the annual Cardiovascular Conference at Snowmass sponsored by the American College of Cardiology.

Bruce Jancin/MDedge News
Dr. Akshay S. Desai

For decades, the annual volume of heart transplantations performed in the U.S. was static because of the huge mismatch between donor organ supply and demand. But heart transplant volume has increased steadily in the last few years – a result of the opioid epidemic.

Data from the U.S. Organ Procurement and Transplantation Network show that the proportion of donor hearts obtained from individuals who died from drug intoxication climbed from a mere 1.5% in 1999 to 17.6% in 2017, the most recent year for which data are available. Meanwhile, the size of the heart transplant waiting list, which rose year after year in 2009-2015, has since declined (N Engl J Med. 2019 Feb 7;380[6]:597-9).

“What’s amazing is that, even though these patients might have historically been considered high risk in general, the organs recovered from these patients – and particularly the hearts – don’t seem to be any worse in terms of allograft survival than the organs recovered from patients who died from other causes, which are the traditional sources, like blunt head trauma, gunshot wounds, or stroke, that lead to brain death. In general, these organs are useful and do quite well,” according to Dr. Desai, medical director of the cardiomyopathy and heart failure program at Brigham and Women’s Hospital, Boston.

He highlighted several other recent developments in the field of cardiac transplantation that promise to further expand the donor heart pool, including acceptance of hepatitis C–infected donors and organ donation after circulatory rather than brain death. Dr. Desai also drew attention to the unintended perverse consequences of a recent redesign of the U.S. donor heart allocation system and discussed the impressive improvement in clinical outcomes with mechanical circulatory support. He noted that, while relatively few cardiologists practice in the highly specialized centers where heart transplants take place, virtually all cardiologists are affected by advances in heart transplantation since hundreds of thousands of the estimated 7 million Americans with heart failure have advanced disease.

Heart transplantation, he emphasized, is becoming increasingly complex. Recipients are on average older, sicker, and have more comorbidities than in times past. As a result, there is greater need for dual organ transplants: heart/lung, heart/liver, or heart/kidney. Plus, more patients come to transplantation after prior cardiac surgery for implantation of a ventricular assist device, so sensitization to blood products is a growing issue. And, of course, the pool of transplant candidates has expanded.

“We’re now forced to take patients previously considered to have contraindications to transplant; for example, diabetes was a contraindication to transplant in the early years, but now it’s the rule in 35%-40% of our patients who present with advanced heart failure,” the cardiologist noted.
 

 

 

Transplants from HCV-infected donors to uninfected recipients

Hearts and lungs from donors with hepatitis C viremia were traditionally deemed unsuitable for transplant. That’s all changed in the current era of highly effective direct-acting antiviral agents for the treatment of HCV infection.

In the DONATE HCV trial, Dr. Desai’s colleagues at Brigham and Women’s Hospital showed that giving HCV-uninfected recipients of hearts or lungs from HCV-viremic donors a shortened 4-week course of treatment with sofosbuvir-velpatasvir (Epclusa) beginning within a few hours after transplantation uniformly blocked viral replication. Six months after transplantation, none of the study participants had a detectable HCV viral load, and all had excellent graft function (N Engl J Med. 2019 Apr 25;380[17]:1606-17).

“This is effective prevention of HCV infection by aggressive upfront therapy,” Dr. Desai explained. “We can now take organs from HCV-viremic patients and use them in solid organ transplantation. This has led to a skyrocketing increase in donors with HCV infection, and those donations have helped us clear the waiting list.”
 

Donation after circulatory death

Australian transplant physicians have pioneered the use of donor hearts obtained after circulatory death in individuals with devastating neurologic injury who didn’t quite meet the criteria for brain death, which is the traditional prerequisite. In the new scenario, withdrawal of life-supporting therapy is followed by circulatory death, then the donor heart is procured and preserved via extracorporeal perfusion until transplantation.

The Australians report excellent outcomes, with rates of overall survival and rejection episodes similar to outcomes from brain-dead donors (J Am Coll Cardiol. 2019 Apr 2;73[12]:1447-59). The first U.S. heart transplant involving donation after circulatory death took place at Duke University in Durham, North Carolina. A multicenter U.S. clinical trial of this practice is underway.

If the results are positive and the practice of donation after circulatory death becomes widely implemented, the U.S. heart donor pool could increase by 30%.
 

Recent overhaul of donor heart allocation system may have backfired

The U.S. donor heart allocation system was redesigned in the fall of 2018 in an effort to reduce waiting times. One of the biggest changes involved breaking down the category with the highest urgency status into three new subcategories based upon sickness. Now, the highest-urgency category is for patients in cardiogenic shock who are supported by extracorporeal membrane oxygenation (ECMO) or other temporary mechanical circulatory support devices.

But an analysis of United Network for Organ Sharing (UNOS) data suggests this change has unintended adverse consequences for clinical outcomes.

Indeed, the investigators reported that the use of ECMO support is fourfold greater in the new system, the use of durable left ventricular assist devices (LVADs) as a bridge to transplant is down, and outcomes are worse. The 180-day rate of freedom from death or retransplantation was 77.9%, down significantly from 93.4% in the former system. In a multivariate analysis, patients transplanted in the new system had an adjusted 2.1-fold increased risk of death or retransplantation (J Heart Lung Transplant. 2020 Jan;39[1]:1-4).

“When you create a new listing system, you create new incentives, and people start to manage patients differently,” Dr. Desai observed. “Increasingly now, the path direct to transplant is through temporary mechanical circulatory support rather than durable mechanical circulatory support. Is that a good idea? We don’t know, but if you look at the best data, those on ECMO or percutaneous VADs have the worst outcomes. So the question of whether we should take the sickest of sick patients directly to transplant as a standard strategy has come under scrutiny.”
 

Improved durable LVAD technology brings impressive clinical outcomes

Results of the landmark MOMENTUM 3 randomized trial showed that 2-year clinical outcomes with the magnetically levitated centrifugal-flow HeartMate 3 LVAD now rival those of percutaneous mitral valve repair using the MitraClip device. Two-year all-cause mortality in the LVAD recipients was 22% versus 29.1% with the MitraClip in the COAPT trial and 34.9% in the MITRA-FR trial. The HeartMate 3 reduces the hemocompatibility issues that plagued earlier-generation durable LVADs, with resultant lower rates of pump thrombosis, stroke, and GI bleeding. Indeed, the outcomes in MOMENTUM 3 were so good – and so similar – with the HeartMate 3, regardless of whether the intended treatment goal was as a bridge to transplant or as lifelong destination therapy, that the investigators have recently proposed doing away with those distinctions.

“It is possible that use of arbitrary categorizations based on current or future transplant eligibility should be clinically abandoned in favor of a single preimplant strategy: to extend the survival and improve the quality of life of patients with medically refractory heart failure,” according to the investigators (JAMA Cardiol. 2020 Jan 15. doi: 10.1001/jamacardio.2019.5323).

The next step forward in LVAD technology is already on the horizon: a fully implantable device that eliminates the transcutaneous drive-line for the power supply, which is prone to infection and diminishes overall quality of life. This investigational device utilizes wireless coplanar energy transfer, with a coil ring placed around the lung and fixed to the chest wall. The implanted battery provides more than 6 hours of power without a recharge (J Heart Lung Transplant. 2019 Apr;38[4]:339-43).

“The first LVAD patient has gone swimming in Kazakhstan,” according to Dr. Desai.

Myocardial recovery in LVAD recipients remains elusive

The initial hope for LVADs was that they would not only be able to serve as a bridge to transplantation or as lifetime therapy, but that the prolonged unloading of the ventricle would enable potent medical therapy to rescue myocardial function so that the device could eventually be explanted. That does happen, but only rarely. In a large registry study, myocardial recovery occurred in only about 1% of patients on mechanical circulatory support. Attempts to enhance the process by add-on stem cell therapy have thus far been ineffective.

“For the moment, recovery is still a hope, not a reality,” the cardiologist said.

He reported serving as a consultant to more than a dozen pharmaceutical or medical device companies and receiving research grants from Alnylam, AstraZeneca, Bayer Healthcare, MyoKardia, and Novartis.

– The tragic opioid epidemic has “one small bright spot”: an expanding pool of eligible donor hearts for transplantation, Akshay S. Desai, MD, said at the annual Cardiovascular Conference at Snowmass sponsored by the American College of Cardiology.

Bruce Jancin/MDedge News
Dr. Akshay S. Desai

For decades, the annual volume of heart transplantations performed in the U.S. was static because of the huge mismatch between donor organ supply and demand. But heart transplant volume has increased steadily in the last few years – a result of the opioid epidemic.

Data from the U.S. Organ Procurement and Transplantation Network show that the proportion of donor hearts obtained from individuals who died from drug intoxication climbed from a mere 1.5% in 1999 to 17.6% in 2017, the most recent year for which data are available. Meanwhile, the size of the heart transplant waiting list, which rose year after year in 2009-2015, has since declined (N Engl J Med. 2019 Feb 7;380[6]:597-9).

“What’s amazing is that, even though these patients might have historically been considered high risk in general, the organs recovered from these patients – and particularly the hearts – don’t seem to be any worse in terms of allograft survival than the organs recovered from patients who died from other causes, which are the traditional sources, like blunt head trauma, gunshot wounds, or stroke, that lead to brain death. In general, these organs are useful and do quite well,” according to Dr. Desai, medical director of the cardiomyopathy and heart failure program at Brigham and Women’s Hospital, Boston.

He highlighted several other recent developments in the field of cardiac transplantation that promise to further expand the donor heart pool, including acceptance of hepatitis C–infected donors and organ donation after circulatory rather than brain death. Dr. Desai also drew attention to the unintended perverse consequences of a recent redesign of the U.S. donor heart allocation system and discussed the impressive improvement in clinical outcomes with mechanical circulatory support. He noted that, while relatively few cardiologists practice in the highly specialized centers where heart transplants take place, virtually all cardiologists are affected by advances in heart transplantation since hundreds of thousands of the estimated 7 million Americans with heart failure have advanced disease.

Heart transplantation, he emphasized, is becoming increasingly complex. Recipients are on average older, sicker, and have more comorbidities than in times past. As a result, there is greater need for dual organ transplants: heart/lung, heart/liver, or heart/kidney. Plus, more patients come to transplantation after prior cardiac surgery for implantation of a ventricular assist device, so sensitization to blood products is a growing issue. And, of course, the pool of transplant candidates has expanded.

“We’re now forced to take patients previously considered to have contraindications to transplant; for example, diabetes was a contraindication to transplant in the early years, but now it’s the rule in 35%-40% of our patients who present with advanced heart failure,” the cardiologist noted.
 

 

 

Transplants from HCV-infected donors to uninfected recipients

Hearts and lungs from donors with hepatitis C viremia were traditionally deemed unsuitable for transplant. That’s all changed in the current era of highly effective direct-acting antiviral agents for the treatment of HCV infection.

In the DONATE HCV trial, Dr. Desai’s colleagues at Brigham and Women’s Hospital showed that giving HCV-uninfected recipients of hearts or lungs from HCV-viremic donors a shortened 4-week course of treatment with sofosbuvir-velpatasvir (Epclusa) beginning within a few hours after transplantation uniformly blocked viral replication. Six months after transplantation, none of the study participants had a detectable HCV viral load, and all had excellent graft function (N Engl J Med. 2019 Apr 25;380[17]:1606-17).

“This is effective prevention of HCV infection by aggressive upfront therapy,” Dr. Desai explained. “We can now take organs from HCV-viremic patients and use them in solid organ transplantation. This has led to a skyrocketing increase in donors with HCV infection, and those donations have helped us clear the waiting list.”
 

Donation after circulatory death

Australian transplant physicians have pioneered the use of donor hearts obtained after circulatory death in individuals with devastating neurologic injury who didn’t quite meet the criteria for brain death, which is the traditional prerequisite. In the new scenario, withdrawal of life-supporting therapy is followed by circulatory death, then the donor heart is procured and preserved via extracorporeal perfusion until transplantation.

The Australians report excellent outcomes, with rates of overall survival and rejection episodes similar to outcomes from brain-dead donors (J Am Coll Cardiol. 2019 Apr 2;73[12]:1447-59). The first U.S. heart transplant involving donation after circulatory death took place at Duke University in Durham, North Carolina. A multicenter U.S. clinical trial of this practice is underway.

If the results are positive and the practice of donation after circulatory death becomes widely implemented, the U.S. heart donor pool could increase by 30%.
 

Recent overhaul of donor heart allocation system may have backfired

The U.S. donor heart allocation system was redesigned in the fall of 2018 in an effort to reduce waiting times. One of the biggest changes involved breaking down the category with the highest urgency status into three new subcategories based upon sickness. Now, the highest-urgency category is for patients in cardiogenic shock who are supported by extracorporeal membrane oxygenation (ECMO) or other temporary mechanical circulatory support devices.

But an analysis of United Network for Organ Sharing (UNOS) data suggests this change has unintended adverse consequences for clinical outcomes.

Indeed, the investigators reported that the use of ECMO support is fourfold greater in the new system, the use of durable left ventricular assist devices (LVADs) as a bridge to transplant is down, and outcomes are worse. The 180-day rate of freedom from death or retransplantation was 77.9%, down significantly from 93.4% in the former system. In a multivariate analysis, patients transplanted in the new system had an adjusted 2.1-fold increased risk of death or retransplantation (J Heart Lung Transplant. 2020 Jan;39[1]:1-4).

“When you create a new listing system, you create new incentives, and people start to manage patients differently,” Dr. Desai observed. “Increasingly now, the path direct to transplant is through temporary mechanical circulatory support rather than durable mechanical circulatory support. Is that a good idea? We don’t know, but if you look at the best data, those on ECMO or percutaneous VADs have the worst outcomes. So the question of whether we should take the sickest of sick patients directly to transplant as a standard strategy has come under scrutiny.”
 

Improved durable LVAD technology brings impressive clinical outcomes

Results of the landmark MOMENTUM 3 randomized trial showed that 2-year clinical outcomes with the magnetically levitated centrifugal-flow HeartMate 3 LVAD now rival those of percutaneous mitral valve repair using the MitraClip device. Two-year all-cause mortality in the LVAD recipients was 22% versus 29.1% with the MitraClip in the COAPT trial and 34.9% in the MITRA-FR trial. The HeartMate 3 reduces the hemocompatibility issues that plagued earlier-generation durable LVADs, with resultant lower rates of pump thrombosis, stroke, and GI bleeding. Indeed, the outcomes in MOMENTUM 3 were so good – and so similar – with the HeartMate 3, regardless of whether the intended treatment goal was as a bridge to transplant or as lifelong destination therapy, that the investigators have recently proposed doing away with those distinctions.

“It is possible that use of arbitrary categorizations based on current or future transplant eligibility should be clinically abandoned in favor of a single preimplant strategy: to extend the survival and improve the quality of life of patients with medically refractory heart failure,” according to the investigators (JAMA Cardiol. 2020 Jan 15. doi: 10.1001/jamacardio.2019.5323).

The next step forward in LVAD technology is already on the horizon: a fully implantable device that eliminates the transcutaneous drive-line for the power supply, which is prone to infection and diminishes overall quality of life. This investigational device utilizes wireless coplanar energy transfer, with a coil ring placed around the lung and fixed to the chest wall. The implanted battery provides more than 6 hours of power without a recharge (J Heart Lung Transplant. 2019 Apr;38[4]:339-43).

“The first LVAD patient has gone swimming in Kazakhstan,” according to Dr. Desai.

Myocardial recovery in LVAD recipients remains elusive

The initial hope for LVADs was that they would not only be able to serve as a bridge to transplantation or as lifetime therapy, but that the prolonged unloading of the ventricle would enable potent medical therapy to rescue myocardial function so that the device could eventually be explanted. That does happen, but only rarely. In a large registry study, myocardial recovery occurred in only about 1% of patients on mechanical circulatory support. Attempts to enhance the process by add-on stem cell therapy have thus far been ineffective.

“For the moment, recovery is still a hope, not a reality,” the cardiologist said.

He reported serving as a consultant to more than a dozen pharmaceutical or medical device companies and receiving research grants from Alnylam, AstraZeneca, Bayer Healthcare, MyoKardia, and Novartis.

Publications
Publications
Topics
Article Type
Sections
Article Source

EXPERT ANALYSIS FROM ACC SNOWMASS 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Nonuremic Calciphylaxis Triggered by Rapid Weight Loss and Hypotension

Article Type
Changed
Display Headline
Nonuremic Calciphylaxis Triggered by Rapid Weight Loss and Hypotension

Calciphylaxis, otherwise known as calcific uremic arteriolopathy, is characterized by calcification of the tunica media of the small- to medium-sized blood vessels of the dermis and subcutis, leading to ischemia and necrosis.1 It is a deadly disease with a 1-year mortality rate of more than 50%.2 End-stage renal disease (ESRD) is the most common risk factor for calciphylaxis, with a prevalence of 1% to 4% of hemodialysis patients with calciphylaxis in the United States.2-5 However, nonuremic calciphylaxis (NUC) has been increasingly reported in the literature and has risk factors other than ESRD, including but not limited to obesity, alcoholic liver disease, primary hyperparathyroidism, connective tissue disease, and underlying malignancy.3,6-9 Triggers for calciphylaxis in at-risk patients include use of corticosteroids or warfarin, iron or albumin infusions, and rapid weight loss.3,6,9-11 We report an unusual case of NUC that most likely was triggered by rapid weight loss and hypotension in a patient with multiple risk factors for calciphylaxis.

Case Report

A 75-year-old white woman with history of morbid obesity (body mass index, 40 kg/m2), unexplained weight loss of 70 lb over the last year, and polymyalgia rheumatica requiring chronic prednisone therapy presented with painful lesions on the thighs, buttocks, and right shoulder of 4 months’ duration. She had multiple hospital admissions preceding the onset of lesions for severe infections resulting in sepsis with hypotension, including Enterococcus faecalis endocarditis, extended-spectrum beta-lactamase bacteremia, and Pseudomonas aeruginosa pneumonia. Physical examination revealed large well-demarcated ulcers and necrotic eschars with surrounding violaceous induration and stellate erythema on the anterior, medial, and posterior thighs and buttocks that were exquisitely tender (Figures 1 and 2).

Figure 1. Necrotic eschars surrounded by erythema and livedo reticularis on the right medial thigh.

Figure 2. Eschar with a rolled erythematous border on the left lateral thigh.

Notable laboratory results included hypoalbuminemia (1.3 g/dL [reference range, 3.5–5.0 g/dL]) with normal renal function, a corrected calcium level of 9.7 mg/dL (reference range, 8.2–10.2 mg/dL), a serum phosphorus level of 3.5 mg/dL (reference range, 2.3–4.7 mg/dL), a calcium-phosphate product of 27.3 mg2/dL2 (reference range, <55 mg2/dL2), and a parathyroid hormone level of 49.3 pg/mL (reference range, 10–65 pg/mL). Antinuclear antibodies were negative. A hypercoagulability evaluation showed normal protein C and S levels, negative lupus anticoagulant, and negative anticardiolipin antibodies.

Telescoping punch biopsies of the indurated borders of the eschars showed prominent calcification of the small- and medium-sized vessels in the mid and deep dermis, intravascular thrombi, and necrosis of the epidermis and subcutaneous fat consistent with calciphylaxis (Figure 3).

Figure 3. A, Epidermal necrosis, small- and medium-sized vessel calcification and thrombus, and underlying septal panniculitis with fat necrosis (H&E, original magnification ×100). B, High-power magnification of small vessel calcification in the subcutaneous fat (H&E, original magnification ×400).


After the diagnosis of calciphylaxis was made, the patient was treated with intravenous sodium thiosulfate 25 mg 3 times weekly and alendronate 70 mg weekly. Daily arterial blood gas studies did not detect metabolic acidosis during the patient’s sodium thiosulfate therapy. The wounds were debrided, and we attempted to slowly taper the patient off the oral prednisone. Unfortunately, her condition slowly deteriorated secondary to sepsis, resulting in septic shock. The patient died 3 weeks after the diagnosis of calciphylaxis was made. At the time of diagnosis, the patient had a poor prognosis and notable risk for sepsis due to the large eschars on the thighs and abdomen as well as her relative immunosuppression due to chronic prednisone use.
 

 

Comment

Background on Calciphylaxis
Calciphylaxis is a rare but deadly disease that affects both ESRD patients receiving dialysis and patients without ESRD who have known risk factors for calciphylaxis, including female gender, white race, obesity, alcoholic liver disease, primary hyperparathyroidism, connective tissue disease, underlying malignancy, protein C or S deficiency, corticosteroid use, warfarin use, diabetes, iron or albumin infusions, and rapid weight loss.3,6-9,11 Although the molecular pathogenesis of calciphylaxis is not completely understood, it is believed to be caused by local deposition of calcium in the tunica media of small- to medium-sized arterioles and venules in the skin.12 This deposition leads to intimal proliferation and progressive narrowing of the vessels with resultant thrombosis, ischemia, and necrosis. The cutaneous manifestations and histopathology of calciphylaxis classically follow its pathogenesis. Calciphylaxis typically presents with livedo reticularis as vessels narrow and then progresses to purpura, bullae, necrosis, and eschar formation with the onset of acute thrombosis and ischemia. Histopathology is characterized by small- and medium-sized vessel calcification and thrombus, dermal necrosis, and septal panniculitis, though the histology can be highly variable.12 Unfortunately, the already poor prognosis for calciphylaxis worsens when lesions become either ulcerative or present on the proximal extremities and trunk.4,13 Sepsis is the leading cause of death in calciphylaxis patients, affecting more than 50% of patients.2,3,14 The differential diagnoses for calciphylactic-appearing lesions include warfarin-induced skin necrosis, disseminated intravascular coagulation, pyoderma gangrenosum, cholesterol emboli, and various vasculitides and coagulopathies.

Risk Factors
Our case demonstrates the importance of risk factor minimization, trigger avoidance, and early intervention due to the high mortality rate of calciphylaxis. Selye et al15 coined the term calciphylaxis in 1961 based on experiments that induced calciphylaxis in rat models. Their research concluded that there were certain sensitizers (ie, risk factors) that predisposed patients to medial calcium deposition in blood vessels and other challengers (ie, triggers) that acted as inciting events to calcium deposition. Our patient presented with multiple known risk factors for calciphylaxis, including obesity (body mass index, 40 kg/m2), female gender, white race, hypoalbuminemia, and chronic corticosteroid use.16 In the presence of a milieu of risk factors, the patient’s rapid weight loss and episodes of hypotension likely were triggers for calciphylaxis.



Other case reports in the literature have suggested weight loss as a trigger for NUC. One morbidly obese patient with inactive rheumatoid arthritis had onset of calciphylaxis lesions after unintentional weight loss of approximately 50% body weight in 1 year17; however, the weight loss does not have to be drastic to trigger calciphylaxis. Another study of 16 patients with uremic calciphylaxis found that 7 of 16 (44%) patients lost 10 to 50 kg in the 6 months prior to calciphylaxis onset.14 One proposed mechanism by Munavalli et al10 is that elevated levels of matrix metalloproteinases during catabolic weight loss states enhance the deposition of calcium into elastic fibers of small vessels. The authors found elevated serum levels of matrix metalloproteinases in their patients with NUC induced by rapid weight loss.10

A meta-analysis by Nigwekar et al3 found a history of prior corticosteroid use in 61% (22/36) of NUC cases reviewed. However, it is unclear whether it is the use of corticosteroids or chronic inflammation that is implicated in NUC pathogenesis. Chronic inflammation causes downregulation of anticalcification signaling pathways.18-20 The role of 2 vascular calcification inhibitors has been evaluated in the pathogenesis of calciphylaxis: fetuin-A and matrix gla protein (MGP).21 The activity of these proteins is decreased not only in calciphylaxis but also in other inflammatory states and chronic renal failure.18-20 One study found lower fetuin-A levels in 312 hemodialysis patients compared to healthy controls and an association between low fetuin-A levels and increased C-reactive protein levels.22 Reduced fetuin-A and MGP levels may be the result of several calciphylaxis risk factors. Warfarin is believed to trigger calciphylaxis via inhibition of gamma-carboxylation of MGP, which is necessary for its anticalcification activity.23 Hypoalbuminemia and alcoholic liver disease also are risk factors that may be explained by the fact that fetuin-A is synthesized in the liver.24 Therefore, liver disease results in decreased production of fetuin-A that is permissive to vascular calcification in calciphylaxis patients.

There have been other reports of calciphylaxis patients who were originally hospitalized due to hypotension, which may serve as a trigger for calciphylaxis onset.25 Because calciphylaxis lesions are more likely to occur in the fatty areas of the abdomen and proximal thighs where blood flow is slower, hypotension likely accentuates the slowing of blood flow and subsequent blood vessel calcification. This theory is supported by studies showing that established calciphylactic lesions worsen more quickly in the presence of systemic hypotension.26 One patient with ESRD and calciphylaxis of the breasts had consistent systolic blood pressure readings in the high 60s to low 70s between dialysis sessions.27 Due to this association, we recommend that patients with calciphylaxis have close blood pressure monitoring to aid in preventing disease progression.28

Management
Calciphylaxis treatment has not yet been standardized, as it is an uncommon disease whose pathogenesis is not fully understood. Current management strategies aim to normalize metabolic abnormalities such as hypercalcemia if they are present and remove inciting agents such as warfarin and corticosteroids.29 Other medical treatments that have been successfully used include sodium thiosulfate, oral steroids, and adjunctive bisphosphonates.29-31 Sodium thiosulfate is known to cause metabolic acidosis by generating thiosulfuric acid in vivo in patients with or without renal disease; therefore, patients on sodium thiosulfate therapy should be monitored for development of metabolic acidosis and treated with oral sodium bicarbonate or dialysis as needed.30,32 Wound care also is an important element of calciphylaxis treatment; however, the debridement of wounds is controversial. Some argue that dry intact eschars serve to protect against sepsis, which is the leading cause of death in calciphylaxis.2,14,33 In contrast, a retrospective study of 63 calciphylaxis patients found a 1-year survival rate of 61.6% in 17 patients receiving wound debridement vs 27.4% in 46 patients who did not.2 The current consensus is that debridement should be considered on a case-by-case basis, factoring in the presence of wound infection, size of wounds, stability of eschars, and treatment goals of the patient.34 Future studies should be aimed at this issue, with special focus on how these factors and the decision to debride or not impact patient outcomes.

Conclusion

Calciphylaxis is a potentially fatal disease that impacts both patients with ESRD and those with nonuremic risk factors. The term calcific uremic arteriolopathy should be disregarded, as nonuremic causes are being reported with increased frequency in the literature. In such cases, patients often have multiple risk factors, including obesity, primary hyperparathyroidism, alcoholic liver disease, and underlying malignancy, among others. Certain triggers for onset of calciphylaxis should be avoided in at-risk patients, including the use of corticosteroids or warfarin; iron and albumin infusions; hypotension; and rapid weight loss. Our fatal case of NUC is a reminder to dermatologists treating at-risk patients to avoid these triggers and to keep calciphylaxis in the differential diagnosis when encountering early lesions such as livedo reticularis, as progression of these lesions has a 1-year mortality rate of more than 50% with the therapies being utilized at this time.

References
  1. Au S, Crawford RI. Three-dimensional analysis of a calciphylaxis plaque: clues to pathogenesis. J Am Acad Dermatol. 2007;47:53-57.
  2. Weenig RH, Sewell LD, Davis MD, et al. Calciphylaxis: natural history, risk factor analysis, and outcome. J Am Acad Dermatol. 2007;56:569-579.
  3. Nigwekar SU, Wolf M, Sterns RH, et al. Calciphylaxis from nonuremic causes: a systematic review. Clin J Am Soc Nephrol. 2008;3:1139-1143.
  4. Fine A, Zacharias J. Calciphylaxis is usually non-ulcerating: risk factors, outcome and therapy. Kidney Int. 2002;61:2210-2217.
  5. Angelis M, Wong LL, Myers SA, et al. Calciphylaxis in patients on hemodialysis: a prevalence study. Surgery. 1997;122:1083-1090.
  6. Chavel SM, Taraszka KS, Schaffer JV, et al. Calciphylaxis associated with acute, reversible renal failure in the setting of alcoholic cirrhosis. J Am Acad Dermatol. 2004;50:125-128.
  7. Bosler DS, Amin MB, Gulli F, et al. Unusual case of calciphylaxis associated with metastatic breast carcinoma. Am J Dermatopathol. 2007;29:400-403.
  8. Buxtorf K, Cerottini JP, Panizzon RG. Lower limb skin ulcerations, intravascular calcifications and sensorimotor polyneuropathy: calciphylaxis as part of a hyperparathyroidism? Dermatology. 1999;198:423-425.
  9. Brouns K, Verbeken E, Degreef H, et al. Fatal calciphylaxis in two patients with giant cell arteritis. Clin Rheumatol. 2007;26:836-840.
  10. Munavalli G, Reisenauer A, Moses M, et al. Weight loss-induced calciphylaxis: potential role of matrix metalloproteinases. J Dermatol. 2003;30:915-919.
  11. Bae GH, Nambudiri VE, Bach DQ, et al. Rapidly progressive nonuremic calciphylaxis in setting of warfarin. Am J Med. 2015;128:E19-E21.
  12. Essary LR, Wick MR. Cutaneous calciphylaxis. an underrecognized clinicopathologic entity. Am J Clin Pathol. 2000;113:280-287.
  13. Hafner J, Keusch G, Wahl C, et al. Uremic small-artery disease with medial calcification and intimal hyperplasia (so-called calciphylaxis): a complication of chronic renal failure and benefit from parathyroidectomy. J Am Acad Dermatol. 1995;33:954-962.
  14. Coates T, Kirkland GS, Dymock RB, et al. Cutaneous necrosis from calcific uremic arteriolopathy. Am J Kidney Dis. 1998;32:384-391.
  15. Selye H, Gentile G, Prioreschi P. Cutaneous molt induced by calciphylaxis in the rat. Science. 1961;134:1876-1877.
  16. Kalajian AH, Malhotra PS, Callen JP, et al. Calciphylaxis with normal renal and parathyroid function: not as rare as previously believed. Arch Dermatol. 2009;145:451-458.
  17. Malabu U, Roberts L, Sangla K. Calciphylaxis in a morbidly obese woman with rheumatoid arthritis presenting with severe weight loss and vitamin D deficiency. Endocr Pract. 2011;17:104-108.
  18. Schäfer C, Heiss A, Schwarz A, et al. The serum protein alpha 2–Heremans-Schmid glycoprotein/fetuin-A is a systemically acting inhibitor of ectopic calcification. J Clin Invest. 2003;112:357-366.
  19. Cozzolino M, Galassi A, Biondi ML, et al. Serum fetuin-A levels link inflammation and cardiovascular calcification in hemodialysis patients. Am J Nephrol. 2006;26:423-429.
  20. Luo G, Ducy P, McKee MD, et al. Spontaneous calcification of arteries and cartilage in mice lacking matrix GLA protein. Nature. 1997;386:78-81.
  21. Weenig RH. Pathogenesis of calciphylaxis: Hans Selye to nuclear factor kappa-B. J Am Acad Dermatol. 2008;58:458-471.
  22. Ketteler M, Bongartz P, Westenfeld R, et al. Association of low fetuin-A (AHSG) concentrations in serum with cardiovascular mortality in patients on dialysis: a cross-sectional study. Lancet. 2003;361:827-833.
  23. Wallin R, Cain D, Sane DC. Matrix Gla protein synthesis and gamma-carboxylation in the aortic vessel wall and proliferating vascular smooth muscle cells a cell system which resembles the system in bone cells. Thromb Haemost. 1999;82:1764-1767.
  24. Sowers KM, Hayden MR. Calcific uremic arteriolopathy: pathophysiology, reactive oxygen species and therapeutic approaches. Oxid Med Cell Longev. 2010;3:109-121.
  25. Allegretti AS, Nazarian RM, Goverman J, et al. Calciphylaxis: a rare but fatal delayed complication of Roux-en-Y gastric bypass surgery. Am J Kidney Dis. 2014;64:274-277.
  26. Wilmer WA, Magro CM. Calciphylaxis: emerging concepts in prevention, diagnosis, and treatment. Semin Dial. 2002;15:172-186.
  27. Gupta D, Tadros R, Mazumdar A, et al. Breast lesions with intractable pain in end-stage renal disease: calciphylaxis with chronic hypotensive dermatopathy related watershed breast lesions. J Palliat Med. 2013;16:551-554.
  28. Janigan DT, Hirsch DJ, Klassen GA, et al. Calcified subcutaneous arterioles with infarcts of the subcutis and skin (“calciphylaxis”) in chronic renal failure. Am J Kidney Dis. 2000;35:588-597.
  29. Jeong HS, Dominguez AR. Calciphylaxis: controversies in pathogenesis, diagnosis and treatment. Am J Med Sci. 2016;351:217-227.
  30. Bourgeois P, De Haes P. Sodium thiosulfate as a treatment for calciphylaxis: a case series. J Dermatolog Treat. 2016;27:520-524.
  31. Biswas A, Walsh NM, Tremaine R. A case of nonuremic calciphylaxis treated effectively with systemic corticosteroids. J Cutan Med Surg. 2016;20:275-278.
  32. Selk N, Rodby, RA. Unexpectedly severe metabolic acidosis associated with sodium thiosulfate therapy in a patient with calcific uremic arteriolopathy. Semin Dial. 2011;24:85-88.
  33. Martin R. Mysterious calciphylaxis: wounds with eschar—to debride or not to debride? Ostomy Wound Manage. 2004:50:64-66, 68-70.
  34. Nigwekar SU, Kroshinsky D, Nazarian RM, et al. Calciphylaxis: risk factors, diagnosis, and treatment. Am J Kidney Dis. 2015;66:133-146.
Article PDF
Author and Disclosure Information

Dr. Kolb is from the Department of Dermatology, Orange Park Medical Center, Florida. Drs. Ellis and LaFond are from the Department of Dermatology, St. Joseph Mercy Hospital, Ann Arbor, Michigan.

The authors report no conflict of interest.

Correspondence: Logan J. Kolb, DO, Orange Park Medical Center, 2001 Kingsley Ave, Orange Park, FL 32073 ([email protected]).

Issue
Cutis - 105(1)
Publications
Topics
Page Number
E11-E14
Sections
Author and Disclosure Information

Dr. Kolb is from the Department of Dermatology, Orange Park Medical Center, Florida. Drs. Ellis and LaFond are from the Department of Dermatology, St. Joseph Mercy Hospital, Ann Arbor, Michigan.

The authors report no conflict of interest.

Correspondence: Logan J. Kolb, DO, Orange Park Medical Center, 2001 Kingsley Ave, Orange Park, FL 32073 ([email protected]).

Author and Disclosure Information

Dr. Kolb is from the Department of Dermatology, Orange Park Medical Center, Florida. Drs. Ellis and LaFond are from the Department of Dermatology, St. Joseph Mercy Hospital, Ann Arbor, Michigan.

The authors report no conflict of interest.

Correspondence: Logan J. Kolb, DO, Orange Park Medical Center, 2001 Kingsley Ave, Orange Park, FL 32073 ([email protected]).

Article PDF
Article PDF

Calciphylaxis, otherwise known as calcific uremic arteriolopathy, is characterized by calcification of the tunica media of the small- to medium-sized blood vessels of the dermis and subcutis, leading to ischemia and necrosis.1 It is a deadly disease with a 1-year mortality rate of more than 50%.2 End-stage renal disease (ESRD) is the most common risk factor for calciphylaxis, with a prevalence of 1% to 4% of hemodialysis patients with calciphylaxis in the United States.2-5 However, nonuremic calciphylaxis (NUC) has been increasingly reported in the literature and has risk factors other than ESRD, including but not limited to obesity, alcoholic liver disease, primary hyperparathyroidism, connective tissue disease, and underlying malignancy.3,6-9 Triggers for calciphylaxis in at-risk patients include use of corticosteroids or warfarin, iron or albumin infusions, and rapid weight loss.3,6,9-11 We report an unusual case of NUC that most likely was triggered by rapid weight loss and hypotension in a patient with multiple risk factors for calciphylaxis.

Case Report

A 75-year-old white woman with history of morbid obesity (body mass index, 40 kg/m2), unexplained weight loss of 70 lb over the last year, and polymyalgia rheumatica requiring chronic prednisone therapy presented with painful lesions on the thighs, buttocks, and right shoulder of 4 months’ duration. She had multiple hospital admissions preceding the onset of lesions for severe infections resulting in sepsis with hypotension, including Enterococcus faecalis endocarditis, extended-spectrum beta-lactamase bacteremia, and Pseudomonas aeruginosa pneumonia. Physical examination revealed large well-demarcated ulcers and necrotic eschars with surrounding violaceous induration and stellate erythema on the anterior, medial, and posterior thighs and buttocks that were exquisitely tender (Figures 1 and 2).

Figure 1. Necrotic eschars surrounded by erythema and livedo reticularis on the right medial thigh.

Figure 2. Eschar with a rolled erythematous border on the left lateral thigh.

Notable laboratory results included hypoalbuminemia (1.3 g/dL [reference range, 3.5–5.0 g/dL]) with normal renal function, a corrected calcium level of 9.7 mg/dL (reference range, 8.2–10.2 mg/dL), a serum phosphorus level of 3.5 mg/dL (reference range, 2.3–4.7 mg/dL), a calcium-phosphate product of 27.3 mg2/dL2 (reference range, <55 mg2/dL2), and a parathyroid hormone level of 49.3 pg/mL (reference range, 10–65 pg/mL). Antinuclear antibodies were negative. A hypercoagulability evaluation showed normal protein C and S levels, negative lupus anticoagulant, and negative anticardiolipin antibodies.

Telescoping punch biopsies of the indurated borders of the eschars showed prominent calcification of the small- and medium-sized vessels in the mid and deep dermis, intravascular thrombi, and necrosis of the epidermis and subcutaneous fat consistent with calciphylaxis (Figure 3).

Figure 3. A, Epidermal necrosis, small- and medium-sized vessel calcification and thrombus, and underlying septal panniculitis with fat necrosis (H&E, original magnification ×100). B, High-power magnification of small vessel calcification in the subcutaneous fat (H&E, original magnification ×400).


After the diagnosis of calciphylaxis was made, the patient was treated with intravenous sodium thiosulfate 25 mg 3 times weekly and alendronate 70 mg weekly. Daily arterial blood gas studies did not detect metabolic acidosis during the patient’s sodium thiosulfate therapy. The wounds were debrided, and we attempted to slowly taper the patient off the oral prednisone. Unfortunately, her condition slowly deteriorated secondary to sepsis, resulting in septic shock. The patient died 3 weeks after the diagnosis of calciphylaxis was made. At the time of diagnosis, the patient had a poor prognosis and notable risk for sepsis due to the large eschars on the thighs and abdomen as well as her relative immunosuppression due to chronic prednisone use.
 

 

Comment

Background on Calciphylaxis
Calciphylaxis is a rare but deadly disease that affects both ESRD patients receiving dialysis and patients without ESRD who have known risk factors for calciphylaxis, including female gender, white race, obesity, alcoholic liver disease, primary hyperparathyroidism, connective tissue disease, underlying malignancy, protein C or S deficiency, corticosteroid use, warfarin use, diabetes, iron or albumin infusions, and rapid weight loss.3,6-9,11 Although the molecular pathogenesis of calciphylaxis is not completely understood, it is believed to be caused by local deposition of calcium in the tunica media of small- to medium-sized arterioles and venules in the skin.12 This deposition leads to intimal proliferation and progressive narrowing of the vessels with resultant thrombosis, ischemia, and necrosis. The cutaneous manifestations and histopathology of calciphylaxis classically follow its pathogenesis. Calciphylaxis typically presents with livedo reticularis as vessels narrow and then progresses to purpura, bullae, necrosis, and eschar formation with the onset of acute thrombosis and ischemia. Histopathology is characterized by small- and medium-sized vessel calcification and thrombus, dermal necrosis, and septal panniculitis, though the histology can be highly variable.12 Unfortunately, the already poor prognosis for calciphylaxis worsens when lesions become either ulcerative or present on the proximal extremities and trunk.4,13 Sepsis is the leading cause of death in calciphylaxis patients, affecting more than 50% of patients.2,3,14 The differential diagnoses for calciphylactic-appearing lesions include warfarin-induced skin necrosis, disseminated intravascular coagulation, pyoderma gangrenosum, cholesterol emboli, and various vasculitides and coagulopathies.

Risk Factors
Our case demonstrates the importance of risk factor minimization, trigger avoidance, and early intervention due to the high mortality rate of calciphylaxis. Selye et al15 coined the term calciphylaxis in 1961 based on experiments that induced calciphylaxis in rat models. Their research concluded that there were certain sensitizers (ie, risk factors) that predisposed patients to medial calcium deposition in blood vessels and other challengers (ie, triggers) that acted as inciting events to calcium deposition. Our patient presented with multiple known risk factors for calciphylaxis, including obesity (body mass index, 40 kg/m2), female gender, white race, hypoalbuminemia, and chronic corticosteroid use.16 In the presence of a milieu of risk factors, the patient’s rapid weight loss and episodes of hypotension likely were triggers for calciphylaxis.



Other case reports in the literature have suggested weight loss as a trigger for NUC. One morbidly obese patient with inactive rheumatoid arthritis had onset of calciphylaxis lesions after unintentional weight loss of approximately 50% body weight in 1 year17; however, the weight loss does not have to be drastic to trigger calciphylaxis. Another study of 16 patients with uremic calciphylaxis found that 7 of 16 (44%) patients lost 10 to 50 kg in the 6 months prior to calciphylaxis onset.14 One proposed mechanism by Munavalli et al10 is that elevated levels of matrix metalloproteinases during catabolic weight loss states enhance the deposition of calcium into elastic fibers of small vessels. The authors found elevated serum levels of matrix metalloproteinases in their patients with NUC induced by rapid weight loss.10

A meta-analysis by Nigwekar et al3 found a history of prior corticosteroid use in 61% (22/36) of NUC cases reviewed. However, it is unclear whether it is the use of corticosteroids or chronic inflammation that is implicated in NUC pathogenesis. Chronic inflammation causes downregulation of anticalcification signaling pathways.18-20 The role of 2 vascular calcification inhibitors has been evaluated in the pathogenesis of calciphylaxis: fetuin-A and matrix gla protein (MGP).21 The activity of these proteins is decreased not only in calciphylaxis but also in other inflammatory states and chronic renal failure.18-20 One study found lower fetuin-A levels in 312 hemodialysis patients compared to healthy controls and an association between low fetuin-A levels and increased C-reactive protein levels.22 Reduced fetuin-A and MGP levels may be the result of several calciphylaxis risk factors. Warfarin is believed to trigger calciphylaxis via inhibition of gamma-carboxylation of MGP, which is necessary for its anticalcification activity.23 Hypoalbuminemia and alcoholic liver disease also are risk factors that may be explained by the fact that fetuin-A is synthesized in the liver.24 Therefore, liver disease results in decreased production of fetuin-A that is permissive to vascular calcification in calciphylaxis patients.

There have been other reports of calciphylaxis patients who were originally hospitalized due to hypotension, which may serve as a trigger for calciphylaxis onset.25 Because calciphylaxis lesions are more likely to occur in the fatty areas of the abdomen and proximal thighs where blood flow is slower, hypotension likely accentuates the slowing of blood flow and subsequent blood vessel calcification. This theory is supported by studies showing that established calciphylactic lesions worsen more quickly in the presence of systemic hypotension.26 One patient with ESRD and calciphylaxis of the breasts had consistent systolic blood pressure readings in the high 60s to low 70s between dialysis sessions.27 Due to this association, we recommend that patients with calciphylaxis have close blood pressure monitoring to aid in preventing disease progression.28

Management
Calciphylaxis treatment has not yet been standardized, as it is an uncommon disease whose pathogenesis is not fully understood. Current management strategies aim to normalize metabolic abnormalities such as hypercalcemia if they are present and remove inciting agents such as warfarin and corticosteroids.29 Other medical treatments that have been successfully used include sodium thiosulfate, oral steroids, and adjunctive bisphosphonates.29-31 Sodium thiosulfate is known to cause metabolic acidosis by generating thiosulfuric acid in vivo in patients with or without renal disease; therefore, patients on sodium thiosulfate therapy should be monitored for development of metabolic acidosis and treated with oral sodium bicarbonate or dialysis as needed.30,32 Wound care also is an important element of calciphylaxis treatment; however, the debridement of wounds is controversial. Some argue that dry intact eschars serve to protect against sepsis, which is the leading cause of death in calciphylaxis.2,14,33 In contrast, a retrospective study of 63 calciphylaxis patients found a 1-year survival rate of 61.6% in 17 patients receiving wound debridement vs 27.4% in 46 patients who did not.2 The current consensus is that debridement should be considered on a case-by-case basis, factoring in the presence of wound infection, size of wounds, stability of eschars, and treatment goals of the patient.34 Future studies should be aimed at this issue, with special focus on how these factors and the decision to debride or not impact patient outcomes.

Conclusion

Calciphylaxis is a potentially fatal disease that impacts both patients with ESRD and those with nonuremic risk factors. The term calcific uremic arteriolopathy should be disregarded, as nonuremic causes are being reported with increased frequency in the literature. In such cases, patients often have multiple risk factors, including obesity, primary hyperparathyroidism, alcoholic liver disease, and underlying malignancy, among others. Certain triggers for onset of calciphylaxis should be avoided in at-risk patients, including the use of corticosteroids or warfarin; iron and albumin infusions; hypotension; and rapid weight loss. Our fatal case of NUC is a reminder to dermatologists treating at-risk patients to avoid these triggers and to keep calciphylaxis in the differential diagnosis when encountering early lesions such as livedo reticularis, as progression of these lesions has a 1-year mortality rate of more than 50% with the therapies being utilized at this time.

Calciphylaxis, otherwise known as calcific uremic arteriolopathy, is characterized by calcification of the tunica media of the small- to medium-sized blood vessels of the dermis and subcutis, leading to ischemia and necrosis.1 It is a deadly disease with a 1-year mortality rate of more than 50%.2 End-stage renal disease (ESRD) is the most common risk factor for calciphylaxis, with a prevalence of 1% to 4% of hemodialysis patients with calciphylaxis in the United States.2-5 However, nonuremic calciphylaxis (NUC) has been increasingly reported in the literature and has risk factors other than ESRD, including but not limited to obesity, alcoholic liver disease, primary hyperparathyroidism, connective tissue disease, and underlying malignancy.3,6-9 Triggers for calciphylaxis in at-risk patients include use of corticosteroids or warfarin, iron or albumin infusions, and rapid weight loss.3,6,9-11 We report an unusual case of NUC that most likely was triggered by rapid weight loss and hypotension in a patient with multiple risk factors for calciphylaxis.

Case Report

A 75-year-old white woman with history of morbid obesity (body mass index, 40 kg/m2), unexplained weight loss of 70 lb over the last year, and polymyalgia rheumatica requiring chronic prednisone therapy presented with painful lesions on the thighs, buttocks, and right shoulder of 4 months’ duration. She had multiple hospital admissions preceding the onset of lesions for severe infections resulting in sepsis with hypotension, including Enterococcus faecalis endocarditis, extended-spectrum beta-lactamase bacteremia, and Pseudomonas aeruginosa pneumonia. Physical examination revealed large well-demarcated ulcers and necrotic eschars with surrounding violaceous induration and stellate erythema on the anterior, medial, and posterior thighs and buttocks that were exquisitely tender (Figures 1 and 2).

Figure 1. Necrotic eschars surrounded by erythema and livedo reticularis on the right medial thigh.

Figure 2. Eschar with a rolled erythematous border on the left lateral thigh.

Notable laboratory results included hypoalbuminemia (1.3 g/dL [reference range, 3.5–5.0 g/dL]) with normal renal function, a corrected calcium level of 9.7 mg/dL (reference range, 8.2–10.2 mg/dL), a serum phosphorus level of 3.5 mg/dL (reference range, 2.3–4.7 mg/dL), a calcium-phosphate product of 27.3 mg2/dL2 (reference range, <55 mg2/dL2), and a parathyroid hormone level of 49.3 pg/mL (reference range, 10–65 pg/mL). Antinuclear antibodies were negative. A hypercoagulability evaluation showed normal protein C and S levels, negative lupus anticoagulant, and negative anticardiolipin antibodies.

Telescoping punch biopsies of the indurated borders of the eschars showed prominent calcification of the small- and medium-sized vessels in the mid and deep dermis, intravascular thrombi, and necrosis of the epidermis and subcutaneous fat consistent with calciphylaxis (Figure 3).

Figure 3. A, Epidermal necrosis, small- and medium-sized vessel calcification and thrombus, and underlying septal panniculitis with fat necrosis (H&E, original magnification ×100). B, High-power magnification of small vessel calcification in the subcutaneous fat (H&E, original magnification ×400).


After the diagnosis of calciphylaxis was made, the patient was treated with intravenous sodium thiosulfate 25 mg 3 times weekly and alendronate 70 mg weekly. Daily arterial blood gas studies did not detect metabolic acidosis during the patient’s sodium thiosulfate therapy. The wounds were debrided, and we attempted to slowly taper the patient off the oral prednisone. Unfortunately, her condition slowly deteriorated secondary to sepsis, resulting in septic shock. The patient died 3 weeks after the diagnosis of calciphylaxis was made. At the time of diagnosis, the patient had a poor prognosis and notable risk for sepsis due to the large eschars on the thighs and abdomen as well as her relative immunosuppression due to chronic prednisone use.
 

 

Comment

Background on Calciphylaxis
Calciphylaxis is a rare but deadly disease that affects both ESRD patients receiving dialysis and patients without ESRD who have known risk factors for calciphylaxis, including female gender, white race, obesity, alcoholic liver disease, primary hyperparathyroidism, connective tissue disease, underlying malignancy, protein C or S deficiency, corticosteroid use, warfarin use, diabetes, iron or albumin infusions, and rapid weight loss.3,6-9,11 Although the molecular pathogenesis of calciphylaxis is not completely understood, it is believed to be caused by local deposition of calcium in the tunica media of small- to medium-sized arterioles and venules in the skin.12 This deposition leads to intimal proliferation and progressive narrowing of the vessels with resultant thrombosis, ischemia, and necrosis. The cutaneous manifestations and histopathology of calciphylaxis classically follow its pathogenesis. Calciphylaxis typically presents with livedo reticularis as vessels narrow and then progresses to purpura, bullae, necrosis, and eschar formation with the onset of acute thrombosis and ischemia. Histopathology is characterized by small- and medium-sized vessel calcification and thrombus, dermal necrosis, and septal panniculitis, though the histology can be highly variable.12 Unfortunately, the already poor prognosis for calciphylaxis worsens when lesions become either ulcerative or present on the proximal extremities and trunk.4,13 Sepsis is the leading cause of death in calciphylaxis patients, affecting more than 50% of patients.2,3,14 The differential diagnoses for calciphylactic-appearing lesions include warfarin-induced skin necrosis, disseminated intravascular coagulation, pyoderma gangrenosum, cholesterol emboli, and various vasculitides and coagulopathies.

Risk Factors
Our case demonstrates the importance of risk factor minimization, trigger avoidance, and early intervention due to the high mortality rate of calciphylaxis. Selye et al15 coined the term calciphylaxis in 1961 based on experiments that induced calciphylaxis in rat models. Their research concluded that there were certain sensitizers (ie, risk factors) that predisposed patients to medial calcium deposition in blood vessels and other challengers (ie, triggers) that acted as inciting events to calcium deposition. Our patient presented with multiple known risk factors for calciphylaxis, including obesity (body mass index, 40 kg/m2), female gender, white race, hypoalbuminemia, and chronic corticosteroid use.16 In the presence of a milieu of risk factors, the patient’s rapid weight loss and episodes of hypotension likely were triggers for calciphylaxis.



Other case reports in the literature have suggested weight loss as a trigger for NUC. One morbidly obese patient with inactive rheumatoid arthritis had onset of calciphylaxis lesions after unintentional weight loss of approximately 50% body weight in 1 year17; however, the weight loss does not have to be drastic to trigger calciphylaxis. Another study of 16 patients with uremic calciphylaxis found that 7 of 16 (44%) patients lost 10 to 50 kg in the 6 months prior to calciphylaxis onset.14 One proposed mechanism by Munavalli et al10 is that elevated levels of matrix metalloproteinases during catabolic weight loss states enhance the deposition of calcium into elastic fibers of small vessels. The authors found elevated serum levels of matrix metalloproteinases in their patients with NUC induced by rapid weight loss.10

A meta-analysis by Nigwekar et al3 found a history of prior corticosteroid use in 61% (22/36) of NUC cases reviewed. However, it is unclear whether it is the use of corticosteroids or chronic inflammation that is implicated in NUC pathogenesis. Chronic inflammation causes downregulation of anticalcification signaling pathways.18-20 The role of 2 vascular calcification inhibitors has been evaluated in the pathogenesis of calciphylaxis: fetuin-A and matrix gla protein (MGP).21 The activity of these proteins is decreased not only in calciphylaxis but also in other inflammatory states and chronic renal failure.18-20 One study found lower fetuin-A levels in 312 hemodialysis patients compared to healthy controls and an association between low fetuin-A levels and increased C-reactive protein levels.22 Reduced fetuin-A and MGP levels may be the result of several calciphylaxis risk factors. Warfarin is believed to trigger calciphylaxis via inhibition of gamma-carboxylation of MGP, which is necessary for its anticalcification activity.23 Hypoalbuminemia and alcoholic liver disease also are risk factors that may be explained by the fact that fetuin-A is synthesized in the liver.24 Therefore, liver disease results in decreased production of fetuin-A that is permissive to vascular calcification in calciphylaxis patients.

There have been other reports of calciphylaxis patients who were originally hospitalized due to hypotension, which may serve as a trigger for calciphylaxis onset.25 Because calciphylaxis lesions are more likely to occur in the fatty areas of the abdomen and proximal thighs where blood flow is slower, hypotension likely accentuates the slowing of blood flow and subsequent blood vessel calcification. This theory is supported by studies showing that established calciphylactic lesions worsen more quickly in the presence of systemic hypotension.26 One patient with ESRD and calciphylaxis of the breasts had consistent systolic blood pressure readings in the high 60s to low 70s between dialysis sessions.27 Due to this association, we recommend that patients with calciphylaxis have close blood pressure monitoring to aid in preventing disease progression.28

Management
Calciphylaxis treatment has not yet been standardized, as it is an uncommon disease whose pathogenesis is not fully understood. Current management strategies aim to normalize metabolic abnormalities such as hypercalcemia if they are present and remove inciting agents such as warfarin and corticosteroids.29 Other medical treatments that have been successfully used include sodium thiosulfate, oral steroids, and adjunctive bisphosphonates.29-31 Sodium thiosulfate is known to cause metabolic acidosis by generating thiosulfuric acid in vivo in patients with or without renal disease; therefore, patients on sodium thiosulfate therapy should be monitored for development of metabolic acidosis and treated with oral sodium bicarbonate or dialysis as needed.30,32 Wound care also is an important element of calciphylaxis treatment; however, the debridement of wounds is controversial. Some argue that dry intact eschars serve to protect against sepsis, which is the leading cause of death in calciphylaxis.2,14,33 In contrast, a retrospective study of 63 calciphylaxis patients found a 1-year survival rate of 61.6% in 17 patients receiving wound debridement vs 27.4% in 46 patients who did not.2 The current consensus is that debridement should be considered on a case-by-case basis, factoring in the presence of wound infection, size of wounds, stability of eschars, and treatment goals of the patient.34 Future studies should be aimed at this issue, with special focus on how these factors and the decision to debride or not impact patient outcomes.

Conclusion

Calciphylaxis is a potentially fatal disease that impacts both patients with ESRD and those with nonuremic risk factors. The term calcific uremic arteriolopathy should be disregarded, as nonuremic causes are being reported with increased frequency in the literature. In such cases, patients often have multiple risk factors, including obesity, primary hyperparathyroidism, alcoholic liver disease, and underlying malignancy, among others. Certain triggers for onset of calciphylaxis should be avoided in at-risk patients, including the use of corticosteroids or warfarin; iron and albumin infusions; hypotension; and rapid weight loss. Our fatal case of NUC is a reminder to dermatologists treating at-risk patients to avoid these triggers and to keep calciphylaxis in the differential diagnosis when encountering early lesions such as livedo reticularis, as progression of these lesions has a 1-year mortality rate of more than 50% with the therapies being utilized at this time.

References
  1. Au S, Crawford RI. Three-dimensional analysis of a calciphylaxis plaque: clues to pathogenesis. J Am Acad Dermatol. 2007;47:53-57.
  2. Weenig RH, Sewell LD, Davis MD, et al. Calciphylaxis: natural history, risk factor analysis, and outcome. J Am Acad Dermatol. 2007;56:569-579.
  3. Nigwekar SU, Wolf M, Sterns RH, et al. Calciphylaxis from nonuremic causes: a systematic review. Clin J Am Soc Nephrol. 2008;3:1139-1143.
  4. Fine A, Zacharias J. Calciphylaxis is usually non-ulcerating: risk factors, outcome and therapy. Kidney Int. 2002;61:2210-2217.
  5. Angelis M, Wong LL, Myers SA, et al. Calciphylaxis in patients on hemodialysis: a prevalence study. Surgery. 1997;122:1083-1090.
  6. Chavel SM, Taraszka KS, Schaffer JV, et al. Calciphylaxis associated with acute, reversible renal failure in the setting of alcoholic cirrhosis. J Am Acad Dermatol. 2004;50:125-128.
  7. Bosler DS, Amin MB, Gulli F, et al. Unusual case of calciphylaxis associated with metastatic breast carcinoma. Am J Dermatopathol. 2007;29:400-403.
  8. Buxtorf K, Cerottini JP, Panizzon RG. Lower limb skin ulcerations, intravascular calcifications and sensorimotor polyneuropathy: calciphylaxis as part of a hyperparathyroidism? Dermatology. 1999;198:423-425.
  9. Brouns K, Verbeken E, Degreef H, et al. Fatal calciphylaxis in two patients with giant cell arteritis. Clin Rheumatol. 2007;26:836-840.
  10. Munavalli G, Reisenauer A, Moses M, et al. Weight loss-induced calciphylaxis: potential role of matrix metalloproteinases. J Dermatol. 2003;30:915-919.
  11. Bae GH, Nambudiri VE, Bach DQ, et al. Rapidly progressive nonuremic calciphylaxis in setting of warfarin. Am J Med. 2015;128:E19-E21.
  12. Essary LR, Wick MR. Cutaneous calciphylaxis. an underrecognized clinicopathologic entity. Am J Clin Pathol. 2000;113:280-287.
  13. Hafner J, Keusch G, Wahl C, et al. Uremic small-artery disease with medial calcification and intimal hyperplasia (so-called calciphylaxis): a complication of chronic renal failure and benefit from parathyroidectomy. J Am Acad Dermatol. 1995;33:954-962.
  14. Coates T, Kirkland GS, Dymock RB, et al. Cutaneous necrosis from calcific uremic arteriolopathy. Am J Kidney Dis. 1998;32:384-391.
  15. Selye H, Gentile G, Prioreschi P. Cutaneous molt induced by calciphylaxis in the rat. Science. 1961;134:1876-1877.
  16. Kalajian AH, Malhotra PS, Callen JP, et al. Calciphylaxis with normal renal and parathyroid function: not as rare as previously believed. Arch Dermatol. 2009;145:451-458.
  17. Malabu U, Roberts L, Sangla K. Calciphylaxis in a morbidly obese woman with rheumatoid arthritis presenting with severe weight loss and vitamin D deficiency. Endocr Pract. 2011;17:104-108.
  18. Schäfer C, Heiss A, Schwarz A, et al. The serum protein alpha 2–Heremans-Schmid glycoprotein/fetuin-A is a systemically acting inhibitor of ectopic calcification. J Clin Invest. 2003;112:357-366.
  19. Cozzolino M, Galassi A, Biondi ML, et al. Serum fetuin-A levels link inflammation and cardiovascular calcification in hemodialysis patients. Am J Nephrol. 2006;26:423-429.
  20. Luo G, Ducy P, McKee MD, et al. Spontaneous calcification of arteries and cartilage in mice lacking matrix GLA protein. Nature. 1997;386:78-81.
  21. Weenig RH. Pathogenesis of calciphylaxis: Hans Selye to nuclear factor kappa-B. J Am Acad Dermatol. 2008;58:458-471.
  22. Ketteler M, Bongartz P, Westenfeld R, et al. Association of low fetuin-A (AHSG) concentrations in serum with cardiovascular mortality in patients on dialysis: a cross-sectional study. Lancet. 2003;361:827-833.
  23. Wallin R, Cain D, Sane DC. Matrix Gla protein synthesis and gamma-carboxylation in the aortic vessel wall and proliferating vascular smooth muscle cells a cell system which resembles the system in bone cells. Thromb Haemost. 1999;82:1764-1767.
  24. Sowers KM, Hayden MR. Calcific uremic arteriolopathy: pathophysiology, reactive oxygen species and therapeutic approaches. Oxid Med Cell Longev. 2010;3:109-121.
  25. Allegretti AS, Nazarian RM, Goverman J, et al. Calciphylaxis: a rare but fatal delayed complication of Roux-en-Y gastric bypass surgery. Am J Kidney Dis. 2014;64:274-277.
  26. Wilmer WA, Magro CM. Calciphylaxis: emerging concepts in prevention, diagnosis, and treatment. Semin Dial. 2002;15:172-186.
  27. Gupta D, Tadros R, Mazumdar A, et al. Breast lesions with intractable pain in end-stage renal disease: calciphylaxis with chronic hypotensive dermatopathy related watershed breast lesions. J Palliat Med. 2013;16:551-554.
  28. Janigan DT, Hirsch DJ, Klassen GA, et al. Calcified subcutaneous arterioles with infarcts of the subcutis and skin (“calciphylaxis”) in chronic renal failure. Am J Kidney Dis. 2000;35:588-597.
  29. Jeong HS, Dominguez AR. Calciphylaxis: controversies in pathogenesis, diagnosis and treatment. Am J Med Sci. 2016;351:217-227.
  30. Bourgeois P, De Haes P. Sodium thiosulfate as a treatment for calciphylaxis: a case series. J Dermatolog Treat. 2016;27:520-524.
  31. Biswas A, Walsh NM, Tremaine R. A case of nonuremic calciphylaxis treated effectively with systemic corticosteroids. J Cutan Med Surg. 2016;20:275-278.
  32. Selk N, Rodby, RA. Unexpectedly severe metabolic acidosis associated with sodium thiosulfate therapy in a patient with calcific uremic arteriolopathy. Semin Dial. 2011;24:85-88.
  33. Martin R. Mysterious calciphylaxis: wounds with eschar—to debride or not to debride? Ostomy Wound Manage. 2004:50:64-66, 68-70.
  34. Nigwekar SU, Kroshinsky D, Nazarian RM, et al. Calciphylaxis: risk factors, diagnosis, and treatment. Am J Kidney Dis. 2015;66:133-146.
References
  1. Au S, Crawford RI. Three-dimensional analysis of a calciphylaxis plaque: clues to pathogenesis. J Am Acad Dermatol. 2007;47:53-57.
  2. Weenig RH, Sewell LD, Davis MD, et al. Calciphylaxis: natural history, risk factor analysis, and outcome. J Am Acad Dermatol. 2007;56:569-579.
  3. Nigwekar SU, Wolf M, Sterns RH, et al. Calciphylaxis from nonuremic causes: a systematic review. Clin J Am Soc Nephrol. 2008;3:1139-1143.
  4. Fine A, Zacharias J. Calciphylaxis is usually non-ulcerating: risk factors, outcome and therapy. Kidney Int. 2002;61:2210-2217.
  5. Angelis M, Wong LL, Myers SA, et al. Calciphylaxis in patients on hemodialysis: a prevalence study. Surgery. 1997;122:1083-1090.
  6. Chavel SM, Taraszka KS, Schaffer JV, et al. Calciphylaxis associated with acute, reversible renal failure in the setting of alcoholic cirrhosis. J Am Acad Dermatol. 2004;50:125-128.
  7. Bosler DS, Amin MB, Gulli F, et al. Unusual case of calciphylaxis associated with metastatic breast carcinoma. Am J Dermatopathol. 2007;29:400-403.
  8. Buxtorf K, Cerottini JP, Panizzon RG. Lower limb skin ulcerations, intravascular calcifications and sensorimotor polyneuropathy: calciphylaxis as part of a hyperparathyroidism? Dermatology. 1999;198:423-425.
  9. Brouns K, Verbeken E, Degreef H, et al. Fatal calciphylaxis in two patients with giant cell arteritis. Clin Rheumatol. 2007;26:836-840.
  10. Munavalli G, Reisenauer A, Moses M, et al. Weight loss-induced calciphylaxis: potential role of matrix metalloproteinases. J Dermatol. 2003;30:915-919.
  11. Bae GH, Nambudiri VE, Bach DQ, et al. Rapidly progressive nonuremic calciphylaxis in setting of warfarin. Am J Med. 2015;128:E19-E21.
  12. Essary LR, Wick MR. Cutaneous calciphylaxis. an underrecognized clinicopathologic entity. Am J Clin Pathol. 2000;113:280-287.
  13. Hafner J, Keusch G, Wahl C, et al. Uremic small-artery disease with medial calcification and intimal hyperplasia (so-called calciphylaxis): a complication of chronic renal failure and benefit from parathyroidectomy. J Am Acad Dermatol. 1995;33:954-962.
  14. Coates T, Kirkland GS, Dymock RB, et al. Cutaneous necrosis from calcific uremic arteriolopathy. Am J Kidney Dis. 1998;32:384-391.
  15. Selye H, Gentile G, Prioreschi P. Cutaneous molt induced by calciphylaxis in the rat. Science. 1961;134:1876-1877.
  16. Kalajian AH, Malhotra PS, Callen JP, et al. Calciphylaxis with normal renal and parathyroid function: not as rare as previously believed. Arch Dermatol. 2009;145:451-458.
  17. Malabu U, Roberts L, Sangla K. Calciphylaxis in a morbidly obese woman with rheumatoid arthritis presenting with severe weight loss and vitamin D deficiency. Endocr Pract. 2011;17:104-108.
  18. Schäfer C, Heiss A, Schwarz A, et al. The serum protein alpha 2–Heremans-Schmid glycoprotein/fetuin-A is a systemically acting inhibitor of ectopic calcification. J Clin Invest. 2003;112:357-366.
  19. Cozzolino M, Galassi A, Biondi ML, et al. Serum fetuin-A levels link inflammation and cardiovascular calcification in hemodialysis patients. Am J Nephrol. 2006;26:423-429.
  20. Luo G, Ducy P, McKee MD, et al. Spontaneous calcification of arteries and cartilage in mice lacking matrix GLA protein. Nature. 1997;386:78-81.
  21. Weenig RH. Pathogenesis of calciphylaxis: Hans Selye to nuclear factor kappa-B. J Am Acad Dermatol. 2008;58:458-471.
  22. Ketteler M, Bongartz P, Westenfeld R, et al. Association of low fetuin-A (AHSG) concentrations in serum with cardiovascular mortality in patients on dialysis: a cross-sectional study. Lancet. 2003;361:827-833.
  23. Wallin R, Cain D, Sane DC. Matrix Gla protein synthesis and gamma-carboxylation in the aortic vessel wall and proliferating vascular smooth muscle cells a cell system which resembles the system in bone cells. Thromb Haemost. 1999;82:1764-1767.
  24. Sowers KM, Hayden MR. Calcific uremic arteriolopathy: pathophysiology, reactive oxygen species and therapeutic approaches. Oxid Med Cell Longev. 2010;3:109-121.
  25. Allegretti AS, Nazarian RM, Goverman J, et al. Calciphylaxis: a rare but fatal delayed complication of Roux-en-Y gastric bypass surgery. Am J Kidney Dis. 2014;64:274-277.
  26. Wilmer WA, Magro CM. Calciphylaxis: emerging concepts in prevention, diagnosis, and treatment. Semin Dial. 2002;15:172-186.
  27. Gupta D, Tadros R, Mazumdar A, et al. Breast lesions with intractable pain in end-stage renal disease: calciphylaxis with chronic hypotensive dermatopathy related watershed breast lesions. J Palliat Med. 2013;16:551-554.
  28. Janigan DT, Hirsch DJ, Klassen GA, et al. Calcified subcutaneous arterioles with infarcts of the subcutis and skin (“calciphylaxis”) in chronic renal failure. Am J Kidney Dis. 2000;35:588-597.
  29. Jeong HS, Dominguez AR. Calciphylaxis: controversies in pathogenesis, diagnosis and treatment. Am J Med Sci. 2016;351:217-227.
  30. Bourgeois P, De Haes P. Sodium thiosulfate as a treatment for calciphylaxis: a case series. J Dermatolog Treat. 2016;27:520-524.
  31. Biswas A, Walsh NM, Tremaine R. A case of nonuremic calciphylaxis treated effectively with systemic corticosteroids. J Cutan Med Surg. 2016;20:275-278.
  32. Selk N, Rodby, RA. Unexpectedly severe metabolic acidosis associated with sodium thiosulfate therapy in a patient with calcific uremic arteriolopathy. Semin Dial. 2011;24:85-88.
  33. Martin R. Mysterious calciphylaxis: wounds with eschar—to debride or not to debride? Ostomy Wound Manage. 2004:50:64-66, 68-70.
  34. Nigwekar SU, Kroshinsky D, Nazarian RM, et al. Calciphylaxis: risk factors, diagnosis, and treatment. Am J Kidney Dis. 2015;66:133-146.
Issue
Cutis - 105(1)
Issue
Cutis - 105(1)
Page Number
E11-E14
Page Number
E11-E14
Publications
Publications
Topics
Article Type
Display Headline
Nonuremic Calciphylaxis Triggered by Rapid Weight Loss and Hypotension
Display Headline
Nonuremic Calciphylaxis Triggered by Rapid Weight Loss and Hypotension
Sections
Inside the Article

Practice Points

  • Calciphylaxis is a potentially fatal disease caused by metastatic calcification of cutaneous small- and medium-sized blood vessels leading to ischemia and necrosis.
  • Calciphylaxis most commonly is seen in patients with renal disease requiring dialysis, but it also may be triggered by nonuremic causes in patients with known risk factors for calciphylaxis.
  • Risk factors for calciphylaxis include female gender, white race, obesity, alcoholic liver disease, primary hyperparathyroidism, connective tissue disease, underlying malignancy, protein C or S deficiency, corticosteroid use, warfarin use, diabetes, iron or albumin infusions, and rapid weight loss.
  • The term calcific uremic arteriolopathy should be disregarded, as nonuremic causes are being reported with increased frequency in the literature.
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Wuhan virus: What clinicians need to know

Article Type
Changed

As the Wuhan coronavirus story unfolds, the most important thing for clinicians in the United States to do is ask patients who appear to have the flu if they, or someone they have been in contact with, recently returned from China, according to infectious disease experts.

China News Service/CC BY 3.0
Medical staff in Wuhan railway station during the Wuhan coronavirus outbreak, Jan. 24, 2020.

“We are asking that of everyone with fever and respiratory symptoms who comes to our clinics, hospital, or emergency room. It’s a powerful screening tool,” said William Schaffner, MD, professor of preventive medicine and infectious diseases at Vanderbilt University Medical Center, Nashville, Tenn.

In addition to fever, common signs of infection include cough, shortness of breath, and breathing difficulties. Some patients have had diarrhea, vomiting, and other gastrointestinal symptoms. In more severe cases, infection can cause pneumonia, severe acute respiratory syndrome, kidney failure, and death. The incubation period appears to be up to 2 weeks, according to the World Health Organization (WHO).

If patients exhibit symptoms and either they or a close contact has returned from China recently, take standard airborne precautions and send specimens – a serum sample, oral and nasal pharyngeal swabs, and lower respiratory tract specimens if available – to the local health department, which will forward them to the Centers for Disease Control and Prevention (CDC) for testing. Turnaround time is 24-48 hours.

Dr. William Shaffner


The 2019 Novel Coronavirus (2019-nCoV), identified as the cause of an outbreak of respiratory illness first detected in December in association with a live animal market in Wuhan, China, has been implicated in almost 2,000 cases and 56 deaths in that country. Cases have been reported in 13 countries besides China. Five cases of 2019-nCoV infection have been confirmed in the United States, all in people recently returned from Wuhan. As the virus spreads in China, however, it’s almost certain more cases will show up in the United States. Travel history is key, Dr. Schaffner and others said.
 

Plan and rehearse

The first step to prepare is to use the CDC’s Interim Guidance for Healthcare Professionals to make a written plan specific to your practice to respond to a potential case. The plan must include notifying the local health department, the CDC liaison for testing, and tracking down patient contacts.

“It’s not good enough to just download CDC’s guidance; use it to make your own local plan and know what to do 24/7,” said Daniel Lucey, MD, an infectious disease expert at Georgetown University Medical Center, Washington, D.C.

“Know who is on call at the health department on weekends and nights,” he said. Know where the patient is going to be isolated; figure out what to do if there’s more than one, and tests come back positive. Have masks on hand, and rehearse the response. “Make a coronavirus team, and absolutely have the nurses involved,” as well as other providers who may come into contact with a case, he added.

Dr. Daniel Lucey


“You want to be able to do as well as your counterparts in Washington state and Chicago,” where the first two U.S. cases emerged. “They were prepared. They knew what to do,” Dr. Lucey said.

Those first two U.S. patients – a man in Everett, Wash., and a Chicago woman – developed symptoms after returning from Wuhan, a city of 11 million just over 400 miles inland from the port city of Shanghai. On Jan. 26 three more cases were confirmed by the CDC, two in California and one in Arizona, and each had recently traveled to Wuhan.  All five patients remain hospitalized, and there’s no evidence they spread the infection further. There is also no evidence of human-to-human transmission of other cases exported from China to any other countries, according to the WHO.

WHO declined to declare a global health emergency – a Public Health Emergency of International Concern, in its parlance – on Jan. 23. The step would have triggered travel and trade restrictions in member states, including the United States. For now, at least, the group said it wasn’t warranted at this point.
 
 

 

Fatality rates

The focus right now is China. The outbreak has spread beyond Wuhan to other parts of the country, and there’s evidence of fourth-generation spread.



Transportation into and out of Wuhan and other cities has been curtailed, Lunar New Year festivals have been canceled, and the Shanghai Disneyland has been closed, among other measures taken by Chinese officials.

The government could be taking drastic measures in part to prevent the public criticism it took in the early 2000’s for the delayed response and lack of transparency during the global outbreak of another wildlife market coronavirus epidemic, severe acute respiratory syndrome (SARS). In a press conference Jan. 22, WHO officials commended the government’s containment efforts but did not say they recommended them.

According to WHO, serious cases in China have mostly been in people over 40 years old with significant comorbidities and have skewed towards men. Spread seems to be limited to family members, health care providers, and other close contacts, probably by respiratory droplets. If that pattern holds, WHO officials said, the outbreak is containable.

The fatality rate appears to be around 3%, a good deal lower than the 10% reported for SARS and much lower than the nearly 40% reported for Middle East respiratory syndrome (MERS), another recent coronavirus mutation from the animal trade.

The Wuhan virus fatality rate might drop as milder cases are detected and added to the denominator. “It definitely appears to be less severe than SARS and MERS,” said Amesh Adalja, MD, an infectious disease physician in Pittsburgh and emerging infectious disease researcher at Johns Hopkins University, Baltimore.

SARS: Lessons learned

In general, the world is much better equipped for coronavirus outbreaks than when SARS, in particular, emerged in 2003.

Dr. Amesh Adalja

WHO officials in their press conference lauded China for it openness with the current outbreak, and for isolating and sequencing the virus immediately, which gave the world a diagnostic test in the first days of the outbreak, something that wasn’t available for SARS. China and other countries also are cooperating and working closely to contain the Wuhan virus.

“What we know today might change tomorrow, so we have to keep tuned in to new information, but we learned a lot from SARS,” Dr. Shaffner said. Overall, it’s likely “the impact on the United States of this new coronavirus is going to be trivial,” he predicted.

Dr. Lucey, however, recalled that the SARS outbreak in Toronto in 2003 started with one missed case. A woman returned asymptomatic from Hong Kong and spread the infection to her family members before she died. Her cause of death wasn’t immediately recognized, nor was the reason her family members were sick, since they hadn’t been to Hong Kong recently.

The infection ultimately spread to more than 200 people, about half of them health care workers. A few people died.

If a virus is sufficiently contagious, “it just takes one. You don’t want to be the one who misses that first patient,” Dr. Lucey said.

Currently, there are no antivirals or vaccines for coronaviruses; researchers are working on both, but for now, care is supportive.

[email protected]

This article was updated with new case numbers on 1/26/20.

Publications
Topics
Sections

As the Wuhan coronavirus story unfolds, the most important thing for clinicians in the United States to do is ask patients who appear to have the flu if they, or someone they have been in contact with, recently returned from China, according to infectious disease experts.

China News Service/CC BY 3.0
Medical staff in Wuhan railway station during the Wuhan coronavirus outbreak, Jan. 24, 2020.

“We are asking that of everyone with fever and respiratory symptoms who comes to our clinics, hospital, or emergency room. It’s a powerful screening tool,” said William Schaffner, MD, professor of preventive medicine and infectious diseases at Vanderbilt University Medical Center, Nashville, Tenn.

In addition to fever, common signs of infection include cough, shortness of breath, and breathing difficulties. Some patients have had diarrhea, vomiting, and other gastrointestinal symptoms. In more severe cases, infection can cause pneumonia, severe acute respiratory syndrome, kidney failure, and death. The incubation period appears to be up to 2 weeks, according to the World Health Organization (WHO).

If patients exhibit symptoms and either they or a close contact has returned from China recently, take standard airborne precautions and send specimens – a serum sample, oral and nasal pharyngeal swabs, and lower respiratory tract specimens if available – to the local health department, which will forward them to the Centers for Disease Control and Prevention (CDC) for testing. Turnaround time is 24-48 hours.

Dr. William Shaffner


The 2019 Novel Coronavirus (2019-nCoV), identified as the cause of an outbreak of respiratory illness first detected in December in association with a live animal market in Wuhan, China, has been implicated in almost 2,000 cases and 56 deaths in that country. Cases have been reported in 13 countries besides China. Five cases of 2019-nCoV infection have been confirmed in the United States, all in people recently returned from Wuhan. As the virus spreads in China, however, it’s almost certain more cases will show up in the United States. Travel history is key, Dr. Schaffner and others said.
 

Plan and rehearse

The first step to prepare is to use the CDC’s Interim Guidance for Healthcare Professionals to make a written plan specific to your practice to respond to a potential case. The plan must include notifying the local health department, the CDC liaison for testing, and tracking down patient contacts.

“It’s not good enough to just download CDC’s guidance; use it to make your own local plan and know what to do 24/7,” said Daniel Lucey, MD, an infectious disease expert at Georgetown University Medical Center, Washington, D.C.

“Know who is on call at the health department on weekends and nights,” he said. Know where the patient is going to be isolated; figure out what to do if there’s more than one, and tests come back positive. Have masks on hand, and rehearse the response. “Make a coronavirus team, and absolutely have the nurses involved,” as well as other providers who may come into contact with a case, he added.

Dr. Daniel Lucey


“You want to be able to do as well as your counterparts in Washington state and Chicago,” where the first two U.S. cases emerged. “They were prepared. They knew what to do,” Dr. Lucey said.

Those first two U.S. patients – a man in Everett, Wash., and a Chicago woman – developed symptoms after returning from Wuhan, a city of 11 million just over 400 miles inland from the port city of Shanghai. On Jan. 26 three more cases were confirmed by the CDC, two in California and one in Arizona, and each had recently traveled to Wuhan.  All five patients remain hospitalized, and there’s no evidence they spread the infection further. There is also no evidence of human-to-human transmission of other cases exported from China to any other countries, according to the WHO.

WHO declined to declare a global health emergency – a Public Health Emergency of International Concern, in its parlance – on Jan. 23. The step would have triggered travel and trade restrictions in member states, including the United States. For now, at least, the group said it wasn’t warranted at this point.
 
 

 

Fatality rates

The focus right now is China. The outbreak has spread beyond Wuhan to other parts of the country, and there’s evidence of fourth-generation spread.



Transportation into and out of Wuhan and other cities has been curtailed, Lunar New Year festivals have been canceled, and the Shanghai Disneyland has been closed, among other measures taken by Chinese officials.

The government could be taking drastic measures in part to prevent the public criticism it took in the early 2000’s for the delayed response and lack of transparency during the global outbreak of another wildlife market coronavirus epidemic, severe acute respiratory syndrome (SARS). In a press conference Jan. 22, WHO officials commended the government’s containment efforts but did not say they recommended them.

According to WHO, serious cases in China have mostly been in people over 40 years old with significant comorbidities and have skewed towards men. Spread seems to be limited to family members, health care providers, and other close contacts, probably by respiratory droplets. If that pattern holds, WHO officials said, the outbreak is containable.

The fatality rate appears to be around 3%, a good deal lower than the 10% reported for SARS and much lower than the nearly 40% reported for Middle East respiratory syndrome (MERS), another recent coronavirus mutation from the animal trade.

The Wuhan virus fatality rate might drop as milder cases are detected and added to the denominator. “It definitely appears to be less severe than SARS and MERS,” said Amesh Adalja, MD, an infectious disease physician in Pittsburgh and emerging infectious disease researcher at Johns Hopkins University, Baltimore.

SARS: Lessons learned

In general, the world is much better equipped for coronavirus outbreaks than when SARS, in particular, emerged in 2003.

Dr. Amesh Adalja

WHO officials in their press conference lauded China for it openness with the current outbreak, and for isolating and sequencing the virus immediately, which gave the world a diagnostic test in the first days of the outbreak, something that wasn’t available for SARS. China and other countries also are cooperating and working closely to contain the Wuhan virus.

“What we know today might change tomorrow, so we have to keep tuned in to new information, but we learned a lot from SARS,” Dr. Shaffner said. Overall, it’s likely “the impact on the United States of this new coronavirus is going to be trivial,” he predicted.

Dr. Lucey, however, recalled that the SARS outbreak in Toronto in 2003 started with one missed case. A woman returned asymptomatic from Hong Kong and spread the infection to her family members before she died. Her cause of death wasn’t immediately recognized, nor was the reason her family members were sick, since they hadn’t been to Hong Kong recently.

The infection ultimately spread to more than 200 people, about half of them health care workers. A few people died.

If a virus is sufficiently contagious, “it just takes one. You don’t want to be the one who misses that first patient,” Dr. Lucey said.

Currently, there are no antivirals or vaccines for coronaviruses; researchers are working on both, but for now, care is supportive.

[email protected]

This article was updated with new case numbers on 1/26/20.

As the Wuhan coronavirus story unfolds, the most important thing for clinicians in the United States to do is ask patients who appear to have the flu if they, or someone they have been in contact with, recently returned from China, according to infectious disease experts.

China News Service/CC BY 3.0
Medical staff in Wuhan railway station during the Wuhan coronavirus outbreak, Jan. 24, 2020.

“We are asking that of everyone with fever and respiratory symptoms who comes to our clinics, hospital, or emergency room. It’s a powerful screening tool,” said William Schaffner, MD, professor of preventive medicine and infectious diseases at Vanderbilt University Medical Center, Nashville, Tenn.

In addition to fever, common signs of infection include cough, shortness of breath, and breathing difficulties. Some patients have had diarrhea, vomiting, and other gastrointestinal symptoms. In more severe cases, infection can cause pneumonia, severe acute respiratory syndrome, kidney failure, and death. The incubation period appears to be up to 2 weeks, according to the World Health Organization (WHO).

If patients exhibit symptoms and either they or a close contact has returned from China recently, take standard airborne precautions and send specimens – a serum sample, oral and nasal pharyngeal swabs, and lower respiratory tract specimens if available – to the local health department, which will forward them to the Centers for Disease Control and Prevention (CDC) for testing. Turnaround time is 24-48 hours.

Dr. William Shaffner


The 2019 Novel Coronavirus (2019-nCoV), identified as the cause of an outbreak of respiratory illness first detected in December in association with a live animal market in Wuhan, China, has been implicated in almost 2,000 cases and 56 deaths in that country. Cases have been reported in 13 countries besides China. Five cases of 2019-nCoV infection have been confirmed in the United States, all in people recently returned from Wuhan. As the virus spreads in China, however, it’s almost certain more cases will show up in the United States. Travel history is key, Dr. Schaffner and others said.
 

Plan and rehearse

The first step to prepare is to use the CDC’s Interim Guidance for Healthcare Professionals to make a written plan specific to your practice to respond to a potential case. The plan must include notifying the local health department, the CDC liaison for testing, and tracking down patient contacts.

“It’s not good enough to just download CDC’s guidance; use it to make your own local plan and know what to do 24/7,” said Daniel Lucey, MD, an infectious disease expert at Georgetown University Medical Center, Washington, D.C.

“Know who is on call at the health department on weekends and nights,” he said. Know where the patient is going to be isolated; figure out what to do if there’s more than one, and tests come back positive. Have masks on hand, and rehearse the response. “Make a coronavirus team, and absolutely have the nurses involved,” as well as other providers who may come into contact with a case, he added.

Dr. Daniel Lucey


“You want to be able to do as well as your counterparts in Washington state and Chicago,” where the first two U.S. cases emerged. “They were prepared. They knew what to do,” Dr. Lucey said.

Those first two U.S. patients – a man in Everett, Wash., and a Chicago woman – developed symptoms after returning from Wuhan, a city of 11 million just over 400 miles inland from the port city of Shanghai. On Jan. 26 three more cases were confirmed by the CDC, two in California and one in Arizona, and each had recently traveled to Wuhan.  All five patients remain hospitalized, and there’s no evidence they spread the infection further. There is also no evidence of human-to-human transmission of other cases exported from China to any other countries, according to the WHO.

WHO declined to declare a global health emergency – a Public Health Emergency of International Concern, in its parlance – on Jan. 23. The step would have triggered travel and trade restrictions in member states, including the United States. For now, at least, the group said it wasn’t warranted at this point.
 
 

 

Fatality rates

The focus right now is China. The outbreak has spread beyond Wuhan to other parts of the country, and there’s evidence of fourth-generation spread.



Transportation into and out of Wuhan and other cities has been curtailed, Lunar New Year festivals have been canceled, and the Shanghai Disneyland has been closed, among other measures taken by Chinese officials.

The government could be taking drastic measures in part to prevent the public criticism it took in the early 2000’s for the delayed response and lack of transparency during the global outbreak of another wildlife market coronavirus epidemic, severe acute respiratory syndrome (SARS). In a press conference Jan. 22, WHO officials commended the government’s containment efforts but did not say they recommended them.

According to WHO, serious cases in China have mostly been in people over 40 years old with significant comorbidities and have skewed towards men. Spread seems to be limited to family members, health care providers, and other close contacts, probably by respiratory droplets. If that pattern holds, WHO officials said, the outbreak is containable.

The fatality rate appears to be around 3%, a good deal lower than the 10% reported for SARS and much lower than the nearly 40% reported for Middle East respiratory syndrome (MERS), another recent coronavirus mutation from the animal trade.

The Wuhan virus fatality rate might drop as milder cases are detected and added to the denominator. “It definitely appears to be less severe than SARS and MERS,” said Amesh Adalja, MD, an infectious disease physician in Pittsburgh and emerging infectious disease researcher at Johns Hopkins University, Baltimore.

SARS: Lessons learned

In general, the world is much better equipped for coronavirus outbreaks than when SARS, in particular, emerged in 2003.

Dr. Amesh Adalja

WHO officials in their press conference lauded China for it openness with the current outbreak, and for isolating and sequencing the virus immediately, which gave the world a diagnostic test in the first days of the outbreak, something that wasn’t available for SARS. China and other countries also are cooperating and working closely to contain the Wuhan virus.

“What we know today might change tomorrow, so we have to keep tuned in to new information, but we learned a lot from SARS,” Dr. Shaffner said. Overall, it’s likely “the impact on the United States of this new coronavirus is going to be trivial,” he predicted.

Dr. Lucey, however, recalled that the SARS outbreak in Toronto in 2003 started with one missed case. A woman returned asymptomatic from Hong Kong and spread the infection to her family members before she died. Her cause of death wasn’t immediately recognized, nor was the reason her family members were sick, since they hadn’t been to Hong Kong recently.

The infection ultimately spread to more than 200 people, about half of them health care workers. A few people died.

If a virus is sufficiently contagious, “it just takes one. You don’t want to be the one who misses that first patient,” Dr. Lucey said.

Currently, there are no antivirals or vaccines for coronaviruses; researchers are working on both, but for now, care is supportive.

[email protected]

This article was updated with new case numbers on 1/26/20.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.