User login
SGLT2 inhibitors for type 1 diabetes: Doctors debate the merits
SAN FRANCISCO – At first, the diabetes professionals in the audience at the annual scientific sessions of the American Diabetes Association overwhelmingly raised their hands to say they would support using SGLT2 inhibitors as adjunctive therapy in patients with type 1 diabetes. Then two physicians debated whether the drugs were too risky – predictably, one said yes, the other said no. In the end, most of the audience was unconvinced by one of the doctors. Which one? Well, we’ll get to that.
First, let’s look at the issue that divided the two physicians: Should the sodium-glucose cotransporter 2 (SGLT2) inhibitors canagliflozin (Invokana), dapagliflozin (Farxiga), and empagliflozin (Jardiance) – now commonly used to treat patients with type 2 diabetes – also be prescribed for patients with type 1 diabetes?
The drugs are not cleared in the United States for use in patients with type 1 diabetes, although drug makers are seeking approval. Earlier in 2019, the Food and Drug Administration turned down a request for the approval of sotagliflozin (Zynquista), a dual SGLT1 and SGLT2 inhibitor, for adults with type 1 diabetes. However, the drug has been approved in the European Union for certain overweight patients with type 1 diabetes.
In addition, the drugs are very costly, compared with some of the other diabetes medications, and physicians say that puts them out of reach for some patients.
The case for ...
In arguing that SGLT2 inhibitors would be appropriate as a therapy for patients with type 1 diabetes, Bruce A. Perkins, MD, MPH, professor and clinician-scientist at Leadership Sinai Center for Diabetes at the University of Toronto, emphasized the need for new treatments in type 1 diabetes.
“Even today, people with type 1 tell us they feel isolated, they fear hypoglycemia, they fear complications. And they have this undue burden of self-management,” he said. “We can do much better. Insulin therapy still needs us desperately needing more.”
Dr. Perkins highlighted the drugs’ widely lauded effects on cardiac and renal health and noted that a 2019 meta-analysis of 10 trials found that, compared with placebo, the drugs were associated with mean reductions in hemoglobin A1c (–0.39%; 95% confidence interval, –0.43 to –0.36) and body weight (–3.47%; 95% CI, –3.78 to –3.16).
That analysis also showed a higher risk of genital infection (3.57; 95% CI, 2.97-4.29) and diabetic ketoacidosis (DKA; 3.11; 95% CI, 2.11-4.58) with SGLT inhibitors, but the authors concluded that, despite the adverse events, the available data suggested that adding the inhibitors to basal insulin could be beneficial in patients with type 1 diabetes (Diabetes Metab Res Rev. 2019 Apr 11. doi: 10.1002/dmrr.3169).
In reference to the findings on DKA, Dr. Perkins said recent research has suggested that the DKA risk could be lowered by decreasing the dose of the SGLT2 inhibitors. “[DKA] is a problem, there’s no question, but there’s a background population risk. Whether we introduce an SGLT2 or not, we have to deal with this issue. We can deal with and overcome the excess DKA risk.”
In the big picture, he said, “it would be a crime not to make this treatment available to some patients. Meaningful benefits far outweigh the risks.”
The case against ...
On the other side of the debate was David M. Nathan, MD, of Harvard Medical School and the Clinical Research Center and Diabetes Center at Massachusetts General Hospital, Boston, who acknowledged the benefits of the SGLT2 inhibitors in type 2 diabetes.
However, he pointed to findings from a 2015 trial of canagliflozin as an add-on in type 1 diabetes (Diabetes Care. 2015;38[12]:2258-65). In that 18-week, randomized phase 2 trial, the investigators found that patients who took the drug had significantly higher rates of serious adverse events (7.7% or 6.8%, depending on dose, vs. 0% for placebo), urinary tract infections (4.3% and 5.1% vs. 1.7%), and DKA (4.3% and 6.0% vs. 0%).
“It would have cost $400 a month for the ‘pleasure’ of those side effects,” Dr. Nathan said.
He also noted a 2015 report on a 29-day, randomized, placebo-controlled study of sotagliflozin, the dual SGLT1 and SGLT2 inhibitor drug, as an add-on treatment for type 1 diabetes, in which investigators reported two episodes of DKA (13%) in the SGLT2 group, compared with none in placebo (Diabetes Care. 2015;38[7]:1181-8).
Dr. Nathan also pointed to a recent FDA warning about cases of Fournier gangrene, a rare type of serious genital infection, in patients taking SGLT2 inhibitors.
“To me, the risk [of using SGLT2 inhibitors in type 1 diabetes] outweighs the benefit by a lot,” he said, echoing comments he made in an editorial he wrote in 2017, that “any added benefits of adjunctive therapies for type 1 diabetes must be carefully balanced against their added risk and cost. Physicians and patients should beware” (N Engl J Med. 2017; 377:2390-1).
The outcome...
The audience was not sufficiently convinced by Dr. Nathan to swing the final vote fully in his favor, but he did manage to dent the initial support for using SGLT2 inhibitors in patients with type 1 disease. Before the debate, the show of hands suggested that roughly 80% of the audience thought SGLT2 inhibitors would be an appropriate therapy option for patients with type 1 diabetes. When the moderator asked the same question again after the arguments had been presented, that initial support had been eroded to about 70%. Dr. Nathan had clearly raised some doubts among the attendees, but Dr. Perkins’ perspective prevailed.
Dr. Perkins reported speaker fees from Medtronic, Abbott, Sanofi and Lilly; advisory panel service for Abbott, Boehringer Ingelheim, and Insulet; and research support to his institution from Boehringer Ingelheim and Bank of Montreal. Dr. Nathan reports no disclosures.
SAN FRANCISCO – At first, the diabetes professionals in the audience at the annual scientific sessions of the American Diabetes Association overwhelmingly raised their hands to say they would support using SGLT2 inhibitors as adjunctive therapy in patients with type 1 diabetes. Then two physicians debated whether the drugs were too risky – predictably, one said yes, the other said no. In the end, most of the audience was unconvinced by one of the doctors. Which one? Well, we’ll get to that.
First, let’s look at the issue that divided the two physicians: Should the sodium-glucose cotransporter 2 (SGLT2) inhibitors canagliflozin (Invokana), dapagliflozin (Farxiga), and empagliflozin (Jardiance) – now commonly used to treat patients with type 2 diabetes – also be prescribed for patients with type 1 diabetes?
The drugs are not cleared in the United States for use in patients with type 1 diabetes, although drug makers are seeking approval. Earlier in 2019, the Food and Drug Administration turned down a request for the approval of sotagliflozin (Zynquista), a dual SGLT1 and SGLT2 inhibitor, for adults with type 1 diabetes. However, the drug has been approved in the European Union for certain overweight patients with type 1 diabetes.
In addition, the drugs are very costly, compared with some of the other diabetes medications, and physicians say that puts them out of reach for some patients.
The case for ...
In arguing that SGLT2 inhibitors would be appropriate as a therapy for patients with type 1 diabetes, Bruce A. Perkins, MD, MPH, professor and clinician-scientist at Leadership Sinai Center for Diabetes at the University of Toronto, emphasized the need for new treatments in type 1 diabetes.
“Even today, people with type 1 tell us they feel isolated, they fear hypoglycemia, they fear complications. And they have this undue burden of self-management,” he said. “We can do much better. Insulin therapy still needs us desperately needing more.”
Dr. Perkins highlighted the drugs’ widely lauded effects on cardiac and renal health and noted that a 2019 meta-analysis of 10 trials found that, compared with placebo, the drugs were associated with mean reductions in hemoglobin A1c (–0.39%; 95% confidence interval, –0.43 to –0.36) and body weight (–3.47%; 95% CI, –3.78 to –3.16).
That analysis also showed a higher risk of genital infection (3.57; 95% CI, 2.97-4.29) and diabetic ketoacidosis (DKA; 3.11; 95% CI, 2.11-4.58) with SGLT inhibitors, but the authors concluded that, despite the adverse events, the available data suggested that adding the inhibitors to basal insulin could be beneficial in patients with type 1 diabetes (Diabetes Metab Res Rev. 2019 Apr 11. doi: 10.1002/dmrr.3169).
In reference to the findings on DKA, Dr. Perkins said recent research has suggested that the DKA risk could be lowered by decreasing the dose of the SGLT2 inhibitors. “[DKA] is a problem, there’s no question, but there’s a background population risk. Whether we introduce an SGLT2 or not, we have to deal with this issue. We can deal with and overcome the excess DKA risk.”
In the big picture, he said, “it would be a crime not to make this treatment available to some patients. Meaningful benefits far outweigh the risks.”
The case against ...
On the other side of the debate was David M. Nathan, MD, of Harvard Medical School and the Clinical Research Center and Diabetes Center at Massachusetts General Hospital, Boston, who acknowledged the benefits of the SGLT2 inhibitors in type 2 diabetes.
However, he pointed to findings from a 2015 trial of canagliflozin as an add-on in type 1 diabetes (Diabetes Care. 2015;38[12]:2258-65). In that 18-week, randomized phase 2 trial, the investigators found that patients who took the drug had significantly higher rates of serious adverse events (7.7% or 6.8%, depending on dose, vs. 0% for placebo), urinary tract infections (4.3% and 5.1% vs. 1.7%), and DKA (4.3% and 6.0% vs. 0%).
“It would have cost $400 a month for the ‘pleasure’ of those side effects,” Dr. Nathan said.
He also noted a 2015 report on a 29-day, randomized, placebo-controlled study of sotagliflozin, the dual SGLT1 and SGLT2 inhibitor drug, as an add-on treatment for type 1 diabetes, in which investigators reported two episodes of DKA (13%) in the SGLT2 group, compared with none in placebo (Diabetes Care. 2015;38[7]:1181-8).
Dr. Nathan also pointed to a recent FDA warning about cases of Fournier gangrene, a rare type of serious genital infection, in patients taking SGLT2 inhibitors.
“To me, the risk [of using SGLT2 inhibitors in type 1 diabetes] outweighs the benefit by a lot,” he said, echoing comments he made in an editorial he wrote in 2017, that “any added benefits of adjunctive therapies for type 1 diabetes must be carefully balanced against their added risk and cost. Physicians and patients should beware” (N Engl J Med. 2017; 377:2390-1).
The outcome...
The audience was not sufficiently convinced by Dr. Nathan to swing the final vote fully in his favor, but he did manage to dent the initial support for using SGLT2 inhibitors in patients with type 1 disease. Before the debate, the show of hands suggested that roughly 80% of the audience thought SGLT2 inhibitors would be an appropriate therapy option for patients with type 1 diabetes. When the moderator asked the same question again after the arguments had been presented, that initial support had been eroded to about 70%. Dr. Nathan had clearly raised some doubts among the attendees, but Dr. Perkins’ perspective prevailed.
Dr. Perkins reported speaker fees from Medtronic, Abbott, Sanofi and Lilly; advisory panel service for Abbott, Boehringer Ingelheim, and Insulet; and research support to his institution from Boehringer Ingelheim and Bank of Montreal. Dr. Nathan reports no disclosures.
SAN FRANCISCO – At first, the diabetes professionals in the audience at the annual scientific sessions of the American Diabetes Association overwhelmingly raised their hands to say they would support using SGLT2 inhibitors as adjunctive therapy in patients with type 1 diabetes. Then two physicians debated whether the drugs were too risky – predictably, one said yes, the other said no. In the end, most of the audience was unconvinced by one of the doctors. Which one? Well, we’ll get to that.
First, let’s look at the issue that divided the two physicians: Should the sodium-glucose cotransporter 2 (SGLT2) inhibitors canagliflozin (Invokana), dapagliflozin (Farxiga), and empagliflozin (Jardiance) – now commonly used to treat patients with type 2 diabetes – also be prescribed for patients with type 1 diabetes?
The drugs are not cleared in the United States for use in patients with type 1 diabetes, although drug makers are seeking approval. Earlier in 2019, the Food and Drug Administration turned down a request for the approval of sotagliflozin (Zynquista), a dual SGLT1 and SGLT2 inhibitor, for adults with type 1 diabetes. However, the drug has been approved in the European Union for certain overweight patients with type 1 diabetes.
In addition, the drugs are very costly, compared with some of the other diabetes medications, and physicians say that puts them out of reach for some patients.
The case for ...
In arguing that SGLT2 inhibitors would be appropriate as a therapy for patients with type 1 diabetes, Bruce A. Perkins, MD, MPH, professor and clinician-scientist at Leadership Sinai Center for Diabetes at the University of Toronto, emphasized the need for new treatments in type 1 diabetes.
“Even today, people with type 1 tell us they feel isolated, they fear hypoglycemia, they fear complications. And they have this undue burden of self-management,” he said. “We can do much better. Insulin therapy still needs us desperately needing more.”
Dr. Perkins highlighted the drugs’ widely lauded effects on cardiac and renal health and noted that a 2019 meta-analysis of 10 trials found that, compared with placebo, the drugs were associated with mean reductions in hemoglobin A1c (–0.39%; 95% confidence interval, –0.43 to –0.36) and body weight (–3.47%; 95% CI, –3.78 to –3.16).
That analysis also showed a higher risk of genital infection (3.57; 95% CI, 2.97-4.29) and diabetic ketoacidosis (DKA; 3.11; 95% CI, 2.11-4.58) with SGLT inhibitors, but the authors concluded that, despite the adverse events, the available data suggested that adding the inhibitors to basal insulin could be beneficial in patients with type 1 diabetes (Diabetes Metab Res Rev. 2019 Apr 11. doi: 10.1002/dmrr.3169).
In reference to the findings on DKA, Dr. Perkins said recent research has suggested that the DKA risk could be lowered by decreasing the dose of the SGLT2 inhibitors. “[DKA] is a problem, there’s no question, but there’s a background population risk. Whether we introduce an SGLT2 or not, we have to deal with this issue. We can deal with and overcome the excess DKA risk.”
In the big picture, he said, “it would be a crime not to make this treatment available to some patients. Meaningful benefits far outweigh the risks.”
The case against ...
On the other side of the debate was David M. Nathan, MD, of Harvard Medical School and the Clinical Research Center and Diabetes Center at Massachusetts General Hospital, Boston, who acknowledged the benefits of the SGLT2 inhibitors in type 2 diabetes.
However, he pointed to findings from a 2015 trial of canagliflozin as an add-on in type 1 diabetes (Diabetes Care. 2015;38[12]:2258-65). In that 18-week, randomized phase 2 trial, the investigators found that patients who took the drug had significantly higher rates of serious adverse events (7.7% or 6.8%, depending on dose, vs. 0% for placebo), urinary tract infections (4.3% and 5.1% vs. 1.7%), and DKA (4.3% and 6.0% vs. 0%).
“It would have cost $400 a month for the ‘pleasure’ of those side effects,” Dr. Nathan said.
He also noted a 2015 report on a 29-day, randomized, placebo-controlled study of sotagliflozin, the dual SGLT1 and SGLT2 inhibitor drug, as an add-on treatment for type 1 diabetes, in which investigators reported two episodes of DKA (13%) in the SGLT2 group, compared with none in placebo (Diabetes Care. 2015;38[7]:1181-8).
Dr. Nathan also pointed to a recent FDA warning about cases of Fournier gangrene, a rare type of serious genital infection, in patients taking SGLT2 inhibitors.
“To me, the risk [of using SGLT2 inhibitors in type 1 diabetes] outweighs the benefit by a lot,” he said, echoing comments he made in an editorial he wrote in 2017, that “any added benefits of adjunctive therapies for type 1 diabetes must be carefully balanced against their added risk and cost. Physicians and patients should beware” (N Engl J Med. 2017; 377:2390-1).
The outcome...
The audience was not sufficiently convinced by Dr. Nathan to swing the final vote fully in his favor, but he did manage to dent the initial support for using SGLT2 inhibitors in patients with type 1 disease. Before the debate, the show of hands suggested that roughly 80% of the audience thought SGLT2 inhibitors would be an appropriate therapy option for patients with type 1 diabetes. When the moderator asked the same question again after the arguments had been presented, that initial support had been eroded to about 70%. Dr. Nathan had clearly raised some doubts among the attendees, but Dr. Perkins’ perspective prevailed.
Dr. Perkins reported speaker fees from Medtronic, Abbott, Sanofi and Lilly; advisory panel service for Abbott, Boehringer Ingelheim, and Insulet; and research support to his institution from Boehringer Ingelheim and Bank of Montreal. Dr. Nathan reports no disclosures.
EXPERT ANALYSIS FROM ADA 2019
Some burnout factors are within a physician’s control
SAN DIEGO – Eat a healthy lunch. Get more sleep. Move your body. How many times in the course of a week do you give patients gentle reminders to practice these most basic steps of self-care? And how many times in the course of a week do you allow these basics to go by the wayside for yourself?
Self-care is one of the elements that can defend against physician burnout, Carol Burke, MD, said at a session on physician burnout held during the annual Digestive Disease Week®. Personal self-care can make a real difference, and shouldn’t be ignored as the profession works to reel back some of the institutional changes that challenge physicians today.
In the workplace, unhealthy stress levels can contribute to burnout, disruptive behavior, decreased productivity, and disengagement. Burnout – a response to chronic stress characterized by a diminished sense of personal accomplishment and emotional exhaustion – can result in cynicism, a lack of compassion, and feelings of depersonalization, said Dr. Burke.
Contributors to physician stress have been well documented, said Dr. Burke, a professor of gastroenterology at the Cleveland Clinic. These range from personal debt and the struggle for work-life balance to an increased focus on metrics and documentation at the expense of authentic patient engagement. All of these factors are measurable by means of the validated Maslach Burnout Inventory, said Dr. Burke. A recent survey that used this measure indicated that nearly half of physician respondents report experiencing burnout.
In 2017, Dr. Burke led a survey of American College of Gastroenterology members that showed 49.3% of respondents reported feeling emotional exhaustion and/or depersonalization. Some key themes emerged from the survey, she said. Women and younger physicians were more likely to experience burnout. Having children in the middle years (11-15 years old) and spending more time on domestic duties and child care increased the risk of burnout.
And doing patient-related work at home or having a spouse or partner bring work home also upped burnout risk. Skipping breakfast and lunch during the workweek was another risk factor, which highlights the importance of basic self-care as armor against the administrative onslaught, said Dr. Burke.
Measured by volume alone, physician work can be overwhelming: 45% of physicians in the United States work more than 60 hours weekly, compared with fewer than 10% of the general workforce, said Dr. Burke.
What factors within the control of an individual practitioner can reduce the risk of debilitating burnout and improve quality of life? Physicians who do report a high quality of life, said Dr. Burke, are more likely to have a positive outlook. They also practice basic self-care like taking vacations, exercising regularly, and engaging in hobbies outside of work.
For exhausted, overworked clinicians, getting a good night’s sleep is a critical form of self-care. But erratic schedules, stress, and family demands can all sabotage plans for better sleep hygiene. Still, attending to sleep is important, said Dr. Burke. Individuals with disturbed sleep are less mindful and have less self-compassion. Sleep disturbance is also strongly correlated with perceived stress.
She also reported that the odds ratio for burnout was 14.7 for physicians who reported insomnia when compared with those without sleep disturbance, and it was 9.9 for those who reported nonrestorative sleep.
Physical activity can help sleep and also help other markers of burnout. Dr. Burke pointed to a recent study of 4,402 medical students. Participants were able to reduce burnout risk when they met the Centers for Disease Control and Prevention recommendations of achieving at least 150 minutes/week of moderate exercise or 75 minutes/week of vigorous exercise, plus at least 2 days/week of strength training (P less than .001; Acad Med. 2017;92:1006).
These physician-targeted programs can work, she said: “Faciliated interventions improve well-being, attitudes associated with patient-centered care, meaning and engagement in work, and reduce burnout.”
Practice-focused interventions to reclaim a semblance of control over one’s time are varied, and some are easier to implement than others. Virtual visits and group visits are surprisingly well received by patients, and each can be huge time-savers for physicians, said Dr. Burke. There are billing and workflow pitfalls to avoid, but group visits, in particular, can be practice changing for those who have heavy backlogs and see many patients with the same condition.
Medical scribes can improve productivity and reduce physician time spent on documentation. Also, said Dr. Burke, visits can appropriately be billed at a higher level of complexity when contemporaneous documentation is thorough. Clinicians overall feel that they can engage more fully with patients, and also feel more effective, when well-trained scribes are integrated into a practice, she said.
Female physicians have repeatedly been shown to have patient panels that are more demanding, and male and female patients alike expect more empathy and social support from their physicians, said Dr. Burke. When psychosocial complexities are interwoven with patient care, as they are more frequently for female providers, a 15-minute visit can easily run twice that – or more. Dr. Burke is among the physicians advocating for recognition of this invisible burden on female clinicians, either through adaptive scheduling or differential productivity expectations. This approach is not without controversy, she acknowledged; still, practices should acknowledge that clinic flow can be very different for male and female gastroenterologists, she said.
Dr. Burke reported no relevant conflicts of interest.
Read tips for how to balance family and personal lives with a professional career in order to avoid burnout at https://www.ddwnews.org/news/aga-symposium-provides-practical-tips-to-avoid-physician-burnout/.
SAN DIEGO – Eat a healthy lunch. Get more sleep. Move your body. How many times in the course of a week do you give patients gentle reminders to practice these most basic steps of self-care? And how many times in the course of a week do you allow these basics to go by the wayside for yourself?
Self-care is one of the elements that can defend against physician burnout, Carol Burke, MD, said at a session on physician burnout held during the annual Digestive Disease Week®. Personal self-care can make a real difference, and shouldn’t be ignored as the profession works to reel back some of the institutional changes that challenge physicians today.
In the workplace, unhealthy stress levels can contribute to burnout, disruptive behavior, decreased productivity, and disengagement. Burnout – a response to chronic stress characterized by a diminished sense of personal accomplishment and emotional exhaustion – can result in cynicism, a lack of compassion, and feelings of depersonalization, said Dr. Burke.
Contributors to physician stress have been well documented, said Dr. Burke, a professor of gastroenterology at the Cleveland Clinic. These range from personal debt and the struggle for work-life balance to an increased focus on metrics and documentation at the expense of authentic patient engagement. All of these factors are measurable by means of the validated Maslach Burnout Inventory, said Dr. Burke. A recent survey that used this measure indicated that nearly half of physician respondents report experiencing burnout.
In 2017, Dr. Burke led a survey of American College of Gastroenterology members that showed 49.3% of respondents reported feeling emotional exhaustion and/or depersonalization. Some key themes emerged from the survey, she said. Women and younger physicians were more likely to experience burnout. Having children in the middle years (11-15 years old) and spending more time on domestic duties and child care increased the risk of burnout.
And doing patient-related work at home or having a spouse or partner bring work home also upped burnout risk. Skipping breakfast and lunch during the workweek was another risk factor, which highlights the importance of basic self-care as armor against the administrative onslaught, said Dr. Burke.
Measured by volume alone, physician work can be overwhelming: 45% of physicians in the United States work more than 60 hours weekly, compared with fewer than 10% of the general workforce, said Dr. Burke.
What factors within the control of an individual practitioner can reduce the risk of debilitating burnout and improve quality of life? Physicians who do report a high quality of life, said Dr. Burke, are more likely to have a positive outlook. They also practice basic self-care like taking vacations, exercising regularly, and engaging in hobbies outside of work.
For exhausted, overworked clinicians, getting a good night’s sleep is a critical form of self-care. But erratic schedules, stress, and family demands can all sabotage plans for better sleep hygiene. Still, attending to sleep is important, said Dr. Burke. Individuals with disturbed sleep are less mindful and have less self-compassion. Sleep disturbance is also strongly correlated with perceived stress.
She also reported that the odds ratio for burnout was 14.7 for physicians who reported insomnia when compared with those without sleep disturbance, and it was 9.9 for those who reported nonrestorative sleep.
Physical activity can help sleep and also help other markers of burnout. Dr. Burke pointed to a recent study of 4,402 medical students. Participants were able to reduce burnout risk when they met the Centers for Disease Control and Prevention recommendations of achieving at least 150 minutes/week of moderate exercise or 75 minutes/week of vigorous exercise, plus at least 2 days/week of strength training (P less than .001; Acad Med. 2017;92:1006).
These physician-targeted programs can work, she said: “Faciliated interventions improve well-being, attitudes associated with patient-centered care, meaning and engagement in work, and reduce burnout.”
Practice-focused interventions to reclaim a semblance of control over one’s time are varied, and some are easier to implement than others. Virtual visits and group visits are surprisingly well received by patients, and each can be huge time-savers for physicians, said Dr. Burke. There are billing and workflow pitfalls to avoid, but group visits, in particular, can be practice changing for those who have heavy backlogs and see many patients with the same condition.
Medical scribes can improve productivity and reduce physician time spent on documentation. Also, said Dr. Burke, visits can appropriately be billed at a higher level of complexity when contemporaneous documentation is thorough. Clinicians overall feel that they can engage more fully with patients, and also feel more effective, when well-trained scribes are integrated into a practice, she said.
Female physicians have repeatedly been shown to have patient panels that are more demanding, and male and female patients alike expect more empathy and social support from their physicians, said Dr. Burke. When psychosocial complexities are interwoven with patient care, as they are more frequently for female providers, a 15-minute visit can easily run twice that – or more. Dr. Burke is among the physicians advocating for recognition of this invisible burden on female clinicians, either through adaptive scheduling or differential productivity expectations. This approach is not without controversy, she acknowledged; still, practices should acknowledge that clinic flow can be very different for male and female gastroenterologists, she said.
Dr. Burke reported no relevant conflicts of interest.
Read tips for how to balance family and personal lives with a professional career in order to avoid burnout at https://www.ddwnews.org/news/aga-symposium-provides-practical-tips-to-avoid-physician-burnout/.
SAN DIEGO – Eat a healthy lunch. Get more sleep. Move your body. How many times in the course of a week do you give patients gentle reminders to practice these most basic steps of self-care? And how many times in the course of a week do you allow these basics to go by the wayside for yourself?
Self-care is one of the elements that can defend against physician burnout, Carol Burke, MD, said at a session on physician burnout held during the annual Digestive Disease Week®. Personal self-care can make a real difference, and shouldn’t be ignored as the profession works to reel back some of the institutional changes that challenge physicians today.
In the workplace, unhealthy stress levels can contribute to burnout, disruptive behavior, decreased productivity, and disengagement. Burnout – a response to chronic stress characterized by a diminished sense of personal accomplishment and emotional exhaustion – can result in cynicism, a lack of compassion, and feelings of depersonalization, said Dr. Burke.
Contributors to physician stress have been well documented, said Dr. Burke, a professor of gastroenterology at the Cleveland Clinic. These range from personal debt and the struggle for work-life balance to an increased focus on metrics and documentation at the expense of authentic patient engagement. All of these factors are measurable by means of the validated Maslach Burnout Inventory, said Dr. Burke. A recent survey that used this measure indicated that nearly half of physician respondents report experiencing burnout.
In 2017, Dr. Burke led a survey of American College of Gastroenterology members that showed 49.3% of respondents reported feeling emotional exhaustion and/or depersonalization. Some key themes emerged from the survey, she said. Women and younger physicians were more likely to experience burnout. Having children in the middle years (11-15 years old) and spending more time on domestic duties and child care increased the risk of burnout.
And doing patient-related work at home or having a spouse or partner bring work home also upped burnout risk. Skipping breakfast and lunch during the workweek was another risk factor, which highlights the importance of basic self-care as armor against the administrative onslaught, said Dr. Burke.
Measured by volume alone, physician work can be overwhelming: 45% of physicians in the United States work more than 60 hours weekly, compared with fewer than 10% of the general workforce, said Dr. Burke.
What factors within the control of an individual practitioner can reduce the risk of debilitating burnout and improve quality of life? Physicians who do report a high quality of life, said Dr. Burke, are more likely to have a positive outlook. They also practice basic self-care like taking vacations, exercising regularly, and engaging in hobbies outside of work.
For exhausted, overworked clinicians, getting a good night’s sleep is a critical form of self-care. But erratic schedules, stress, and family demands can all sabotage plans for better sleep hygiene. Still, attending to sleep is important, said Dr. Burke. Individuals with disturbed sleep are less mindful and have less self-compassion. Sleep disturbance is also strongly correlated with perceived stress.
She also reported that the odds ratio for burnout was 14.7 for physicians who reported insomnia when compared with those without sleep disturbance, and it was 9.9 for those who reported nonrestorative sleep.
Physical activity can help sleep and also help other markers of burnout. Dr. Burke pointed to a recent study of 4,402 medical students. Participants were able to reduce burnout risk when they met the Centers for Disease Control and Prevention recommendations of achieving at least 150 minutes/week of moderate exercise or 75 minutes/week of vigorous exercise, plus at least 2 days/week of strength training (P less than .001; Acad Med. 2017;92:1006).
These physician-targeted programs can work, she said: “Faciliated interventions improve well-being, attitudes associated with patient-centered care, meaning and engagement in work, and reduce burnout.”
Practice-focused interventions to reclaim a semblance of control over one’s time are varied, and some are easier to implement than others. Virtual visits and group visits are surprisingly well received by patients, and each can be huge time-savers for physicians, said Dr. Burke. There are billing and workflow pitfalls to avoid, but group visits, in particular, can be practice changing for those who have heavy backlogs and see many patients with the same condition.
Medical scribes can improve productivity and reduce physician time spent on documentation. Also, said Dr. Burke, visits can appropriately be billed at a higher level of complexity when contemporaneous documentation is thorough. Clinicians overall feel that they can engage more fully with patients, and also feel more effective, when well-trained scribes are integrated into a practice, she said.
Female physicians have repeatedly been shown to have patient panels that are more demanding, and male and female patients alike expect more empathy and social support from their physicians, said Dr. Burke. When psychosocial complexities are interwoven with patient care, as they are more frequently for female providers, a 15-minute visit can easily run twice that – or more. Dr. Burke is among the physicians advocating for recognition of this invisible burden on female clinicians, either through adaptive scheduling or differential productivity expectations. This approach is not without controversy, she acknowledged; still, practices should acknowledge that clinic flow can be very different for male and female gastroenterologists, she said.
Dr. Burke reported no relevant conflicts of interest.
Read tips for how to balance family and personal lives with a professional career in order to avoid burnout at https://www.ddwnews.org/news/aga-symposium-provides-practical-tips-to-avoid-physician-burnout/.
EXPERT ANALYSIS FROM DDW 2019
Improved Patient Outcomes and Reduced Wait Times: Transforming a VA Outpatient Substance Use Disorder Program
Substance use disorders (SUDs) are an increasing public health concern in the US. The 2015 National Survey on Drug Use and Health indicated that 27 million people (8% of the US population) reported current use of recreational drugs or misuse of alcohol or prescription medications.1 The 2013 National Survey on Drug Use and Health indicated that 1.5 million veterans (roughly 6.6%) met the criteria for a SUD.2 More than 50% of patients awaiting entry into a SUD treatment program will never achieve admission due, in part, to long wait times.3-5
National attention has been focused on increasing veteran access to quality treatment based on evidence-based practices (EBPs). Several national legislative measures and treatment protocols have been implemented: the Uniform Mental Health Services in US Department of Veterans Affairs (VA) medical centers and clinics; Veterans Access, Choice, and Accountability Act (2014); Cognitive Behavioral Therapy for Substance Use Disorders (CBT-SUD) Training Program; and the Psychotropic Drug Safety Initiative (PDSI).6-8 Consistent with these directives and in line with American Society of Addiction Medicine (ASAM) and Substance Abuse and Mental Health Services Administration (SAMHSA) guidelines for medication-assisted therapies (MAT),the James A. Haley Veterans’ Hospital (JAHVH) Mental Health and Behavioral Sciences Service (MH&BSS) Substance Use Disorders Service (SUDS) in Tampa, Florida, implemented an evidence-based, treatment-on-demand model of care.9-11
Meeting SUD Treatment Needs
What does the new supervisor of a clinical program do when a 24-employee outpatient VA Alcohol and Drug Addiction Treatment Program (ADATP) has an average 33-day wait time for treatment with 54% of patients lost to care between initial evaluation and admission?12 Patients lacked consistent access to SUD pharmacotherapy. The national VA clinical performance indicators were substandard and there are no additional resources available to apply to the program.
At JAHVH the program supervisor enlisted hospital leadership to support program redesign. The redesign sought to improve operational efficiency and eliminate patient wait time; adopt national standards for assessment and treatment developed by ASAM; implement strictly evidence-based psychotherapeutic treatments; educate program psychiatrists about evidence-based psychopharmacologic treatments and hold them accountable for patient adherence; streamline documentation templates; free clinical providers from nonclinical tasks; create an inpatient addiction consult team to diagnose and treat chronic hospitalized patients with SUDs; ensure continuity of care; and, standardize consistent, objective measures of patient response to treatment to track the program’s effectiveness.
In this article, the authors provide an explanation of the clinical, theoretical foundation and the practical steps taken to design and implement this transformation. They then describe the lessons learned, hoping that their process will serve as a model for those in similar situations.
Program Redesign
July 1, 2015, a new program supervisor was hired and began a 2-month evaluation and analysis of the program with input from leadership, staff, and hospital/community stakeholders. September 1, the monthlong process of developing the redesign began. On September 30 the plan was presented to, and approved by, MH&BSS leadership. October was spent preparing for change with an implementation date of November 2 selected. On November 2, 2015, the complete redesign was implemented.
Needs Assessment
A needs assessment yielded improvement opportunities in program structure (levels of care); clinical content; staff and resource allocation, including clinical workflow and management systems. Staff identified philosophical and practical variance in the program, often leading to confusion for patients and clinicians and potentially resulting in disparate quality care and patient outcomes. Recommendations for addressing these needs included incorporating ASAM guidelines for assignment to clinically appropriate levels of care, implementation of consistent EBPs for SUD and comorbid conditions,9 and emphasis on staff training and development to champion evidence-based program philosophy and service delivery.
The assessment determined that the average waitlist time was 33 days, and patients were required to abstain from substance or alcohol use prior to admission to the Intensive Outpatient Program. If a waitlisted patient relapsed, she or he was removed from the waitlist and denied admission. A study conducted at JAHVH reported that 54% of waitlisted patients in this clinic (prior to November 2, 2015) never were admitted to the program.12 Access to care was considered a significant issue.
Program Implementation
September was spent developing a comprehensive redesign of the SUD clinic. The vision included incorporating all ASAM levels of care; creating an evidence-based, treatment-on-demand model of care; and, securing the support of MH&BSS leadership team, staff, and patients for the redesign. The supervisory clinician interviewed staff both individually and as a group. Clinicians were provided extensive training on EBP for SUDs, including psychotherapies, psychosocial treatments, and psychopharmacologic interventions. A journal club was started with staff-generated topics that offered articles sharing current research, EBPs, and psychotherapeutic techniques, continuing education on substances, and management of coexisting diagnoses. Clinicians increased the frequency of SUD in-service trainings. Psychiatrists provided several Grand Rounds to the MH&BSS service. All counselors were assigned to 1 of the program’s 3 clinical psychologists for individual weekly clinical supervision.
By providing all staff with current, evidence-based, clinically relevant treatment information and emphasizing its relationship to successful patient outcomes, program leadership energized staff support. Staff were encouraged to perform at the top of their scope of practice and engage in training and consultation. Each staff member was delegated a role in the process to inspire buy-in.
Preparation for the Shift
October was spent preparing for a seamless, one-day implementation of proposed changes, including implementation of updated clinical policies, procedures, and document templates (rewritten to include only clinically appropriate information required by VA policy or the Joint Commission); streamlined staff schedules; and utilization of staff-developed and research/policy-driven EBP handbook. Finally, the Brief Addiction Monitor (BAM) was selected as objective criteria to consistently assess patient progress in treatment, and staff were instructed to use this measure at regular intervals and for all levels of care.
Emphasis was placed on ongoing fortification of staff and patient support for the reorganization. For example, the Addiction Severity Index, though not required by policy, was historically used, adding 90 minutes to the evaluation and admission session. Staff agreed to remove this measure to improve clinician availability. Staff were also empowered to rename the redesigned program, and chose Substance Use Disorders Service (SUDS).
Process Changes
To achieve same-day access to clinical care, program leadership created a daily morning orientation group. Patients are scheduled or may attend as a walk-in. The orientation’s purpose is to explain what services are available and to offer each patient an opportunity for immediate evaluation and treatment. Staff schedules were modified to provide patient evaluation appointment slots immediately following orientation. The number of immediate evaluation slots was initially assessed by analyzing the demand for treatment over the previous 6 months, determining the daily mean, and setting the number of slots to accommodate 3 standard deviations above the daily mean. If a patient in a daily orientation group expresses a willingness to engage in treatment, he or she is immediately evaluated by a counselor during a 90-minute session and seen by a psychiatrist to determine whether pharmacologic treatment would be appropriate. If needed, the medication is prescribed that day. The primary purpose of the patient’s initial clinical evaluation is to determine the most appropriate level of care based on ASAM criteria. Also available were 90-minute afternoon evaluation appointments with psychiatrists for patients who walk into the clinic after the morning orientation group had ended.
Prior to the redesign, clinic psychiatrists were minimally prescribing evidence-based pharmacotherapy for sobriety support. At the time of redesign, only 8% of patients diagnosed with opioid use disorders (OUDs) were prescribed buprenorphine/naloxone or naltrexone. Just 1.9% of patients diagnosed with alcohol use disorder (AUD) were prescribed naltrexone or acamprosate. With the redesign, access to these medications has significantly expanded.
All templates were redesigned to ensure consistent documentation. This change decreased the overall provider task burden, and explicitly supported the use of ASAM multidimensional criteria and the Brief Addiction Monitor (BAM) to identify a pretreatment baseline score and track each patient’s clinical progress.13 Evidence-based written curricula were standardized for individual and group psychotherapies to reduce provider and programmatic variation.
The redesign creates distinct levels of care based on ASAM criteria, including harm reduction, ambulatory detoxification, outpatient group and individual psychotherapy, an evidence-based Intensive Outpatient Program (IOP), and aftercare. Application of the ASAM standards has allowed clinicians to make accurate placement decisions that best meet individual patient needs and to serve as effective stewards even with limited treatment and financial resources. Although JAHVH does not have a residential SUD program, procedures were developed to refer veterans to community-based residential treatment programs when appropriate.
Group Therapies
With the redesign, SUDS was no longer exclusively a 12-step program; however, it still supported and recognized the value of this approach for some patients. A psychologist periodically audits group sessions to prevent drift from that group’s curriculum. Counselors are assigned to weekly hour-long clinical supervision sessions with a psychologist to review patient care and reinforce the application of evidence-based individualized treatment.
After reviewing empirical literature and VA directives, CBT-SUD was adopted. It encompasses individual and group interventions, such as motivational interviewing (MI), contingency management (CM), and medication-assisted therapies as primary therapeutic treatment modalities, all of which have demonstrated efficacy as measured by length of sobriety postintervention.9,14,15
Clinical Staff Improvements
Staff were reorganized into 3 interdisciplinary treatment teams. A weekly team meeting is scheduled to coordinate care and discuss the treatment of complex patients. Clinical staff focus has shifted from case-management to diagnosis and treatment; now patients are referred to their primary care team’s social worker for case management services. Allowing clinical staff to focus solely on the diagnosis and clinical treatment of SUDs has significantly enhanced productivity and morale.
Staff receive training in the newly adopted interventions during brief monthly refresher courses provided by inhouse psychologists. Additional training includes participation in local and national SUD teleconferences and onsite meetings with experts in harm reduction and motivational interventions. During the transition, clinicians were encouraged to attend staff resiliency training. Continuing education was available to the SUDS psychiatrists and all inpatient and outpatient psychiatrists at JAHVH. Recently, this educational initiative was expanded to include all primary care and inpatient internal medicine physicians.
Implementation
On November 2, 2015, all planned programmatic changes were simultaneously implemented. On that day, clinician and patient schedules changed, the new EBP curriculum was administered, the use of streamlined documentation procedures began, and daily orientation groups followed by same-day evaluations were initiated.
The pretreatment sobriety requirement was eliminated as a barrier to care, and the program began to use a harm-reduction treatment track as recommended by ASAM guidelines. Patients with urgent or emergent medical or psychiatric problems were immediately assessed by SUDS health care providers and treated in the clinic or transported to the emergency department. Previously unavailable, patient access to ambulatory detoxification was initiated. The prescription of buprenorphine/naloxone for the treatment of OUD treatments increased from 1 prescriber to all 3.
Three months after program reorganization, the leadership reviewed overall workflow, conducted patient satisfaction surveys, and evaluated facility use and productivity. To address patient needs and facilitate optimal use of space, the number of same-day evaluation slots was reduced while the number of individual therapy slots was increased.
Staff meet in workgroups to discuss EBPs and further refine content with feedback from the supervisory clinician and team psychologists who routinely audit group therapy sessions. Staff report ongoing benefit from weekly supervision with a clinical psychologist. An inpatient addiction consultation team that uses existing manpower and resources has been developed.
Program Goals and Outcomes
The SUDS program serves more patients at multiple levels of standardized care with 2 fewer full-time positions. One counselor and one advanced practice registered nurse were reallocated to different programs within the JAHVH VA mental health clinic. Following a review of all program clinic profiles in the VA’s Computerized Patient Record System (CPRS) for utilization, accuracy, and necessity, and allowing for accurate program data capture, the transition resulted in a reduction of distinct clinics from 114 to 67 (-58.7%). In fiscal year 2018, review of CPRS yielded 19,786 total visits (3,645 unique visits).
Eliminate Patient Wait Tme
Patient wait time, as measured in CPRS from date of initial evaluation to date of treatment was reduced from an average of 33 days to 0 within 2 weeks of program implementation. A review of CPRS data also indicated that preadmission attrition dropped from 54% to < 1%; all patients desiring treatment are assigned a counselor and treatment is initiated the same day.
Adopt ASAM Criteria
After the redesign, patients have received more appropriate care based on individualized treatment plans. Due to the implementation of a fluid and supportive model, patients can move through levels of care as clinical need dictates rather than failing treatment and having to reengage. Staff receive ongoing education on the use of ASAM. Evaluation and treatment plan templates now reflect assignment to level of care rationale using ASAM guidelines.
Use of Evidence-Based Psychotherapeutic Treatments
More consistent, coordinated, and effective psychotherapies have improved patient care. The program’s previous issues with patients receiving conflicting treatment guidance from different providers has been resolved. Duplicate and ineffective treatments, including multiple readmissions to the IOP level of care, overemphasis of abstinence-based modalities for patients in active use, and referrals to inpatient SUD care under the assumption that “higher level of care is better” have ceased through staff education, leadership support, and appropriate staffing and communication. Review of patient advocate complaints tracked by and resolved by the service demonstrated an 80% decrease in patient advocate complaints regarding SUD clinic services.
Implement Evidence-Based Psychopharmacologic Treatments
The pharmacotherapy education initiative yielded tangible benefits and is likely a significant contributor to the program’s improved clinical outcomes. Prescription of pharmacotherapy for patients with OUD has risen from 8% to 25.1% in eligible patients. Appropriate medication prescription for the treatment of AUD has risen from 1.9% to 9.8% in eligible patients. These data are reflected in the VA Pharmaceutical Drug Safety Initiative (PDSI) dashboard.
Streamline Documentation
Significantly reducing the charting burden was likely a significant contributor to increased provider productivity and improved patient outcomes. Regular meetings between SUDS leadership and clinical informatics ensure that standardized note templates meet hospital policy and gather all necessary accreditation information.
Improve Employee Morale
Increased staff morale is indicated by a noticeable reduction in employee sick days; a decrease of > 20% (over the same time period the previous year), per the VA electronic timekeeping system, during the first 6 months following the November 2 program implementation.
SUDS Inpatient Addiction Consult Team
In January of 2017, SUDS began an inpatient medicine consultation service to offer evaluation, pharmacotherapy, and supportive counseling to patients diagnosed with SUDs who had been admitted to inpatient medical and surgical services. This team includes existing SUDS staff members reallocated to the inpatient service, is led by a SUDS psychiatrist, and includes 3 multidisciplinary clinicians with extensive training in assessment, diagnosis, and treatment planning of SUDs and comorbid conditions. Prior to implementation, the SUDS inpatient addiction consult team met with hospital leadership and attending physicians for inpatient medicine and psychiatry physicians.
To access the SUDS inpatient addiction consult team, physicians request a consult. Patients are offered an evaluation and are assigned to a level of care with orders for outpatient appointments with a counselor and psychiatrist within 7 days of hospital discharge. Medication-assisted treatment for chronic SUDs is implemented while patients remain admitted to the inpatient medical service. In fiscal year 2018, the SUDS inpatient addiction consult team performed 1,428 inpatient evaluations.
Consistent Treatment Outcome Measures
The BAM is a clinical tool designed to measure patient outcomes in substance use disorders.13 Its 17-item scale measures substance use risk factors that may lead to relapse, and protective factors that are recovery-oriented behaviors that help prevent relapse. It demonstrates sensitivity to change and has excellent test-retest reliability. The BAM has been in use in the addictions treatment program since 2011 but was previously administered only after admission to the IOP and again after a 30- to 90-day follow-up period. Since the program redesign, all SUDS patients are administered the BAM at their initial evaluation and at each individual appointment thereafter. The initial BAM assessment encompasses the previous 30 days; this 30-day version is also used for monthly follow-ups. For BAM assessments that occur within 30 days from the time of the last evaluation, a 7-day version is used. Prior to the redesign, about 24% of patients received a follow-up 30-day BAM assessment.12 Per CPRS review of veterans participating in continued treatment, the rate rose to 100% 3 months after the redesign.
When program staff compared preredesign and postredesign BAM data, they detected significant clinical differences. Data demonstrate a 22.2% improvement in protective factors, including patient confidence in their ability to remain abstinent; engaging in self-help activities, such as attending Alcoholics Anonymous meetings; engaging in organized spiritual activities; going to school, working, or volunteering; securing a regular income; and time spent with friends or family who are supportive of recovery.
The data also show a marked reduction in substance use at follow-up points in treatment and a corresponding decrease in risk factors. One item of the BAM assesses patient level of satisfaction with their treatment. Since the redesign, patients report that they are “considerably” satisfied with their SUD treatment.
Currently, program staff are conducting a review of BAM scores by level of care to further parse the impact of various treatments and best target patient need using measurement-based care and EBP, such as contingency management, which provides small monetary incentives when patients maintain clean urine drug screens.16 In addition, the program plans to achieve more uniformity in BAM assessment intervals at all levels of care, and possibly also integrate BAM data into SUD group therapies. Correlation of the BAM scores to other metrics, such as readmission to inpatient medicine, relapse, urine drug screen, or critical laboratory values, will provide additional insight into impact of programmatic changes.
Discussion
Feedback from other clinics and services within the hospital has been very positive. Some providers have reported that they appreciate the ease and availability of access to SUDS. Additionally, patients engaged in treatment prior to the redesign have been contacted for an updated evaluation and assignment to a counselor and appropriate level of care. From the staff’s perspective, the shift to immediate access to care has allowed a more streamlined process with fewer hurdles for patient admission. Staff report that they now feel empowered to meet the needs of veterans in a comprehensive, same-day fashion.
The success of our redesign was contingent on internal and external sta
The successful implementation of these changes has revealed several important elements regarding patient care. The first lesson was that improving access and integrating best practices is possible without additional resources, outside monies, or disruption to patient services. With the support of MH&BSS leadership, the program streamlined existing processes and used both staff and clinic resources more efficiently.
The second lesson involved the importance of continually reviewing and revising standard operating procedures to match the needs of the current patient population. Policies and procedures that once were viewed as potential barriers to change have been replaced with a more flexible approach and willingness to evolve.
As a result, far fewer patients have been lost to treatment. The time and resources that staff historically dedicated to nonclinical patient care are now redirected to immediate service provision. This increase in operational efficiency and treatment efficacy has resulted in a boost to staff morale, even during a time of immense change and increased productivity. Program staff are now able to personally witness the significant changes in their patients’ lives and feel a sense of pride at being a member of a hard-working team that provides the highest quality of substance use treatment. This is critical to job satisfaction and meets the VA mission to provide timely, effective, and evidence-based treatments to patients.
Conclusion
JAHVH strives to continue to provide the highest quality of SUD treatment available. Future directions aim to streamline clinic operations by constantly monitoring and reviewing workloads, while also considering patient feedback. A continuous review of EBP is part of our clinic’s culture. Program leadership endeavors to promote an open environment where providers can share their triumphs and frustrations and foster a team approach to problem solving. Further plans include expanding the range of treatment levels offered by developing a residential SUD treatment facility.
1. Substance Abuse and Mental Health Services Administration. 2015 National Survey on Drug Use and Health: Summary of the Effects of the 2015 NSDUH Questionnaire Redesign: Implications for Data Users. https://www.samhsa.gov/data/sites/default/files/NSDUH-TrendBreak-2015.pdf. Published June 2016. Accessed June 12, 2019.
2. Substance Abuse and Mental Health Services Administration. Results from the 2013 National Survey on Drug Use and Health: Summary of National Findings. NSDUH Series H-48, HHS Publication No. (SMA) 14-4863. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2014.
3. Donovan DM, Rosengren DB, Downey L, Cox GB, Sloan PDSKL. Attrition prevention with individuals awaiting publicly funded drug treatment. Addiction. 2001;96(8):1149-1160.
4. Hser Y, Maglione M, Polinsky ML, Anglin MD. Predicting treatment entry among treatment-seeking drug abusers. J Subst Abuse Treatment. 1997;15(3):213-220.
5. Stark MJ, Campbell BK, Brinkerhoff CV. “Hello, may we help you?” A study of attrition prevention at the time of the first phone contact with substance-abusing clients. Am J Drug Alcohol Abuse. 1990;16:67-76.
6. US Department of Veterans Affairs. Uniform Mental Health Services in VA Medical Centers and Clinics. VHA Handbook 1160.01. https://www.va.gov/vhapublications/ViewPublication.asp?pub_ID=1762. Updated November 2015. Accessed December 12, 2017.
7. Veterans Access, Choice and Accountability Act of 2014, 2 USC § 933.
8. DeMarce JM, Gnys M, Raffa SD, Karlin, BE. Cognitive Behavioral Therapy for Substance Use Disorders Among Veterans: Therapist Manual. Washington, DC: US Department of Veterans Affairs; 2014.
9. Mee-Lee D, Shulman GD, Fishman MJ, Gastfriend DR, Miller MM, eds. The ASAM Criteria: Treatment Criteria for Addictive, Substance-Related, and Co-Occurring Conditions. 3rd ed. Carson City, NV: The Change Companies; 2013.
10. Substance Abuse and Mental Health Services Administration and National Institute on Alcohol Abuse and Alcoholism. Medication for the Treatment of Alcohol Use Disorder: A Brief Guide. HHS Publication No. (SMA) 15-4907. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2015.
11. Substance Abuse and Mental Health Services Administration and National Institute on Alcohol Abuse and Alcoholism. Medication for Opioid Use Disorder – Full Document. HHS Publication No. (SMA) 18-5063FULLDOC. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2018.
12. Winn JL, Shealy SE, Kropp GJ, Felkins-Dohm D, Gonzales-Nolas C, Francis E. Housing assistance and case management: Improving access to substance use disorder treatment for homeless veterans. Psychological Serv. 2013;10(2):233-240.
13. Cacciola JS, Alterman AI, DePhilippis D, et al. Development and initial evaluation of the Brief Addiction Monitor (BAM). J Subst Abuse Treatment. 2013;44(3):256-263.
14. McHugh RK, Hearon BA, Otto MW. Cognitive-behavioral therapy for substance use disorders, Psychiatr Clinics North Am. 2010;33:511–525.
15. Karlin, BE, Cross, G. From the laboratory to the therapy room: national dissemination and implementation of evidence-based psychotherapies in the U.S. Department of Veterans Affairs health care system. Am Psychol. 2014;69:19-33.
16. DePhilippis D, Petry NM, Bonn-Miller MO, Rosenbach SB, McKay JR. The national implementation of contingency management (CM) in the Department of Veterans Affairs: attendance at CM sessions and substance use outcomes, Drug Alcohol Dependence. 2018;185:367-373.
Substance use disorders (SUDs) are an increasing public health concern in the US. The 2015 National Survey on Drug Use and Health indicated that 27 million people (8% of the US population) reported current use of recreational drugs or misuse of alcohol or prescription medications.1 The 2013 National Survey on Drug Use and Health indicated that 1.5 million veterans (roughly 6.6%) met the criteria for a SUD.2 More than 50% of patients awaiting entry into a SUD treatment program will never achieve admission due, in part, to long wait times.3-5
National attention has been focused on increasing veteran access to quality treatment based on evidence-based practices (EBPs). Several national legislative measures and treatment protocols have been implemented: the Uniform Mental Health Services in US Department of Veterans Affairs (VA) medical centers and clinics; Veterans Access, Choice, and Accountability Act (2014); Cognitive Behavioral Therapy for Substance Use Disorders (CBT-SUD) Training Program; and the Psychotropic Drug Safety Initiative (PDSI).6-8 Consistent with these directives and in line with American Society of Addiction Medicine (ASAM) and Substance Abuse and Mental Health Services Administration (SAMHSA) guidelines for medication-assisted therapies (MAT),the James A. Haley Veterans’ Hospital (JAHVH) Mental Health and Behavioral Sciences Service (MH&BSS) Substance Use Disorders Service (SUDS) in Tampa, Florida, implemented an evidence-based, treatment-on-demand model of care.9-11
Meeting SUD Treatment Needs
What does the new supervisor of a clinical program do when a 24-employee outpatient VA Alcohol and Drug Addiction Treatment Program (ADATP) has an average 33-day wait time for treatment with 54% of patients lost to care between initial evaluation and admission?12 Patients lacked consistent access to SUD pharmacotherapy. The national VA clinical performance indicators were substandard and there are no additional resources available to apply to the program.
At JAHVH the program supervisor enlisted hospital leadership to support program redesign. The redesign sought to improve operational efficiency and eliminate patient wait time; adopt national standards for assessment and treatment developed by ASAM; implement strictly evidence-based psychotherapeutic treatments; educate program psychiatrists about evidence-based psychopharmacologic treatments and hold them accountable for patient adherence; streamline documentation templates; free clinical providers from nonclinical tasks; create an inpatient addiction consult team to diagnose and treat chronic hospitalized patients with SUDs; ensure continuity of care; and, standardize consistent, objective measures of patient response to treatment to track the program’s effectiveness.
In this article, the authors provide an explanation of the clinical, theoretical foundation and the practical steps taken to design and implement this transformation. They then describe the lessons learned, hoping that their process will serve as a model for those in similar situations.
Program Redesign
July 1, 2015, a new program supervisor was hired and began a 2-month evaluation and analysis of the program with input from leadership, staff, and hospital/community stakeholders. September 1, the monthlong process of developing the redesign began. On September 30 the plan was presented to, and approved by, MH&BSS leadership. October was spent preparing for change with an implementation date of November 2 selected. On November 2, 2015, the complete redesign was implemented.
Needs Assessment
A needs assessment yielded improvement opportunities in program structure (levels of care); clinical content; staff and resource allocation, including clinical workflow and management systems. Staff identified philosophical and practical variance in the program, often leading to confusion for patients and clinicians and potentially resulting in disparate quality care and patient outcomes. Recommendations for addressing these needs included incorporating ASAM guidelines for assignment to clinically appropriate levels of care, implementation of consistent EBPs for SUD and comorbid conditions,9 and emphasis on staff training and development to champion evidence-based program philosophy and service delivery.
The assessment determined that the average waitlist time was 33 days, and patients were required to abstain from substance or alcohol use prior to admission to the Intensive Outpatient Program. If a waitlisted patient relapsed, she or he was removed from the waitlist and denied admission. A study conducted at JAHVH reported that 54% of waitlisted patients in this clinic (prior to November 2, 2015) never were admitted to the program.12 Access to care was considered a significant issue.
Program Implementation
September was spent developing a comprehensive redesign of the SUD clinic. The vision included incorporating all ASAM levels of care; creating an evidence-based, treatment-on-demand model of care; and, securing the support of MH&BSS leadership team, staff, and patients for the redesign. The supervisory clinician interviewed staff both individually and as a group. Clinicians were provided extensive training on EBP for SUDs, including psychotherapies, psychosocial treatments, and psychopharmacologic interventions. A journal club was started with staff-generated topics that offered articles sharing current research, EBPs, and psychotherapeutic techniques, continuing education on substances, and management of coexisting diagnoses. Clinicians increased the frequency of SUD in-service trainings. Psychiatrists provided several Grand Rounds to the MH&BSS service. All counselors were assigned to 1 of the program’s 3 clinical psychologists for individual weekly clinical supervision.
By providing all staff with current, evidence-based, clinically relevant treatment information and emphasizing its relationship to successful patient outcomes, program leadership energized staff support. Staff were encouraged to perform at the top of their scope of practice and engage in training and consultation. Each staff member was delegated a role in the process to inspire buy-in.
Preparation for the Shift
October was spent preparing for a seamless, one-day implementation of proposed changes, including implementation of updated clinical policies, procedures, and document templates (rewritten to include only clinically appropriate information required by VA policy or the Joint Commission); streamlined staff schedules; and utilization of staff-developed and research/policy-driven EBP handbook. Finally, the Brief Addiction Monitor (BAM) was selected as objective criteria to consistently assess patient progress in treatment, and staff were instructed to use this measure at regular intervals and for all levels of care.
Emphasis was placed on ongoing fortification of staff and patient support for the reorganization. For example, the Addiction Severity Index, though not required by policy, was historically used, adding 90 minutes to the evaluation and admission session. Staff agreed to remove this measure to improve clinician availability. Staff were also empowered to rename the redesigned program, and chose Substance Use Disorders Service (SUDS).
Process Changes
To achieve same-day access to clinical care, program leadership created a daily morning orientation group. Patients are scheduled or may attend as a walk-in. The orientation’s purpose is to explain what services are available and to offer each patient an opportunity for immediate evaluation and treatment. Staff schedules were modified to provide patient evaluation appointment slots immediately following orientation. The number of immediate evaluation slots was initially assessed by analyzing the demand for treatment over the previous 6 months, determining the daily mean, and setting the number of slots to accommodate 3 standard deviations above the daily mean. If a patient in a daily orientation group expresses a willingness to engage in treatment, he or she is immediately evaluated by a counselor during a 90-minute session and seen by a psychiatrist to determine whether pharmacologic treatment would be appropriate. If needed, the medication is prescribed that day. The primary purpose of the patient’s initial clinical evaluation is to determine the most appropriate level of care based on ASAM criteria. Also available were 90-minute afternoon evaluation appointments with psychiatrists for patients who walk into the clinic after the morning orientation group had ended.
Prior to the redesign, clinic psychiatrists were minimally prescribing evidence-based pharmacotherapy for sobriety support. At the time of redesign, only 8% of patients diagnosed with opioid use disorders (OUDs) were prescribed buprenorphine/naloxone or naltrexone. Just 1.9% of patients diagnosed with alcohol use disorder (AUD) were prescribed naltrexone or acamprosate. With the redesign, access to these medications has significantly expanded.
All templates were redesigned to ensure consistent documentation. This change decreased the overall provider task burden, and explicitly supported the use of ASAM multidimensional criteria and the Brief Addiction Monitor (BAM) to identify a pretreatment baseline score and track each patient’s clinical progress.13 Evidence-based written curricula were standardized for individual and group psychotherapies to reduce provider and programmatic variation.
The redesign creates distinct levels of care based on ASAM criteria, including harm reduction, ambulatory detoxification, outpatient group and individual psychotherapy, an evidence-based Intensive Outpatient Program (IOP), and aftercare. Application of the ASAM standards has allowed clinicians to make accurate placement decisions that best meet individual patient needs and to serve as effective stewards even with limited treatment and financial resources. Although JAHVH does not have a residential SUD program, procedures were developed to refer veterans to community-based residential treatment programs when appropriate.
Group Therapies
With the redesign, SUDS was no longer exclusively a 12-step program; however, it still supported and recognized the value of this approach for some patients. A psychologist periodically audits group sessions to prevent drift from that group’s curriculum. Counselors are assigned to weekly hour-long clinical supervision sessions with a psychologist to review patient care and reinforce the application of evidence-based individualized treatment.
After reviewing empirical literature and VA directives, CBT-SUD was adopted. It encompasses individual and group interventions, such as motivational interviewing (MI), contingency management (CM), and medication-assisted therapies as primary therapeutic treatment modalities, all of which have demonstrated efficacy as measured by length of sobriety postintervention.9,14,15
Clinical Staff Improvements
Staff were reorganized into 3 interdisciplinary treatment teams. A weekly team meeting is scheduled to coordinate care and discuss the treatment of complex patients. Clinical staff focus has shifted from case-management to diagnosis and treatment; now patients are referred to their primary care team’s social worker for case management services. Allowing clinical staff to focus solely on the diagnosis and clinical treatment of SUDs has significantly enhanced productivity and morale.
Staff receive training in the newly adopted interventions during brief monthly refresher courses provided by inhouse psychologists. Additional training includes participation in local and national SUD teleconferences and onsite meetings with experts in harm reduction and motivational interventions. During the transition, clinicians were encouraged to attend staff resiliency training. Continuing education was available to the SUDS psychiatrists and all inpatient and outpatient psychiatrists at JAHVH. Recently, this educational initiative was expanded to include all primary care and inpatient internal medicine physicians.
Implementation
On November 2, 2015, all planned programmatic changes were simultaneously implemented. On that day, clinician and patient schedules changed, the new EBP curriculum was administered, the use of streamlined documentation procedures began, and daily orientation groups followed by same-day evaluations were initiated.
The pretreatment sobriety requirement was eliminated as a barrier to care, and the program began to use a harm-reduction treatment track as recommended by ASAM guidelines. Patients with urgent or emergent medical or psychiatric problems were immediately assessed by SUDS health care providers and treated in the clinic or transported to the emergency department. Previously unavailable, patient access to ambulatory detoxification was initiated. The prescription of buprenorphine/naloxone for the treatment of OUD treatments increased from 1 prescriber to all 3.
Three months after program reorganization, the leadership reviewed overall workflow, conducted patient satisfaction surveys, and evaluated facility use and productivity. To address patient needs and facilitate optimal use of space, the number of same-day evaluation slots was reduced while the number of individual therapy slots was increased.
Staff meet in workgroups to discuss EBPs and further refine content with feedback from the supervisory clinician and team psychologists who routinely audit group therapy sessions. Staff report ongoing benefit from weekly supervision with a clinical psychologist. An inpatient addiction consultation team that uses existing manpower and resources has been developed.
Program Goals and Outcomes
The SUDS program serves more patients at multiple levels of standardized care with 2 fewer full-time positions. One counselor and one advanced practice registered nurse were reallocated to different programs within the JAHVH VA mental health clinic. Following a review of all program clinic profiles in the VA’s Computerized Patient Record System (CPRS) for utilization, accuracy, and necessity, and allowing for accurate program data capture, the transition resulted in a reduction of distinct clinics from 114 to 67 (-58.7%). In fiscal year 2018, review of CPRS yielded 19,786 total visits (3,645 unique visits).
Eliminate Patient Wait Tme
Patient wait time, as measured in CPRS from date of initial evaluation to date of treatment was reduced from an average of 33 days to 0 within 2 weeks of program implementation. A review of CPRS data also indicated that preadmission attrition dropped from 54% to < 1%; all patients desiring treatment are assigned a counselor and treatment is initiated the same day.
Adopt ASAM Criteria
After the redesign, patients have received more appropriate care based on individualized treatment plans. Due to the implementation of a fluid and supportive model, patients can move through levels of care as clinical need dictates rather than failing treatment and having to reengage. Staff receive ongoing education on the use of ASAM. Evaluation and treatment plan templates now reflect assignment to level of care rationale using ASAM guidelines.
Use of Evidence-Based Psychotherapeutic Treatments
More consistent, coordinated, and effective psychotherapies have improved patient care. The program’s previous issues with patients receiving conflicting treatment guidance from different providers has been resolved. Duplicate and ineffective treatments, including multiple readmissions to the IOP level of care, overemphasis of abstinence-based modalities for patients in active use, and referrals to inpatient SUD care under the assumption that “higher level of care is better” have ceased through staff education, leadership support, and appropriate staffing and communication. Review of patient advocate complaints tracked by and resolved by the service demonstrated an 80% decrease in patient advocate complaints regarding SUD clinic services.
Implement Evidence-Based Psychopharmacologic Treatments
The pharmacotherapy education initiative yielded tangible benefits and is likely a significant contributor to the program’s improved clinical outcomes. Prescription of pharmacotherapy for patients with OUD has risen from 8% to 25.1% in eligible patients. Appropriate medication prescription for the treatment of AUD has risen from 1.9% to 9.8% in eligible patients. These data are reflected in the VA Pharmaceutical Drug Safety Initiative (PDSI) dashboard.
Streamline Documentation
Significantly reducing the charting burden was likely a significant contributor to increased provider productivity and improved patient outcomes. Regular meetings between SUDS leadership and clinical informatics ensure that standardized note templates meet hospital policy and gather all necessary accreditation information.
Improve Employee Morale
Increased staff morale is indicated by a noticeable reduction in employee sick days; a decrease of > 20% (over the same time period the previous year), per the VA electronic timekeeping system, during the first 6 months following the November 2 program implementation.
SUDS Inpatient Addiction Consult Team
In January of 2017, SUDS began an inpatient medicine consultation service to offer evaluation, pharmacotherapy, and supportive counseling to patients diagnosed with SUDs who had been admitted to inpatient medical and surgical services. This team includes existing SUDS staff members reallocated to the inpatient service, is led by a SUDS psychiatrist, and includes 3 multidisciplinary clinicians with extensive training in assessment, diagnosis, and treatment planning of SUDs and comorbid conditions. Prior to implementation, the SUDS inpatient addiction consult team met with hospital leadership and attending physicians for inpatient medicine and psychiatry physicians.
To access the SUDS inpatient addiction consult team, physicians request a consult. Patients are offered an evaluation and are assigned to a level of care with orders for outpatient appointments with a counselor and psychiatrist within 7 days of hospital discharge. Medication-assisted treatment for chronic SUDs is implemented while patients remain admitted to the inpatient medical service. In fiscal year 2018, the SUDS inpatient addiction consult team performed 1,428 inpatient evaluations.
Consistent Treatment Outcome Measures
The BAM is a clinical tool designed to measure patient outcomes in substance use disorders.13 Its 17-item scale measures substance use risk factors that may lead to relapse, and protective factors that are recovery-oriented behaviors that help prevent relapse. It demonstrates sensitivity to change and has excellent test-retest reliability. The BAM has been in use in the addictions treatment program since 2011 but was previously administered only after admission to the IOP and again after a 30- to 90-day follow-up period. Since the program redesign, all SUDS patients are administered the BAM at their initial evaluation and at each individual appointment thereafter. The initial BAM assessment encompasses the previous 30 days; this 30-day version is also used for monthly follow-ups. For BAM assessments that occur within 30 days from the time of the last evaluation, a 7-day version is used. Prior to the redesign, about 24% of patients received a follow-up 30-day BAM assessment.12 Per CPRS review of veterans participating in continued treatment, the rate rose to 100% 3 months after the redesign.
When program staff compared preredesign and postredesign BAM data, they detected significant clinical differences. Data demonstrate a 22.2% improvement in protective factors, including patient confidence in their ability to remain abstinent; engaging in self-help activities, such as attending Alcoholics Anonymous meetings; engaging in organized spiritual activities; going to school, working, or volunteering; securing a regular income; and time spent with friends or family who are supportive of recovery.
The data also show a marked reduction in substance use at follow-up points in treatment and a corresponding decrease in risk factors. One item of the BAM assesses patient level of satisfaction with their treatment. Since the redesign, patients report that they are “considerably” satisfied with their SUD treatment.
Currently, program staff are conducting a review of BAM scores by level of care to further parse the impact of various treatments and best target patient need using measurement-based care and EBP, such as contingency management, which provides small monetary incentives when patients maintain clean urine drug screens.16 In addition, the program plans to achieve more uniformity in BAM assessment intervals at all levels of care, and possibly also integrate BAM data into SUD group therapies. Correlation of the BAM scores to other metrics, such as readmission to inpatient medicine, relapse, urine drug screen, or critical laboratory values, will provide additional insight into impact of programmatic changes.
Discussion
Feedback from other clinics and services within the hospital has been very positive. Some providers have reported that they appreciate the ease and availability of access to SUDS. Additionally, patients engaged in treatment prior to the redesign have been contacted for an updated evaluation and assignment to a counselor and appropriate level of care. From the staff’s perspective, the shift to immediate access to care has allowed a more streamlined process with fewer hurdles for patient admission. Staff report that they now feel empowered to meet the needs of veterans in a comprehensive, same-day fashion.
The success of our redesign was contingent on internal and external sta
The successful implementation of these changes has revealed several important elements regarding patient care. The first lesson was that improving access and integrating best practices is possible without additional resources, outside monies, or disruption to patient services. With the support of MH&BSS leadership, the program streamlined existing processes and used both staff and clinic resources more efficiently.
The second lesson involved the importance of continually reviewing and revising standard operating procedures to match the needs of the current patient population. Policies and procedures that once were viewed as potential barriers to change have been replaced with a more flexible approach and willingness to evolve.
As a result, far fewer patients have been lost to treatment. The time and resources that staff historically dedicated to nonclinical patient care are now redirected to immediate service provision. This increase in operational efficiency and treatment efficacy has resulted in a boost to staff morale, even during a time of immense change and increased productivity. Program staff are now able to personally witness the significant changes in their patients’ lives and feel a sense of pride at being a member of a hard-working team that provides the highest quality of substance use treatment. This is critical to job satisfaction and meets the VA mission to provide timely, effective, and evidence-based treatments to patients.
Conclusion
JAHVH strives to continue to provide the highest quality of SUD treatment available. Future directions aim to streamline clinic operations by constantly monitoring and reviewing workloads, while also considering patient feedback. A continuous review of EBP is part of our clinic’s culture. Program leadership endeavors to promote an open environment where providers can share their triumphs and frustrations and foster a team approach to problem solving. Further plans include expanding the range of treatment levels offered by developing a residential SUD treatment facility.
Substance use disorders (SUDs) are an increasing public health concern in the US. The 2015 National Survey on Drug Use and Health indicated that 27 million people (8% of the US population) reported current use of recreational drugs or misuse of alcohol or prescription medications.1 The 2013 National Survey on Drug Use and Health indicated that 1.5 million veterans (roughly 6.6%) met the criteria for a SUD.2 More than 50% of patients awaiting entry into a SUD treatment program will never achieve admission due, in part, to long wait times.3-5
National attention has been focused on increasing veteran access to quality treatment based on evidence-based practices (EBPs). Several national legislative measures and treatment protocols have been implemented: the Uniform Mental Health Services in US Department of Veterans Affairs (VA) medical centers and clinics; Veterans Access, Choice, and Accountability Act (2014); Cognitive Behavioral Therapy for Substance Use Disorders (CBT-SUD) Training Program; and the Psychotropic Drug Safety Initiative (PDSI).6-8 Consistent with these directives and in line with American Society of Addiction Medicine (ASAM) and Substance Abuse and Mental Health Services Administration (SAMHSA) guidelines for medication-assisted therapies (MAT),the James A. Haley Veterans’ Hospital (JAHVH) Mental Health and Behavioral Sciences Service (MH&BSS) Substance Use Disorders Service (SUDS) in Tampa, Florida, implemented an evidence-based, treatment-on-demand model of care.9-11
Meeting SUD Treatment Needs
What does the new supervisor of a clinical program do when a 24-employee outpatient VA Alcohol and Drug Addiction Treatment Program (ADATP) has an average 33-day wait time for treatment with 54% of patients lost to care between initial evaluation and admission?12 Patients lacked consistent access to SUD pharmacotherapy. The national VA clinical performance indicators were substandard and there are no additional resources available to apply to the program.
At JAHVH the program supervisor enlisted hospital leadership to support program redesign. The redesign sought to improve operational efficiency and eliminate patient wait time; adopt national standards for assessment and treatment developed by ASAM; implement strictly evidence-based psychotherapeutic treatments; educate program psychiatrists about evidence-based psychopharmacologic treatments and hold them accountable for patient adherence; streamline documentation templates; free clinical providers from nonclinical tasks; create an inpatient addiction consult team to diagnose and treat chronic hospitalized patients with SUDs; ensure continuity of care; and, standardize consistent, objective measures of patient response to treatment to track the program’s effectiveness.
In this article, the authors provide an explanation of the clinical, theoretical foundation and the practical steps taken to design and implement this transformation. They then describe the lessons learned, hoping that their process will serve as a model for those in similar situations.
Program Redesign
July 1, 2015, a new program supervisor was hired and began a 2-month evaluation and analysis of the program with input from leadership, staff, and hospital/community stakeholders. September 1, the monthlong process of developing the redesign began. On September 30 the plan was presented to, and approved by, MH&BSS leadership. October was spent preparing for change with an implementation date of November 2 selected. On November 2, 2015, the complete redesign was implemented.
Needs Assessment
A needs assessment yielded improvement opportunities in program structure (levels of care); clinical content; staff and resource allocation, including clinical workflow and management systems. Staff identified philosophical and practical variance in the program, often leading to confusion for patients and clinicians and potentially resulting in disparate quality care and patient outcomes. Recommendations for addressing these needs included incorporating ASAM guidelines for assignment to clinically appropriate levels of care, implementation of consistent EBPs for SUD and comorbid conditions,9 and emphasis on staff training and development to champion evidence-based program philosophy and service delivery.
The assessment determined that the average waitlist time was 33 days, and patients were required to abstain from substance or alcohol use prior to admission to the Intensive Outpatient Program. If a waitlisted patient relapsed, she or he was removed from the waitlist and denied admission. A study conducted at JAHVH reported that 54% of waitlisted patients in this clinic (prior to November 2, 2015) never were admitted to the program.12 Access to care was considered a significant issue.
Program Implementation
September was spent developing a comprehensive redesign of the SUD clinic. The vision included incorporating all ASAM levels of care; creating an evidence-based, treatment-on-demand model of care; and, securing the support of MH&BSS leadership team, staff, and patients for the redesign. The supervisory clinician interviewed staff both individually and as a group. Clinicians were provided extensive training on EBP for SUDs, including psychotherapies, psychosocial treatments, and psychopharmacologic interventions. A journal club was started with staff-generated topics that offered articles sharing current research, EBPs, and psychotherapeutic techniques, continuing education on substances, and management of coexisting diagnoses. Clinicians increased the frequency of SUD in-service trainings. Psychiatrists provided several Grand Rounds to the MH&BSS service. All counselors were assigned to 1 of the program’s 3 clinical psychologists for individual weekly clinical supervision.
By providing all staff with current, evidence-based, clinically relevant treatment information and emphasizing its relationship to successful patient outcomes, program leadership energized staff support. Staff were encouraged to perform at the top of their scope of practice and engage in training and consultation. Each staff member was delegated a role in the process to inspire buy-in.
Preparation for the Shift
October was spent preparing for a seamless, one-day implementation of proposed changes, including implementation of updated clinical policies, procedures, and document templates (rewritten to include only clinically appropriate information required by VA policy or the Joint Commission); streamlined staff schedules; and utilization of staff-developed and research/policy-driven EBP handbook. Finally, the Brief Addiction Monitor (BAM) was selected as objective criteria to consistently assess patient progress in treatment, and staff were instructed to use this measure at regular intervals and for all levels of care.
Emphasis was placed on ongoing fortification of staff and patient support for the reorganization. For example, the Addiction Severity Index, though not required by policy, was historically used, adding 90 minutes to the evaluation and admission session. Staff agreed to remove this measure to improve clinician availability. Staff were also empowered to rename the redesigned program, and chose Substance Use Disorders Service (SUDS).
Process Changes
To achieve same-day access to clinical care, program leadership created a daily morning orientation group. Patients are scheduled or may attend as a walk-in. The orientation’s purpose is to explain what services are available and to offer each patient an opportunity for immediate evaluation and treatment. Staff schedules were modified to provide patient evaluation appointment slots immediately following orientation. The number of immediate evaluation slots was initially assessed by analyzing the demand for treatment over the previous 6 months, determining the daily mean, and setting the number of slots to accommodate 3 standard deviations above the daily mean. If a patient in a daily orientation group expresses a willingness to engage in treatment, he or she is immediately evaluated by a counselor during a 90-minute session and seen by a psychiatrist to determine whether pharmacologic treatment would be appropriate. If needed, the medication is prescribed that day. The primary purpose of the patient’s initial clinical evaluation is to determine the most appropriate level of care based on ASAM criteria. Also available were 90-minute afternoon evaluation appointments with psychiatrists for patients who walk into the clinic after the morning orientation group had ended.
Prior to the redesign, clinic psychiatrists were minimally prescribing evidence-based pharmacotherapy for sobriety support. At the time of redesign, only 8% of patients diagnosed with opioid use disorders (OUDs) were prescribed buprenorphine/naloxone or naltrexone. Just 1.9% of patients diagnosed with alcohol use disorder (AUD) were prescribed naltrexone or acamprosate. With the redesign, access to these medications has significantly expanded.
All templates were redesigned to ensure consistent documentation. This change decreased the overall provider task burden, and explicitly supported the use of ASAM multidimensional criteria and the Brief Addiction Monitor (BAM) to identify a pretreatment baseline score and track each patient’s clinical progress.13 Evidence-based written curricula were standardized for individual and group psychotherapies to reduce provider and programmatic variation.
The redesign creates distinct levels of care based on ASAM criteria, including harm reduction, ambulatory detoxification, outpatient group and individual psychotherapy, an evidence-based Intensive Outpatient Program (IOP), and aftercare. Application of the ASAM standards has allowed clinicians to make accurate placement decisions that best meet individual patient needs and to serve as effective stewards even with limited treatment and financial resources. Although JAHVH does not have a residential SUD program, procedures were developed to refer veterans to community-based residential treatment programs when appropriate.
Group Therapies
With the redesign, SUDS was no longer exclusively a 12-step program; however, it still supported and recognized the value of this approach for some patients. A psychologist periodically audits group sessions to prevent drift from that group’s curriculum. Counselors are assigned to weekly hour-long clinical supervision sessions with a psychologist to review patient care and reinforce the application of evidence-based individualized treatment.
After reviewing empirical literature and VA directives, CBT-SUD was adopted. It encompasses individual and group interventions, such as motivational interviewing (MI), contingency management (CM), and medication-assisted therapies as primary therapeutic treatment modalities, all of which have demonstrated efficacy as measured by length of sobriety postintervention.9,14,15
Clinical Staff Improvements
Staff were reorganized into 3 interdisciplinary treatment teams. A weekly team meeting is scheduled to coordinate care and discuss the treatment of complex patients. Clinical staff focus has shifted from case-management to diagnosis and treatment; now patients are referred to their primary care team’s social worker for case management services. Allowing clinical staff to focus solely on the diagnosis and clinical treatment of SUDs has significantly enhanced productivity and morale.
Staff receive training in the newly adopted interventions during brief monthly refresher courses provided by inhouse psychologists. Additional training includes participation in local and national SUD teleconferences and onsite meetings with experts in harm reduction and motivational interventions. During the transition, clinicians were encouraged to attend staff resiliency training. Continuing education was available to the SUDS psychiatrists and all inpatient and outpatient psychiatrists at JAHVH. Recently, this educational initiative was expanded to include all primary care and inpatient internal medicine physicians.
Implementation
On November 2, 2015, all planned programmatic changes were simultaneously implemented. On that day, clinician and patient schedules changed, the new EBP curriculum was administered, the use of streamlined documentation procedures began, and daily orientation groups followed by same-day evaluations were initiated.
The pretreatment sobriety requirement was eliminated as a barrier to care, and the program began to use a harm-reduction treatment track as recommended by ASAM guidelines. Patients with urgent or emergent medical or psychiatric problems were immediately assessed by SUDS health care providers and treated in the clinic or transported to the emergency department. Previously unavailable, patient access to ambulatory detoxification was initiated. The prescription of buprenorphine/naloxone for the treatment of OUD treatments increased from 1 prescriber to all 3.
Three months after program reorganization, the leadership reviewed overall workflow, conducted patient satisfaction surveys, and evaluated facility use and productivity. To address patient needs and facilitate optimal use of space, the number of same-day evaluation slots was reduced while the number of individual therapy slots was increased.
Staff meet in workgroups to discuss EBPs and further refine content with feedback from the supervisory clinician and team psychologists who routinely audit group therapy sessions. Staff report ongoing benefit from weekly supervision with a clinical psychologist. An inpatient addiction consultation team that uses existing manpower and resources has been developed.
Program Goals and Outcomes
The SUDS program serves more patients at multiple levels of standardized care with 2 fewer full-time positions. One counselor and one advanced practice registered nurse were reallocated to different programs within the JAHVH VA mental health clinic. Following a review of all program clinic profiles in the VA’s Computerized Patient Record System (CPRS) for utilization, accuracy, and necessity, and allowing for accurate program data capture, the transition resulted in a reduction of distinct clinics from 114 to 67 (-58.7%). In fiscal year 2018, review of CPRS yielded 19,786 total visits (3,645 unique visits).
Eliminate Patient Wait Tme
Patient wait time, as measured in CPRS from date of initial evaluation to date of treatment was reduced from an average of 33 days to 0 within 2 weeks of program implementation. A review of CPRS data also indicated that preadmission attrition dropped from 54% to < 1%; all patients desiring treatment are assigned a counselor and treatment is initiated the same day.
Adopt ASAM Criteria
After the redesign, patients have received more appropriate care based on individualized treatment plans. Due to the implementation of a fluid and supportive model, patients can move through levels of care as clinical need dictates rather than failing treatment and having to reengage. Staff receive ongoing education on the use of ASAM. Evaluation and treatment plan templates now reflect assignment to level of care rationale using ASAM guidelines.
Use of Evidence-Based Psychotherapeutic Treatments
More consistent, coordinated, and effective psychotherapies have improved patient care. The program’s previous issues with patients receiving conflicting treatment guidance from different providers has been resolved. Duplicate and ineffective treatments, including multiple readmissions to the IOP level of care, overemphasis of abstinence-based modalities for patients in active use, and referrals to inpatient SUD care under the assumption that “higher level of care is better” have ceased through staff education, leadership support, and appropriate staffing and communication. Review of patient advocate complaints tracked by and resolved by the service demonstrated an 80% decrease in patient advocate complaints regarding SUD clinic services.
Implement Evidence-Based Psychopharmacologic Treatments
The pharmacotherapy education initiative yielded tangible benefits and is likely a significant contributor to the program’s improved clinical outcomes. Prescription of pharmacotherapy for patients with OUD has risen from 8% to 25.1% in eligible patients. Appropriate medication prescription for the treatment of AUD has risen from 1.9% to 9.8% in eligible patients. These data are reflected in the VA Pharmaceutical Drug Safety Initiative (PDSI) dashboard.
Streamline Documentation
Significantly reducing the charting burden was likely a significant contributor to increased provider productivity and improved patient outcomes. Regular meetings between SUDS leadership and clinical informatics ensure that standardized note templates meet hospital policy and gather all necessary accreditation information.
Improve Employee Morale
Increased staff morale is indicated by a noticeable reduction in employee sick days; a decrease of > 20% (over the same time period the previous year), per the VA electronic timekeeping system, during the first 6 months following the November 2 program implementation.
SUDS Inpatient Addiction Consult Team
In January of 2017, SUDS began an inpatient medicine consultation service to offer evaluation, pharmacotherapy, and supportive counseling to patients diagnosed with SUDs who had been admitted to inpatient medical and surgical services. This team includes existing SUDS staff members reallocated to the inpatient service, is led by a SUDS psychiatrist, and includes 3 multidisciplinary clinicians with extensive training in assessment, diagnosis, and treatment planning of SUDs and comorbid conditions. Prior to implementation, the SUDS inpatient addiction consult team met with hospital leadership and attending physicians for inpatient medicine and psychiatry physicians.
To access the SUDS inpatient addiction consult team, physicians request a consult. Patients are offered an evaluation and are assigned to a level of care with orders for outpatient appointments with a counselor and psychiatrist within 7 days of hospital discharge. Medication-assisted treatment for chronic SUDs is implemented while patients remain admitted to the inpatient medical service. In fiscal year 2018, the SUDS inpatient addiction consult team performed 1,428 inpatient evaluations.
Consistent Treatment Outcome Measures
The BAM is a clinical tool designed to measure patient outcomes in substance use disorders.13 Its 17-item scale measures substance use risk factors that may lead to relapse, and protective factors that are recovery-oriented behaviors that help prevent relapse. It demonstrates sensitivity to change and has excellent test-retest reliability. The BAM has been in use in the addictions treatment program since 2011 but was previously administered only after admission to the IOP and again after a 30- to 90-day follow-up period. Since the program redesign, all SUDS patients are administered the BAM at their initial evaluation and at each individual appointment thereafter. The initial BAM assessment encompasses the previous 30 days; this 30-day version is also used for monthly follow-ups. For BAM assessments that occur within 30 days from the time of the last evaluation, a 7-day version is used. Prior to the redesign, about 24% of patients received a follow-up 30-day BAM assessment.12 Per CPRS review of veterans participating in continued treatment, the rate rose to 100% 3 months after the redesign.
When program staff compared preredesign and postredesign BAM data, they detected significant clinical differences. Data demonstrate a 22.2% improvement in protective factors, including patient confidence in their ability to remain abstinent; engaging in self-help activities, such as attending Alcoholics Anonymous meetings; engaging in organized spiritual activities; going to school, working, or volunteering; securing a regular income; and time spent with friends or family who are supportive of recovery.
The data also show a marked reduction in substance use at follow-up points in treatment and a corresponding decrease in risk factors. One item of the BAM assesses patient level of satisfaction with their treatment. Since the redesign, patients report that they are “considerably” satisfied with their SUD treatment.
Currently, program staff are conducting a review of BAM scores by level of care to further parse the impact of various treatments and best target patient need using measurement-based care and EBP, such as contingency management, which provides small monetary incentives when patients maintain clean urine drug screens.16 In addition, the program plans to achieve more uniformity in BAM assessment intervals at all levels of care, and possibly also integrate BAM data into SUD group therapies. Correlation of the BAM scores to other metrics, such as readmission to inpatient medicine, relapse, urine drug screen, or critical laboratory values, will provide additional insight into impact of programmatic changes.
Discussion
Feedback from other clinics and services within the hospital has been very positive. Some providers have reported that they appreciate the ease and availability of access to SUDS. Additionally, patients engaged in treatment prior to the redesign have been contacted for an updated evaluation and assignment to a counselor and appropriate level of care. From the staff’s perspective, the shift to immediate access to care has allowed a more streamlined process with fewer hurdles for patient admission. Staff report that they now feel empowered to meet the needs of veterans in a comprehensive, same-day fashion.
The success of our redesign was contingent on internal and external sta
The successful implementation of these changes has revealed several important elements regarding patient care. The first lesson was that improving access and integrating best practices is possible without additional resources, outside monies, or disruption to patient services. With the support of MH&BSS leadership, the program streamlined existing processes and used both staff and clinic resources more efficiently.
The second lesson involved the importance of continually reviewing and revising standard operating procedures to match the needs of the current patient population. Policies and procedures that once were viewed as potential barriers to change have been replaced with a more flexible approach and willingness to evolve.
As a result, far fewer patients have been lost to treatment. The time and resources that staff historically dedicated to nonclinical patient care are now redirected to immediate service provision. This increase in operational efficiency and treatment efficacy has resulted in a boost to staff morale, even during a time of immense change and increased productivity. Program staff are now able to personally witness the significant changes in their patients’ lives and feel a sense of pride at being a member of a hard-working team that provides the highest quality of substance use treatment. This is critical to job satisfaction and meets the VA mission to provide timely, effective, and evidence-based treatments to patients.
Conclusion
JAHVH strives to continue to provide the highest quality of SUD treatment available. Future directions aim to streamline clinic operations by constantly monitoring and reviewing workloads, while also considering patient feedback. A continuous review of EBP is part of our clinic’s culture. Program leadership endeavors to promote an open environment where providers can share their triumphs and frustrations and foster a team approach to problem solving. Further plans include expanding the range of treatment levels offered by developing a residential SUD treatment facility.
1. Substance Abuse and Mental Health Services Administration. 2015 National Survey on Drug Use and Health: Summary of the Effects of the 2015 NSDUH Questionnaire Redesign: Implications for Data Users. https://www.samhsa.gov/data/sites/default/files/NSDUH-TrendBreak-2015.pdf. Published June 2016. Accessed June 12, 2019.
2. Substance Abuse and Mental Health Services Administration. Results from the 2013 National Survey on Drug Use and Health: Summary of National Findings. NSDUH Series H-48, HHS Publication No. (SMA) 14-4863. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2014.
3. Donovan DM, Rosengren DB, Downey L, Cox GB, Sloan PDSKL. Attrition prevention with individuals awaiting publicly funded drug treatment. Addiction. 2001;96(8):1149-1160.
4. Hser Y, Maglione M, Polinsky ML, Anglin MD. Predicting treatment entry among treatment-seeking drug abusers. J Subst Abuse Treatment. 1997;15(3):213-220.
5. Stark MJ, Campbell BK, Brinkerhoff CV. “Hello, may we help you?” A study of attrition prevention at the time of the first phone contact with substance-abusing clients. Am J Drug Alcohol Abuse. 1990;16:67-76.
6. US Department of Veterans Affairs. Uniform Mental Health Services in VA Medical Centers and Clinics. VHA Handbook 1160.01. https://www.va.gov/vhapublications/ViewPublication.asp?pub_ID=1762. Updated November 2015. Accessed December 12, 2017.
7. Veterans Access, Choice and Accountability Act of 2014, 2 USC § 933.
8. DeMarce JM, Gnys M, Raffa SD, Karlin, BE. Cognitive Behavioral Therapy for Substance Use Disorders Among Veterans: Therapist Manual. Washington, DC: US Department of Veterans Affairs; 2014.
9. Mee-Lee D, Shulman GD, Fishman MJ, Gastfriend DR, Miller MM, eds. The ASAM Criteria: Treatment Criteria for Addictive, Substance-Related, and Co-Occurring Conditions. 3rd ed. Carson City, NV: The Change Companies; 2013.
10. Substance Abuse and Mental Health Services Administration and National Institute on Alcohol Abuse and Alcoholism. Medication for the Treatment of Alcohol Use Disorder: A Brief Guide. HHS Publication No. (SMA) 15-4907. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2015.
11. Substance Abuse and Mental Health Services Administration and National Institute on Alcohol Abuse and Alcoholism. Medication for Opioid Use Disorder – Full Document. HHS Publication No. (SMA) 18-5063FULLDOC. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2018.
12. Winn JL, Shealy SE, Kropp GJ, Felkins-Dohm D, Gonzales-Nolas C, Francis E. Housing assistance and case management: Improving access to substance use disorder treatment for homeless veterans. Psychological Serv. 2013;10(2):233-240.
13. Cacciola JS, Alterman AI, DePhilippis D, et al. Development and initial evaluation of the Brief Addiction Monitor (BAM). J Subst Abuse Treatment. 2013;44(3):256-263.
14. McHugh RK, Hearon BA, Otto MW. Cognitive-behavioral therapy for substance use disorders, Psychiatr Clinics North Am. 2010;33:511–525.
15. Karlin, BE, Cross, G. From the laboratory to the therapy room: national dissemination and implementation of evidence-based psychotherapies in the U.S. Department of Veterans Affairs health care system. Am Psychol. 2014;69:19-33.
16. DePhilippis D, Petry NM, Bonn-Miller MO, Rosenbach SB, McKay JR. The national implementation of contingency management (CM) in the Department of Veterans Affairs: attendance at CM sessions and substance use outcomes, Drug Alcohol Dependence. 2018;185:367-373.
1. Substance Abuse and Mental Health Services Administration. 2015 National Survey on Drug Use and Health: Summary of the Effects of the 2015 NSDUH Questionnaire Redesign: Implications for Data Users. https://www.samhsa.gov/data/sites/default/files/NSDUH-TrendBreak-2015.pdf. Published June 2016. Accessed June 12, 2019.
2. Substance Abuse and Mental Health Services Administration. Results from the 2013 National Survey on Drug Use and Health: Summary of National Findings. NSDUH Series H-48, HHS Publication No. (SMA) 14-4863. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2014.
3. Donovan DM, Rosengren DB, Downey L, Cox GB, Sloan PDSKL. Attrition prevention with individuals awaiting publicly funded drug treatment. Addiction. 2001;96(8):1149-1160.
4. Hser Y, Maglione M, Polinsky ML, Anglin MD. Predicting treatment entry among treatment-seeking drug abusers. J Subst Abuse Treatment. 1997;15(3):213-220.
5. Stark MJ, Campbell BK, Brinkerhoff CV. “Hello, may we help you?” A study of attrition prevention at the time of the first phone contact with substance-abusing clients. Am J Drug Alcohol Abuse. 1990;16:67-76.
6. US Department of Veterans Affairs. Uniform Mental Health Services in VA Medical Centers and Clinics. VHA Handbook 1160.01. https://www.va.gov/vhapublications/ViewPublication.asp?pub_ID=1762. Updated November 2015. Accessed December 12, 2017.
7. Veterans Access, Choice and Accountability Act of 2014, 2 USC § 933.
8. DeMarce JM, Gnys M, Raffa SD, Karlin, BE. Cognitive Behavioral Therapy for Substance Use Disorders Among Veterans: Therapist Manual. Washington, DC: US Department of Veterans Affairs; 2014.
9. Mee-Lee D, Shulman GD, Fishman MJ, Gastfriend DR, Miller MM, eds. The ASAM Criteria: Treatment Criteria for Addictive, Substance-Related, and Co-Occurring Conditions. 3rd ed. Carson City, NV: The Change Companies; 2013.
10. Substance Abuse and Mental Health Services Administration and National Institute on Alcohol Abuse and Alcoholism. Medication for the Treatment of Alcohol Use Disorder: A Brief Guide. HHS Publication No. (SMA) 15-4907. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2015.
11. Substance Abuse and Mental Health Services Administration and National Institute on Alcohol Abuse and Alcoholism. Medication for Opioid Use Disorder – Full Document. HHS Publication No. (SMA) 18-5063FULLDOC. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2018.
12. Winn JL, Shealy SE, Kropp GJ, Felkins-Dohm D, Gonzales-Nolas C, Francis E. Housing assistance and case management: Improving access to substance use disorder treatment for homeless veterans. Psychological Serv. 2013;10(2):233-240.
13. Cacciola JS, Alterman AI, DePhilippis D, et al. Development and initial evaluation of the Brief Addiction Monitor (BAM). J Subst Abuse Treatment. 2013;44(3):256-263.
14. McHugh RK, Hearon BA, Otto MW. Cognitive-behavioral therapy for substance use disorders, Psychiatr Clinics North Am. 2010;33:511–525.
15. Karlin, BE, Cross, G. From the laboratory to the therapy room: national dissemination and implementation of evidence-based psychotherapies in the U.S. Department of Veterans Affairs health care system. Am Psychol. 2014;69:19-33.
16. DePhilippis D, Petry NM, Bonn-Miller MO, Rosenbach SB, McKay JR. The national implementation of contingency management (CM) in the Department of Veterans Affairs: attendance at CM sessions and substance use outcomes, Drug Alcohol Dependence. 2018;185:367-373.
Clinically Impressive Tophaceous Gout With Significant Bony Destruction
Gout is an in inflammatory condition that is generally characterized by red, hot, swollen, and painful joints. The disease is often associated with increased serum uric acid levels; which are considered elevated when they are > 6 mg/dL in women and > 7 mg/dL in men. When gout affects joints, the subchondral bone may be involved, leading to destructive, painful changes. This article presents the case of a patient diagnosed with tophaceous gout of the left second toe with bony erosive changes and calcified nodules noted on magnetic resonance images (MRI).
Case Presentation
A 70-year-old white male presented to the podiatry clinic for a left second-toe mass that was diagnosed as tophaceous gout after being seen by his primary care physician. The patient reported that the mass had slowly grown over the past 10 years. At presentation, he had a 0.2-cm ulcer on the dorsal aspect of the left second-toe mass. The patient stated that the ulcer had recently appeared with some exudate; however, there was no active drainage of material. The patient had a 20-year history of gout that was untreated with dietary modifications or medication. The patient also stated that although the left second-toe mass did not cause any pain on rest, it did cause pain with shoe gear and during ambulation. A community-based podiatrist had recommended amputation of the second toe and as a result the patient was seeking a second opinion at the US Department of Veterans Affairs (VA) Lebanon VA Medical Center (VAMC) in Pennsylvania. The patient had not had acute gouty attacks during the past 10 years.
The patient’s medical history was significant for uncontrolled gout, hyperlipidemia, coronary artery disease with a 4-vessel coronary artery bypass grafting, impaired fasting glucose, prostate cancer that was in remission, alcohol misuse (currently limited to ≤ 2 drinks per night), and 30-year history of cigarette smoking (quit 2 months prior to visit).1,2
At his first visit to the clinic, an examination revealed distinct evidence of bulging of the soft tissues of the second toe of the left foot with a dry sinus tract that was not malodorous (Figure 1). The left second toe was erythematous and edematous. A local increase in skin temperature was present on the second toe of the left foot compared with that of the contralateral foot and other toes. The dorsalis pedis and tibialis posterior pulses were easily palpated, and the capillary return was within normal limits. Palpation of the left second-toe plantar elicited mild tenderness. Crepitation was not present at the left second metatarsophalangeal joint (MPJ) nor at the interphalangeal joint. There was restricted range of motion at the left second MPJ compared with that of the right foot and no motion at the proximal interphalangeal joint. The movement at the left second metatarsophalangeal elicited tenderness. The mass on the left second toe was firm, nonpulsatile, oval-shaped, with a white pigmented consistency that measured 2 cm x 2.5 cm.
There were no deficits present on the neurologic examination, which was noncontributory. There also was no gross evidence of motor weakness. His initial temporal temperature was 98.2° F. The initial laboratory findings were uric acid, 9.5 mg/dL; fasting glucose, 117 g/dL; estimated glomerular filtration rate, 55 mL/min/1.73 m2; erythrocyte sedimentation rate, 6.5 mm/h; and white blood count, 6.6 K/uL.3,4-6
Diagnostic imaging included X-rays of the patient’s feet and a MRI of the left foot. The X-rays showed diffusely osteopenic bones with severe soft tissue swelling surrounding the second proximal interphalangeal joint. Also present was moderate soft tissue swelling at the level of the first metatarsophalangeal joint accompanied by extensive erosions at both of these joints, most pronounced at the second proximal interphalangeal joint. Also, there was narrowing at the first MPJ and the first interphalangeal joint. Erosive changes at the tarsometatarsal articulations and small lucencies within the navicular/midfoot joint were suggestive of additional gouty erosions. A small-to-moderate posterior calcaneal enthesophyte was present as well as a tiny calcaneal enthesophyte (Figure 2).
A MRI showed a destructive soft tissue mass, resulting in overhanging edges, with foci of calcifications centered about the proximal interphalangeal joint of the second toe, which is consistent with a calcified tophaceous gout nodule. The widest dimension of the mass measured 3.2 cm. There also was a less prominent calcified tophaceous gout nodule at the first MPJ. There were additional small punched-out lesions involving the bases of the first through fourth metatarsi and at the distal aspect of the first cuneiform in keeping with gouty arthropathy (Figure 3).4,7-10
The initial treatment plan presented to the patient was to amputate the left second toe. But the patient decided against amputation. Treatment guidelines for allopurinol are to titrate in 100-mg increments every 2 weeks until the serum uric acid levels are consistently < 6, tophi resolve, and the patient should be free of gout attacks.11 We initiated uric acid-lowering therapy with allopurinol at 50 mg/d for 7 days, increasing to 100 mg/d for 7 days, then to 200 mg/d for 10 days. The patient’s serum uric acid level was checked at 200 mg/d. Our patient could not tolerate the allopurinol and decided to discontinue treatment. After 1 year he started having severe pain and returned to have the toe amputated. The patient healed uneventfully.
Discussion
Tophaceous gout is characterized by collections of solid urate accompanied by chronic inflammatory and often destructive changes in the surrounding tissue brought on by periods of increased uric acid levels. Due to the patient’s 20-year history of untreated tophaceous gout, we saw the extent of bony and soft tissue destruction that this pathology created. This patient’s uric acid laboratory value of 9.5 mg/dL was well above the normal reference values of 2.6 to 7.2 mg/dL. The X-rays performed suggested that there was not only bony destruction, but also deformity.
The destruction to the surrounding soft tissues noted as advanced nonhealing wounds formed to the area of the tophi. The size of the second digit also was impressive, causing displacement of the other digits. As stated in the literature, tophaceous gout is usually painless as was the case in our patient. It is the combination of the relatively painless nature of this pathology accompanied by no treatment over many years that led to the patient’s level of deformity and tissue destruction.
Conclusion
We describe a common presentation of bone involvement secondary to significant tophaceous gout in the absence osteomyelitis. The goal of treatment was to maintain a functional foot free of major deformity, pain, or associated risk factors that could lead to a more significant surgical procedure, such as a proximal amputation.11 Given the destructive nature of this pathology, it is important to educate the patient, perform regular examinations, and start medications early to control uric acid levels. These measures will improve the patient’s prognosis and avoid severe sequelae.
1. Zhu Y, Pandya BJ, Choi HK. Prevalence of gout and hyperuricemia in the US general population: the National Health and Nutrition Examination Survey 2007-2008. Arthritis Rheum. 2011;63(10):3136-3141.
2. Roddy E, Choi HK. Epidemiology of gout. Rheum Dis Clin North Am. 2014;40(2):155-175.
3. Choi H. Epidemiology of crystal arthropathy. Rheum Dis Clin North Am. 2006;32(2):255-273.
4. Nakayama DA, Barthelemy C, Carrera G, Lightfoot RW Jr, Wortmann RL. Tophaceous gout: a clinical and radiographic assessment. Arthritis Rheum. 1984;27(4):468-471.
5. Dalbeth N, Haskard DO. Pathophysiology of crystal-induced arthritis. In: Wortmann RL, Schumacher HR Jr, Becker MA, Ryan LM, eds. Crystal-induced Arthropathies. New York: Taylor & Francis; 2006.
6. Dalbeth N, Pool B, Gamble GD, et al. Cellular characterization of the gouty tophus: a quantitative analysis. Arthritis Rheum. 2010;62(5):1549-1556.
7. Hsu CY, Shih TT, Huang KM, Chen PQ, Sheu JJ, Li YW. Tophaceous gout of the spine: MR imaging features. Clin Radiol. 2002;57(10):919-925.
8. Schumacher HR Jr, Becker MA, Edwards NL, et al. Magnetic resonance imaging in the quantitative assessment of gouty tophi. Int J Clin Pract. 2006;60(4):408-414.
9. McQueen FM, Doyle A, Dalbeth N. Imaging in the crystal arthropathies. Rheum Dis Clin North Am. 2014;40(2):231-249.
10. Choi HK, Al-Arfaj AM, Eftekhari A, et al. Dual energy computed tomography in tophaceous gout. Ann Rheum Dis. 2009;68(10):1609-1612.
11. Khanna D, Fitzgerald JD, Khanna PP, et al; American College of Rheumatology. 2012 American College of Rheumatology guidelines for management of gout. Part 1: systematic nonpharmacologic and pharmacologic therapeutic approaches to hyperuricemia. Arthritis Care Res (Hoboken). 2012;64(10):1431-1446.
Gout is an in inflammatory condition that is generally characterized by red, hot, swollen, and painful joints. The disease is often associated with increased serum uric acid levels; which are considered elevated when they are > 6 mg/dL in women and > 7 mg/dL in men. When gout affects joints, the subchondral bone may be involved, leading to destructive, painful changes. This article presents the case of a patient diagnosed with tophaceous gout of the left second toe with bony erosive changes and calcified nodules noted on magnetic resonance images (MRI).
Case Presentation
A 70-year-old white male presented to the podiatry clinic for a left second-toe mass that was diagnosed as tophaceous gout after being seen by his primary care physician. The patient reported that the mass had slowly grown over the past 10 years. At presentation, he had a 0.2-cm ulcer on the dorsal aspect of the left second-toe mass. The patient stated that the ulcer had recently appeared with some exudate; however, there was no active drainage of material. The patient had a 20-year history of gout that was untreated with dietary modifications or medication. The patient also stated that although the left second-toe mass did not cause any pain on rest, it did cause pain with shoe gear and during ambulation. A community-based podiatrist had recommended amputation of the second toe and as a result the patient was seeking a second opinion at the US Department of Veterans Affairs (VA) Lebanon VA Medical Center (VAMC) in Pennsylvania. The patient had not had acute gouty attacks during the past 10 years.
The patient’s medical history was significant for uncontrolled gout, hyperlipidemia, coronary artery disease with a 4-vessel coronary artery bypass grafting, impaired fasting glucose, prostate cancer that was in remission, alcohol misuse (currently limited to ≤ 2 drinks per night), and 30-year history of cigarette smoking (quit 2 months prior to visit).1,2
At his first visit to the clinic, an examination revealed distinct evidence of bulging of the soft tissues of the second toe of the left foot with a dry sinus tract that was not malodorous (Figure 1). The left second toe was erythematous and edematous. A local increase in skin temperature was present on the second toe of the left foot compared with that of the contralateral foot and other toes. The dorsalis pedis and tibialis posterior pulses were easily palpated, and the capillary return was within normal limits. Palpation of the left second-toe plantar elicited mild tenderness. Crepitation was not present at the left second metatarsophalangeal joint (MPJ) nor at the interphalangeal joint. There was restricted range of motion at the left second MPJ compared with that of the right foot and no motion at the proximal interphalangeal joint. The movement at the left second metatarsophalangeal elicited tenderness. The mass on the left second toe was firm, nonpulsatile, oval-shaped, with a white pigmented consistency that measured 2 cm x 2.5 cm.
There were no deficits present on the neurologic examination, which was noncontributory. There also was no gross evidence of motor weakness. His initial temporal temperature was 98.2° F. The initial laboratory findings were uric acid, 9.5 mg/dL; fasting glucose, 117 g/dL; estimated glomerular filtration rate, 55 mL/min/1.73 m2; erythrocyte sedimentation rate, 6.5 mm/h; and white blood count, 6.6 K/uL.3,4-6
Diagnostic imaging included X-rays of the patient’s feet and a MRI of the left foot. The X-rays showed diffusely osteopenic bones with severe soft tissue swelling surrounding the second proximal interphalangeal joint. Also present was moderate soft tissue swelling at the level of the first metatarsophalangeal joint accompanied by extensive erosions at both of these joints, most pronounced at the second proximal interphalangeal joint. Also, there was narrowing at the first MPJ and the first interphalangeal joint. Erosive changes at the tarsometatarsal articulations and small lucencies within the navicular/midfoot joint were suggestive of additional gouty erosions. A small-to-moderate posterior calcaneal enthesophyte was present as well as a tiny calcaneal enthesophyte (Figure 2).
A MRI showed a destructive soft tissue mass, resulting in overhanging edges, with foci of calcifications centered about the proximal interphalangeal joint of the second toe, which is consistent with a calcified tophaceous gout nodule. The widest dimension of the mass measured 3.2 cm. There also was a less prominent calcified tophaceous gout nodule at the first MPJ. There were additional small punched-out lesions involving the bases of the first through fourth metatarsi and at the distal aspect of the first cuneiform in keeping with gouty arthropathy (Figure 3).4,7-10
The initial treatment plan presented to the patient was to amputate the left second toe. But the patient decided against amputation. Treatment guidelines for allopurinol are to titrate in 100-mg increments every 2 weeks until the serum uric acid levels are consistently < 6, tophi resolve, and the patient should be free of gout attacks.11 We initiated uric acid-lowering therapy with allopurinol at 50 mg/d for 7 days, increasing to 100 mg/d for 7 days, then to 200 mg/d for 10 days. The patient’s serum uric acid level was checked at 200 mg/d. Our patient could not tolerate the allopurinol and decided to discontinue treatment. After 1 year he started having severe pain and returned to have the toe amputated. The patient healed uneventfully.
Discussion
Tophaceous gout is characterized by collections of solid urate accompanied by chronic inflammatory and often destructive changes in the surrounding tissue brought on by periods of increased uric acid levels. Due to the patient’s 20-year history of untreated tophaceous gout, we saw the extent of bony and soft tissue destruction that this pathology created. This patient’s uric acid laboratory value of 9.5 mg/dL was well above the normal reference values of 2.6 to 7.2 mg/dL. The X-rays performed suggested that there was not only bony destruction, but also deformity.
The destruction to the surrounding soft tissues noted as advanced nonhealing wounds formed to the area of the tophi. The size of the second digit also was impressive, causing displacement of the other digits. As stated in the literature, tophaceous gout is usually painless as was the case in our patient. It is the combination of the relatively painless nature of this pathology accompanied by no treatment over many years that led to the patient’s level of deformity and tissue destruction.
Conclusion
We describe a common presentation of bone involvement secondary to significant tophaceous gout in the absence osteomyelitis. The goal of treatment was to maintain a functional foot free of major deformity, pain, or associated risk factors that could lead to a more significant surgical procedure, such as a proximal amputation.11 Given the destructive nature of this pathology, it is important to educate the patient, perform regular examinations, and start medications early to control uric acid levels. These measures will improve the patient’s prognosis and avoid severe sequelae.
Gout is an in inflammatory condition that is generally characterized by red, hot, swollen, and painful joints. The disease is often associated with increased serum uric acid levels; which are considered elevated when they are > 6 mg/dL in women and > 7 mg/dL in men. When gout affects joints, the subchondral bone may be involved, leading to destructive, painful changes. This article presents the case of a patient diagnosed with tophaceous gout of the left second toe with bony erosive changes and calcified nodules noted on magnetic resonance images (MRI).
Case Presentation
A 70-year-old white male presented to the podiatry clinic for a left second-toe mass that was diagnosed as tophaceous gout after being seen by his primary care physician. The patient reported that the mass had slowly grown over the past 10 years. At presentation, he had a 0.2-cm ulcer on the dorsal aspect of the left second-toe mass. The patient stated that the ulcer had recently appeared with some exudate; however, there was no active drainage of material. The patient had a 20-year history of gout that was untreated with dietary modifications or medication. The patient also stated that although the left second-toe mass did not cause any pain on rest, it did cause pain with shoe gear and during ambulation. A community-based podiatrist had recommended amputation of the second toe and as a result the patient was seeking a second opinion at the US Department of Veterans Affairs (VA) Lebanon VA Medical Center (VAMC) in Pennsylvania. The patient had not had acute gouty attacks during the past 10 years.
The patient’s medical history was significant for uncontrolled gout, hyperlipidemia, coronary artery disease with a 4-vessel coronary artery bypass grafting, impaired fasting glucose, prostate cancer that was in remission, alcohol misuse (currently limited to ≤ 2 drinks per night), and 30-year history of cigarette smoking (quit 2 months prior to visit).1,2
At his first visit to the clinic, an examination revealed distinct evidence of bulging of the soft tissues of the second toe of the left foot with a dry sinus tract that was not malodorous (Figure 1). The left second toe was erythematous and edematous. A local increase in skin temperature was present on the second toe of the left foot compared with that of the contralateral foot and other toes. The dorsalis pedis and tibialis posterior pulses were easily palpated, and the capillary return was within normal limits. Palpation of the left second-toe plantar elicited mild tenderness. Crepitation was not present at the left second metatarsophalangeal joint (MPJ) nor at the interphalangeal joint. There was restricted range of motion at the left second MPJ compared with that of the right foot and no motion at the proximal interphalangeal joint. The movement at the left second metatarsophalangeal elicited tenderness. The mass on the left second toe was firm, nonpulsatile, oval-shaped, with a white pigmented consistency that measured 2 cm x 2.5 cm.
There were no deficits present on the neurologic examination, which was noncontributory. There also was no gross evidence of motor weakness. His initial temporal temperature was 98.2° F. The initial laboratory findings were uric acid, 9.5 mg/dL; fasting glucose, 117 g/dL; estimated glomerular filtration rate, 55 mL/min/1.73 m2; erythrocyte sedimentation rate, 6.5 mm/h; and white blood count, 6.6 K/uL.3,4-6
Diagnostic imaging included X-rays of the patient’s feet and a MRI of the left foot. The X-rays showed diffusely osteopenic bones with severe soft tissue swelling surrounding the second proximal interphalangeal joint. Also present was moderate soft tissue swelling at the level of the first metatarsophalangeal joint accompanied by extensive erosions at both of these joints, most pronounced at the second proximal interphalangeal joint. Also, there was narrowing at the first MPJ and the first interphalangeal joint. Erosive changes at the tarsometatarsal articulations and small lucencies within the navicular/midfoot joint were suggestive of additional gouty erosions. A small-to-moderate posterior calcaneal enthesophyte was present as well as a tiny calcaneal enthesophyte (Figure 2).
A MRI showed a destructive soft tissue mass, resulting in overhanging edges, with foci of calcifications centered about the proximal interphalangeal joint of the second toe, which is consistent with a calcified tophaceous gout nodule. The widest dimension of the mass measured 3.2 cm. There also was a less prominent calcified tophaceous gout nodule at the first MPJ. There were additional small punched-out lesions involving the bases of the first through fourth metatarsi and at the distal aspect of the first cuneiform in keeping with gouty arthropathy (Figure 3).4,7-10
The initial treatment plan presented to the patient was to amputate the left second toe. But the patient decided against amputation. Treatment guidelines for allopurinol are to titrate in 100-mg increments every 2 weeks until the serum uric acid levels are consistently < 6, tophi resolve, and the patient should be free of gout attacks.11 We initiated uric acid-lowering therapy with allopurinol at 50 mg/d for 7 days, increasing to 100 mg/d for 7 days, then to 200 mg/d for 10 days. The patient’s serum uric acid level was checked at 200 mg/d. Our patient could not tolerate the allopurinol and decided to discontinue treatment. After 1 year he started having severe pain and returned to have the toe amputated. The patient healed uneventfully.
Discussion
Tophaceous gout is characterized by collections of solid urate accompanied by chronic inflammatory and often destructive changes in the surrounding tissue brought on by periods of increased uric acid levels. Due to the patient’s 20-year history of untreated tophaceous gout, we saw the extent of bony and soft tissue destruction that this pathology created. This patient’s uric acid laboratory value of 9.5 mg/dL was well above the normal reference values of 2.6 to 7.2 mg/dL. The X-rays performed suggested that there was not only bony destruction, but also deformity.
The destruction to the surrounding soft tissues noted as advanced nonhealing wounds formed to the area of the tophi. The size of the second digit also was impressive, causing displacement of the other digits. As stated in the literature, tophaceous gout is usually painless as was the case in our patient. It is the combination of the relatively painless nature of this pathology accompanied by no treatment over many years that led to the patient’s level of deformity and tissue destruction.
Conclusion
We describe a common presentation of bone involvement secondary to significant tophaceous gout in the absence osteomyelitis. The goal of treatment was to maintain a functional foot free of major deformity, pain, or associated risk factors that could lead to a more significant surgical procedure, such as a proximal amputation.11 Given the destructive nature of this pathology, it is important to educate the patient, perform regular examinations, and start medications early to control uric acid levels. These measures will improve the patient’s prognosis and avoid severe sequelae.
1. Zhu Y, Pandya BJ, Choi HK. Prevalence of gout and hyperuricemia in the US general population: the National Health and Nutrition Examination Survey 2007-2008. Arthritis Rheum. 2011;63(10):3136-3141.
2. Roddy E, Choi HK. Epidemiology of gout. Rheum Dis Clin North Am. 2014;40(2):155-175.
3. Choi H. Epidemiology of crystal arthropathy. Rheum Dis Clin North Am. 2006;32(2):255-273.
4. Nakayama DA, Barthelemy C, Carrera G, Lightfoot RW Jr, Wortmann RL. Tophaceous gout: a clinical and radiographic assessment. Arthritis Rheum. 1984;27(4):468-471.
5. Dalbeth N, Haskard DO. Pathophysiology of crystal-induced arthritis. In: Wortmann RL, Schumacher HR Jr, Becker MA, Ryan LM, eds. Crystal-induced Arthropathies. New York: Taylor & Francis; 2006.
6. Dalbeth N, Pool B, Gamble GD, et al. Cellular characterization of the gouty tophus: a quantitative analysis. Arthritis Rheum. 2010;62(5):1549-1556.
7. Hsu CY, Shih TT, Huang KM, Chen PQ, Sheu JJ, Li YW. Tophaceous gout of the spine: MR imaging features. Clin Radiol. 2002;57(10):919-925.
8. Schumacher HR Jr, Becker MA, Edwards NL, et al. Magnetic resonance imaging in the quantitative assessment of gouty tophi. Int J Clin Pract. 2006;60(4):408-414.
9. McQueen FM, Doyle A, Dalbeth N. Imaging in the crystal arthropathies. Rheum Dis Clin North Am. 2014;40(2):231-249.
10. Choi HK, Al-Arfaj AM, Eftekhari A, et al. Dual energy computed tomography in tophaceous gout. Ann Rheum Dis. 2009;68(10):1609-1612.
11. Khanna D, Fitzgerald JD, Khanna PP, et al; American College of Rheumatology. 2012 American College of Rheumatology guidelines for management of gout. Part 1: systematic nonpharmacologic and pharmacologic therapeutic approaches to hyperuricemia. Arthritis Care Res (Hoboken). 2012;64(10):1431-1446.
1. Zhu Y, Pandya BJ, Choi HK. Prevalence of gout and hyperuricemia in the US general population: the National Health and Nutrition Examination Survey 2007-2008. Arthritis Rheum. 2011;63(10):3136-3141.
2. Roddy E, Choi HK. Epidemiology of gout. Rheum Dis Clin North Am. 2014;40(2):155-175.
3. Choi H. Epidemiology of crystal arthropathy. Rheum Dis Clin North Am. 2006;32(2):255-273.
4. Nakayama DA, Barthelemy C, Carrera G, Lightfoot RW Jr, Wortmann RL. Tophaceous gout: a clinical and radiographic assessment. Arthritis Rheum. 1984;27(4):468-471.
5. Dalbeth N, Haskard DO. Pathophysiology of crystal-induced arthritis. In: Wortmann RL, Schumacher HR Jr, Becker MA, Ryan LM, eds. Crystal-induced Arthropathies. New York: Taylor & Francis; 2006.
6. Dalbeth N, Pool B, Gamble GD, et al. Cellular characterization of the gouty tophus: a quantitative analysis. Arthritis Rheum. 2010;62(5):1549-1556.
7. Hsu CY, Shih TT, Huang KM, Chen PQ, Sheu JJ, Li YW. Tophaceous gout of the spine: MR imaging features. Clin Radiol. 2002;57(10):919-925.
8. Schumacher HR Jr, Becker MA, Edwards NL, et al. Magnetic resonance imaging in the quantitative assessment of gouty tophi. Int J Clin Pract. 2006;60(4):408-414.
9. McQueen FM, Doyle A, Dalbeth N. Imaging in the crystal arthropathies. Rheum Dis Clin North Am. 2014;40(2):231-249.
10. Choi HK, Al-Arfaj AM, Eftekhari A, et al. Dual energy computed tomography in tophaceous gout. Ann Rheum Dis. 2009;68(10):1609-1612.
11. Khanna D, Fitzgerald JD, Khanna PP, et al; American College of Rheumatology. 2012 American College of Rheumatology guidelines for management of gout. Part 1: systematic nonpharmacologic and pharmacologic therapeutic approaches to hyperuricemia. Arthritis Care Res (Hoboken). 2012;64(10):1431-1446.
Usage of and Attitudes Toward Health Information Exchange Before and After System Implementation in a VA Medical Center
More than 9 million veterans are enrolled in the Veterans Health Administration (VHA). A high percentage of veterans who use VHA services have multiple chronic conditions and complex medical needs.1 In addition to receiving health care from the VHA, many of these patients receive additional services from non-VHA providers in the community. Furthermore, recent laws enacted, such as the 2018 VA MISSION Act and the 2014 VA Choice Program, have increased veterans’ use of community health care services.
VHA staff face considerable barriers when seeking documentation about non-VHA services delivered in the community, which can be fragmented across multiple health care systems. In many VHA medical centers, staff must telephone non-VHA sites of care and/or use time-consuming fax services to request community-based patient records. VA health care providers (HCPs) often complain that community records are not available to make timely clinical decisions or that they must do so without knowing past or co-occurring assessments or treatment plans. Without access to comprehensive health records, patients are at risk for duplicated treatment, medication errors, and death.2,3
Background
To improve the continuity and safety of health care, US governmental and health information experts stimulated formal communication among HCPs via the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act.4,5 One of the primary aims of the HITECH Act was to promote reliable and interoperable electronic sharing of clinical information through health information exchange (HIE) for both patients and HCPs. Monetary incentives encouraged regional, state, or state-funded organizations to create and promote HIE capabilities.
Presently, empirical data are not available that describe the effect of external HIE systems in VHA settings. However, data examining non-VHA settings suggest that HIE may improve quality of care, although findings are mixed. For example, some research has found that HIE reduces hospital admissions, duplicated test ordering, and health care costs and improves decision making, whereas other research has found no change.3,6-13 Barriers to HIE use noted in community settings include poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6-10,14
A few US Department of Veterans Affairs (VA) medical centers have recently initiated contracts with HIE organizations. Because much of the present research evaluates internally developed HIE systems, scholars in the field have identified a pressing need for useful statistics before and after implementation of externally developed HIE systems.13,15 Additionally, scholars call for data examining nonacademic settings (eg, VHA medical centers) and for diverse patient populations (eg, individuals with chronic disorders, veterans).13This quality improvement project had 2 goals. The first goal was to assess baseline descriptive statistics related to requesting/obtaining community health records in a VHA setting. The second goal was to evaluate VHA staff access to needed community health records (eg, records stemming from community consults) before and after implementation of an externally developed HIE system.
Methods
This project was a single-center, quality improvement evaluation examining the effect of implementing an HIE system, developed by an external nonprofit organization. The project protocol was approved by the VA Pacific Islands Healthcare System (VAPIHCS) Evidence-Based Practices Council. Clinicians’ responses were anonymous, and data were reported only in aggregate. Assessment was conducted by an evaluator who was not associated with the HIE system developers and its implementation, reducing the chance of bias.15
Coinciding with the HIE system implementation and prior to having access to it, VAPIHCS medical and managed care staff were invited to complete an online needs assessment tool. Voluntary trainings on the system were offered at various times on multiple days and lasted approximately 1 hour. Six months after the HIE system was implemented, a postassessment tool reevaluated HIE-related access.
VHA Setting and HIE System
VAPIHCS serves about 55,000 unique patients across a 2.6 million square-mile catchment area (Hawaii and Pacific Island territories). Facilities include a medium-sized, urban VA medical center and 7 suburban or rural/remote primary care outpatient clinics.
VAPIHCS contracted with Hawaii Health Information Exchange (HHIE), a nonprofit organization that was designated by the state of Hawaii to develop a seamless, secure HIE system. According to HHIE, 83% of the 23 hospitals in the state and 55% of Hawaii’s 2,927 active practicing physicians have adopted the HIE system (F. Chan, personal communication, December 12, 2018). HHIE’s data sources provide real-time access to a database of 20 million health records. Records include, among other records, data such as patients’ reasons for referral, encounter diagnoses, medications, immunizations, and discharge instructions from many (but not all) HCPs in Hawaii.
HHIE reports that it has the capacity to interface with all electronic health records systems currently in use in the community (F. Chan, personal communication, December 12, 2018). Although the HIE system can provide directed exchange (ie, sending and receiving secure information electronically between HCPs), the HIE system implemented in the VAPIHCS was limited to query-retrieve (ie, practitioner-initiated requests for information from other community HCPs). Specifically, to access patient records, practitioners log in to the HIE portal and enter a patient’s name in a search window. The system then generates a consolidated virtual chart with data collected from all HIE data-sharing participants. To share records, community HCPs either build or enable a profile in an integrated health care enterprise electronic communication interface into their data. However, VHA records were not made available to community HCPs at this initial stage.
Measures and Statistical Analysis
A template of quality improvement-related questions was adapted for this project with input from subject matter experts. Questions were then modified further based on interviews with 5 clinical and managed care staff members. The final online tool consisted of up to 20 multiple choice items and 2 open-ended questions delivered online. A 22-item evaluation tool was administered 6 months after system implementation. Frequencies were obtained for descriptive items, and group responses were compared across time.
Results
Thirty-nine staff (32 medical and 7 managed care staff) completed the needs assessment, and 20 staff (16 medical and 4 managed care staff) completed the postimplementation evaluation.
Before implementation of the HIE system, most staff (54%) indicated that they spent > 1 hour a week conducting tasks related to seeking and/or obtaining health records from the community. The largest percentage of staff (27%) requested > 10 community records during a typical week. Most respondents indicated that they would use an easy tool to instantly retrieve community health records at least 20 times per week (Table 1).
Preimplementation, 32.4% of respondents indicated that they could access community-based health records sometimes. Postimplementation, most respondents indicated they could access the records most of the time (Figure 1).
Preimplementation, staff most frequently indicated they were very dissatisfied with the current level of access to community records. Postimplementation, more staff were somewhat satisfied or very satisfied (Figure 2). Postimplementation, 48% of staff most often reported using the HIE system either several times a month or 2 to 4 times a week, 19% used the system daily, 19% used 1 to 2 times, and 14% never used the system. Most staff (67%) reported that the system improved access to records somewhat and supported continuing the contract with the HIE system. Conversely, 18% of respondents said that their access did not improve enough for the system to be of use to them.
Preimplementation, staff most frequently indicated that they did not have time (28.6%) or sufficient staff (25.7%) to request records (Table 2). Postimplementation, staff most frequently (33.3%) indicated that they had no problems accessing the HIE system, but 6.7% reported having time or interface/software difficulties.
Discussion
This report assessed a quality improvement project designed to increase VHA access to community health records via an external HIE system. Prior to this work, no data were available on use, barriers, and staff satisfaction related to implementing an externally developed HIE system within a VA medical center.13,15
Before the medical center implemented the HIE system, logistical barriers prevented most HCPs and managed care staff from obtaining needed community records. Staff faced challenges such as lacking time as well as rudimentary barriers, such as community clinics not responding to requests or the fax machine not working. Time remained a challenge after implementation, but this work demonstrated that the HIE system helped staff overcome many logistical barriers.
After implementation of the HIE system, staff reported an improvement in access and satisfaction related to retrieving community health records. These findings are consistent with most but not all evaluations of HIE systems.3,6,7,12,13 In the present work, staff used the system several times a month or several times a week, and most staff believed that access to the HIE system should be continued. Still, improvement was incomplete. The HIE system increased access to specific types of records (eg, reports) and health care systems (eg, large hospitals), but not others. As a result, the system was more useful for some staff than for others.
Research examining HIE systems in community and academic settings have identified factors that deter their use, such as poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6,7,14,16 In the present project, incomplete record availability was a noted barrier. Additionally, a few staff reported system interface issues. However, most staff found the system easy to use as part of their daily workflow.
Because the HIE system had a meaningful, positive impact on VHA providers and staff, it will be sustained at VAPIHCS. Specifically, the contract with the HHIE has been renewed, and the number of user licenses has increased. Staff users now self-refer for the service or can be referred by their service chiefs.
Limitations
This work was designed to evaluate the effect of an HIE system on staff in 1 VHA setting; thus, findings may not be generalizable to other settings or HIE systems. Limitations of the present work include small sample size of respondents; limited time frame for responses; and limited response rate. The logical next step would be research efforts to compare access to the HIE system with no access on factors such as workload productivity, cost savings, and patient safety.
Conclusion
The vision of the HITECH Act was to improve the continuity and safety of health care via reliable and interoperable electronic sharing of clinical information across health care entities.6 This VHA quality improvement project demonstrated a meaningful improvement in staff’s level of satisfaction with access to community health records when staff used an externally developed HIE system. Not all types of records (eg, progress notes) were accessible, which resulted in the system being useful for most but not all staff.
In the future, the federal government’s internally developed Veterans Health Information Exchange (formerly known as the Virtual Lifetime Electronic Record [VLER]) is expected to enable VHA, the Department of Defense, and participating community care providers to access shared electronic health records nationally. However, until we can achieve that envisioned interoperability, VHA staff can use HIE and other clinical support applications to access health records.
1. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S.
2. Bourgeois FC, Olson KL, Mandl KD. Patients treated at multiple acute health care facilities: quantifying information fragmentation. Arch Intern Med. 2010;170(22):1989-1995.
3. Rudin RS, Motala A, Goldzweig CL, Shekelle PG. Usage and effect of health information exchange: a systematic review. Ann Intern Med. 2014;161(11):803-811.
4. Blumenthal D. Implementation of the federal health information technology initiative. N Engl J Med. 2011;365(25):2426-2431.
5. The Office of the National Coordinator for Health Information Technology. Connecting health and care for the nation: a shared nationwide interoperability roadmap. Final version 1.0. https://www.healthit.gov/sites/default/files/hie-interoperability/nationwide-interoperability-roadmap-final-version-1.0.pdf. Accessed May 22, 2019.
6. Detmer D, Bloomrosen M, Raymond B, Tang P. Integrated personal health records: transformative tools for consumer-centric care. BMC Med Inform Decis Mak. 2008;8:45.
7. Hersh WR, Totten AM, Eden KB, et al. Outcomes from health information exchange: systematic review and future research needs. JMIR Med Inform. 2015;3(4):e39.
8. Vest JR, Kern LM, Campion TR Jr, Silver MD, Kaushal R. Association between use of a health information exchange system and hospital admissions. Appl Clin Inform. 2014;5(1):219-231.
9. Vest JR, Jung HY, Ostrovsky A, Das LT, McGinty GB. Image sharing technologies and reduction of imaging utilization: a systematic review and meta-analysis. J Am Coll Radiol. 2015;12(12 pt B):1371-1379.e3.
10. Walker DM. Does participation in health information exchange improve hospital efficiency? Health Care Manag Sci. 2018;21(3):426-438.
11. Gordon BD, Bernard K, Salzman J, Whitebird RR. Impact of health information exchange on emergency medicine clinical decision making. West J Emerg Med. 2015;16(7):1047-1051.
12. Hincapie A, Warholak T. The impact of health information exchange on health outcomes. Appl Clin Inform. 2011;2(4):499-507.
13. Rahurkar S, Vest JR, Menachemi N. Despite the spread of health information exchange, there is little evidence of its impact on cost, use, and quality of care. Health Aff (Millwood). 2015;34(3):477-483.
14. Eden KB, Totten AM, Kassakian SZ, et al. Barriers and facilitators to exchanging health information: a systematic review. Int J Med Inform. 2016;88:44-51.
15. Hersh WR, Totten AM, Eden K, et al. The evidence base for health information exchange. In: Dixon BE, ed. Health Information Exchange: Navigating and Managing a Network of Health Information Systems. Cambridge, MA: Academic Press; 2016:213-229.
16. Blavin F, Ramos C, Cafarella Lallemand N, Fass J, Ozanich G, Adler-Milstein J. Analyzing the public benefit attributable to interoperable health information exchange. https://aspe.hhs.gov/system/files/pdf/258851/AnalyzingthePublicBenefitAttributabletoInteroperableHealth.pdf. Published July 2017. Accessed May 22, 2019.
More than 9 million veterans are enrolled in the Veterans Health Administration (VHA). A high percentage of veterans who use VHA services have multiple chronic conditions and complex medical needs.1 In addition to receiving health care from the VHA, many of these patients receive additional services from non-VHA providers in the community. Furthermore, recent laws enacted, such as the 2018 VA MISSION Act and the 2014 VA Choice Program, have increased veterans’ use of community health care services.
VHA staff face considerable barriers when seeking documentation about non-VHA services delivered in the community, which can be fragmented across multiple health care systems. In many VHA medical centers, staff must telephone non-VHA sites of care and/or use time-consuming fax services to request community-based patient records. VA health care providers (HCPs) often complain that community records are not available to make timely clinical decisions or that they must do so without knowing past or co-occurring assessments or treatment plans. Without access to comprehensive health records, patients are at risk for duplicated treatment, medication errors, and death.2,3
Background
To improve the continuity and safety of health care, US governmental and health information experts stimulated formal communication among HCPs via the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act.4,5 One of the primary aims of the HITECH Act was to promote reliable and interoperable electronic sharing of clinical information through health information exchange (HIE) for both patients and HCPs. Monetary incentives encouraged regional, state, or state-funded organizations to create and promote HIE capabilities.
Presently, empirical data are not available that describe the effect of external HIE systems in VHA settings. However, data examining non-VHA settings suggest that HIE may improve quality of care, although findings are mixed. For example, some research has found that HIE reduces hospital admissions, duplicated test ordering, and health care costs and improves decision making, whereas other research has found no change.3,6-13 Barriers to HIE use noted in community settings include poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6-10,14
A few US Department of Veterans Affairs (VA) medical centers have recently initiated contracts with HIE organizations. Because much of the present research evaluates internally developed HIE systems, scholars in the field have identified a pressing need for useful statistics before and after implementation of externally developed HIE systems.13,15 Additionally, scholars call for data examining nonacademic settings (eg, VHA medical centers) and for diverse patient populations (eg, individuals with chronic disorders, veterans).13This quality improvement project had 2 goals. The first goal was to assess baseline descriptive statistics related to requesting/obtaining community health records in a VHA setting. The second goal was to evaluate VHA staff access to needed community health records (eg, records stemming from community consults) before and after implementation of an externally developed HIE system.
Methods
This project was a single-center, quality improvement evaluation examining the effect of implementing an HIE system, developed by an external nonprofit organization. The project protocol was approved by the VA Pacific Islands Healthcare System (VAPIHCS) Evidence-Based Practices Council. Clinicians’ responses were anonymous, and data were reported only in aggregate. Assessment was conducted by an evaluator who was not associated with the HIE system developers and its implementation, reducing the chance of bias.15
Coinciding with the HIE system implementation and prior to having access to it, VAPIHCS medical and managed care staff were invited to complete an online needs assessment tool. Voluntary trainings on the system were offered at various times on multiple days and lasted approximately 1 hour. Six months after the HIE system was implemented, a postassessment tool reevaluated HIE-related access.
VHA Setting and HIE System
VAPIHCS serves about 55,000 unique patients across a 2.6 million square-mile catchment area (Hawaii and Pacific Island territories). Facilities include a medium-sized, urban VA medical center and 7 suburban or rural/remote primary care outpatient clinics.
VAPIHCS contracted with Hawaii Health Information Exchange (HHIE), a nonprofit organization that was designated by the state of Hawaii to develop a seamless, secure HIE system. According to HHIE, 83% of the 23 hospitals in the state and 55% of Hawaii’s 2,927 active practicing physicians have adopted the HIE system (F. Chan, personal communication, December 12, 2018). HHIE’s data sources provide real-time access to a database of 20 million health records. Records include, among other records, data such as patients’ reasons for referral, encounter diagnoses, medications, immunizations, and discharge instructions from many (but not all) HCPs in Hawaii.
HHIE reports that it has the capacity to interface with all electronic health records systems currently in use in the community (F. Chan, personal communication, December 12, 2018). Although the HIE system can provide directed exchange (ie, sending and receiving secure information electronically between HCPs), the HIE system implemented in the VAPIHCS was limited to query-retrieve (ie, practitioner-initiated requests for information from other community HCPs). Specifically, to access patient records, practitioners log in to the HIE portal and enter a patient’s name in a search window. The system then generates a consolidated virtual chart with data collected from all HIE data-sharing participants. To share records, community HCPs either build or enable a profile in an integrated health care enterprise electronic communication interface into their data. However, VHA records were not made available to community HCPs at this initial stage.
Measures and Statistical Analysis
A template of quality improvement-related questions was adapted for this project with input from subject matter experts. Questions were then modified further based on interviews with 5 clinical and managed care staff members. The final online tool consisted of up to 20 multiple choice items and 2 open-ended questions delivered online. A 22-item evaluation tool was administered 6 months after system implementation. Frequencies were obtained for descriptive items, and group responses were compared across time.
Results
Thirty-nine staff (32 medical and 7 managed care staff) completed the needs assessment, and 20 staff (16 medical and 4 managed care staff) completed the postimplementation evaluation.
Before implementation of the HIE system, most staff (54%) indicated that they spent > 1 hour a week conducting tasks related to seeking and/or obtaining health records from the community. The largest percentage of staff (27%) requested > 10 community records during a typical week. Most respondents indicated that they would use an easy tool to instantly retrieve community health records at least 20 times per week (Table 1).
Preimplementation, 32.4% of respondents indicated that they could access community-based health records sometimes. Postimplementation, most respondents indicated they could access the records most of the time (Figure 1).
Preimplementation, staff most frequently indicated they were very dissatisfied with the current level of access to community records. Postimplementation, more staff were somewhat satisfied or very satisfied (Figure 2). Postimplementation, 48% of staff most often reported using the HIE system either several times a month or 2 to 4 times a week, 19% used the system daily, 19% used 1 to 2 times, and 14% never used the system. Most staff (67%) reported that the system improved access to records somewhat and supported continuing the contract with the HIE system. Conversely, 18% of respondents said that their access did not improve enough for the system to be of use to them.
Preimplementation, staff most frequently indicated that they did not have time (28.6%) or sufficient staff (25.7%) to request records (Table 2). Postimplementation, staff most frequently (33.3%) indicated that they had no problems accessing the HIE system, but 6.7% reported having time or interface/software difficulties.
Discussion
This report assessed a quality improvement project designed to increase VHA access to community health records via an external HIE system. Prior to this work, no data were available on use, barriers, and staff satisfaction related to implementing an externally developed HIE system within a VA medical center.13,15
Before the medical center implemented the HIE system, logistical barriers prevented most HCPs and managed care staff from obtaining needed community records. Staff faced challenges such as lacking time as well as rudimentary barriers, such as community clinics not responding to requests or the fax machine not working. Time remained a challenge after implementation, but this work demonstrated that the HIE system helped staff overcome many logistical barriers.
After implementation of the HIE system, staff reported an improvement in access and satisfaction related to retrieving community health records. These findings are consistent with most but not all evaluations of HIE systems.3,6,7,12,13 In the present work, staff used the system several times a month or several times a week, and most staff believed that access to the HIE system should be continued. Still, improvement was incomplete. The HIE system increased access to specific types of records (eg, reports) and health care systems (eg, large hospitals), but not others. As a result, the system was more useful for some staff than for others.
Research examining HIE systems in community and academic settings have identified factors that deter their use, such as poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6,7,14,16 In the present project, incomplete record availability was a noted barrier. Additionally, a few staff reported system interface issues. However, most staff found the system easy to use as part of their daily workflow.
Because the HIE system had a meaningful, positive impact on VHA providers and staff, it will be sustained at VAPIHCS. Specifically, the contract with the HHIE has been renewed, and the number of user licenses has increased. Staff users now self-refer for the service or can be referred by their service chiefs.
Limitations
This work was designed to evaluate the effect of an HIE system on staff in 1 VHA setting; thus, findings may not be generalizable to other settings or HIE systems. Limitations of the present work include small sample size of respondents; limited time frame for responses; and limited response rate. The logical next step would be research efforts to compare access to the HIE system with no access on factors such as workload productivity, cost savings, and patient safety.
Conclusion
The vision of the HITECH Act was to improve the continuity and safety of health care via reliable and interoperable electronic sharing of clinical information across health care entities.6 This VHA quality improvement project demonstrated a meaningful improvement in staff’s level of satisfaction with access to community health records when staff used an externally developed HIE system. Not all types of records (eg, progress notes) were accessible, which resulted in the system being useful for most but not all staff.
In the future, the federal government’s internally developed Veterans Health Information Exchange (formerly known as the Virtual Lifetime Electronic Record [VLER]) is expected to enable VHA, the Department of Defense, and participating community care providers to access shared electronic health records nationally. However, until we can achieve that envisioned interoperability, VHA staff can use HIE and other clinical support applications to access health records.
More than 9 million veterans are enrolled in the Veterans Health Administration (VHA). A high percentage of veterans who use VHA services have multiple chronic conditions and complex medical needs.1 In addition to receiving health care from the VHA, many of these patients receive additional services from non-VHA providers in the community. Furthermore, recent laws enacted, such as the 2018 VA MISSION Act and the 2014 VA Choice Program, have increased veterans’ use of community health care services.
VHA staff face considerable barriers when seeking documentation about non-VHA services delivered in the community, which can be fragmented across multiple health care systems. In many VHA medical centers, staff must telephone non-VHA sites of care and/or use time-consuming fax services to request community-based patient records. VA health care providers (HCPs) often complain that community records are not available to make timely clinical decisions or that they must do so without knowing past or co-occurring assessments or treatment plans. Without access to comprehensive health records, patients are at risk for duplicated treatment, medication errors, and death.2,3
Background
To improve the continuity and safety of health care, US governmental and health information experts stimulated formal communication among HCPs via the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act.4,5 One of the primary aims of the HITECH Act was to promote reliable and interoperable electronic sharing of clinical information through health information exchange (HIE) for both patients and HCPs. Monetary incentives encouraged regional, state, or state-funded organizations to create and promote HIE capabilities.
Presently, empirical data are not available that describe the effect of external HIE systems in VHA settings. However, data examining non-VHA settings suggest that HIE may improve quality of care, although findings are mixed. For example, some research has found that HIE reduces hospital admissions, duplicated test ordering, and health care costs and improves decision making, whereas other research has found no change.3,6-13 Barriers to HIE use noted in community settings include poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6-10,14
A few US Department of Veterans Affairs (VA) medical centers have recently initiated contracts with HIE organizations. Because much of the present research evaluates internally developed HIE systems, scholars in the field have identified a pressing need for useful statistics before and after implementation of externally developed HIE systems.13,15 Additionally, scholars call for data examining nonacademic settings (eg, VHA medical centers) and for diverse patient populations (eg, individuals with chronic disorders, veterans).13This quality improvement project had 2 goals. The first goal was to assess baseline descriptive statistics related to requesting/obtaining community health records in a VHA setting. The second goal was to evaluate VHA staff access to needed community health records (eg, records stemming from community consults) before and after implementation of an externally developed HIE system.
Methods
This project was a single-center, quality improvement evaluation examining the effect of implementing an HIE system, developed by an external nonprofit organization. The project protocol was approved by the VA Pacific Islands Healthcare System (VAPIHCS) Evidence-Based Practices Council. Clinicians’ responses were anonymous, and data were reported only in aggregate. Assessment was conducted by an evaluator who was not associated with the HIE system developers and its implementation, reducing the chance of bias.15
Coinciding with the HIE system implementation and prior to having access to it, VAPIHCS medical and managed care staff were invited to complete an online needs assessment tool. Voluntary trainings on the system were offered at various times on multiple days and lasted approximately 1 hour. Six months after the HIE system was implemented, a postassessment tool reevaluated HIE-related access.
VHA Setting and HIE System
VAPIHCS serves about 55,000 unique patients across a 2.6 million square-mile catchment area (Hawaii and Pacific Island territories). Facilities include a medium-sized, urban VA medical center and 7 suburban or rural/remote primary care outpatient clinics.
VAPIHCS contracted with Hawaii Health Information Exchange (HHIE), a nonprofit organization that was designated by the state of Hawaii to develop a seamless, secure HIE system. According to HHIE, 83% of the 23 hospitals in the state and 55% of Hawaii’s 2,927 active practicing physicians have adopted the HIE system (F. Chan, personal communication, December 12, 2018). HHIE’s data sources provide real-time access to a database of 20 million health records. Records include, among other records, data such as patients’ reasons for referral, encounter diagnoses, medications, immunizations, and discharge instructions from many (but not all) HCPs in Hawaii.
HHIE reports that it has the capacity to interface with all electronic health records systems currently in use in the community (F. Chan, personal communication, December 12, 2018). Although the HIE system can provide directed exchange (ie, sending and receiving secure information electronically between HCPs), the HIE system implemented in the VAPIHCS was limited to query-retrieve (ie, practitioner-initiated requests for information from other community HCPs). Specifically, to access patient records, practitioners log in to the HIE portal and enter a patient’s name in a search window. The system then generates a consolidated virtual chart with data collected from all HIE data-sharing participants. To share records, community HCPs either build or enable a profile in an integrated health care enterprise electronic communication interface into their data. However, VHA records were not made available to community HCPs at this initial stage.
Measures and Statistical Analysis
A template of quality improvement-related questions was adapted for this project with input from subject matter experts. Questions were then modified further based on interviews with 5 clinical and managed care staff members. The final online tool consisted of up to 20 multiple choice items and 2 open-ended questions delivered online. A 22-item evaluation tool was administered 6 months after system implementation. Frequencies were obtained for descriptive items, and group responses were compared across time.
Results
Thirty-nine staff (32 medical and 7 managed care staff) completed the needs assessment, and 20 staff (16 medical and 4 managed care staff) completed the postimplementation evaluation.
Before implementation of the HIE system, most staff (54%) indicated that they spent > 1 hour a week conducting tasks related to seeking and/or obtaining health records from the community. The largest percentage of staff (27%) requested > 10 community records during a typical week. Most respondents indicated that they would use an easy tool to instantly retrieve community health records at least 20 times per week (Table 1).
Preimplementation, 32.4% of respondents indicated that they could access community-based health records sometimes. Postimplementation, most respondents indicated they could access the records most of the time (Figure 1).
Preimplementation, staff most frequently indicated they were very dissatisfied with the current level of access to community records. Postimplementation, more staff were somewhat satisfied or very satisfied (Figure 2). Postimplementation, 48% of staff most often reported using the HIE system either several times a month or 2 to 4 times a week, 19% used the system daily, 19% used 1 to 2 times, and 14% never used the system. Most staff (67%) reported that the system improved access to records somewhat and supported continuing the contract with the HIE system. Conversely, 18% of respondents said that their access did not improve enough for the system to be of use to them.
Preimplementation, staff most frequently indicated that they did not have time (28.6%) or sufficient staff (25.7%) to request records (Table 2). Postimplementation, staff most frequently (33.3%) indicated that they had no problems accessing the HIE system, but 6.7% reported having time or interface/software difficulties.
Discussion
This report assessed a quality improvement project designed to increase VHA access to community health records via an external HIE system. Prior to this work, no data were available on use, barriers, and staff satisfaction related to implementing an externally developed HIE system within a VA medical center.13,15
Before the medical center implemented the HIE system, logistical barriers prevented most HCPs and managed care staff from obtaining needed community records. Staff faced challenges such as lacking time as well as rudimentary barriers, such as community clinics not responding to requests or the fax machine not working. Time remained a challenge after implementation, but this work demonstrated that the HIE system helped staff overcome many logistical barriers.
After implementation of the HIE system, staff reported an improvement in access and satisfaction related to retrieving community health records. These findings are consistent with most but not all evaluations of HIE systems.3,6,7,12,13 In the present work, staff used the system several times a month or several times a week, and most staff believed that access to the HIE system should be continued. Still, improvement was incomplete. The HIE system increased access to specific types of records (eg, reports) and health care systems (eg, large hospitals), but not others. As a result, the system was more useful for some staff than for others.
Research examining HIE systems in community and academic settings have identified factors that deter their use, such as poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6,7,14,16 In the present project, incomplete record availability was a noted barrier. Additionally, a few staff reported system interface issues. However, most staff found the system easy to use as part of their daily workflow.
Because the HIE system had a meaningful, positive impact on VHA providers and staff, it will be sustained at VAPIHCS. Specifically, the contract with the HHIE has been renewed, and the number of user licenses has increased. Staff users now self-refer for the service or can be referred by their service chiefs.
Limitations
This work was designed to evaluate the effect of an HIE system on staff in 1 VHA setting; thus, findings may not be generalizable to other settings or HIE systems. Limitations of the present work include small sample size of respondents; limited time frame for responses; and limited response rate. The logical next step would be research efforts to compare access to the HIE system with no access on factors such as workload productivity, cost savings, and patient safety.
Conclusion
The vision of the HITECH Act was to improve the continuity and safety of health care via reliable and interoperable electronic sharing of clinical information across health care entities.6 This VHA quality improvement project demonstrated a meaningful improvement in staff’s level of satisfaction with access to community health records when staff used an externally developed HIE system. Not all types of records (eg, progress notes) were accessible, which resulted in the system being useful for most but not all staff.
In the future, the federal government’s internally developed Veterans Health Information Exchange (formerly known as the Virtual Lifetime Electronic Record [VLER]) is expected to enable VHA, the Department of Defense, and participating community care providers to access shared electronic health records nationally. However, until we can achieve that envisioned interoperability, VHA staff can use HIE and other clinical support applications to access health records.
1. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S.
2. Bourgeois FC, Olson KL, Mandl KD. Patients treated at multiple acute health care facilities: quantifying information fragmentation. Arch Intern Med. 2010;170(22):1989-1995.
3. Rudin RS, Motala A, Goldzweig CL, Shekelle PG. Usage and effect of health information exchange: a systematic review. Ann Intern Med. 2014;161(11):803-811.
4. Blumenthal D. Implementation of the federal health information technology initiative. N Engl J Med. 2011;365(25):2426-2431.
5. The Office of the National Coordinator for Health Information Technology. Connecting health and care for the nation: a shared nationwide interoperability roadmap. Final version 1.0. https://www.healthit.gov/sites/default/files/hie-interoperability/nationwide-interoperability-roadmap-final-version-1.0.pdf. Accessed May 22, 2019.
6. Detmer D, Bloomrosen M, Raymond B, Tang P. Integrated personal health records: transformative tools for consumer-centric care. BMC Med Inform Decis Mak. 2008;8:45.
7. Hersh WR, Totten AM, Eden KB, et al. Outcomes from health information exchange: systematic review and future research needs. JMIR Med Inform. 2015;3(4):e39.
8. Vest JR, Kern LM, Campion TR Jr, Silver MD, Kaushal R. Association between use of a health information exchange system and hospital admissions. Appl Clin Inform. 2014;5(1):219-231.
9. Vest JR, Jung HY, Ostrovsky A, Das LT, McGinty GB. Image sharing technologies and reduction of imaging utilization: a systematic review and meta-analysis. J Am Coll Radiol. 2015;12(12 pt B):1371-1379.e3.
10. Walker DM. Does participation in health information exchange improve hospital efficiency? Health Care Manag Sci. 2018;21(3):426-438.
11. Gordon BD, Bernard K, Salzman J, Whitebird RR. Impact of health information exchange on emergency medicine clinical decision making. West J Emerg Med. 2015;16(7):1047-1051.
12. Hincapie A, Warholak T. The impact of health information exchange on health outcomes. Appl Clin Inform. 2011;2(4):499-507.
13. Rahurkar S, Vest JR, Menachemi N. Despite the spread of health information exchange, there is little evidence of its impact on cost, use, and quality of care. Health Aff (Millwood). 2015;34(3):477-483.
14. Eden KB, Totten AM, Kassakian SZ, et al. Barriers and facilitators to exchanging health information: a systematic review. Int J Med Inform. 2016;88:44-51.
15. Hersh WR, Totten AM, Eden K, et al. The evidence base for health information exchange. In: Dixon BE, ed. Health Information Exchange: Navigating and Managing a Network of Health Information Systems. Cambridge, MA: Academic Press; 2016:213-229.
16. Blavin F, Ramos C, Cafarella Lallemand N, Fass J, Ozanich G, Adler-Milstein J. Analyzing the public benefit attributable to interoperable health information exchange. https://aspe.hhs.gov/system/files/pdf/258851/AnalyzingthePublicBenefitAttributabletoInteroperableHealth.pdf. Published July 2017. Accessed May 22, 2019.
1. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S.
2. Bourgeois FC, Olson KL, Mandl KD. Patients treated at multiple acute health care facilities: quantifying information fragmentation. Arch Intern Med. 2010;170(22):1989-1995.
3. Rudin RS, Motala A, Goldzweig CL, Shekelle PG. Usage and effect of health information exchange: a systematic review. Ann Intern Med. 2014;161(11):803-811.
4. Blumenthal D. Implementation of the federal health information technology initiative. N Engl J Med. 2011;365(25):2426-2431.
5. The Office of the National Coordinator for Health Information Technology. Connecting health and care for the nation: a shared nationwide interoperability roadmap. Final version 1.0. https://www.healthit.gov/sites/default/files/hie-interoperability/nationwide-interoperability-roadmap-final-version-1.0.pdf. Accessed May 22, 2019.
6. Detmer D, Bloomrosen M, Raymond B, Tang P. Integrated personal health records: transformative tools for consumer-centric care. BMC Med Inform Decis Mak. 2008;8:45.
7. Hersh WR, Totten AM, Eden KB, et al. Outcomes from health information exchange: systematic review and future research needs. JMIR Med Inform. 2015;3(4):e39.
8. Vest JR, Kern LM, Campion TR Jr, Silver MD, Kaushal R. Association between use of a health information exchange system and hospital admissions. Appl Clin Inform. 2014;5(1):219-231.
9. Vest JR, Jung HY, Ostrovsky A, Das LT, McGinty GB. Image sharing technologies and reduction of imaging utilization: a systematic review and meta-analysis. J Am Coll Radiol. 2015;12(12 pt B):1371-1379.e3.
10. Walker DM. Does participation in health information exchange improve hospital efficiency? Health Care Manag Sci. 2018;21(3):426-438.
11. Gordon BD, Bernard K, Salzman J, Whitebird RR. Impact of health information exchange on emergency medicine clinical decision making. West J Emerg Med. 2015;16(7):1047-1051.
12. Hincapie A, Warholak T. The impact of health information exchange on health outcomes. Appl Clin Inform. 2011;2(4):499-507.
13. Rahurkar S, Vest JR, Menachemi N. Despite the spread of health information exchange, there is little evidence of its impact on cost, use, and quality of care. Health Aff (Millwood). 2015;34(3):477-483.
14. Eden KB, Totten AM, Kassakian SZ, et al. Barriers and facilitators to exchanging health information: a systematic review. Int J Med Inform. 2016;88:44-51.
15. Hersh WR, Totten AM, Eden K, et al. The evidence base for health information exchange. In: Dixon BE, ed. Health Information Exchange: Navigating and Managing a Network of Health Information Systems. Cambridge, MA: Academic Press; 2016:213-229.
16. Blavin F, Ramos C, Cafarella Lallemand N, Fass J, Ozanich G, Adler-Milstein J. Analyzing the public benefit attributable to interoperable health information exchange. https://aspe.hhs.gov/system/files/pdf/258851/AnalyzingthePublicBenefitAttributabletoInteroperableHealth.pdf. Published July 2017. Accessed May 22, 2019.
Beyond the Polygraph: Deception Detection and the Autonomic Nervous System
The US Department of Defense (DoD) and law enforcement agencies around the country utilize polygraph as an aid in security screenings and interrogation. It is assumed that a person being interviewed will have a visceral response when attempting to deceive the interviewer, and that this response can be detected by measuring the change in vital signs between questions. By using vital signs as an indirect measurement of deception-induced stress, the polygraph machine may provide a false positive or negative result if a patient has an inherited or acquired condition that affects the autonomic nervous system (ANS).
A variety of diseases from alcohol use disorder to rheumatoid arthritis can affect the ANS. In addition, a multitude of commonly prescribed drugs can affect the ANS. Although in their infancy, functional magnetic resonance imaging (fMRI) and EEG (electroencephalogram) deception detection techniques circumvent these issues. Dysautonomias may be an underappreciated cause of error in polygraph interpretation. Polygraph examiners and DoD agencies should be aware of the potential for these disorders to interfere with interpretation of results. In the near future, other modalities that do not measure autonomic variables may be utilized to avoid these pitfalls.
Polygraphy
Throughout history, humans have been interested in techniques and devices that can discern lies from the truth. Even in the ancient era, it was known that the act of lying had physiologic effects. In ancient Israel, if a woman accused of adultery should develop a swollen abdomen after drinking “waters of bitterness,” she was considered guilty of the crime, as described in Numbers 5:11-31. In Ancient China, those accused of fraud would be forced to hold dry rice in their mouths; if the expectorated rice was dry, the suspect was found guilty.1 We now know that catecholamines, particularly epinephrine, secreted during times of stress, cause relaxation of smooth muscle, leading to reduced bowel motility and dry mouth.2-4 However, most methods before the modern era were based more on superstition and chance rather than any sound physiologic premise.
When asked to discern the truth from falsehood based on their own perceptions, people correctly discern lies as false merely 47% of the time and truth as nondeceptive about 61% of the time.5 In short, unaided, we are very poor lie detectors. Therefore, a great deal of interest in technology that can aid in lie detection has ensued. With enhanced technology and understanding of human physiology came a renewed interest in lie detection. Since it was known that vital signs such as blood pressure (BP), heart rate, and breathing could be affected by the stressful situation brought on by deception, quantifying and measuring those responses in an effort to detect lying became a goal. In 1881, the Italian criminologist Cesare Lombroso invented a glove that when worn by a suspect, measured their BP.6-8 Changes in BP also were the target variable of the systolic BP deception test invented by William M. Marston, PhD, in 1915.8 Marston also experimented with measurements of other variables, such as muscle tension.9 In 1921, John Larson invented the first modern polygraph machine.7
Procedures
Today’s polygraph builds on these techniques. A standard polygraph measures respiration, heart rate, BP, and sudomotor function (sweating). Respiration is measured via strain gauges strapped around the chest and abdomen that respond to chest expansion during inhalation. BP and pulse can be measured through a variety of means, including finger pulse measurement or sphygmomanometer.8
Perspiration is measured by skin electrical conductance. Human sweat contains a variety of cations and anions—mostly sodium and chloride, but also potassium, bicarbonate, and lactate. The presence of these electrolytes alter electrical conduction at the skin surface when sweat is released.10
The exact questioning procedure used to perform a polygraph examination can vary. The Comparison Question Test is most commonly used. In this format, the interview consists of questions that are relevant to the investigation at hand, interspersed with control questions. The examiner compares the changes in vital signs and skin conduction to the baseline measurements generated during the pretest interview and during control questions.8 Using these standardized techniques, some studies have shown accuracy rates between 83% and 95% in controlled settings.8 However, studies performed outside of the polygraph community have found very high false positive rates, up to 50% or greater.11
The US Supreme Court has ruled that individual jurisdictions can decide whether or not to admit polygraph evidence in court, and the US Court of Appeals for the Eleventh Circuit has ruled that polygraph results are only admissible if both parties agree to it and are given sufficient notice.12,13 Currently, New Mexico is the only state that allows polygraph results to be used as evidence without a pretrial agreement; all other states either require such an agreement or forbid the results to be used as evidence.14
Although rarely used in federal and state courts as evidence, polygraphy is commonly used during investigations and in the hiring process of government agencies. DoD Directive 5210.48 and Instruction 5210.91 enable DoD investigative organizations (eg, Naval Criminal Investigative Service, National Security Agency, US Army Investigational Command) to use polygraph as an aid during investigations into suspected involvement with foreign intelligence, terrorism against the US, mishandling of classified documents, and other serious violations.15
The Role of the Physician in Polygraph Assessment
It may be rare that the physician is called upon to provide information regarding an individual’s medical condition or related medication use and the effect of these on polygraph results. In such cases, however, the physician must remember the primary fiduciary duty to the patient. Disclosure of medical conditions cannot be made without the patient’s consent, save in very specific situations (eg, Commanding Officer Inquiry, Tarasoff Duty to Protect, etc). It is the polygraph examiner’s responsibility to be aware of potential confounders in a particular examination.10
Physicians can have a responsibility when in administrative or supervisory positions, to advise security and other officials regarding the fitness for certain duties of candidates with whom there is no physician-patient relationship. This may include an individual’s ability to undergo polygraph examination and the validity of such results. However, when a physician-patient relationship is involved, care must be given to ensure that the patient understands that the relationship is protected both by professional standards and by law and that no information will be shared without the patient’s authorization (aside from those rare exceptions provided by law). Often, a straightforward explanation to the patient of the medical condition and any medication’s potential effects on polygraph results will be sufficient, allowing the patient to report as much as is deemed necessary to the polygraph examiner.
Polygraphy Pitfalls
Polygraphy presupposes that the subject will have a consistent and measurable physiologic response when he or she attempts to deceive the interviewer. The changes in BP, heart rate, respirations, and perspiration that are detected by polygraphy and interpreted by the examiner are controlled by the ANS (Table 1). There are a variety of diseases that are known to cause autonomic dysfunction (dysautonomia). Small fiber autonomic neuropathies often result in loss of sweating and altered heart rate and BP variation and can arise from many underlying conditions. Synucleinopathies, such as Parkinson disease, alter cardiovascular reflexes.14,16
Even diseases not commonly recognized as having a predominant clinical impact on ANS function can demonstrate measurable physiologic effect. For example, approximately 60% of patients with rheumatoid arthritis will have blunted cardiovagal baroreceptor responses and heart rate variability.17 ANS dysfunction is also a common sequela of alcoholism.18 Patients with diabetes mellitus often have an elevated resting heart rate and low heart rate variability due to dysregulated β-adrenergic activity.19 The impact of reduced baroreceptor response and reduced heart rate variability could impact the polygraph interpreter’s ability to discern responses using heart rate. Individuals with ANS dysfunction that causes blunted physiologic responses could have inconclusive or potentially worse false-negative polygraph results due to lack of variation between control and target questions.
To our knowledge, no study has been performed on the validity of polygraphy in patients with any form of dysautonomia. Additionally, a 2011 process and compliance study of the DoD polygraph program specifically recommended that “adjudicators would benefit from training in polygraph capabilities and limitations.”20 Although specific requirements vary from program to program, all programs accredited by the American Polygraph Association provide training in physiology, psychology, and standardization of test results.
Many commonly prescribed medications have effects on the ANS that could affect the results of a polygraph exam (Table 2). For example, β blockers reduce β adrenergic receptor activation in cardiac muscle and blood vessels, reducing heart rate, heart rate variability, cardiac contractility, and BP.21 This class of medication is prescribed for a variety of conditions, including congestive heart failure, hypertension, panic disorder, and posttraumatic stress disorder. Thus, a patient taking β blockers will have a blunted physiologic response to stress and have an increased likelihood of an inconclusive or false-negative polygraph exam.
Some over-the-counter medications also have effects on autonomic function. Sympathomimetics such as pseudoephedrine or antihistamines with anticholinergic activity like diphenhydramine can both increase heart rate and BP.22,23 Of the 10 most prescribed medications of 2016, 5 have direct effects on the ANS or the variables measured by the polygraph machine.24 An exhaustive list of medication effects on autonomic function is beyond the scope of this article.
A medication that may affect the results of a polygraph study that is of special interest to the DoD and military is mefloquine. Mefloquine is an antimalarial drug that has been used by military personnel deployed to malaria endemic regions.25 In murine models, mefloquine has been shown to disrupt autonomic and respiratory control in the central nervous system.26 The neuropsychiatric adverse effects of mefloquine are well documented and can last for years after exposure to the drug.27 Therefore, mefloquine could affect the results of a polygraph test through both direct toxic effects on the ANS as well as causing anxiety and depression, potentially affecting the subject’s response to questioning.
Alternative Modalities
Given the pitfalls inherent with external physiologic measures for lie detection, additional modalities that bypass measurement of ANS-governed responses have been sought. Indeed, the integration and combination of more comprehensive modalities has come to be named the forensic credibility assessment.
Functional MRI
Beginning in 1991, researchers began using fMRI to see real-time perfusion changes in areas of the cerebral cortex between times of rest and mental stimulation.26 This modality provides a noninvasive technique for viewing which specific parts of the brain are stimulated during activity. When someone is engaged in active deception, the dorsolateral prefrontal cortex has greater perfusion than when the patient is engaged in truth telling.28 Since fMRI involves imaging for evaluation of the central nervous system, it avoids the potential inaccuracies that can be seen in some subjects with autonomic irregularities. In fact, fMRI may have superior sensitivity and specificity for lie detection compared with that of conventional polygraphy.29
Significant limitations to the use of fMRI include the necessity of expensive specialized equipment and trained personnel to operate the MRI. Agencies that use polygraph examinations may be unwilling to make such an investment. Further, subjects with metallic foreign bodies or noncompatible medical implants cannot undergo the MRI procedure. Finally, there have been bioethical and legal concerns raised that measuring brain activity during interrogation may endanger “cognitive freedom” and may even be considered unreasonable search and seizure under the Fourth Amendment to the US Constitution.30 However, fMRI—like polygraphy—can only measure the difference between brain perfusion in 2 states. The idea of fMRI as “mind reading” is largely a misconception.31
Electroencephalography
Various EEG modalities have received increased interest for lie detection. In EEG, electrodes are used to measure the summation of a multitude of postsynaptic action potentials and the local voltage gradient they produce when cortical pyramidal neurons are fired in synchrony.32 These voltage gradients are detectable at the scalp surface. Shortly after the invention of EEG, it was observed that specific stimuli generated unique and predicable changes in EEG morphology. These event-related potentials (ERP) are detectable by scalp EEG shortly after the stimulus is given.33
ERPs can be elicited by a multitude of sensory stimuli, have a predictable and reproducible morphology, and are believed to be a psychophysiologic correlate of mental processing of stimuli.34 The P300 is an ERP characterized by a positive change in voltage occurring 300 milliseconds after a stimulus. It is associated with stimulus processing and categorization.35 Since deception is a complex cognitive process involving recognizing pertinent stimuli and inventing false responses to them, it was theorized that the detection of a P300 ERP during a patient interview would mean the patient truly recognizes the stimulus and is denying such knowledge. Early studies performed on P300 had variable accuracy for lie detection, roughly 40% to 80%, depending on the study. Thus, the rate of false negatives would increase if the subjects were coached on countermeasures, such as increasing the significance of distractor data or counting backward by 7s.36,37 Later studies have found ways of minimizing these issues, such as detection of a P900 ERP (a cortical potential at 900 milliseconds) that can be seen when subjects are attempting countermeasures.38
Another technique for increasing accuracy in EEG-mediated lie detection is measurement of multifaceted electroencephalographic response (MER), which involves a more detailed analysis of multiple EEG electrode sites and how the signaling changes over time using both visual comparison of multiple trials as well as bootstrap analysis.37 In particular, memory- and encoding-related multifaceted electroencephalographic response (MERMER) using P300 coupled with an electrically negative impulse recorded at the frontal lobe and phasic changes in the global EEG had superior accuracy than P300 alone.37
The benefits of EEG compared with that of fMRI include large reductions in cost, space, and restrictions for use in some individuals (EEG is safe for virtually all patients, including those with metallic foreign bodies). However, like fMRI, EEG still requires trained personnel to operate and interpret. Also, it has yet to be tested outside of the laboratory.
Conclusion
The ability to detect deception is an important factor in determining security risk and adjudication of legal proceedings, but untrained persons are surprisingly poor at discerning truth from lies. The polygraph has been used by law enforcement and government agencies for decades to aid in interrogation and the screening of employees for security clearances and other types of access. However, results are vulnerable to inaccuracies in subjects with autonomic disorders and may be confounded by multiple medications. While emerging technologies such as fMRI and EEG may allow superior accuracy by bypassing ANS-based physiologic outputs, the polygraph examiner and the physician must be aware of the effect of autonomic dysfunction and of the medications that affect the ANS. This is particularly true within military medicine, as many patients within this population are subject to polygraph examination.
1. Ford EB. Lie detection: historical, neuropsychiatric and legal dimensions. Int J Law Psychiatry. 2006;29(3):159-177.
2. Ohrn PG. Catecholamine infusion and gastrointestinal propulsion in the rat. Acta Chir Scand Suppl. 1979(461):43-52.
3. Sakamoto H. The study of catecholamine, acetylcholine and bradykinin in buccal circulation in dogs. Kurume Med J. 1979;26(2):153-162.
4. Bond CF Jr, Depaulo BM. Accuracy of deception judgments. Pers Soc Psychol Rev. 2006;10(3):214-234.
5. Vicianova M. Historical techniques of lie detection. Eur J sychology. 2015;11(3):522-534.
6. Matté JA. Forensic Psychophysiology Using the Polygraph: Scientific Truth Verification, Lie Detection. Williamsville, NY: JAM Publications; 2012.
7. Segrave K. Lie Detectors: A Social History. Jefferson, NC: McFarland & Company; 2004.
8. Nelson R. Scientific basis for polygraph testing. Polygraph. 2015;44(1):28-61.
9. Boucsein W. Electrodermal Activity. New York, NY: Springer Publishing; 2012.
10. US Congress, Office of Assessment and Technology. Scientific validity of polygraph testing: a research review and evaluation. https://ota.fas.org/reports/8320.pdf. Published 1983. Accessed June 12, 2019.
11. United States v Scheffer, 523 US 303 (1998).
12. United States v Piccinonna, 729 F Supp 1336 (SD Fl 1990).
13. Fridman DS, Janoe JS. The state of judicial gatekeeping in New Mexico. https://cyber.harvard.edu/daubert/nm.htm. Updated April 17, 1999. Accessed May 20, 2019.
14. Gibbons CH. Small fiber neuropathies. Continuum (Minneap Minn). 2014;20(5 Peripheral Nervous System Disorders):1398-1412.
15. US Department of Defense. Directive 5210.48: Credibility assessment (CA) program. https://fas.org/irp/doddir/dod/d5210_48.pdf. Updated February 12, 2018. Accessed May 30, 2019.
16. Postuma RB, Gagnon JF, Pelletier A, Montplaisir J. Prodromal autonomic symptoms and signs in Parkinson’s disease and dementia with Lewy bodies. Mov Disord. 2013;28(5):597-604.
17. Adlan AM, Lip GY, Paton JF, Kitas GD, Fisher JP. Autonomic function and rheumatoid arthritis: a systematic review. Semin Arthritis Rheum. 2014;44(3):283-304.
18. Di Ciaula A, Grattagliano I, Portincasa P. Chronic alcoholics retain dyspeptic symptoms, pan-enteric dysmotility, and autonomic neuropathy before and after abstinence. J Dig Dis. 2016;17(11):735-746.
19. Thaung HA, Baldi JC, Wang H, et al. Increased efferent cardiac sympathetic nerve activity and defective intrinsic heart rate regulation in type 2 diabetes. Diabetes. 2015;64(8):2944-2956.
20. US Department of Defense, Office of the Undersecretary of Defense for Intelligence. Department of Defense polygraph program process and compliance study: study report. https://fas.org/sgp/othergov/polygraph/dod-poly.pdf. Published December 19, 2011. Accessed May 20, 2019.
21. Ladage D, Schwinger RH, Brixius K. Cardio-selective beta-blocker: pharmacological evidence and their influence on exercise capacity. Cardiovasc Ther. 2013;31(2):76-83.
22. D’Souza RS, Mercogliano C, Ojukwu E, et al. Effects of prophylactic anticholinergic medications to decrease extrapyramidal side effects in patients taking acute antiemetic drugs: a systematic review and meta-analysis Emerg Med J. 2018;35:325-331.
23. Gheorghiev MD, Hosseini F, Moran J, Cooper CE. Effects of pseudoephedrine on parameters affecting exercise performance: a meta-analysis. Sports Med Open. 2018;4(1):44.
24. Frellick M. Top-selling, top-prescribed drugs for 2016. https://www.medscape.com/viewarticle/886404. Published October 2, 2017. Accessed May 20, 2019.
25. Lall DM, Dutschmann M, Deuchars J, Deuchars S. The anti-malarial drug mefloquine disrupts central autonomic and respiratory control in the working heart brainstem preparation of the rat. J Biomed Sci. 2012;19:103.
26. Ritchie EC, Block J, Nevin RL. Psychiatric side effects of mefloquine: applications to forensic psychiatry. J Am Acad Psychiatry Law. 2013;41(2):224-235.
27. Belliveau JW, Kennedy DN Jr, McKinstry RC, et al. Functional mapping of the human visual cortex by magnetic resonance imaging. Science. 1991;254(5032):716-719.
28. Ito A, Abe N, Fujii T, et al. The contribution of the dorsolateral prefrontal cortex to the preparation for deception and truth-telling. Brain Res. 2012;1464:43-52.
29. Langleben DD, Hakun JG, Seelig D. Polygraphy and functional magnetic resonance imaging in lie detection: a controlled blind comparison using the concealed information test. J Clin Psychiatry. 2016;77(10):1372-1380.
30. Boire RG. Searching the brain: the Fourth Amendment implications of brain-based deception detection devices. Am J Bioeth. 2005;5(2):62-63; discussion W5.
31. Langleben DD. Detection of deception with fMRI: Are we there yet? Legal Criminological Psychol. 2008;13(1):1-9.
32. Marcuse LV, Fields MC, Yoo J. Rowans Primer of EEG. 2nd ed. Edinburgh, Scotland, United Kingdom: Elsevier; 2016.
33. Farwell LA, Donchin E. The truth will out: interrogative polygraphy (“lie detection”) with event-related brain potentials. Psychophysiology. 1991;28(5):531-547.
34. Sur S, Sinha VK. Event-related potential: an overview. Ind Psychiatry J. 2009;18(1):70-73.
35. Polich J. Updating P300: an integrative theory of P3a and P3b. Clinical Neurophysiol. 2007;118(10):2128-2148.
36. Mertens R, Allen, JJB. The role of psychophysiology in forensic assessments: Deception detection, ERPs, and virtual reality mock crime scenarios. Psychophysiology. 2008;45(2):286-298.
37. Rosenfeld JP, Labkovsky E. New P300-based protocol to detect concealed information: resistance to mental countermeasures against only half the irrelevant stimuli and a possible ERP indicator of countermeasures. Psychophysiology. 2010;47(6):1002-1010.
38. Farwell LA, Smith SS. Using brain MERMER testing to detect knowledge despite efforts to conceal. J Forensic Sci. 2001;46(1):135-143.
The US Department of Defense (DoD) and law enforcement agencies around the country utilize polygraph as an aid in security screenings and interrogation. It is assumed that a person being interviewed will have a visceral response when attempting to deceive the interviewer, and that this response can be detected by measuring the change in vital signs between questions. By using vital signs as an indirect measurement of deception-induced stress, the polygraph machine may provide a false positive or negative result if a patient has an inherited or acquired condition that affects the autonomic nervous system (ANS).
A variety of diseases from alcohol use disorder to rheumatoid arthritis can affect the ANS. In addition, a multitude of commonly prescribed drugs can affect the ANS. Although in their infancy, functional magnetic resonance imaging (fMRI) and EEG (electroencephalogram) deception detection techniques circumvent these issues. Dysautonomias may be an underappreciated cause of error in polygraph interpretation. Polygraph examiners and DoD agencies should be aware of the potential for these disorders to interfere with interpretation of results. In the near future, other modalities that do not measure autonomic variables may be utilized to avoid these pitfalls.
Polygraphy
Throughout history, humans have been interested in techniques and devices that can discern lies from the truth. Even in the ancient era, it was known that the act of lying had physiologic effects. In ancient Israel, if a woman accused of adultery should develop a swollen abdomen after drinking “waters of bitterness,” she was considered guilty of the crime, as described in Numbers 5:11-31. In Ancient China, those accused of fraud would be forced to hold dry rice in their mouths; if the expectorated rice was dry, the suspect was found guilty.1 We now know that catecholamines, particularly epinephrine, secreted during times of stress, cause relaxation of smooth muscle, leading to reduced bowel motility and dry mouth.2-4 However, most methods before the modern era were based more on superstition and chance rather than any sound physiologic premise.
When asked to discern the truth from falsehood based on their own perceptions, people correctly discern lies as false merely 47% of the time and truth as nondeceptive about 61% of the time.5 In short, unaided, we are very poor lie detectors. Therefore, a great deal of interest in technology that can aid in lie detection has ensued. With enhanced technology and understanding of human physiology came a renewed interest in lie detection. Since it was known that vital signs such as blood pressure (BP), heart rate, and breathing could be affected by the stressful situation brought on by deception, quantifying and measuring those responses in an effort to detect lying became a goal. In 1881, the Italian criminologist Cesare Lombroso invented a glove that when worn by a suspect, measured their BP.6-8 Changes in BP also were the target variable of the systolic BP deception test invented by William M. Marston, PhD, in 1915.8 Marston also experimented with measurements of other variables, such as muscle tension.9 In 1921, John Larson invented the first modern polygraph machine.7
Procedures
Today’s polygraph builds on these techniques. A standard polygraph measures respiration, heart rate, BP, and sudomotor function (sweating). Respiration is measured via strain gauges strapped around the chest and abdomen that respond to chest expansion during inhalation. BP and pulse can be measured through a variety of means, including finger pulse measurement or sphygmomanometer.8
Perspiration is measured by skin electrical conductance. Human sweat contains a variety of cations and anions—mostly sodium and chloride, but also potassium, bicarbonate, and lactate. The presence of these electrolytes alter electrical conduction at the skin surface when sweat is released.10
The exact questioning procedure used to perform a polygraph examination can vary. The Comparison Question Test is most commonly used. In this format, the interview consists of questions that are relevant to the investigation at hand, interspersed with control questions. The examiner compares the changes in vital signs and skin conduction to the baseline measurements generated during the pretest interview and during control questions.8 Using these standardized techniques, some studies have shown accuracy rates between 83% and 95% in controlled settings.8 However, studies performed outside of the polygraph community have found very high false positive rates, up to 50% or greater.11
The US Supreme Court has ruled that individual jurisdictions can decide whether or not to admit polygraph evidence in court, and the US Court of Appeals for the Eleventh Circuit has ruled that polygraph results are only admissible if both parties agree to it and are given sufficient notice.12,13 Currently, New Mexico is the only state that allows polygraph results to be used as evidence without a pretrial agreement; all other states either require such an agreement or forbid the results to be used as evidence.14
Although rarely used in federal and state courts as evidence, polygraphy is commonly used during investigations and in the hiring process of government agencies. DoD Directive 5210.48 and Instruction 5210.91 enable DoD investigative organizations (eg, Naval Criminal Investigative Service, National Security Agency, US Army Investigational Command) to use polygraph as an aid during investigations into suspected involvement with foreign intelligence, terrorism against the US, mishandling of classified documents, and other serious violations.15
The Role of the Physician in Polygraph Assessment
It may be rare that the physician is called upon to provide information regarding an individual’s medical condition or related medication use and the effect of these on polygraph results. In such cases, however, the physician must remember the primary fiduciary duty to the patient. Disclosure of medical conditions cannot be made without the patient’s consent, save in very specific situations (eg, Commanding Officer Inquiry, Tarasoff Duty to Protect, etc). It is the polygraph examiner’s responsibility to be aware of potential confounders in a particular examination.10
Physicians can have a responsibility when in administrative or supervisory positions, to advise security and other officials regarding the fitness for certain duties of candidates with whom there is no physician-patient relationship. This may include an individual’s ability to undergo polygraph examination and the validity of such results. However, when a physician-patient relationship is involved, care must be given to ensure that the patient understands that the relationship is protected both by professional standards and by law and that no information will be shared without the patient’s authorization (aside from those rare exceptions provided by law). Often, a straightforward explanation to the patient of the medical condition and any medication’s potential effects on polygraph results will be sufficient, allowing the patient to report as much as is deemed necessary to the polygraph examiner.
Polygraphy Pitfalls
Polygraphy presupposes that the subject will have a consistent and measurable physiologic response when he or she attempts to deceive the interviewer. The changes in BP, heart rate, respirations, and perspiration that are detected by polygraphy and interpreted by the examiner are controlled by the ANS (Table 1). There are a variety of diseases that are known to cause autonomic dysfunction (dysautonomia). Small fiber autonomic neuropathies often result in loss of sweating and altered heart rate and BP variation and can arise from many underlying conditions. Synucleinopathies, such as Parkinson disease, alter cardiovascular reflexes.14,16
Even diseases not commonly recognized as having a predominant clinical impact on ANS function can demonstrate measurable physiologic effect. For example, approximately 60% of patients with rheumatoid arthritis will have blunted cardiovagal baroreceptor responses and heart rate variability.17 ANS dysfunction is also a common sequela of alcoholism.18 Patients with diabetes mellitus often have an elevated resting heart rate and low heart rate variability due to dysregulated β-adrenergic activity.19 The impact of reduced baroreceptor response and reduced heart rate variability could impact the polygraph interpreter’s ability to discern responses using heart rate. Individuals with ANS dysfunction that causes blunted physiologic responses could have inconclusive or potentially worse false-negative polygraph results due to lack of variation between control and target questions.
To our knowledge, no study has been performed on the validity of polygraphy in patients with any form of dysautonomia. Additionally, a 2011 process and compliance study of the DoD polygraph program specifically recommended that “adjudicators would benefit from training in polygraph capabilities and limitations.”20 Although specific requirements vary from program to program, all programs accredited by the American Polygraph Association provide training in physiology, psychology, and standardization of test results.
Many commonly prescribed medications have effects on the ANS that could affect the results of a polygraph exam (Table 2). For example, β blockers reduce β adrenergic receptor activation in cardiac muscle and blood vessels, reducing heart rate, heart rate variability, cardiac contractility, and BP.21 This class of medication is prescribed for a variety of conditions, including congestive heart failure, hypertension, panic disorder, and posttraumatic stress disorder. Thus, a patient taking β blockers will have a blunted physiologic response to stress and have an increased likelihood of an inconclusive or false-negative polygraph exam.
Some over-the-counter medications also have effects on autonomic function. Sympathomimetics such as pseudoephedrine or antihistamines with anticholinergic activity like diphenhydramine can both increase heart rate and BP.22,23 Of the 10 most prescribed medications of 2016, 5 have direct effects on the ANS or the variables measured by the polygraph machine.24 An exhaustive list of medication effects on autonomic function is beyond the scope of this article.
A medication that may affect the results of a polygraph study that is of special interest to the DoD and military is mefloquine. Mefloquine is an antimalarial drug that has been used by military personnel deployed to malaria endemic regions.25 In murine models, mefloquine has been shown to disrupt autonomic and respiratory control in the central nervous system.26 The neuropsychiatric adverse effects of mefloquine are well documented and can last for years after exposure to the drug.27 Therefore, mefloquine could affect the results of a polygraph test through both direct toxic effects on the ANS as well as causing anxiety and depression, potentially affecting the subject’s response to questioning.
Alternative Modalities
Given the pitfalls inherent with external physiologic measures for lie detection, additional modalities that bypass measurement of ANS-governed responses have been sought. Indeed, the integration and combination of more comprehensive modalities has come to be named the forensic credibility assessment.
Functional MRI
Beginning in 1991, researchers began using fMRI to see real-time perfusion changes in areas of the cerebral cortex between times of rest and mental stimulation.26 This modality provides a noninvasive technique for viewing which specific parts of the brain are stimulated during activity. When someone is engaged in active deception, the dorsolateral prefrontal cortex has greater perfusion than when the patient is engaged in truth telling.28 Since fMRI involves imaging for evaluation of the central nervous system, it avoids the potential inaccuracies that can be seen in some subjects with autonomic irregularities. In fact, fMRI may have superior sensitivity and specificity for lie detection compared with that of conventional polygraphy.29
Significant limitations to the use of fMRI include the necessity of expensive specialized equipment and trained personnel to operate the MRI. Agencies that use polygraph examinations may be unwilling to make such an investment. Further, subjects with metallic foreign bodies or noncompatible medical implants cannot undergo the MRI procedure. Finally, there have been bioethical and legal concerns raised that measuring brain activity during interrogation may endanger “cognitive freedom” and may even be considered unreasonable search and seizure under the Fourth Amendment to the US Constitution.30 However, fMRI—like polygraphy—can only measure the difference between brain perfusion in 2 states. The idea of fMRI as “mind reading” is largely a misconception.31
Electroencephalography
Various EEG modalities have received increased interest for lie detection. In EEG, electrodes are used to measure the summation of a multitude of postsynaptic action potentials and the local voltage gradient they produce when cortical pyramidal neurons are fired in synchrony.32 These voltage gradients are detectable at the scalp surface. Shortly after the invention of EEG, it was observed that specific stimuli generated unique and predicable changes in EEG morphology. These event-related potentials (ERP) are detectable by scalp EEG shortly after the stimulus is given.33
ERPs can be elicited by a multitude of sensory stimuli, have a predictable and reproducible morphology, and are believed to be a psychophysiologic correlate of mental processing of stimuli.34 The P300 is an ERP characterized by a positive change in voltage occurring 300 milliseconds after a stimulus. It is associated with stimulus processing and categorization.35 Since deception is a complex cognitive process involving recognizing pertinent stimuli and inventing false responses to them, it was theorized that the detection of a P300 ERP during a patient interview would mean the patient truly recognizes the stimulus and is denying such knowledge. Early studies performed on P300 had variable accuracy for lie detection, roughly 40% to 80%, depending on the study. Thus, the rate of false negatives would increase if the subjects were coached on countermeasures, such as increasing the significance of distractor data or counting backward by 7s.36,37 Later studies have found ways of minimizing these issues, such as detection of a P900 ERP (a cortical potential at 900 milliseconds) that can be seen when subjects are attempting countermeasures.38
Another technique for increasing accuracy in EEG-mediated lie detection is measurement of multifaceted electroencephalographic response (MER), which involves a more detailed analysis of multiple EEG electrode sites and how the signaling changes over time using both visual comparison of multiple trials as well as bootstrap analysis.37 In particular, memory- and encoding-related multifaceted electroencephalographic response (MERMER) using P300 coupled with an electrically negative impulse recorded at the frontal lobe and phasic changes in the global EEG had superior accuracy than P300 alone.37
The benefits of EEG compared with that of fMRI include large reductions in cost, space, and restrictions for use in some individuals (EEG is safe for virtually all patients, including those with metallic foreign bodies). However, like fMRI, EEG still requires trained personnel to operate and interpret. Also, it has yet to be tested outside of the laboratory.
Conclusion
The ability to detect deception is an important factor in determining security risk and adjudication of legal proceedings, but untrained persons are surprisingly poor at discerning truth from lies. The polygraph has been used by law enforcement and government agencies for decades to aid in interrogation and the screening of employees for security clearances and other types of access. However, results are vulnerable to inaccuracies in subjects with autonomic disorders and may be confounded by multiple medications. While emerging technologies such as fMRI and EEG may allow superior accuracy by bypassing ANS-based physiologic outputs, the polygraph examiner and the physician must be aware of the effect of autonomic dysfunction and of the medications that affect the ANS. This is particularly true within military medicine, as many patients within this population are subject to polygraph examination.
The US Department of Defense (DoD) and law enforcement agencies around the country utilize polygraph as an aid in security screenings and interrogation. It is assumed that a person being interviewed will have a visceral response when attempting to deceive the interviewer, and that this response can be detected by measuring the change in vital signs between questions. By using vital signs as an indirect measurement of deception-induced stress, the polygraph machine may provide a false positive or negative result if a patient has an inherited or acquired condition that affects the autonomic nervous system (ANS).
A variety of diseases from alcohol use disorder to rheumatoid arthritis can affect the ANS. In addition, a multitude of commonly prescribed drugs can affect the ANS. Although in their infancy, functional magnetic resonance imaging (fMRI) and EEG (electroencephalogram) deception detection techniques circumvent these issues. Dysautonomias may be an underappreciated cause of error in polygraph interpretation. Polygraph examiners and DoD agencies should be aware of the potential for these disorders to interfere with interpretation of results. In the near future, other modalities that do not measure autonomic variables may be utilized to avoid these pitfalls.
Polygraphy
Throughout history, humans have been interested in techniques and devices that can discern lies from the truth. Even in the ancient era, it was known that the act of lying had physiologic effects. In ancient Israel, if a woman accused of adultery should develop a swollen abdomen after drinking “waters of bitterness,” she was considered guilty of the crime, as described in Numbers 5:11-31. In Ancient China, those accused of fraud would be forced to hold dry rice in their mouths; if the expectorated rice was dry, the suspect was found guilty.1 We now know that catecholamines, particularly epinephrine, secreted during times of stress, cause relaxation of smooth muscle, leading to reduced bowel motility and dry mouth.2-4 However, most methods before the modern era were based more on superstition and chance rather than any sound physiologic premise.
When asked to discern the truth from falsehood based on their own perceptions, people correctly discern lies as false merely 47% of the time and truth as nondeceptive about 61% of the time.5 In short, unaided, we are very poor lie detectors. Therefore, a great deal of interest in technology that can aid in lie detection has ensued. With enhanced technology and understanding of human physiology came a renewed interest in lie detection. Since it was known that vital signs such as blood pressure (BP), heart rate, and breathing could be affected by the stressful situation brought on by deception, quantifying and measuring those responses in an effort to detect lying became a goal. In 1881, the Italian criminologist Cesare Lombroso invented a glove that when worn by a suspect, measured their BP.6-8 Changes in BP also were the target variable of the systolic BP deception test invented by William M. Marston, PhD, in 1915.8 Marston also experimented with measurements of other variables, such as muscle tension.9 In 1921, John Larson invented the first modern polygraph machine.7
Procedures
Today’s polygraph builds on these techniques. A standard polygraph measures respiration, heart rate, BP, and sudomotor function (sweating). Respiration is measured via strain gauges strapped around the chest and abdomen that respond to chest expansion during inhalation. BP and pulse can be measured through a variety of means, including finger pulse measurement or sphygmomanometer.8
Perspiration is measured by skin electrical conductance. Human sweat contains a variety of cations and anions—mostly sodium and chloride, but also potassium, bicarbonate, and lactate. The presence of these electrolytes alter electrical conduction at the skin surface when sweat is released.10
The exact questioning procedure used to perform a polygraph examination can vary. The Comparison Question Test is most commonly used. In this format, the interview consists of questions that are relevant to the investigation at hand, interspersed with control questions. The examiner compares the changes in vital signs and skin conduction to the baseline measurements generated during the pretest interview and during control questions.8 Using these standardized techniques, some studies have shown accuracy rates between 83% and 95% in controlled settings.8 However, studies performed outside of the polygraph community have found very high false positive rates, up to 50% or greater.11
The US Supreme Court has ruled that individual jurisdictions can decide whether or not to admit polygraph evidence in court, and the US Court of Appeals for the Eleventh Circuit has ruled that polygraph results are only admissible if both parties agree to it and are given sufficient notice.12,13 Currently, New Mexico is the only state that allows polygraph results to be used as evidence without a pretrial agreement; all other states either require such an agreement or forbid the results to be used as evidence.14
Although rarely used in federal and state courts as evidence, polygraphy is commonly used during investigations and in the hiring process of government agencies. DoD Directive 5210.48 and Instruction 5210.91 enable DoD investigative organizations (eg, Naval Criminal Investigative Service, National Security Agency, US Army Investigational Command) to use polygraph as an aid during investigations into suspected involvement with foreign intelligence, terrorism against the US, mishandling of classified documents, and other serious violations.15
The Role of the Physician in Polygraph Assessment
It may be rare that the physician is called upon to provide information regarding an individual’s medical condition or related medication use and the effect of these on polygraph results. In such cases, however, the physician must remember the primary fiduciary duty to the patient. Disclosure of medical conditions cannot be made without the patient’s consent, save in very specific situations (eg, Commanding Officer Inquiry, Tarasoff Duty to Protect, etc). It is the polygraph examiner’s responsibility to be aware of potential confounders in a particular examination.10
Physicians can have a responsibility when in administrative or supervisory positions, to advise security and other officials regarding the fitness for certain duties of candidates with whom there is no physician-patient relationship. This may include an individual’s ability to undergo polygraph examination and the validity of such results. However, when a physician-patient relationship is involved, care must be given to ensure that the patient understands that the relationship is protected both by professional standards and by law and that no information will be shared without the patient’s authorization (aside from those rare exceptions provided by law). Often, a straightforward explanation to the patient of the medical condition and any medication’s potential effects on polygraph results will be sufficient, allowing the patient to report as much as is deemed necessary to the polygraph examiner.
Polygraphy Pitfalls
Polygraphy presupposes that the subject will have a consistent and measurable physiologic response when he or she attempts to deceive the interviewer. The changes in BP, heart rate, respirations, and perspiration that are detected by polygraphy and interpreted by the examiner are controlled by the ANS (Table 1). There are a variety of diseases that are known to cause autonomic dysfunction (dysautonomia). Small fiber autonomic neuropathies often result in loss of sweating and altered heart rate and BP variation and can arise from many underlying conditions. Synucleinopathies, such as Parkinson disease, alter cardiovascular reflexes.14,16
Even diseases not commonly recognized as having a predominant clinical impact on ANS function can demonstrate measurable physiologic effect. For example, approximately 60% of patients with rheumatoid arthritis will have blunted cardiovagal baroreceptor responses and heart rate variability.17 ANS dysfunction is also a common sequela of alcoholism.18 Patients with diabetes mellitus often have an elevated resting heart rate and low heart rate variability due to dysregulated β-adrenergic activity.19 The impact of reduced baroreceptor response and reduced heart rate variability could impact the polygraph interpreter’s ability to discern responses using heart rate. Individuals with ANS dysfunction that causes blunted physiologic responses could have inconclusive or potentially worse false-negative polygraph results due to lack of variation between control and target questions.
To our knowledge, no study has been performed on the validity of polygraphy in patients with any form of dysautonomia. Additionally, a 2011 process and compliance study of the DoD polygraph program specifically recommended that “adjudicators would benefit from training in polygraph capabilities and limitations.”20 Although specific requirements vary from program to program, all programs accredited by the American Polygraph Association provide training in physiology, psychology, and standardization of test results.
Many commonly prescribed medications have effects on the ANS that could affect the results of a polygraph exam (Table 2). For example, β blockers reduce β adrenergic receptor activation in cardiac muscle and blood vessels, reducing heart rate, heart rate variability, cardiac contractility, and BP.21 This class of medication is prescribed for a variety of conditions, including congestive heart failure, hypertension, panic disorder, and posttraumatic stress disorder. Thus, a patient taking β blockers will have a blunted physiologic response to stress and have an increased likelihood of an inconclusive or false-negative polygraph exam.
Some over-the-counter medications also have effects on autonomic function. Sympathomimetics such as pseudoephedrine or antihistamines with anticholinergic activity like diphenhydramine can both increase heart rate and BP.22,23 Of the 10 most prescribed medications of 2016, 5 have direct effects on the ANS or the variables measured by the polygraph machine.24 An exhaustive list of medication effects on autonomic function is beyond the scope of this article.
A medication that may affect the results of a polygraph study that is of special interest to the DoD and military is mefloquine. Mefloquine is an antimalarial drug that has been used by military personnel deployed to malaria endemic regions.25 In murine models, mefloquine has been shown to disrupt autonomic and respiratory control in the central nervous system.26 The neuropsychiatric adverse effects of mefloquine are well documented and can last for years after exposure to the drug.27 Therefore, mefloquine could affect the results of a polygraph test through both direct toxic effects on the ANS as well as causing anxiety and depression, potentially affecting the subject’s response to questioning.
Alternative Modalities
Given the pitfalls inherent with external physiologic measures for lie detection, additional modalities that bypass measurement of ANS-governed responses have been sought. Indeed, the integration and combination of more comprehensive modalities has come to be named the forensic credibility assessment.
Functional MRI
Beginning in 1991, researchers began using fMRI to see real-time perfusion changes in areas of the cerebral cortex between times of rest and mental stimulation.26 This modality provides a noninvasive technique for viewing which specific parts of the brain are stimulated during activity. When someone is engaged in active deception, the dorsolateral prefrontal cortex has greater perfusion than when the patient is engaged in truth telling.28 Since fMRI involves imaging for evaluation of the central nervous system, it avoids the potential inaccuracies that can be seen in some subjects with autonomic irregularities. In fact, fMRI may have superior sensitivity and specificity for lie detection compared with that of conventional polygraphy.29
Significant limitations to the use of fMRI include the necessity of expensive specialized equipment and trained personnel to operate the MRI. Agencies that use polygraph examinations may be unwilling to make such an investment. Further, subjects with metallic foreign bodies or noncompatible medical implants cannot undergo the MRI procedure. Finally, there have been bioethical and legal concerns raised that measuring brain activity during interrogation may endanger “cognitive freedom” and may even be considered unreasonable search and seizure under the Fourth Amendment to the US Constitution.30 However, fMRI—like polygraphy—can only measure the difference between brain perfusion in 2 states. The idea of fMRI as “mind reading” is largely a misconception.31
Electroencephalography
Various EEG modalities have received increased interest for lie detection. In EEG, electrodes are used to measure the summation of a multitude of postsynaptic action potentials and the local voltage gradient they produce when cortical pyramidal neurons are fired in synchrony.32 These voltage gradients are detectable at the scalp surface. Shortly after the invention of EEG, it was observed that specific stimuli generated unique and predicable changes in EEG morphology. These event-related potentials (ERP) are detectable by scalp EEG shortly after the stimulus is given.33
ERPs can be elicited by a multitude of sensory stimuli, have a predictable and reproducible morphology, and are believed to be a psychophysiologic correlate of mental processing of stimuli.34 The P300 is an ERP characterized by a positive change in voltage occurring 300 milliseconds after a stimulus. It is associated with stimulus processing and categorization.35 Since deception is a complex cognitive process involving recognizing pertinent stimuli and inventing false responses to them, it was theorized that the detection of a P300 ERP during a patient interview would mean the patient truly recognizes the stimulus and is denying such knowledge. Early studies performed on P300 had variable accuracy for lie detection, roughly 40% to 80%, depending on the study. Thus, the rate of false negatives would increase if the subjects were coached on countermeasures, such as increasing the significance of distractor data or counting backward by 7s.36,37 Later studies have found ways of minimizing these issues, such as detection of a P900 ERP (a cortical potential at 900 milliseconds) that can be seen when subjects are attempting countermeasures.38
Another technique for increasing accuracy in EEG-mediated lie detection is measurement of multifaceted electroencephalographic response (MER), which involves a more detailed analysis of multiple EEG electrode sites and how the signaling changes over time using both visual comparison of multiple trials as well as bootstrap analysis.37 In particular, memory- and encoding-related multifaceted electroencephalographic response (MERMER) using P300 coupled with an electrically negative impulse recorded at the frontal lobe and phasic changes in the global EEG had superior accuracy than P300 alone.37
The benefits of EEG compared with that of fMRI include large reductions in cost, space, and restrictions for use in some individuals (EEG is safe for virtually all patients, including those with metallic foreign bodies). However, like fMRI, EEG still requires trained personnel to operate and interpret. Also, it has yet to be tested outside of the laboratory.
Conclusion
The ability to detect deception is an important factor in determining security risk and adjudication of legal proceedings, but untrained persons are surprisingly poor at discerning truth from lies. The polygraph has been used by law enforcement and government agencies for decades to aid in interrogation and the screening of employees for security clearances and other types of access. However, results are vulnerable to inaccuracies in subjects with autonomic disorders and may be confounded by multiple medications. While emerging technologies such as fMRI and EEG may allow superior accuracy by bypassing ANS-based physiologic outputs, the polygraph examiner and the physician must be aware of the effect of autonomic dysfunction and of the medications that affect the ANS. This is particularly true within military medicine, as many patients within this population are subject to polygraph examination.
1. Ford EB. Lie detection: historical, neuropsychiatric and legal dimensions. Int J Law Psychiatry. 2006;29(3):159-177.
2. Ohrn PG. Catecholamine infusion and gastrointestinal propulsion in the rat. Acta Chir Scand Suppl. 1979(461):43-52.
3. Sakamoto H. The study of catecholamine, acetylcholine and bradykinin in buccal circulation in dogs. Kurume Med J. 1979;26(2):153-162.
4. Bond CF Jr, Depaulo BM. Accuracy of deception judgments. Pers Soc Psychol Rev. 2006;10(3):214-234.
5. Vicianova M. Historical techniques of lie detection. Eur J sychology. 2015;11(3):522-534.
6. Matté JA. Forensic Psychophysiology Using the Polygraph: Scientific Truth Verification, Lie Detection. Williamsville, NY: JAM Publications; 2012.
7. Segrave K. Lie Detectors: A Social History. Jefferson, NC: McFarland & Company; 2004.
8. Nelson R. Scientific basis for polygraph testing. Polygraph. 2015;44(1):28-61.
9. Boucsein W. Electrodermal Activity. New York, NY: Springer Publishing; 2012.
10. US Congress, Office of Assessment and Technology. Scientific validity of polygraph testing: a research review and evaluation. https://ota.fas.org/reports/8320.pdf. Published 1983. Accessed June 12, 2019.
11. United States v Scheffer, 523 US 303 (1998).
12. United States v Piccinonna, 729 F Supp 1336 (SD Fl 1990).
13. Fridman DS, Janoe JS. The state of judicial gatekeeping in New Mexico. https://cyber.harvard.edu/daubert/nm.htm. Updated April 17, 1999. Accessed May 20, 2019.
14. Gibbons CH. Small fiber neuropathies. Continuum (Minneap Minn). 2014;20(5 Peripheral Nervous System Disorders):1398-1412.
15. US Department of Defense. Directive 5210.48: Credibility assessment (CA) program. https://fas.org/irp/doddir/dod/d5210_48.pdf. Updated February 12, 2018. Accessed May 30, 2019.
16. Postuma RB, Gagnon JF, Pelletier A, Montplaisir J. Prodromal autonomic symptoms and signs in Parkinson’s disease and dementia with Lewy bodies. Mov Disord. 2013;28(5):597-604.
17. Adlan AM, Lip GY, Paton JF, Kitas GD, Fisher JP. Autonomic function and rheumatoid arthritis: a systematic review. Semin Arthritis Rheum. 2014;44(3):283-304.
18. Di Ciaula A, Grattagliano I, Portincasa P. Chronic alcoholics retain dyspeptic symptoms, pan-enteric dysmotility, and autonomic neuropathy before and after abstinence. J Dig Dis. 2016;17(11):735-746.
19. Thaung HA, Baldi JC, Wang H, et al. Increased efferent cardiac sympathetic nerve activity and defective intrinsic heart rate regulation in type 2 diabetes. Diabetes. 2015;64(8):2944-2956.
20. US Department of Defense, Office of the Undersecretary of Defense for Intelligence. Department of Defense polygraph program process and compliance study: study report. https://fas.org/sgp/othergov/polygraph/dod-poly.pdf. Published December 19, 2011. Accessed May 20, 2019.
21. Ladage D, Schwinger RH, Brixius K. Cardio-selective beta-blocker: pharmacological evidence and their influence on exercise capacity. Cardiovasc Ther. 2013;31(2):76-83.
22. D’Souza RS, Mercogliano C, Ojukwu E, et al. Effects of prophylactic anticholinergic medications to decrease extrapyramidal side effects in patients taking acute antiemetic drugs: a systematic review and meta-analysis Emerg Med J. 2018;35:325-331.
23. Gheorghiev MD, Hosseini F, Moran J, Cooper CE. Effects of pseudoephedrine on parameters affecting exercise performance: a meta-analysis. Sports Med Open. 2018;4(1):44.
24. Frellick M. Top-selling, top-prescribed drugs for 2016. https://www.medscape.com/viewarticle/886404. Published October 2, 2017. Accessed May 20, 2019.
25. Lall DM, Dutschmann M, Deuchars J, Deuchars S. The anti-malarial drug mefloquine disrupts central autonomic and respiratory control in the working heart brainstem preparation of the rat. J Biomed Sci. 2012;19:103.
26. Ritchie EC, Block J, Nevin RL. Psychiatric side effects of mefloquine: applications to forensic psychiatry. J Am Acad Psychiatry Law. 2013;41(2):224-235.
27. Belliveau JW, Kennedy DN Jr, McKinstry RC, et al. Functional mapping of the human visual cortex by magnetic resonance imaging. Science. 1991;254(5032):716-719.
28. Ito A, Abe N, Fujii T, et al. The contribution of the dorsolateral prefrontal cortex to the preparation for deception and truth-telling. Brain Res. 2012;1464:43-52.
29. Langleben DD, Hakun JG, Seelig D. Polygraphy and functional magnetic resonance imaging in lie detection: a controlled blind comparison using the concealed information test. J Clin Psychiatry. 2016;77(10):1372-1380.
30. Boire RG. Searching the brain: the Fourth Amendment implications of brain-based deception detection devices. Am J Bioeth. 2005;5(2):62-63; discussion W5.
31. Langleben DD. Detection of deception with fMRI: Are we there yet? Legal Criminological Psychol. 2008;13(1):1-9.
32. Marcuse LV, Fields MC, Yoo J. Rowans Primer of EEG. 2nd ed. Edinburgh, Scotland, United Kingdom: Elsevier; 2016.
33. Farwell LA, Donchin E. The truth will out: interrogative polygraphy (“lie detection”) with event-related brain potentials. Psychophysiology. 1991;28(5):531-547.
34. Sur S, Sinha VK. Event-related potential: an overview. Ind Psychiatry J. 2009;18(1):70-73.
35. Polich J. Updating P300: an integrative theory of P3a and P3b. Clinical Neurophysiol. 2007;118(10):2128-2148.
36. Mertens R, Allen, JJB. The role of psychophysiology in forensic assessments: Deception detection, ERPs, and virtual reality mock crime scenarios. Psychophysiology. 2008;45(2):286-298.
37. Rosenfeld JP, Labkovsky E. New P300-based protocol to detect concealed information: resistance to mental countermeasures against only half the irrelevant stimuli and a possible ERP indicator of countermeasures. Psychophysiology. 2010;47(6):1002-1010.
38. Farwell LA, Smith SS. Using brain MERMER testing to detect knowledge despite efforts to conceal. J Forensic Sci. 2001;46(1):135-143.
1. Ford EB. Lie detection: historical, neuropsychiatric and legal dimensions. Int J Law Psychiatry. 2006;29(3):159-177.
2. Ohrn PG. Catecholamine infusion and gastrointestinal propulsion in the rat. Acta Chir Scand Suppl. 1979(461):43-52.
3. Sakamoto H. The study of catecholamine, acetylcholine and bradykinin in buccal circulation in dogs. Kurume Med J. 1979;26(2):153-162.
4. Bond CF Jr, Depaulo BM. Accuracy of deception judgments. Pers Soc Psychol Rev. 2006;10(3):214-234.
5. Vicianova M. Historical techniques of lie detection. Eur J sychology. 2015;11(3):522-534.
6. Matté JA. Forensic Psychophysiology Using the Polygraph: Scientific Truth Verification, Lie Detection. Williamsville, NY: JAM Publications; 2012.
7. Segrave K. Lie Detectors: A Social History. Jefferson, NC: McFarland & Company; 2004.
8. Nelson R. Scientific basis for polygraph testing. Polygraph. 2015;44(1):28-61.
9. Boucsein W. Electrodermal Activity. New York, NY: Springer Publishing; 2012.
10. US Congress, Office of Assessment and Technology. Scientific validity of polygraph testing: a research review and evaluation. https://ota.fas.org/reports/8320.pdf. Published 1983. Accessed June 12, 2019.
11. United States v Scheffer, 523 US 303 (1998).
12. United States v Piccinonna, 729 F Supp 1336 (SD Fl 1990).
13. Fridman DS, Janoe JS. The state of judicial gatekeeping in New Mexico. https://cyber.harvard.edu/daubert/nm.htm. Updated April 17, 1999. Accessed May 20, 2019.
14. Gibbons CH. Small fiber neuropathies. Continuum (Minneap Minn). 2014;20(5 Peripheral Nervous System Disorders):1398-1412.
15. US Department of Defense. Directive 5210.48: Credibility assessment (CA) program. https://fas.org/irp/doddir/dod/d5210_48.pdf. Updated February 12, 2018. Accessed May 30, 2019.
16. Postuma RB, Gagnon JF, Pelletier A, Montplaisir J. Prodromal autonomic symptoms and signs in Parkinson’s disease and dementia with Lewy bodies. Mov Disord. 2013;28(5):597-604.
17. Adlan AM, Lip GY, Paton JF, Kitas GD, Fisher JP. Autonomic function and rheumatoid arthritis: a systematic review. Semin Arthritis Rheum. 2014;44(3):283-304.
18. Di Ciaula A, Grattagliano I, Portincasa P. Chronic alcoholics retain dyspeptic symptoms, pan-enteric dysmotility, and autonomic neuropathy before and after abstinence. J Dig Dis. 2016;17(11):735-746.
19. Thaung HA, Baldi JC, Wang H, et al. Increased efferent cardiac sympathetic nerve activity and defective intrinsic heart rate regulation in type 2 diabetes. Diabetes. 2015;64(8):2944-2956.
20. US Department of Defense, Office of the Undersecretary of Defense for Intelligence. Department of Defense polygraph program process and compliance study: study report. https://fas.org/sgp/othergov/polygraph/dod-poly.pdf. Published December 19, 2011. Accessed May 20, 2019.
21. Ladage D, Schwinger RH, Brixius K. Cardio-selective beta-blocker: pharmacological evidence and their influence on exercise capacity. Cardiovasc Ther. 2013;31(2):76-83.
22. D’Souza RS, Mercogliano C, Ojukwu E, et al. Effects of prophylactic anticholinergic medications to decrease extrapyramidal side effects in patients taking acute antiemetic drugs: a systematic review and meta-analysis Emerg Med J. 2018;35:325-331.
23. Gheorghiev MD, Hosseini F, Moran J, Cooper CE. Effects of pseudoephedrine on parameters affecting exercise performance: a meta-analysis. Sports Med Open. 2018;4(1):44.
24. Frellick M. Top-selling, top-prescribed drugs for 2016. https://www.medscape.com/viewarticle/886404. Published October 2, 2017. Accessed May 20, 2019.
25. Lall DM, Dutschmann M, Deuchars J, Deuchars S. The anti-malarial drug mefloquine disrupts central autonomic and respiratory control in the working heart brainstem preparation of the rat. J Biomed Sci. 2012;19:103.
26. Ritchie EC, Block J, Nevin RL. Psychiatric side effects of mefloquine: applications to forensic psychiatry. J Am Acad Psychiatry Law. 2013;41(2):224-235.
27. Belliveau JW, Kennedy DN Jr, McKinstry RC, et al. Functional mapping of the human visual cortex by magnetic resonance imaging. Science. 1991;254(5032):716-719.
28. Ito A, Abe N, Fujii T, et al. The contribution of the dorsolateral prefrontal cortex to the preparation for deception and truth-telling. Brain Res. 2012;1464:43-52.
29. Langleben DD, Hakun JG, Seelig D. Polygraphy and functional magnetic resonance imaging in lie detection: a controlled blind comparison using the concealed information test. J Clin Psychiatry. 2016;77(10):1372-1380.
30. Boire RG. Searching the brain: the Fourth Amendment implications of brain-based deception detection devices. Am J Bioeth. 2005;5(2):62-63; discussion W5.
31. Langleben DD. Detection of deception with fMRI: Are we there yet? Legal Criminological Psychol. 2008;13(1):1-9.
32. Marcuse LV, Fields MC, Yoo J. Rowans Primer of EEG. 2nd ed. Edinburgh, Scotland, United Kingdom: Elsevier; 2016.
33. Farwell LA, Donchin E. The truth will out: interrogative polygraphy (“lie detection”) with event-related brain potentials. Psychophysiology. 1991;28(5):531-547.
34. Sur S, Sinha VK. Event-related potential: an overview. Ind Psychiatry J. 2009;18(1):70-73.
35. Polich J. Updating P300: an integrative theory of P3a and P3b. Clinical Neurophysiol. 2007;118(10):2128-2148.
36. Mertens R, Allen, JJB. The role of psychophysiology in forensic assessments: Deception detection, ERPs, and virtual reality mock crime scenarios. Psychophysiology. 2008;45(2):286-298.
37. Rosenfeld JP, Labkovsky E. New P300-based protocol to detect concealed information: resistance to mental countermeasures against only half the irrelevant stimuli and a possible ERP indicator of countermeasures. Psychophysiology. 2010;47(6):1002-1010.
38. Farwell LA, Smith SS. Using brain MERMER testing to detect knowledge despite efforts to conceal. J Forensic Sci. 2001;46(1):135-143.
Enoxaparin vs Continuous Heparin for Periprocedural Bridging in Patients With Atrial Fibrillation and Advanced Chronic Kidney Disease
There has been a long-standing controversy in the use of parenteral anticoagulation for perioperative bridging in patients with atrial fibrillation (AF) pursuing elective surgery.1 The decision to bridge is dependent on the patient’s risk of thromboembolic complications and susceptibility to bleed.1 The BRIDGE trial showed noninferiority in rate of stroke and embolism events between low molecular weight heparins (LMWHs) and no perioperative bridging.2 However, according to the American College of Chest Physicians (CHEST) 2012 guidelines, patients in the BRIDGE trial would be deemed low risk for thromboembolic events displayed by a mean CHADS2 (congestive heart failure [CHF], hypertension, age, diabetes mellitus, and stroke/transient ischemic attack) score of 2.3. Also, the BRIDGE study and many others excluded patients with advanced forms of chronic kidney disease (CKD).2,3
Similar to patients with AF, patients with advanced CKD (ACKD, stage 4 and 5 CKD) have an increased risk of stroke and venous thromboembolism (VTE).4,5 Patients with AF and ACKD have not been adequately studied for perioperative anticoagulation bridging outcomes. Although unfractionated heparin (UFH) is preferred over LMWH in ACKD patients,enoxaparin can be used in this population.1,6 Enoxaparin 1 mg/kg once daily is approved by the US Food and Drug Administration (FDA) for use in patients with severe renal insufficiency defined as creatinine clearance (CrCl) < 30 mL/min. This dosage adjustment is subsequent to studies with enoxaparin 1 mg/kg twice daily that showed a significant increase in major and minor bleeding in severe renal-insufficient patients with CrCl < 30 mL/min vs patients with CrCl > 30 mL/min.7 When comparing the myocardial infarction (MI) outcomes of severe renal-insufficient patients in the ExTRACT-TIMI 25 trial, enoxaparin 1 mg/kg once daily had no significant difference in nonfatal major bleeding vs UFH.8 In patients without renal impairment (no documentation of kidney disease), bridging therapy with LMWH was completed more than UFH in < 24 hours of hospital stay and with similar rates of VTEs and major bleeding.9 In addition to its ability to be administered outpatient, enoxaparin has a more predictable pharmacokinetic profile, allowing for less monitoring and a lower incidence of heparin-induced thrombocytopenia (HIT) vs that of UFH.6
The Michael E. DeBakey Veteran Affairs Medical Center (MEDVAMC) in Houston, Texas, is one of the largest US Department of Veterans Affairs (VA) hospitals in the US, managing > 150,000 veterans in Southeast Texas and other southern states. As a referral center for traveling patients, it is crucial that MEDVAMC decrease hospital length of stay (LOS) to increase space for incoming patients. Reducing LOS also reduces costs and may have a correlation with decreasing the incidence of nosocomial infections. Because of its significance to this facility, hospital LOS is an appropriate primary outcome for this study.
To our knowledge, bridging outcomes between LMWH and UFH in patients with AF and ACKD have never been studied. We hypothesized that using enoxaparin instead of heparin for periprocedural management would result in decreased hospital LOS, leading to a lower economic burden and lower incidence of nosocomial infections with no significant differences in major and minor bleeding and thromboembolic complications.10
Methods
This study was a single-center, retrospective chart review of adult patients from January 2008 to September 2017. The review was conducted at MEDVAMC and was approved by the research and development committee and by the Baylor College of Medicine Institutional Review Board. Formal consent was not required.
Included patients were aged ≥ 18 years with diagnoses of AF or atrial flutter and ACKD as recognized by a glomerular filtration rate (eGFR) of < 30 mL/min/1.73 m2 as calculated by use of the Modification of Diet in Renal Disease Study (MDRD) equation.11 Patients must have previously been on warfarin and required temporary interruption of warfarin for an elective procedure. During the interruption of warfarin therapy, a requirement was set for patients to be on periprocedural anticoagulation with subcutaneous (SC) enoxaparin 1 mg/kg daily or continuous IV heparin per MEDVAMC heparin protocol. Patients were excluded if they had experienced major bleeding in the 6 weeks prior to the elective procedure, had current thrombocytopenia (platelet count < 100 × 109/L), or had a history of heparin-induced thrombocytopenia (HIT) or a heparin allergy.
This patient population was identified using TheraDoc Clinical Surveillance Software System (Charlotte, NC), which has prebuilt alert reviews for anticoagulation medications, including enoxaparin and heparin. An alert for patients on enoxaparin with serum creatinine (SCr) > 1.5 mg/dL was used to screen patients who met the inclusion criteria. A second alert identified patients on heparin. The VA Computerized Patient Record System (CPRS) was used to collect patient data.
Economic Analysis
An economic analysis was conducted using data from the VA Managerial Cost Accounting Reports. Data on the national average cost per bed day was used for the purpose of extrapolating this information to multiple VA institutions.12 National average cost per day was determined by dividing the total cost by the number of bed days for the identified treating specialty during the fiscal period of 2018. Average cost per day data included costs for bed day, surgery, radiology services, laboratory tests, pharmacy services, treatment location (ie, intensive care units [ICUs]) and all other costs associated with an inpatient stay. A cost analysis was performed using this average cost per bed day and the mean LOS between enoxaparin and UFH for each treating specialty. The major outcome of the cost analysis was the total cost per average inpatient stay. The national average cost per bed day for each treating specialty was multiplied by the average LOS found for each treating specialty in this study; the sum of all the average costs per inpatient stay for the treating specialties resulted in the total cost per average inpatient stay. Permission to use these data was granted by the Pharmacy and Critical Care Services at MEDVAMC.
Patient Demographics and Characteristics
Data were collected on patient demographics (Table 1). Nosocomial infections, stroke/transient ischemic attack, MI, VTE, major and minor bleeding, and death are defined in Table 2.
The primary outcome of the study was hospital LOS. The study was powered at 90% for α = .05, which gives a required study population of 114 (1:1 enrollment ratio) patients to determine a statistically significant difference in hospital stay. This sample size was calculated using the mean hospital LOS (the primary objective) in the REGIMEN registry for LMWH (4.6 days) and UFH (10.3 days).9 To our knowledge, the incidence of nosocomial infections (a secondary outcome) has not been studied in this patient population; therefore, there was no basis to assess an appropriate sample size to find a difference in this outcome. Furthermore, the goal was to collect as many patients as possible to best assess this variable. Because of an expected high exclusion rate, 504 patients were reviewed to target a sample size of 120 patients. Due to the single-center nature of this review, the secondary outcomes of thromboembolic complications and major and minor bleeding were expected to be underpowered.
The final analysis compared the enoxaparin arm with the UFH arm. Univariate differences between the treatment groups were compared using the Fisher exact test for categorical variables. Demographic data and other continuous variables were analyzed by an unpaired t test to compare means between the 2 arms. Outcomes and characteristics were deemed statistically significant when α (P value) was < .05. All P values reported were 2-tailed with a 95% CI. No statistical analysis was performed for the cost differences (based on LOS per treating specialty) in the 2 treatment arms. Statistical analyses were completed by utilizing GraphPad Software (San Diego, CA).
Results
In total, 50 patients were analyzed in the study. There were 36 patients bridged with IV UFH at a concentration of 25,000 U/250 mL with an initial infusion rate of 12 U/kg/h. For the other arm, 14 patients were anticoagulated with renally dosed enoxaparin 1 mg/kg/d with an average daily dose of 89.3 mg; the mean actual body weight in this group was 90.9 mg (correlates with enoxaparin daily dose). Physicians of the primary team decided which parenteral anticoagulant to use. The difference in mean duration of inpatient parental anticoagulation between both groups was not statistically significant: enoxaparin at 7.1 days and UFH at 9.6 days (P = .19). Patients in the enoxaparin arm were off warfarin therapy for an average of 6.0 days vs 7.5 days for the UFH group (P = .29). The duration of outpatient anticoagulation with enoxaparin was not analyzed in this study.
Patient and Procedure Characteristics
All patients had AF or atrial flutter with 86% of patients (n = 43) having a CHADS2 > 2 and 48% (n = 29) having a CHA2DS2VASc > 4. Overall, the mean age was 71.3 years with similarities in ethnicity distribution. Patients had multiple comorbidities as shown by a mean Charlson Comorbidity Index (CCI) of 7.7 and an increased risk of bleeding as evidenced by 98% (n = 48) of patients having a HAS-BLED score of ≥ 3. A greater percentage of patients bridged with enoxaparin had DM, history of stroke and MI, and a heart valve, whereas UFH patients were more likely to be in stage 5 CKD (eGFR < 15 mL/min/1.73m2) with a significantly lower mean eGFR (16.76 vs 22.64, P = .03). Furthermore, there were more patients on hemodialysis in the UFH (50%) arm vs enoxaparin (21%) arm and a lower mean CrCl with UFH (20.1 mL/min) compared with enoxaparin (24.9 mL/min); however, the differences in hemodialysis and mean CrCl were not statistically significant. There were no patients on peritoneal dialysis in this review.
Procedure Characteristics
The average Revised Cardiac Risk Index (RCRI) score was about 3, indicating that these patients were at a Class IV risk (11%) of having a perioperative cardiac event (Table 3). Nineteen patients (38%) elected for a major surgery with all but 1 of the surgeries (major or minor) being invasive. The average length of surgery was 1.2 hours, and patients were more likely to undergo cardiothoracic procedures (38%). There were 2 out of 14 (14%) patients on enoxaparin who were able to have surgery as an outpatient; whereas this did not occur in patients on UFH. The procedures completed for these patients were a colostomy (minor surgery) and arteriovenous graft repair (major surgery). There were no statistically significant differences regarding types of procedures between the 2 arms.
Outcomes
The primary outcome of this study, hospital LOS, differed significantly in the enoxaparin arm vs UFH: 10.2 days vs 17.5 days, P = .04 (Table 4). The time-to-discharge from initiation of parenteral anticoagulation was significantly reduced with enoxaparin (7.1 days) compared with UFH (11.9 days); P = .04. Although also reduced in the enoxaparin arm, ICU LOS did not show statistical significance (1.1 days vs 4.0 days, P = .09).
About 36% (n = 18) of patients in this study acquired an infection during hospitalization for elective surgery. The most common microorganism and site of infection were Enterococcus species and urinary tract, respectively (Table 5). Nearly half (44%, n = 16) of the patients in the UFH group had a nosocomial infection vs 14% (n = 2) of enoxaparin-bridged patients with a difference approaching significance; P = .056. Both patients in the enoxaparin group had the urinary tract as the primary source of infection; 1 of these patients had a urologic procedure.
Major bleeding occurred in 7% (n = 1) of enoxaparin patients vs 22% (n = 8) in the UFH arm, but this was not found to be statistically significant (P = .41). Minor bleeding was similar between enoxaparin and UFH arms (14% vs 19%, P = .99). Regarding thromboembolic complications, the enoxaparin group (0%) had a numerical reduction compared to UFH (11%) with VTE (n = 4) being the only occurrence of the composite outcome (P = .57). There were 4 deaths within 30 days posthospitalization—all were from the UFH group (P = .57). Due to the small sample size of this study, these outcomes (bleeding and thrombotic events) were not powered to detect a statistically significant difference.
Economic Analysis
The average cost differences (Table 6) of hospitalization between enoxaparin and UFH were calculated using the average LOS per treating specialty multiplied by the national average cost of the MCO for an inpatient bed day in 2018.12 The treating specialty with the longest average LOS in the enoxaparin arm was thoracic (4.7 days). The UFH arm also had a large LOS (average days) for the thoracic specialty (6.4 days); however, the vascular specialty (6.7 days) had the longest average LOS in this group. Due to a mean LOS of 10.2 days in the enoxaparin arm, which was further stratified by treating specialty, the total cost per average inpatient stay was calculated as $51,710. On the other hand, patients in the UFH arm had a total cost per average inpatient stay of $92,848.
Monitoring
Anti-factor Xa levels for LMWH monitoring were not analyzed in this study due to a lack of values collected; only 1 patient had an anti-factor Xa level checked during this time frame. Infusion rates of UFH were adjusted based on aPTT levels collected per MEDVAMC inpatient anticoagulation protocol. The average percentage of aPTT in therapeutic range was 46.3% and the mean time-to-therapeutic range (SD) was about 2.4 (1.3) days. Due to this study’s retrospective nature, there were inconsistencies with availability of documentation of UFH infusion rates. For this reason, these values were not analyzed further.
Discussion
In 2017, the American College of Cardiology published the Periprocedural Anticoagulation Expert Consensus Pathway, which recommends for patients with AF at low risk (CHA2DS2VASc 1-4) of thromboembolism to not be bridged (unless patient had a prior VTE or stroke/TIA).13 Nearly half the patients in this study, were classified as moderate-to-high thrombotic risk as evidenced by a CHA2DS2VASc > 4 with a mean score of 4.8. Due to this study’s retrospective design from 2008 to 2017, many of the clinicians may have referenced the 2008 CHEST antithrombotic guidelines when making the decision to bridge patients; these guidelines and the previous MEDVAMC anticoagulation protocol recommend bridging patients with AF with CHADS2 > 2 (moderate-to-high thrombotic risk) in which all but 1 of the patients in this study met criteria.1,14 In contrast to the landmark BRIDGE trial, the mean CHADS2 score in this study was 3.6; this is an indication that our patient population was of individuals at an increased risk of stroke and embolism.
In addition to thromboembolic complications, patients in the current study also were at increased risk of clinically relevant bleeding with a mean HAS-BLED score of 4.1 and nearly all patients having a score > 3. The complexity of the veteran population also was displayed by this study’s mean CCI (7.7) and RCRI (3.0) indicating a 0% estimated 10-year survival and a 11% increase in having a perioperative cardiac event, respectively. A mean CCI of 7.7 is associated with a 13.3 relative risk of death within 6 years postoperation.15 All patients had a diagnosis of hypertension, and > 75% had this diagnosis complicated by DM. In addition, this patient population was of those with extensive cardiovascular disease or increased risk, which makes for a clinically relevant application of patients who would require periprocedural bridging.
Another positive aspect of this study is that all the baseline characteristics, apart from renal function, were similar between arms, helping to strengthen the ability to adequately compare the 2 bridging modalities. Our assumption for the reasoning that more stage 5 CKD and dialysis patients were anticoagulated with UFH vs enoxaparin is a result of concern for an increased risk of bleeding with a medication that is renally cleared 30% less in CrCl < 30 mL/min.16 Although, enoxaparin 1 mg/kg/d is FDA approved as a therapeutic anticoagulant option, clinicians at MEDVAMC likely had reservations about its use in end-stage CKD patients. Unlike many studies, including the BRIDGE trial, patients with ACKD were not excluded from this trial, and the outcomes with enoxaparin are available for interpretation.
To no surprise, for patients included in this study, enoxaparin use led to shorter hospital LOS, reduced ICU LOS, and a quicker time-to-discharge from initiation. This is credited to the 100% bioavailability of SC enoxaparin in conjunction with its means to be a therapeutic option as an outpatient.16 Unlike IV UFH, patients requiring bridging can be discharged on SC injections of enoxaparin until a therapeutic INR is maintained with warfarin.The duration of hospital LOS in both arms were longer in this study compared with that of other studies.9 This may be due to clinicians being more cautious with renal insufficient patients, and the patients included in this study had multiple comorbidities. According to an economic analysis performed by Amorosi and colleagues in 2004, bridging with enoxaparin instead of UFH can save up to $3,733 per patient and reduce bridging costs by 63% to 85% driven primarily by decreased hospital LOS.10
Economic Outcome
In our study, we conducted a cost analysis using national VA data that indicated a $41,138 or 44% reduction in total cost per average inpatient stay when bridging 1 patient with enoxaparin vs UFH. The benefit of this cost analysis is that it reflects direct costs at VA institutions nationally; this will allow these data to be useful for practitioners at MEDVAMC and other VA hospitals. Stratifying the costs by treating specialty instead of treatment location minimized skewing of the data as there were some patients with long LOS in the ICU. No patients in the enoxaparin arm were treated in otolaryngology, which may have skewed the data. The data included direct costs for beds as well as costs for multiple services, such as procedures, pharmacy, nursing, laboratory tests, and imaging. Unlike the Amorosi study, our review did not include acquisition costs for enoxaparin syringes and bags of UFH or laboratory costs for aPTT and anti-factor Xa levels in part because of the data source and the difficulty calculating costs over a 10-year span.
Patients in the enoxaparin arm had a trend toward fewer occurrences of hospital-acquired infections than did those in the UFH arm, which we believe is due to a decreased LOS (in both total hospital and ICU days) and fewer blood draws needed for monitoring. This also may be attributed to a longer mean duration of surgery in the UFH arm (1.3 hours) vs enoxaparin (0.9 hours). The percentage of patients with procedures ≥ 45 minutes and the types of procedures between both arms were similar. However, these outcomes were not statistically significant. In addition, elderly males who are hospitalized may require a catheter (due to urinary retention), and catheter-associated urinary tract infection (CAUTI) is one of the highest reported infections in acute care hospitals in the US. This is in line with our patient population and may be a supplementary reason for the increase in infection incidence with UFH. Though, whether urinary catheters were used in these patients was not evaluated in this study.
Despite being at an increased risk of experiencing a major adverse cardiovascular event (MACE), no patients in either arm had a stroke/TIA or MI within 30 days postprocedure. The only occurrences documented were VTEs, which happened only in 4 patients on UFH. Four people died in this study, solely in the UFH arm. The incidence of thromboembolic complications and death along with major and minor bleeding cannot be deduced as meaningful as this study was underpowered for these outcomes. Despite anti-factor Xa monitoring being recommended in ACKD patients on enoxaparin, this monitoring was not routinely performed in this study. Another limitation was the inability to adequately assess the appropriateness of nurse-adjusted UFH infusion rates largely due to the retrospective nature of this study. The variability of aPTT percentage in therapeutic range and time-to-therapeutic range reported was indicative of the difficulties of monitoring for the safety and efficacy of UFH.
In 1991, Cruickshank and colleagues conducted a study in which a standard nomogram (similar to the MEDVAMC nomogram) for the adjustment of IV heparin was implemented at a single hospital.17 The success rate (aPTT percentage in therapeutic range) was 59.4% and average time-to-therapeutic range was about 1 day. The success rate (46.3%) and time-to-therapeutic range (2.4 days) in our study were lower and longer, respectively, than was expected. One potential reason for this discrepancy could be the differences in indication as the patients in Cruickshank and colleagues were being treated for VTE, whereas patients in our study had AF or atrial flutter. Also, there were inconsistencies in the availability of documentation of monitoring parameters for heparin due to the study time frame and retrospective design. Patients on UFH who are not within the therapeutic range in a timely manner are at greater risk of MACE and major/minor bleeding. Our study was not powered to detect these findings.
Strengths and Limitations
A significant limitation of this study was its small sample size; the study was not able to meet power for the primary outcome; it is unknown whether our study met power for nosocomial infections. The study also was not a powered review of other adverse events, such as thromboembolic complications, bleeding, and death. The study had an uneven number of patients, which made it more difficult to appropriately compare 2 patient populations; the study also did not include medians for patient characteristics and outcomes.
Due to this study’s time frame, the clinical pharmacy services at MEDVAMC were not as robust as they are now, which is the reason the decisions on which anticoagulant to use were primarily physician based. The use of TheraDoc to identify patients posed the risk of missing patients who may not have had the appropriate laboratory tests performed (ie, SCr). Patients on UFH had a reduced eGFR compared with that of enoxaparin, which may limit our extrapolation of enoxaparin’s use in end-stage renal disease. The reduced eGFR and higher number of dialysis patients in the UFH arm may have increased the occurrence of more labile INRs and bleeding outcomes. Patients on hemodialysis typically have more comorbidities and an increased risk of infection due to the frequent use of catheters and needles to access the bloodstream. In addition, the potential differences in catheter use and duration between groups were not identified. If these parameters were studied, the data collected may have helped better explain the reasoning for increased incidence of infection in the UFH arm.
Strengths of this study include a complex patient population with similar characteristics, distribution of ethnicities representative of the US population, patients at moderate-to-high thrombotic risk, the analysis of nosocomial infections, and the exclusion of patients with normal renal function or moderate CKD.
Conclusion
To our knowledge, this is the first study to compare periprocedural bridging outcomes and incidence of nosocomial infections in patients with AF and ACKD. This review provides new evidence that in this patient population, enoxaparin is a potential anticoagulant to reduce hospital LOS and hospital-acquired infections. Compared with UFH, bridging with enoxaparin reduced hospital LOS and anticoagulation time-to-discharge by 7 and 5 days, respectively, and decreased the incidence of nosocomial infections by 30%. Using the mean LOS per treating specialty for both arms, bridging 1 patient with AF with enoxaparin vs UFH can potentially lead to an estimated $40,000 (44%) reduction in total cost of hospitalization. Enoxaparin also had no numeric differences in mortality and adverse events (stroke/TIA, MI, VTE) vs that of UFH, but it is important to note that this study was not powered to find a significant difference in these outcomes. Due to the mean eGFR of patients on enoxaparin being 22.6 mL/min/1.73 m2 and only 1 in 5 having stage 5 CKD, at this time, we do not recommend enoxaparin for periprocedural use in stage 5 CKD or in patients on hemodialysis. Larger studies are needed, including randomized trials, in this patient population to further evaluate these outcomes and assess the use of enoxaparin in patients with ACKD.
1. Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2)(suppl):e326S-350S.
2. Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-833.
3. Hammerstingl C, Schmitz A, Fimmers R, Omran H. Bridging of chronic oral anticoagulation with enoxaparin in patients with atrial fibrillation: results from the prospective BRAVE registry. Cardiovasc Ther. 2009;27(4):230-238.
4. Dad T, Weiner DE. Stroke and chronic kidney disease: epidemiology, pathogenesis, and management across kidney disease stages. Semin Nephrol. 2015;35(4):311-322.
5. Wattanakit K, Cushman M. Chronic kidney disease and venous thromboembolism: epidemiology and mechanisms. Curr Opin Pulm Med. 2009;15(5):408-412.
6. Saltiel M. Dosing low molecular weight heparins in kidney disease. J Pharm Pract. 2010;23(3):205-209.
7. Spinler SA, Inverso SM, Cohen M, Goodman SG, Stringer KA, Antman EM; ESSENCE and TIMI 11B Investigators. Safety and efficacy of unfractionated heparin versus enoxaparin in patients who are obese and patients with severe renal impairment: analysis from the ESSENCE and TIMI 11B studies. Am Heart J. 2003;146(1):33-41.
8. Fox KA, Antman EM, Montalescot G, et al. The impact of renal dysfunction on outcomes in the ExTRACT-TIMI 25 trial. J Am Coll Cardiol. 2007;49(23):2249-2255.
9. Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost. 2006;4(6):1246-1252.
10. Amorosi SL, Tsilimingras K, Thompson D, Fanikos J, Weinstein MC, Goldhaber SZ. Cost analysis of “bridging therapy” with low-molecular-weight heparin versus unfractionated heparin during temporary interruption of chronic anticoagulation. Am J Cardiol. 2004;93(4):509-511.
11. Inker LA, Astor BC, Fox CH, et al. KDOQI US commentary on the 2012 KDIGO clinical practice guideline for the evaluation and management of CKD. Am J Kidney Dis. 2014;63(5):713-735.
12. US Department of Veteran Affairs. Managerial Cost Accounting Financial User Support Reports: fiscal year 2018. https://www.herc.research.va.gov/include/page.asp?id=managerial-cost-accounting. [Source not verified.]
13. Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC Expert Consensus Decision Pathway for Periprocedural Management of Anticoagulation in Patients With Nonvalvular Atrial Fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol. 2017;69(7):871-898.
14. Kearon C, Kahn SR, Agnelli G, et al. Antithrombotic therapy for venous thromboembolic disease: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines (8th Edition). Chest. 2008;133(6 suppl):454S-545S.
15. Charlson M, Szatrowski TP, Peterson J, Gold J. Validation of a combined comorbidity index. J Clin Epidemiol. 1994;47(11):1245-1251.
16. Lovenox [package insert]. Bridgewater, NJ: Sanofi-Aventis; December 2017.
17. Cruickshank MK, Levine MN, Hirsh J, Roberts R, Siguenza M. A standard heparin nomogram for the management of heparin therapy. Arch Intern Med. 1991;151(2):333-337.
18. Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation. 2015;131(5):488-494.
19. Verheugt FW, Steinhubl SR, Hamon M, et al. Incidence, prognostic impact, and influence of antithrombotic therapy on access and nonaccess site bleeding in percutaneous coronary intervention. JACC Cardiovasc Interv. 2011;4(2):191-197.
20. Bijsterveld NR, Peters RJ, Murphy SA, Bernink PJ, Tijssen JG, Cohen M. Recurrent cardiac ischemic events early after discontinuation of short-term heparin treatment in acute coronary syndromes: results from the Thrombolysis in Myocardial Infarction (TIMI) 11B and Efficacy and Safety of Subcutaneous Enoxaparin in Non-Q-Wave Coronary Events (ESSENCE) studies. J Am Coll Cardiol. 2003;42(12):2083-2089.
There has been a long-standing controversy in the use of parenteral anticoagulation for perioperative bridging in patients with atrial fibrillation (AF) pursuing elective surgery.1 The decision to bridge is dependent on the patient’s risk of thromboembolic complications and susceptibility to bleed.1 The BRIDGE trial showed noninferiority in rate of stroke and embolism events between low molecular weight heparins (LMWHs) and no perioperative bridging.2 However, according to the American College of Chest Physicians (CHEST) 2012 guidelines, patients in the BRIDGE trial would be deemed low risk for thromboembolic events displayed by a mean CHADS2 (congestive heart failure [CHF], hypertension, age, diabetes mellitus, and stroke/transient ischemic attack) score of 2.3. Also, the BRIDGE study and many others excluded patients with advanced forms of chronic kidney disease (CKD).2,3
Similar to patients with AF, patients with advanced CKD (ACKD, stage 4 and 5 CKD) have an increased risk of stroke and venous thromboembolism (VTE).4,5 Patients with AF and ACKD have not been adequately studied for perioperative anticoagulation bridging outcomes. Although unfractionated heparin (UFH) is preferred over LMWH in ACKD patients,enoxaparin can be used in this population.1,6 Enoxaparin 1 mg/kg once daily is approved by the US Food and Drug Administration (FDA) for use in patients with severe renal insufficiency defined as creatinine clearance (CrCl) < 30 mL/min. This dosage adjustment is subsequent to studies with enoxaparin 1 mg/kg twice daily that showed a significant increase in major and minor bleeding in severe renal-insufficient patients with CrCl < 30 mL/min vs patients with CrCl > 30 mL/min.7 When comparing the myocardial infarction (MI) outcomes of severe renal-insufficient patients in the ExTRACT-TIMI 25 trial, enoxaparin 1 mg/kg once daily had no significant difference in nonfatal major bleeding vs UFH.8 In patients without renal impairment (no documentation of kidney disease), bridging therapy with LMWH was completed more than UFH in < 24 hours of hospital stay and with similar rates of VTEs and major bleeding.9 In addition to its ability to be administered outpatient, enoxaparin has a more predictable pharmacokinetic profile, allowing for less monitoring and a lower incidence of heparin-induced thrombocytopenia (HIT) vs that of UFH.6
The Michael E. DeBakey Veteran Affairs Medical Center (MEDVAMC) in Houston, Texas, is one of the largest US Department of Veterans Affairs (VA) hospitals in the US, managing > 150,000 veterans in Southeast Texas and other southern states. As a referral center for traveling patients, it is crucial that MEDVAMC decrease hospital length of stay (LOS) to increase space for incoming patients. Reducing LOS also reduces costs and may have a correlation with decreasing the incidence of nosocomial infections. Because of its significance to this facility, hospital LOS is an appropriate primary outcome for this study.
To our knowledge, bridging outcomes between LMWH and UFH in patients with AF and ACKD have never been studied. We hypothesized that using enoxaparin instead of heparin for periprocedural management would result in decreased hospital LOS, leading to a lower economic burden and lower incidence of nosocomial infections with no significant differences in major and minor bleeding and thromboembolic complications.10
Methods
This study was a single-center, retrospective chart review of adult patients from January 2008 to September 2017. The review was conducted at MEDVAMC and was approved by the research and development committee and by the Baylor College of Medicine Institutional Review Board. Formal consent was not required.
Included patients were aged ≥ 18 years with diagnoses of AF or atrial flutter and ACKD as recognized by a glomerular filtration rate (eGFR) of < 30 mL/min/1.73 m2 as calculated by use of the Modification of Diet in Renal Disease Study (MDRD) equation.11 Patients must have previously been on warfarin and required temporary interruption of warfarin for an elective procedure. During the interruption of warfarin therapy, a requirement was set for patients to be on periprocedural anticoagulation with subcutaneous (SC) enoxaparin 1 mg/kg daily or continuous IV heparin per MEDVAMC heparin protocol. Patients were excluded if they had experienced major bleeding in the 6 weeks prior to the elective procedure, had current thrombocytopenia (platelet count < 100 × 109/L), or had a history of heparin-induced thrombocytopenia (HIT) or a heparin allergy.
This patient population was identified using TheraDoc Clinical Surveillance Software System (Charlotte, NC), which has prebuilt alert reviews for anticoagulation medications, including enoxaparin and heparin. An alert for patients on enoxaparin with serum creatinine (SCr) > 1.5 mg/dL was used to screen patients who met the inclusion criteria. A second alert identified patients on heparin. The VA Computerized Patient Record System (CPRS) was used to collect patient data.
Economic Analysis
An economic analysis was conducted using data from the VA Managerial Cost Accounting Reports. Data on the national average cost per bed day was used for the purpose of extrapolating this information to multiple VA institutions.12 National average cost per day was determined by dividing the total cost by the number of bed days for the identified treating specialty during the fiscal period of 2018. Average cost per day data included costs for bed day, surgery, radiology services, laboratory tests, pharmacy services, treatment location (ie, intensive care units [ICUs]) and all other costs associated with an inpatient stay. A cost analysis was performed using this average cost per bed day and the mean LOS between enoxaparin and UFH for each treating specialty. The major outcome of the cost analysis was the total cost per average inpatient stay. The national average cost per bed day for each treating specialty was multiplied by the average LOS found for each treating specialty in this study; the sum of all the average costs per inpatient stay for the treating specialties resulted in the total cost per average inpatient stay. Permission to use these data was granted by the Pharmacy and Critical Care Services at MEDVAMC.
Patient Demographics and Characteristics
Data were collected on patient demographics (Table 1). Nosocomial infections, stroke/transient ischemic attack, MI, VTE, major and minor bleeding, and death are defined in Table 2.
The primary outcome of the study was hospital LOS. The study was powered at 90% for α = .05, which gives a required study population of 114 (1:1 enrollment ratio) patients to determine a statistically significant difference in hospital stay. This sample size was calculated using the mean hospital LOS (the primary objective) in the REGIMEN registry for LMWH (4.6 days) and UFH (10.3 days).9 To our knowledge, the incidence of nosocomial infections (a secondary outcome) has not been studied in this patient population; therefore, there was no basis to assess an appropriate sample size to find a difference in this outcome. Furthermore, the goal was to collect as many patients as possible to best assess this variable. Because of an expected high exclusion rate, 504 patients were reviewed to target a sample size of 120 patients. Due to the single-center nature of this review, the secondary outcomes of thromboembolic complications and major and minor bleeding were expected to be underpowered.
The final analysis compared the enoxaparin arm with the UFH arm. Univariate differences between the treatment groups were compared using the Fisher exact test for categorical variables. Demographic data and other continuous variables were analyzed by an unpaired t test to compare means between the 2 arms. Outcomes and characteristics were deemed statistically significant when α (P value) was < .05. All P values reported were 2-tailed with a 95% CI. No statistical analysis was performed for the cost differences (based on LOS per treating specialty) in the 2 treatment arms. Statistical analyses were completed by utilizing GraphPad Software (San Diego, CA).
Results
In total, 50 patients were analyzed in the study. There were 36 patients bridged with IV UFH at a concentration of 25,000 U/250 mL with an initial infusion rate of 12 U/kg/h. For the other arm, 14 patients were anticoagulated with renally dosed enoxaparin 1 mg/kg/d with an average daily dose of 89.3 mg; the mean actual body weight in this group was 90.9 mg (correlates with enoxaparin daily dose). Physicians of the primary team decided which parenteral anticoagulant to use. The difference in mean duration of inpatient parental anticoagulation between both groups was not statistically significant: enoxaparin at 7.1 days and UFH at 9.6 days (P = .19). Patients in the enoxaparin arm were off warfarin therapy for an average of 6.0 days vs 7.5 days for the UFH group (P = .29). The duration of outpatient anticoagulation with enoxaparin was not analyzed in this study.
Patient and Procedure Characteristics
All patients had AF or atrial flutter with 86% of patients (n = 43) having a CHADS2 > 2 and 48% (n = 29) having a CHA2DS2VASc > 4. Overall, the mean age was 71.3 years with similarities in ethnicity distribution. Patients had multiple comorbidities as shown by a mean Charlson Comorbidity Index (CCI) of 7.7 and an increased risk of bleeding as evidenced by 98% (n = 48) of patients having a HAS-BLED score of ≥ 3. A greater percentage of patients bridged with enoxaparin had DM, history of stroke and MI, and a heart valve, whereas UFH patients were more likely to be in stage 5 CKD (eGFR < 15 mL/min/1.73m2) with a significantly lower mean eGFR (16.76 vs 22.64, P = .03). Furthermore, there were more patients on hemodialysis in the UFH (50%) arm vs enoxaparin (21%) arm and a lower mean CrCl with UFH (20.1 mL/min) compared with enoxaparin (24.9 mL/min); however, the differences in hemodialysis and mean CrCl were not statistically significant. There were no patients on peritoneal dialysis in this review.
Procedure Characteristics
The average Revised Cardiac Risk Index (RCRI) score was about 3, indicating that these patients were at a Class IV risk (11%) of having a perioperative cardiac event (Table 3). Nineteen patients (38%) elected for a major surgery with all but 1 of the surgeries (major or minor) being invasive. The average length of surgery was 1.2 hours, and patients were more likely to undergo cardiothoracic procedures (38%). There were 2 out of 14 (14%) patients on enoxaparin who were able to have surgery as an outpatient; whereas this did not occur in patients on UFH. The procedures completed for these patients were a colostomy (minor surgery) and arteriovenous graft repair (major surgery). There were no statistically significant differences regarding types of procedures between the 2 arms.
Outcomes
The primary outcome of this study, hospital LOS, differed significantly in the enoxaparin arm vs UFH: 10.2 days vs 17.5 days, P = .04 (Table 4). The time-to-discharge from initiation of parenteral anticoagulation was significantly reduced with enoxaparin (7.1 days) compared with UFH (11.9 days); P = .04. Although also reduced in the enoxaparin arm, ICU LOS did not show statistical significance (1.1 days vs 4.0 days, P = .09).
About 36% (n = 18) of patients in this study acquired an infection during hospitalization for elective surgery. The most common microorganism and site of infection were Enterococcus species and urinary tract, respectively (Table 5). Nearly half (44%, n = 16) of the patients in the UFH group had a nosocomial infection vs 14% (n = 2) of enoxaparin-bridged patients with a difference approaching significance; P = .056. Both patients in the enoxaparin group had the urinary tract as the primary source of infection; 1 of these patients had a urologic procedure.
Major bleeding occurred in 7% (n = 1) of enoxaparin patients vs 22% (n = 8) in the UFH arm, but this was not found to be statistically significant (P = .41). Minor bleeding was similar between enoxaparin and UFH arms (14% vs 19%, P = .99). Regarding thromboembolic complications, the enoxaparin group (0%) had a numerical reduction compared to UFH (11%) with VTE (n = 4) being the only occurrence of the composite outcome (P = .57). There were 4 deaths within 30 days posthospitalization—all were from the UFH group (P = .57). Due to the small sample size of this study, these outcomes (bleeding and thrombotic events) were not powered to detect a statistically significant difference.
Economic Analysis
The average cost differences (Table 6) of hospitalization between enoxaparin and UFH were calculated using the average LOS per treating specialty multiplied by the national average cost of the MCO for an inpatient bed day in 2018.12 The treating specialty with the longest average LOS in the enoxaparin arm was thoracic (4.7 days). The UFH arm also had a large LOS (average days) for the thoracic specialty (6.4 days); however, the vascular specialty (6.7 days) had the longest average LOS in this group. Due to a mean LOS of 10.2 days in the enoxaparin arm, which was further stratified by treating specialty, the total cost per average inpatient stay was calculated as $51,710. On the other hand, patients in the UFH arm had a total cost per average inpatient stay of $92,848.
Monitoring
Anti-factor Xa levels for LMWH monitoring were not analyzed in this study due to a lack of values collected; only 1 patient had an anti-factor Xa level checked during this time frame. Infusion rates of UFH were adjusted based on aPTT levels collected per MEDVAMC inpatient anticoagulation protocol. The average percentage of aPTT in therapeutic range was 46.3% and the mean time-to-therapeutic range (SD) was about 2.4 (1.3) days. Due to this study’s retrospective nature, there were inconsistencies with availability of documentation of UFH infusion rates. For this reason, these values were not analyzed further.
Discussion
In 2017, the American College of Cardiology published the Periprocedural Anticoagulation Expert Consensus Pathway, which recommends for patients with AF at low risk (CHA2DS2VASc 1-4) of thromboembolism to not be bridged (unless patient had a prior VTE or stroke/TIA).13 Nearly half the patients in this study, were classified as moderate-to-high thrombotic risk as evidenced by a CHA2DS2VASc > 4 with a mean score of 4.8. Due to this study’s retrospective design from 2008 to 2017, many of the clinicians may have referenced the 2008 CHEST antithrombotic guidelines when making the decision to bridge patients; these guidelines and the previous MEDVAMC anticoagulation protocol recommend bridging patients with AF with CHADS2 > 2 (moderate-to-high thrombotic risk) in which all but 1 of the patients in this study met criteria.1,14 In contrast to the landmark BRIDGE trial, the mean CHADS2 score in this study was 3.6; this is an indication that our patient population was of individuals at an increased risk of stroke and embolism.
In addition to thromboembolic complications, patients in the current study also were at increased risk of clinically relevant bleeding with a mean HAS-BLED score of 4.1 and nearly all patients having a score > 3. The complexity of the veteran population also was displayed by this study’s mean CCI (7.7) and RCRI (3.0) indicating a 0% estimated 10-year survival and a 11% increase in having a perioperative cardiac event, respectively. A mean CCI of 7.7 is associated with a 13.3 relative risk of death within 6 years postoperation.15 All patients had a diagnosis of hypertension, and > 75% had this diagnosis complicated by DM. In addition, this patient population was of those with extensive cardiovascular disease or increased risk, which makes for a clinically relevant application of patients who would require periprocedural bridging.
Another positive aspect of this study is that all the baseline characteristics, apart from renal function, were similar between arms, helping to strengthen the ability to adequately compare the 2 bridging modalities. Our assumption for the reasoning that more stage 5 CKD and dialysis patients were anticoagulated with UFH vs enoxaparin is a result of concern for an increased risk of bleeding with a medication that is renally cleared 30% less in CrCl < 30 mL/min.16 Although, enoxaparin 1 mg/kg/d is FDA approved as a therapeutic anticoagulant option, clinicians at MEDVAMC likely had reservations about its use in end-stage CKD patients. Unlike many studies, including the BRIDGE trial, patients with ACKD were not excluded from this trial, and the outcomes with enoxaparin are available for interpretation.
To no surprise, for patients included in this study, enoxaparin use led to shorter hospital LOS, reduced ICU LOS, and a quicker time-to-discharge from initiation. This is credited to the 100% bioavailability of SC enoxaparin in conjunction with its means to be a therapeutic option as an outpatient.16 Unlike IV UFH, patients requiring bridging can be discharged on SC injections of enoxaparin until a therapeutic INR is maintained with warfarin.The duration of hospital LOS in both arms were longer in this study compared with that of other studies.9 This may be due to clinicians being more cautious with renal insufficient patients, and the patients included in this study had multiple comorbidities. According to an economic analysis performed by Amorosi and colleagues in 2004, bridging with enoxaparin instead of UFH can save up to $3,733 per patient and reduce bridging costs by 63% to 85% driven primarily by decreased hospital LOS.10
Economic Outcome
In our study, we conducted a cost analysis using national VA data that indicated a $41,138 or 44% reduction in total cost per average inpatient stay when bridging 1 patient with enoxaparin vs UFH. The benefit of this cost analysis is that it reflects direct costs at VA institutions nationally; this will allow these data to be useful for practitioners at MEDVAMC and other VA hospitals. Stratifying the costs by treating specialty instead of treatment location minimized skewing of the data as there were some patients with long LOS in the ICU. No patients in the enoxaparin arm were treated in otolaryngology, which may have skewed the data. The data included direct costs for beds as well as costs for multiple services, such as procedures, pharmacy, nursing, laboratory tests, and imaging. Unlike the Amorosi study, our review did not include acquisition costs for enoxaparin syringes and bags of UFH or laboratory costs for aPTT and anti-factor Xa levels in part because of the data source and the difficulty calculating costs over a 10-year span.
Patients in the enoxaparin arm had a trend toward fewer occurrences of hospital-acquired infections than did those in the UFH arm, which we believe is due to a decreased LOS (in both total hospital and ICU days) and fewer blood draws needed for monitoring. This also may be attributed to a longer mean duration of surgery in the UFH arm (1.3 hours) vs enoxaparin (0.9 hours). The percentage of patients with procedures ≥ 45 minutes and the types of procedures between both arms were similar. However, these outcomes were not statistically significant. In addition, elderly males who are hospitalized may require a catheter (due to urinary retention), and catheter-associated urinary tract infection (CAUTI) is one of the highest reported infections in acute care hospitals in the US. This is in line with our patient population and may be a supplementary reason for the increase in infection incidence with UFH. Though, whether urinary catheters were used in these patients was not evaluated in this study.
Despite being at an increased risk of experiencing a major adverse cardiovascular event (MACE), no patients in either arm had a stroke/TIA or MI within 30 days postprocedure. The only occurrences documented were VTEs, which happened only in 4 patients on UFH. Four people died in this study, solely in the UFH arm. The incidence of thromboembolic complications and death along with major and minor bleeding cannot be deduced as meaningful as this study was underpowered for these outcomes. Despite anti-factor Xa monitoring being recommended in ACKD patients on enoxaparin, this monitoring was not routinely performed in this study. Another limitation was the inability to adequately assess the appropriateness of nurse-adjusted UFH infusion rates largely due to the retrospective nature of this study. The variability of aPTT percentage in therapeutic range and time-to-therapeutic range reported was indicative of the difficulties of monitoring for the safety and efficacy of UFH.
In 1991, Cruickshank and colleagues conducted a study in which a standard nomogram (similar to the MEDVAMC nomogram) for the adjustment of IV heparin was implemented at a single hospital.17 The success rate (aPTT percentage in therapeutic range) was 59.4% and average time-to-therapeutic range was about 1 day. The success rate (46.3%) and time-to-therapeutic range (2.4 days) in our study were lower and longer, respectively, than was expected. One potential reason for this discrepancy could be the differences in indication as the patients in Cruickshank and colleagues were being treated for VTE, whereas patients in our study had AF or atrial flutter. Also, there were inconsistencies in the availability of documentation of monitoring parameters for heparin due to the study time frame and retrospective design. Patients on UFH who are not within the therapeutic range in a timely manner are at greater risk of MACE and major/minor bleeding. Our study was not powered to detect these findings.
Strengths and Limitations
A significant limitation of this study was its small sample size; the study was not able to meet power for the primary outcome; it is unknown whether our study met power for nosocomial infections. The study also was not a powered review of other adverse events, such as thromboembolic complications, bleeding, and death. The study had an uneven number of patients, which made it more difficult to appropriately compare 2 patient populations; the study also did not include medians for patient characteristics and outcomes.
Due to this study’s time frame, the clinical pharmacy services at MEDVAMC were not as robust as they are now, which is the reason the decisions on which anticoagulant to use were primarily physician based. The use of TheraDoc to identify patients posed the risk of missing patients who may not have had the appropriate laboratory tests performed (ie, SCr). Patients on UFH had a reduced eGFR compared with that of enoxaparin, which may limit our extrapolation of enoxaparin’s use in end-stage renal disease. The reduced eGFR and higher number of dialysis patients in the UFH arm may have increased the occurrence of more labile INRs and bleeding outcomes. Patients on hemodialysis typically have more comorbidities and an increased risk of infection due to the frequent use of catheters and needles to access the bloodstream. In addition, the potential differences in catheter use and duration between groups were not identified. If these parameters were studied, the data collected may have helped better explain the reasoning for increased incidence of infection in the UFH arm.
Strengths of this study include a complex patient population with similar characteristics, distribution of ethnicities representative of the US population, patients at moderate-to-high thrombotic risk, the analysis of nosocomial infections, and the exclusion of patients with normal renal function or moderate CKD.
Conclusion
To our knowledge, this is the first study to compare periprocedural bridging outcomes and incidence of nosocomial infections in patients with AF and ACKD. This review provides new evidence that in this patient population, enoxaparin is a potential anticoagulant to reduce hospital LOS and hospital-acquired infections. Compared with UFH, bridging with enoxaparin reduced hospital LOS and anticoagulation time-to-discharge by 7 and 5 days, respectively, and decreased the incidence of nosocomial infections by 30%. Using the mean LOS per treating specialty for both arms, bridging 1 patient with AF with enoxaparin vs UFH can potentially lead to an estimated $40,000 (44%) reduction in total cost of hospitalization. Enoxaparin also had no numeric differences in mortality and adverse events (stroke/TIA, MI, VTE) vs that of UFH, but it is important to note that this study was not powered to find a significant difference in these outcomes. Due to the mean eGFR of patients on enoxaparin being 22.6 mL/min/1.73 m2 and only 1 in 5 having stage 5 CKD, at this time, we do not recommend enoxaparin for periprocedural use in stage 5 CKD or in patients on hemodialysis. Larger studies are needed, including randomized trials, in this patient population to further evaluate these outcomes and assess the use of enoxaparin in patients with ACKD.
There has been a long-standing controversy in the use of parenteral anticoagulation for perioperative bridging in patients with atrial fibrillation (AF) pursuing elective surgery.1 The decision to bridge is dependent on the patient’s risk of thromboembolic complications and susceptibility to bleed.1 The BRIDGE trial showed noninferiority in rate of stroke and embolism events between low molecular weight heparins (LMWHs) and no perioperative bridging.2 However, according to the American College of Chest Physicians (CHEST) 2012 guidelines, patients in the BRIDGE trial would be deemed low risk for thromboembolic events displayed by a mean CHADS2 (congestive heart failure [CHF], hypertension, age, diabetes mellitus, and stroke/transient ischemic attack) score of 2.3. Also, the BRIDGE study and many others excluded patients with advanced forms of chronic kidney disease (CKD).2,3
Similar to patients with AF, patients with advanced CKD (ACKD, stage 4 and 5 CKD) have an increased risk of stroke and venous thromboembolism (VTE).4,5 Patients with AF and ACKD have not been adequately studied for perioperative anticoagulation bridging outcomes. Although unfractionated heparin (UFH) is preferred over LMWH in ACKD patients,enoxaparin can be used in this population.1,6 Enoxaparin 1 mg/kg once daily is approved by the US Food and Drug Administration (FDA) for use in patients with severe renal insufficiency defined as creatinine clearance (CrCl) < 30 mL/min. This dosage adjustment is subsequent to studies with enoxaparin 1 mg/kg twice daily that showed a significant increase in major and minor bleeding in severe renal-insufficient patients with CrCl < 30 mL/min vs patients with CrCl > 30 mL/min.7 When comparing the myocardial infarction (MI) outcomes of severe renal-insufficient patients in the ExTRACT-TIMI 25 trial, enoxaparin 1 mg/kg once daily had no significant difference in nonfatal major bleeding vs UFH.8 In patients without renal impairment (no documentation of kidney disease), bridging therapy with LMWH was completed more than UFH in < 24 hours of hospital stay and with similar rates of VTEs and major bleeding.9 In addition to its ability to be administered outpatient, enoxaparin has a more predictable pharmacokinetic profile, allowing for less monitoring and a lower incidence of heparin-induced thrombocytopenia (HIT) vs that of UFH.6
The Michael E. DeBakey Veteran Affairs Medical Center (MEDVAMC) in Houston, Texas, is one of the largest US Department of Veterans Affairs (VA) hospitals in the US, managing > 150,000 veterans in Southeast Texas and other southern states. As a referral center for traveling patients, it is crucial that MEDVAMC decrease hospital length of stay (LOS) to increase space for incoming patients. Reducing LOS also reduces costs and may have a correlation with decreasing the incidence of nosocomial infections. Because of its significance to this facility, hospital LOS is an appropriate primary outcome for this study.
To our knowledge, bridging outcomes between LMWH and UFH in patients with AF and ACKD have never been studied. We hypothesized that using enoxaparin instead of heparin for periprocedural management would result in decreased hospital LOS, leading to a lower economic burden and lower incidence of nosocomial infections with no significant differences in major and minor bleeding and thromboembolic complications.10
Methods
This study was a single-center, retrospective chart review of adult patients from January 2008 to September 2017. The review was conducted at MEDVAMC and was approved by the research and development committee and by the Baylor College of Medicine Institutional Review Board. Formal consent was not required.
Included patients were aged ≥ 18 years with diagnoses of AF or atrial flutter and ACKD as recognized by a glomerular filtration rate (eGFR) of < 30 mL/min/1.73 m2 as calculated by use of the Modification of Diet in Renal Disease Study (MDRD) equation.11 Patients must have previously been on warfarin and required temporary interruption of warfarin for an elective procedure. During the interruption of warfarin therapy, a requirement was set for patients to be on periprocedural anticoagulation with subcutaneous (SC) enoxaparin 1 mg/kg daily or continuous IV heparin per MEDVAMC heparin protocol. Patients were excluded if they had experienced major bleeding in the 6 weeks prior to the elective procedure, had current thrombocytopenia (platelet count < 100 × 109/L), or had a history of heparin-induced thrombocytopenia (HIT) or a heparin allergy.
This patient population was identified using TheraDoc Clinical Surveillance Software System (Charlotte, NC), which has prebuilt alert reviews for anticoagulation medications, including enoxaparin and heparin. An alert for patients on enoxaparin with serum creatinine (SCr) > 1.5 mg/dL was used to screen patients who met the inclusion criteria. A second alert identified patients on heparin. The VA Computerized Patient Record System (CPRS) was used to collect patient data.
Economic Analysis
An economic analysis was conducted using data from the VA Managerial Cost Accounting Reports. Data on the national average cost per bed day was used for the purpose of extrapolating this information to multiple VA institutions.12 National average cost per day was determined by dividing the total cost by the number of bed days for the identified treating specialty during the fiscal period of 2018. Average cost per day data included costs for bed day, surgery, radiology services, laboratory tests, pharmacy services, treatment location (ie, intensive care units [ICUs]) and all other costs associated with an inpatient stay. A cost analysis was performed using this average cost per bed day and the mean LOS between enoxaparin and UFH for each treating specialty. The major outcome of the cost analysis was the total cost per average inpatient stay. The national average cost per bed day for each treating specialty was multiplied by the average LOS found for each treating specialty in this study; the sum of all the average costs per inpatient stay for the treating specialties resulted in the total cost per average inpatient stay. Permission to use these data was granted by the Pharmacy and Critical Care Services at MEDVAMC.
Patient Demographics and Characteristics
Data were collected on patient demographics (Table 1). Nosocomial infections, stroke/transient ischemic attack, MI, VTE, major and minor bleeding, and death are defined in Table 2.
The primary outcome of the study was hospital LOS. The study was powered at 90% for α = .05, which gives a required study population of 114 (1:1 enrollment ratio) patients to determine a statistically significant difference in hospital stay. This sample size was calculated using the mean hospital LOS (the primary objective) in the REGIMEN registry for LMWH (4.6 days) and UFH (10.3 days).9 To our knowledge, the incidence of nosocomial infections (a secondary outcome) has not been studied in this patient population; therefore, there was no basis to assess an appropriate sample size to find a difference in this outcome. Furthermore, the goal was to collect as many patients as possible to best assess this variable. Because of an expected high exclusion rate, 504 patients were reviewed to target a sample size of 120 patients. Due to the single-center nature of this review, the secondary outcomes of thromboembolic complications and major and minor bleeding were expected to be underpowered.
The final analysis compared the enoxaparin arm with the UFH arm. Univariate differences between the treatment groups were compared using the Fisher exact test for categorical variables. Demographic data and other continuous variables were analyzed by an unpaired t test to compare means between the 2 arms. Outcomes and characteristics were deemed statistically significant when α (P value) was < .05. All P values reported were 2-tailed with a 95% CI. No statistical analysis was performed for the cost differences (based on LOS per treating specialty) in the 2 treatment arms. Statistical analyses were completed by utilizing GraphPad Software (San Diego, CA).
Results
In total, 50 patients were analyzed in the study. There were 36 patients bridged with IV UFH at a concentration of 25,000 U/250 mL with an initial infusion rate of 12 U/kg/h. For the other arm, 14 patients were anticoagulated with renally dosed enoxaparin 1 mg/kg/d with an average daily dose of 89.3 mg; the mean actual body weight in this group was 90.9 mg (correlates with enoxaparin daily dose). Physicians of the primary team decided which parenteral anticoagulant to use. The difference in mean duration of inpatient parental anticoagulation between both groups was not statistically significant: enoxaparin at 7.1 days and UFH at 9.6 days (P = .19). Patients in the enoxaparin arm were off warfarin therapy for an average of 6.0 days vs 7.5 days for the UFH group (P = .29). The duration of outpatient anticoagulation with enoxaparin was not analyzed in this study.
Patient and Procedure Characteristics
All patients had AF or atrial flutter with 86% of patients (n = 43) having a CHADS2 > 2 and 48% (n = 29) having a CHA2DS2VASc > 4. Overall, the mean age was 71.3 years with similarities in ethnicity distribution. Patients had multiple comorbidities as shown by a mean Charlson Comorbidity Index (CCI) of 7.7 and an increased risk of bleeding as evidenced by 98% (n = 48) of patients having a HAS-BLED score of ≥ 3. A greater percentage of patients bridged with enoxaparin had DM, history of stroke and MI, and a heart valve, whereas UFH patients were more likely to be in stage 5 CKD (eGFR < 15 mL/min/1.73m2) with a significantly lower mean eGFR (16.76 vs 22.64, P = .03). Furthermore, there were more patients on hemodialysis in the UFH (50%) arm vs enoxaparin (21%) arm and a lower mean CrCl with UFH (20.1 mL/min) compared with enoxaparin (24.9 mL/min); however, the differences in hemodialysis and mean CrCl were not statistically significant. There were no patients on peritoneal dialysis in this review.
Procedure Characteristics
The average Revised Cardiac Risk Index (RCRI) score was about 3, indicating that these patients were at a Class IV risk (11%) of having a perioperative cardiac event (Table 3). Nineteen patients (38%) elected for a major surgery with all but 1 of the surgeries (major or minor) being invasive. The average length of surgery was 1.2 hours, and patients were more likely to undergo cardiothoracic procedures (38%). There were 2 out of 14 (14%) patients on enoxaparin who were able to have surgery as an outpatient; whereas this did not occur in patients on UFH. The procedures completed for these patients were a colostomy (minor surgery) and arteriovenous graft repair (major surgery). There were no statistically significant differences regarding types of procedures between the 2 arms.
Outcomes
The primary outcome of this study, hospital LOS, differed significantly in the enoxaparin arm vs UFH: 10.2 days vs 17.5 days, P = .04 (Table 4). The time-to-discharge from initiation of parenteral anticoagulation was significantly reduced with enoxaparin (7.1 days) compared with UFH (11.9 days); P = .04. Although also reduced in the enoxaparin arm, ICU LOS did not show statistical significance (1.1 days vs 4.0 days, P = .09).
About 36% (n = 18) of patients in this study acquired an infection during hospitalization for elective surgery. The most common microorganism and site of infection were Enterococcus species and urinary tract, respectively (Table 5). Nearly half (44%, n = 16) of the patients in the UFH group had a nosocomial infection vs 14% (n = 2) of enoxaparin-bridged patients with a difference approaching significance; P = .056. Both patients in the enoxaparin group had the urinary tract as the primary source of infection; 1 of these patients had a urologic procedure.
Major bleeding occurred in 7% (n = 1) of enoxaparin patients vs 22% (n = 8) in the UFH arm, but this was not found to be statistically significant (P = .41). Minor bleeding was similar between enoxaparin and UFH arms (14% vs 19%, P = .99). Regarding thromboembolic complications, the enoxaparin group (0%) had a numerical reduction compared to UFH (11%) with VTE (n = 4) being the only occurrence of the composite outcome (P = .57). There were 4 deaths within 30 days posthospitalization—all were from the UFH group (P = .57). Due to the small sample size of this study, these outcomes (bleeding and thrombotic events) were not powered to detect a statistically significant difference.
Economic Analysis
The average cost differences (Table 6) of hospitalization between enoxaparin and UFH were calculated using the average LOS per treating specialty multiplied by the national average cost of the MCO for an inpatient bed day in 2018.12 The treating specialty with the longest average LOS in the enoxaparin arm was thoracic (4.7 days). The UFH arm also had a large LOS (average days) for the thoracic specialty (6.4 days); however, the vascular specialty (6.7 days) had the longest average LOS in this group. Due to a mean LOS of 10.2 days in the enoxaparin arm, which was further stratified by treating specialty, the total cost per average inpatient stay was calculated as $51,710. On the other hand, patients in the UFH arm had a total cost per average inpatient stay of $92,848.
Monitoring
Anti-factor Xa levels for LMWH monitoring were not analyzed in this study due to a lack of values collected; only 1 patient had an anti-factor Xa level checked during this time frame. Infusion rates of UFH were adjusted based on aPTT levels collected per MEDVAMC inpatient anticoagulation protocol. The average percentage of aPTT in therapeutic range was 46.3% and the mean time-to-therapeutic range (SD) was about 2.4 (1.3) days. Due to this study’s retrospective nature, there were inconsistencies with availability of documentation of UFH infusion rates. For this reason, these values were not analyzed further.
Discussion
In 2017, the American College of Cardiology published the Periprocedural Anticoagulation Expert Consensus Pathway, which recommends for patients with AF at low risk (CHA2DS2VASc 1-4) of thromboembolism to not be bridged (unless patient had a prior VTE or stroke/TIA).13 Nearly half the patients in this study, were classified as moderate-to-high thrombotic risk as evidenced by a CHA2DS2VASc > 4 with a mean score of 4.8. Due to this study’s retrospective design from 2008 to 2017, many of the clinicians may have referenced the 2008 CHEST antithrombotic guidelines when making the decision to bridge patients; these guidelines and the previous MEDVAMC anticoagulation protocol recommend bridging patients with AF with CHADS2 > 2 (moderate-to-high thrombotic risk) in which all but 1 of the patients in this study met criteria.1,14 In contrast to the landmark BRIDGE trial, the mean CHADS2 score in this study was 3.6; this is an indication that our patient population was of individuals at an increased risk of stroke and embolism.
In addition to thromboembolic complications, patients in the current study also were at increased risk of clinically relevant bleeding with a mean HAS-BLED score of 4.1 and nearly all patients having a score > 3. The complexity of the veteran population also was displayed by this study’s mean CCI (7.7) and RCRI (3.0) indicating a 0% estimated 10-year survival and a 11% increase in having a perioperative cardiac event, respectively. A mean CCI of 7.7 is associated with a 13.3 relative risk of death within 6 years postoperation.15 All patients had a diagnosis of hypertension, and > 75% had this diagnosis complicated by DM. In addition, this patient population was of those with extensive cardiovascular disease or increased risk, which makes for a clinically relevant application of patients who would require periprocedural bridging.
Another positive aspect of this study is that all the baseline characteristics, apart from renal function, were similar between arms, helping to strengthen the ability to adequately compare the 2 bridging modalities. Our assumption for the reasoning that more stage 5 CKD and dialysis patients were anticoagulated with UFH vs enoxaparin is a result of concern for an increased risk of bleeding with a medication that is renally cleared 30% less in CrCl < 30 mL/min.16 Although, enoxaparin 1 mg/kg/d is FDA approved as a therapeutic anticoagulant option, clinicians at MEDVAMC likely had reservations about its use in end-stage CKD patients. Unlike many studies, including the BRIDGE trial, patients with ACKD were not excluded from this trial, and the outcomes with enoxaparin are available for interpretation.
To no surprise, for patients included in this study, enoxaparin use led to shorter hospital LOS, reduced ICU LOS, and a quicker time-to-discharge from initiation. This is credited to the 100% bioavailability of SC enoxaparin in conjunction with its means to be a therapeutic option as an outpatient.16 Unlike IV UFH, patients requiring bridging can be discharged on SC injections of enoxaparin until a therapeutic INR is maintained with warfarin.The duration of hospital LOS in both arms were longer in this study compared with that of other studies.9 This may be due to clinicians being more cautious with renal insufficient patients, and the patients included in this study had multiple comorbidities. According to an economic analysis performed by Amorosi and colleagues in 2004, bridging with enoxaparin instead of UFH can save up to $3,733 per patient and reduce bridging costs by 63% to 85% driven primarily by decreased hospital LOS.10
Economic Outcome
In our study, we conducted a cost analysis using national VA data that indicated a $41,138 or 44% reduction in total cost per average inpatient stay when bridging 1 patient with enoxaparin vs UFH. The benefit of this cost analysis is that it reflects direct costs at VA institutions nationally; this will allow these data to be useful for practitioners at MEDVAMC and other VA hospitals. Stratifying the costs by treating specialty instead of treatment location minimized skewing of the data as there were some patients with long LOS in the ICU. No patients in the enoxaparin arm were treated in otolaryngology, which may have skewed the data. The data included direct costs for beds as well as costs for multiple services, such as procedures, pharmacy, nursing, laboratory tests, and imaging. Unlike the Amorosi study, our review did not include acquisition costs for enoxaparin syringes and bags of UFH or laboratory costs for aPTT and anti-factor Xa levels in part because of the data source and the difficulty calculating costs over a 10-year span.
Patients in the enoxaparin arm had a trend toward fewer occurrences of hospital-acquired infections than did those in the UFH arm, which we believe is due to a decreased LOS (in both total hospital and ICU days) and fewer blood draws needed for monitoring. This also may be attributed to a longer mean duration of surgery in the UFH arm (1.3 hours) vs enoxaparin (0.9 hours). The percentage of patients with procedures ≥ 45 minutes and the types of procedures between both arms were similar. However, these outcomes were not statistically significant. In addition, elderly males who are hospitalized may require a catheter (due to urinary retention), and catheter-associated urinary tract infection (CAUTI) is one of the highest reported infections in acute care hospitals in the US. This is in line with our patient population and may be a supplementary reason for the increase in infection incidence with UFH. Though, whether urinary catheters were used in these patients was not evaluated in this study.
Despite being at an increased risk of experiencing a major adverse cardiovascular event (MACE), no patients in either arm had a stroke/TIA or MI within 30 days postprocedure. The only occurrences documented were VTEs, which happened only in 4 patients on UFH. Four people died in this study, solely in the UFH arm. The incidence of thromboembolic complications and death along with major and minor bleeding cannot be deduced as meaningful as this study was underpowered for these outcomes. Despite anti-factor Xa monitoring being recommended in ACKD patients on enoxaparin, this monitoring was not routinely performed in this study. Another limitation was the inability to adequately assess the appropriateness of nurse-adjusted UFH infusion rates largely due to the retrospective nature of this study. The variability of aPTT percentage in therapeutic range and time-to-therapeutic range reported was indicative of the difficulties of monitoring for the safety and efficacy of UFH.
In 1991, Cruickshank and colleagues conducted a study in which a standard nomogram (similar to the MEDVAMC nomogram) for the adjustment of IV heparin was implemented at a single hospital.17 The success rate (aPTT percentage in therapeutic range) was 59.4% and average time-to-therapeutic range was about 1 day. The success rate (46.3%) and time-to-therapeutic range (2.4 days) in our study were lower and longer, respectively, than was expected. One potential reason for this discrepancy could be the differences in indication as the patients in Cruickshank and colleagues were being treated for VTE, whereas patients in our study had AF or atrial flutter. Also, there were inconsistencies in the availability of documentation of monitoring parameters for heparin due to the study time frame and retrospective design. Patients on UFH who are not within the therapeutic range in a timely manner are at greater risk of MACE and major/minor bleeding. Our study was not powered to detect these findings.
Strengths and Limitations
A significant limitation of this study was its small sample size; the study was not able to meet power for the primary outcome; it is unknown whether our study met power for nosocomial infections. The study also was not a powered review of other adverse events, such as thromboembolic complications, bleeding, and death. The study had an uneven number of patients, which made it more difficult to appropriately compare 2 patient populations; the study also did not include medians for patient characteristics and outcomes.
Due to this study’s time frame, the clinical pharmacy services at MEDVAMC were not as robust as they are now, which is the reason the decisions on which anticoagulant to use were primarily physician based. The use of TheraDoc to identify patients posed the risk of missing patients who may not have had the appropriate laboratory tests performed (ie, SCr). Patients on UFH had a reduced eGFR compared with that of enoxaparin, which may limit our extrapolation of enoxaparin’s use in end-stage renal disease. The reduced eGFR and higher number of dialysis patients in the UFH arm may have increased the occurrence of more labile INRs and bleeding outcomes. Patients on hemodialysis typically have more comorbidities and an increased risk of infection due to the frequent use of catheters and needles to access the bloodstream. In addition, the potential differences in catheter use and duration between groups were not identified. If these parameters were studied, the data collected may have helped better explain the reasoning for increased incidence of infection in the UFH arm.
Strengths of this study include a complex patient population with similar characteristics, distribution of ethnicities representative of the US population, patients at moderate-to-high thrombotic risk, the analysis of nosocomial infections, and the exclusion of patients with normal renal function or moderate CKD.
Conclusion
To our knowledge, this is the first study to compare periprocedural bridging outcomes and incidence of nosocomial infections in patients with AF and ACKD. This review provides new evidence that in this patient population, enoxaparin is a potential anticoagulant to reduce hospital LOS and hospital-acquired infections. Compared with UFH, bridging with enoxaparin reduced hospital LOS and anticoagulation time-to-discharge by 7 and 5 days, respectively, and decreased the incidence of nosocomial infections by 30%. Using the mean LOS per treating specialty for both arms, bridging 1 patient with AF with enoxaparin vs UFH can potentially lead to an estimated $40,000 (44%) reduction in total cost of hospitalization. Enoxaparin also had no numeric differences in mortality and adverse events (stroke/TIA, MI, VTE) vs that of UFH, but it is important to note that this study was not powered to find a significant difference in these outcomes. Due to the mean eGFR of patients on enoxaparin being 22.6 mL/min/1.73 m2 and only 1 in 5 having stage 5 CKD, at this time, we do not recommend enoxaparin for periprocedural use in stage 5 CKD or in patients on hemodialysis. Larger studies are needed, including randomized trials, in this patient population to further evaluate these outcomes and assess the use of enoxaparin in patients with ACKD.
1. Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2)(suppl):e326S-350S.
2. Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-833.
3. Hammerstingl C, Schmitz A, Fimmers R, Omran H. Bridging of chronic oral anticoagulation with enoxaparin in patients with atrial fibrillation: results from the prospective BRAVE registry. Cardiovasc Ther. 2009;27(4):230-238.
4. Dad T, Weiner DE. Stroke and chronic kidney disease: epidemiology, pathogenesis, and management across kidney disease stages. Semin Nephrol. 2015;35(4):311-322.
5. Wattanakit K, Cushman M. Chronic kidney disease and venous thromboembolism: epidemiology and mechanisms. Curr Opin Pulm Med. 2009;15(5):408-412.
6. Saltiel M. Dosing low molecular weight heparins in kidney disease. J Pharm Pract. 2010;23(3):205-209.
7. Spinler SA, Inverso SM, Cohen M, Goodman SG, Stringer KA, Antman EM; ESSENCE and TIMI 11B Investigators. Safety and efficacy of unfractionated heparin versus enoxaparin in patients who are obese and patients with severe renal impairment: analysis from the ESSENCE and TIMI 11B studies. Am Heart J. 2003;146(1):33-41.
8. Fox KA, Antman EM, Montalescot G, et al. The impact of renal dysfunction on outcomes in the ExTRACT-TIMI 25 trial. J Am Coll Cardiol. 2007;49(23):2249-2255.
9. Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost. 2006;4(6):1246-1252.
10. Amorosi SL, Tsilimingras K, Thompson D, Fanikos J, Weinstein MC, Goldhaber SZ. Cost analysis of “bridging therapy” with low-molecular-weight heparin versus unfractionated heparin during temporary interruption of chronic anticoagulation. Am J Cardiol. 2004;93(4):509-511.
11. Inker LA, Astor BC, Fox CH, et al. KDOQI US commentary on the 2012 KDIGO clinical practice guideline for the evaluation and management of CKD. Am J Kidney Dis. 2014;63(5):713-735.
12. US Department of Veteran Affairs. Managerial Cost Accounting Financial User Support Reports: fiscal year 2018. https://www.herc.research.va.gov/include/page.asp?id=managerial-cost-accounting. [Source not verified.]
13. Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC Expert Consensus Decision Pathway for Periprocedural Management of Anticoagulation in Patients With Nonvalvular Atrial Fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol. 2017;69(7):871-898.
14. Kearon C, Kahn SR, Agnelli G, et al. Antithrombotic therapy for venous thromboembolic disease: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines (8th Edition). Chest. 2008;133(6 suppl):454S-545S.
15. Charlson M, Szatrowski TP, Peterson J, Gold J. Validation of a combined comorbidity index. J Clin Epidemiol. 1994;47(11):1245-1251.
16. Lovenox [package insert]. Bridgewater, NJ: Sanofi-Aventis; December 2017.
17. Cruickshank MK, Levine MN, Hirsh J, Roberts R, Siguenza M. A standard heparin nomogram for the management of heparin therapy. Arch Intern Med. 1991;151(2):333-337.
18. Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation. 2015;131(5):488-494.
19. Verheugt FW, Steinhubl SR, Hamon M, et al. Incidence, prognostic impact, and influence of antithrombotic therapy on access and nonaccess site bleeding in percutaneous coronary intervention. JACC Cardiovasc Interv. 2011;4(2):191-197.
20. Bijsterveld NR, Peters RJ, Murphy SA, Bernink PJ, Tijssen JG, Cohen M. Recurrent cardiac ischemic events early after discontinuation of short-term heparin treatment in acute coronary syndromes: results from the Thrombolysis in Myocardial Infarction (TIMI) 11B and Efficacy and Safety of Subcutaneous Enoxaparin in Non-Q-Wave Coronary Events (ESSENCE) studies. J Am Coll Cardiol. 2003;42(12):2083-2089.
1. Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2)(suppl):e326S-350S.
2. Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-833.
3. Hammerstingl C, Schmitz A, Fimmers R, Omran H. Bridging of chronic oral anticoagulation with enoxaparin in patients with atrial fibrillation: results from the prospective BRAVE registry. Cardiovasc Ther. 2009;27(4):230-238.
4. Dad T, Weiner DE. Stroke and chronic kidney disease: epidemiology, pathogenesis, and management across kidney disease stages. Semin Nephrol. 2015;35(4):311-322.
5. Wattanakit K, Cushman M. Chronic kidney disease and venous thromboembolism: epidemiology and mechanisms. Curr Opin Pulm Med. 2009;15(5):408-412.
6. Saltiel M. Dosing low molecular weight heparins in kidney disease. J Pharm Pract. 2010;23(3):205-209.
7. Spinler SA, Inverso SM, Cohen M, Goodman SG, Stringer KA, Antman EM; ESSENCE and TIMI 11B Investigators. Safety and efficacy of unfractionated heparin versus enoxaparin in patients who are obese and patients with severe renal impairment: analysis from the ESSENCE and TIMI 11B studies. Am Heart J. 2003;146(1):33-41.
8. Fox KA, Antman EM, Montalescot G, et al. The impact of renal dysfunction on outcomes in the ExTRACT-TIMI 25 trial. J Am Coll Cardiol. 2007;49(23):2249-2255.
9. Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost. 2006;4(6):1246-1252.
10. Amorosi SL, Tsilimingras K, Thompson D, Fanikos J, Weinstein MC, Goldhaber SZ. Cost analysis of “bridging therapy” with low-molecular-weight heparin versus unfractionated heparin during temporary interruption of chronic anticoagulation. Am J Cardiol. 2004;93(4):509-511.
11. Inker LA, Astor BC, Fox CH, et al. KDOQI US commentary on the 2012 KDIGO clinical practice guideline for the evaluation and management of CKD. Am J Kidney Dis. 2014;63(5):713-735.
12. US Department of Veteran Affairs. Managerial Cost Accounting Financial User Support Reports: fiscal year 2018. https://www.herc.research.va.gov/include/page.asp?id=managerial-cost-accounting. [Source not verified.]
13. Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC Expert Consensus Decision Pathway for Periprocedural Management of Anticoagulation in Patients With Nonvalvular Atrial Fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol. 2017;69(7):871-898.
14. Kearon C, Kahn SR, Agnelli G, et al. Antithrombotic therapy for venous thromboembolic disease: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines (8th Edition). Chest. 2008;133(6 suppl):454S-545S.
15. Charlson M, Szatrowski TP, Peterson J, Gold J. Validation of a combined comorbidity index. J Clin Epidemiol. 1994;47(11):1245-1251.
16. Lovenox [package insert]. Bridgewater, NJ: Sanofi-Aventis; December 2017.
17. Cruickshank MK, Levine MN, Hirsh J, Roberts R, Siguenza M. A standard heparin nomogram for the management of heparin therapy. Arch Intern Med. 1991;151(2):333-337.
18. Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation. 2015;131(5):488-494.
19. Verheugt FW, Steinhubl SR, Hamon M, et al. Incidence, prognostic impact, and influence of antithrombotic therapy on access and nonaccess site bleeding in percutaneous coronary intervention. JACC Cardiovasc Interv. 2011;4(2):191-197.
20. Bijsterveld NR, Peters RJ, Murphy SA, Bernink PJ, Tijssen JG, Cohen M. Recurrent cardiac ischemic events early after discontinuation of short-term heparin treatment in acute coronary syndromes: results from the Thrombolysis in Myocardial Infarction (TIMI) 11B and Efficacy and Safety of Subcutaneous Enoxaparin in Non-Q-Wave Coronary Events (ESSENCE) studies. J Am Coll Cardiol. 2003;42(12):2083-2089.
Fluoroscopically Guided Lateral Approach Hip Injection
Hip injections are performed as diagnostic and therapeutic interventions across a variety of medical subspecialties, including but not limited to those practicing physical medicine and rehabilitation, pain medicine, sports medicine, orthopedic surgery, and radiology. Traditional image-guided intra-articular hip injection commonly uses an anterior-oblique approach from a starting point on the anterior groin traversing soft tissue anterior to the femoral neck to the target needle placement at the femoral head-neck junction.
In fluoroscopic procedures, a coaxial technique for needles placement is used for safe and precise insertion of needles. An X-ray beam is angled in line with the projected path of the needle from skin entry point to injection target. Coaxial, en face technique (also called EF, parallel, hub view, down the barrel, or barrel view) appears as a single radiopaque dot over the target injection site.1 This technique minimizes needle redirection for correction of the injection path and minimal disturbance of surrounding tissue on the approach to the intended target.
Noncoaxial technique, as used in the anterior-oblique approach, intentionally directs the needle away from a skin entry point, the needle barrel traversing the X-ray beam toward an injection target. Clinical challenges to injection with the anterior-oblique approach include using a noncoaxial technique. Additional challenges to the anterior-oblique (also referred to as anterior) approach are body habitus and pannus, proximity to neurovascular structures, and patient positioning. By understanding the risks and benefits of varied technical approaches to accomplish a clinical goal and outcome, trainees are better able to select the technique most appropriate for a varied patient population.
Common risks to patients for all intra-articular interventions include bleeding, infection, and pain. Risk of damage to nearby structures is often mentioned as part of a standard informed consent process as it relates to the femoral vein, artery, and nerve that are in close anatomical proximity to the target injection site. When prior studies have examined the risk of complications resulting from intra-articular hip injections, a common conclusion is that despite a relatively low-risk profile for skilled interventionalists, efforts to avoid needle placement in the medial 50% of the femoral head on antero-posterior imaging is recommended.2
The anterior technique is a commonly described approach, and the same can be used for both ultrasound-guided and fluoroscopically guided hip injections.3 Using ultrasound guidance, the anterior technique can be performed with in-plane direct visualization of the needle throughout the procedure. With fluoroscopic guidance, the anterior approach is performed out-of-plane, using the noncoaxial technique. This requires the interventionalist to use tactile and anatomic guidance to the target injection site. The anterior approach for hip injection is one of few interventions where coaxial technique is not used for the procedure, making the instruction for a learner less concrete and potentially more challenging related to the needle path not under direct visualization in plane with the X-ray beam.
Technical guidance and detailed instruction for the lateral approach is infrequently described in fluoroscopic interventional texts. Reference to a lateral approach hip injection was made as early as the 1970s, without detail provided on the technique, with respect to the advantage of visualization of the hip joint for needle placement when hardware is in place.4 A more recent article described a lateral approach technique involving the patient in a decubitus (lateral) supine position, which presents limitations in consistent fluoroscopic imaging and can be a challenging static position for the patient to maintain.5
The retrospective review of anterior-oblique and lateral approach procedures in this study aims to demonstrate that there is no significant difference in radiation exposure, rate of successful intra-articular injection, or complication rate. If proven as a noninferior technique, the lateral approach may be a valuable interventional skill to those performing hip injections. Potential benefits to the patient and provider include options for the provider to access the joint using either technique. Additionally, the approach can be added to the instructional plan for those practitioners providing technical instruction to trainees within their health care system.
Methods
The institutional review board at the VA Ann Arbor Healthcare System reviewed and granted approval for this study. One of 5 interventional pain physician staff members at the VA Ann Arbor Healthcare System performed fluoroscopically guided hip injections. Interventional pain fellows under the direct supervision of board-certified physicians performed the procedures for the study cases. Supervising physicians included both physiatrists and anesthesiologists. Images were reviewed and evaluated without corresponding patient biographic data.
For cases using the lateral approach, the patients were positioned supine on the fluoroscopy table. In anterior-posterior and lateral views, trajectory lines are drawn using a long metal marking rod held adjacent to the patient. With pulsed low-dose fluoroscopy, transverse lines are drawn to identify midpoint of the femoral head in lateral view (Figure 1A, x-axis) and the most direct line from skin to lateral femoral head neck junction joint target (Figure 1B, z-axis). Also confirmed in lateral view, the z-axis marked line drawn on the skin is used to confirm that this transverse plane crosses the overlapping femoral heads (Figure 1A, y-axis).
The cross-section of these transverse and coronal plane lines identifies the starting point for the most direct approach from skin to injection target at femoral head-neck junction. Using the coaxial technique in the lateral view, the needle is introduced and advanced using intermittent fluoroscopic images to the lateral joint target. Continuing in this view, the interventionalist can ensure that advancing the needle to the osseous endpoint will place the tip at the midpoint of the femoral head at the target on the lateral surface, avoiding inadvertent advance of the needle anterior or posterior the femoral head. Final needle placement confirmation is then completed in antero-posterior view (Figure 2A). Contrast enhancement is used to confirm intra-articular spread (Figure 2B).
Cases included in the study were performed over an 8-month period in 2017. Case images recorded in IntelliSpace PACS Radiology software (Andover, MA) were included by creating a list of all cases performed and documented using the major joint injection procedure code. The cases reviewed began with the most recent cases. Two research team members (1 radiologist and 1 interventional pain physician) reviewed the series of saved images for each patient and the associated procedure report. The research team members documented and recorded de-identified study data in Microsoft Excel (Redmond, WA).
Imaging reports, using the saved images and the associated procedure report, were classified for technical approach (anterior, lateral, or inconclusive), success of joint injection as evidenced by appropriate contrast enhancement within the joint space (successful, unsuccessful, or incomplete images), documented use of sedation (yes, no), patient positioning (supine, prone), radiation exposure dose, radiation exposure time, and additional comments, such as “notable pannus” or “hardware present” to annotate significant findings on imaging review.
Statistical Analysis
The distribution of 2 outcomes used to compare rates of complication, radiation dose, and exposure time was checked using the Shapiro-Wilk test. Power analysis determined that inclusion of 30 anterior and 30 lateral cases results in adequate power to detect a 1-point mean difference, assuming a standard deviation of 1.5 in each group. Both radiation dose and exposure time were found to be nonnormally distributed (W = 0.65, P < .001; W = 0.86, P < .001; respectively). Median and interquartile range (IQR) of dose and time in seconds for anterior and lateral approaches were computed. Median differences in radiation dose and exposure time between anterior and lateral approaches were assessed with the k-sample test of equality of medians. All analyses were conducted using Stata Version 14.1 (College Station, TX).
Results
Between June 2017 and January 2018, 88 cases were reviewed as performed, with 30 anterior and 30 lateral approach cases included in this retrospective comparison study. A total of 28 cases were excluded from the study for using an inconclusive approach, multiple or bilateral procedures, cases without recorded dose and time data, and inadequately saved images to provide meaningful data (Figure 3).
Rate of successful intervention with needle placement confirmed within the articular space on contrast enhancement was not significantly different in the study groups with 96.7% (29 of 30) anterior approach cases reported as successful, 100% (30 of 30) lateral approach cases reported as successful. Overhanging pannus in the viewing area was reported in 5 anterior approach cases and 4 lateral cases. Hardware was noted in 2 lateral approach cases, none in anterior approach cases. Sedation was used for 3 of the anterior approach cases and none of the lateral approach cases.
Patients undergoing the lateral approach received a higher median radiation dose than did those undergoing the anterior approach, but this was not statistically significant (P = .07) (Table). Those undergoing the lateral approach also had a longer median exposure time than did those undergoing the anterior approach, but this also was not statistically significant (P = .3). With no immediate complications reported in any of the studied interventions, there was no difference in complication rates between anterior and lateral approach cases.
Discussion
Pain medicine fellows who have previously completed residency in a variety of disciplines, often either anesthesiology or physical medicine and rehabilitation, perform fluoroscopically guided procedures and benefit from increased experience with coaxial technique as this improves needle depth and location awareness. Once mastered, this skill set can be applied to and useful for multiple interventional pain procedures. Similar technical instruction with an emphasis on coaxial technique for hip injections as performed in the anterior or anterolateral approach can be used in both fluoroscopic and ultrasound-guided procedures, including facet injection, transforaminal epidural steroid injection, and myriad other procedures performed to ameliorate pain. There are advantages to pursuing a similar approach with all image-guided procedures. Evaluated in this comparison study is an alternative technique that has potential for risk reduction benefit with reduced proximity to neurovascular structures, which ultimately leads to a safer procedure profile.
Using a lateral approach, the interventionalist determines a starting point, entering the skin at a greater distance from any overlying pannus and the elevated concentration of gram-negative and gram-positive bacteria contained within the inguinal skin.6 A previous study demonstrated improved success of intra-articular needle tip placement without image guidance in patients with body mass index (BMI) < 30.7 A prior study of anterior approach using anatomic landmarks as compared to lateral approach demonstrated the anterior approach pierced or contacted the femoral nerve in 27% of anterior cases and came within 5 mm of 60% of anterior cases.2 Use of image guidance, whether ultrasound, fluoroscopy, or computed tomography (CT) is preferred related to reduced risk of contact with adjacent neurovascular structures. Anatomic surface landmarks have been described as an alternative injection technique, without the use of fluoroscopy for confirmatory initial, intraprocedure, and final placement.8 Palpation of anatomic structures is required for this nonimage-guided technique, and although similar to the described technique in this study, the anatomically guided injection starting point is more lateral than the anterior approach but not in the most lateral position in the transverse plane that is used for this fluoroscopically guided lateral approach study.
Physiologic characteristics of subjects and technical aspects of fluoroscopy both can be factors in radiation dose and exposure times for hip injections. Patient BMI was not included in the data collection, but further study would seek to determine whether BMI is a significant risk for any increased radiation dose and exposure times using lateral approach injections. Use of lateral images for fluoroscopy requires penetration of X-ray beam through more tissue compared with that of anterior-posterior images. Further study of these techniques would benefit from comparing the pulse rate of fluoroscopic images and collimation (or focusing of the radiation beam over a smaller area of tissue) as factors in any observed increase in total radiation dose and exposure times.
Improving the safety profile of this procedure could have a positive impact on the patient population receiving fluoroscopic hip injections, both within the VA Ann Arbor Health System and elsewhere. While the study population was limited to the VA patient population seeking subspecialty nonsurgical joint care at a single tertiary care center, this technique is generalizable and can be used in most patients, as hip pain is a common condition necessitating nonoperative evaluation and treatment.
Radiation Exposures
As our analysis demonstrates, mean radiation dose exposure for each group was consistent with low (≤ 3 mSv) to moderate (> 3-20 mSv) annual effective doses in the general population.7 Both anterior and lateral median radiation dose of 1 mGy and 3 mGy, respectively, are within the standard exposure for radiographs of the pelvis (1.31 mGy).9 It is therefore reasonable to consider a lateral approach for hip injection, given the benefits of direct coaxial approach and avoiding needle entry through higher bacteria-concentrated skin.
The lateral approach did have increased radiation dose and exposure time, although it was not statistically significantly greater than the anterior approach. The difference between radiation dose and time to perform either technique was not clinically significant. One potential explanation for this is that the lateral technique has increased tissue to penetrate, which can be reduced with collimation and other fluoroscopic image adjustments. Additionally, as trainees progress in competency, fewer images should need to be obtained.7 We hypothesize that as familiarity and comfort with this technique increase, the number of images necessary for successful injection would decrease, leading to decreased radiation dose and exposure time. We would expect that in the hands of a board-certified interventionalist, radiation dose and exposure time would be significantly decreased as compared to our current dataset, and this is an area of planned further study. With our existing dataset, the majority of procedures were performed with trainees, with inadequate information documented for comparison of dose over time and procedural experience under individual physicians.
Notable strengths of this study are the direct comparison of the anterior approach when compared to the lateral approach with regard to radiation dose and exposure time, which we have not seen described in the literature. A detailed description of the technique may result in increased utilization by other providers. Data were collected from multiple providers, as board-certified pain physicians and board-eligible interventional pain fellows performed the procedures. This variability in providers increases the generalizability of the findings, with a variety of providers, disciplines, years of experiences, and type of training represented.
Limitations
Limitations include the retrospective nature of the study and the relatively small sample size. However, even with this limitation, it is notable that no statistically significant differences were observed in mean radiation dose or fluoroscopy exposure time, making the lateral approach, at minimum, a noninferior technique. Combined with the improved safety profile, this technique is a viable alternative to the traditional anterior-oblique approach. Further study should be performed, such as a prospective, randomized control trial investigating the 2 techniques and following pain scores and functional ability after the procedure.
Conclusion
Given the decreased procedural risk related to proximity of neurovascular structures and coaxial technique for needle advancement, lateral approach for hip injection should be considered by those in any discipline performing fluoroscopically guided procedures. Lateral technique may be particularly useful in technically challenging cases and when skin entry at the anterior groin is suboptimal, as a noninferior alternative to traditional anterior method.
1. Cianfoni A, Boulter DJ, Rumboldt Z, Sapton T, Bonaldi G. Guidelines to imaging landmarks for interventional spine procedures: fluoroscopy and CT anatomy. Neurographics. 2011;1(1):39-48.
2. Leopold SS, Battista V, Oliverio JA. Safety and efficacy of intraarticular hip injection using anatomic landmarks. Clin Orthop Relat Res. 2001;(391):192-197.
3. Dodré E, Lefebvre G, Cockenpot E, Chastanet P, Cotten A. Interventional MSK procedures: the hip. Br J Radiol. 2016;89(1057):20150408.
4. Hankey S, McCall IW, Park WM, O’Connor BT. Technical problems in arthrography of the painful hip arthroplasty. Clin Radiol. 1979;30(6):653-656.
5. Yasar E, Singh JR, Hill J, Akuthota V. Image-guided injections of the hip. J Nov Physiother Phys Rehabil. 2014;1(2):39-48.
6. Aly R, Maibach HI. Aerobic microbial flora of intertrigenous skin. Appl Environ Microbiol. 1977;33(1):97-100.
7. Fazel R, Krumholz HM, Wang W, et al. Exposure to low-dose ionizing radiation from medical imaging procedures. N Engl J Med. 2009;361(9):849-857.
8. Masoud MA, Said HG. Intra-articular hip injection using anatomic surface landmarks. Arthosc Tech. 2013;2(2):e147-e149.
9. Ofori K, Gordon SW, Akrobortu E, Ampene AA, Darko EO. Estimation of adult patient doses for selected x-ray diagnostic examinations. J Radiat Res Appl Sci. 2014;7(4):459-462.
Hip injections are performed as diagnostic and therapeutic interventions across a variety of medical subspecialties, including but not limited to those practicing physical medicine and rehabilitation, pain medicine, sports medicine, orthopedic surgery, and radiology. Traditional image-guided intra-articular hip injection commonly uses an anterior-oblique approach from a starting point on the anterior groin traversing soft tissue anterior to the femoral neck to the target needle placement at the femoral head-neck junction.
In fluoroscopic procedures, a coaxial technique for needles placement is used for safe and precise insertion of needles. An X-ray beam is angled in line with the projected path of the needle from skin entry point to injection target. Coaxial, en face technique (also called EF, parallel, hub view, down the barrel, or barrel view) appears as a single radiopaque dot over the target injection site.1 This technique minimizes needle redirection for correction of the injection path and minimal disturbance of surrounding tissue on the approach to the intended target.
Noncoaxial technique, as used in the anterior-oblique approach, intentionally directs the needle away from a skin entry point, the needle barrel traversing the X-ray beam toward an injection target. Clinical challenges to injection with the anterior-oblique approach include using a noncoaxial technique. Additional challenges to the anterior-oblique (also referred to as anterior) approach are body habitus and pannus, proximity to neurovascular structures, and patient positioning. By understanding the risks and benefits of varied technical approaches to accomplish a clinical goal and outcome, trainees are better able to select the technique most appropriate for a varied patient population.
Common risks to patients for all intra-articular interventions include bleeding, infection, and pain. Risk of damage to nearby structures is often mentioned as part of a standard informed consent process as it relates to the femoral vein, artery, and nerve that are in close anatomical proximity to the target injection site. When prior studies have examined the risk of complications resulting from intra-articular hip injections, a common conclusion is that despite a relatively low-risk profile for skilled interventionalists, efforts to avoid needle placement in the medial 50% of the femoral head on antero-posterior imaging is recommended.2
The anterior technique is a commonly described approach, and the same can be used for both ultrasound-guided and fluoroscopically guided hip injections.3 Using ultrasound guidance, the anterior technique can be performed with in-plane direct visualization of the needle throughout the procedure. With fluoroscopic guidance, the anterior approach is performed out-of-plane, using the noncoaxial technique. This requires the interventionalist to use tactile and anatomic guidance to the target injection site. The anterior approach for hip injection is one of few interventions where coaxial technique is not used for the procedure, making the instruction for a learner less concrete and potentially more challenging related to the needle path not under direct visualization in plane with the X-ray beam.
Technical guidance and detailed instruction for the lateral approach is infrequently described in fluoroscopic interventional texts. Reference to a lateral approach hip injection was made as early as the 1970s, without detail provided on the technique, with respect to the advantage of visualization of the hip joint for needle placement when hardware is in place.4 A more recent article described a lateral approach technique involving the patient in a decubitus (lateral) supine position, which presents limitations in consistent fluoroscopic imaging and can be a challenging static position for the patient to maintain.5
The retrospective review of anterior-oblique and lateral approach procedures in this study aims to demonstrate that there is no significant difference in radiation exposure, rate of successful intra-articular injection, or complication rate. If proven as a noninferior technique, the lateral approach may be a valuable interventional skill to those performing hip injections. Potential benefits to the patient and provider include options for the provider to access the joint using either technique. Additionally, the approach can be added to the instructional plan for those practitioners providing technical instruction to trainees within their health care system.
Methods
The institutional review board at the VA Ann Arbor Healthcare System reviewed and granted approval for this study. One of 5 interventional pain physician staff members at the VA Ann Arbor Healthcare System performed fluoroscopically guided hip injections. Interventional pain fellows under the direct supervision of board-certified physicians performed the procedures for the study cases. Supervising physicians included both physiatrists and anesthesiologists. Images were reviewed and evaluated without corresponding patient biographic data.
For cases using the lateral approach, the patients were positioned supine on the fluoroscopy table. In anterior-posterior and lateral views, trajectory lines are drawn using a long metal marking rod held adjacent to the patient. With pulsed low-dose fluoroscopy, transverse lines are drawn to identify midpoint of the femoral head in lateral view (Figure 1A, x-axis) and the most direct line from skin to lateral femoral head neck junction joint target (Figure 1B, z-axis). Also confirmed in lateral view, the z-axis marked line drawn on the skin is used to confirm that this transverse plane crosses the overlapping femoral heads (Figure 1A, y-axis).
The cross-section of these transverse and coronal plane lines identifies the starting point for the most direct approach from skin to injection target at femoral head-neck junction. Using the coaxial technique in the lateral view, the needle is introduced and advanced using intermittent fluoroscopic images to the lateral joint target. Continuing in this view, the interventionalist can ensure that advancing the needle to the osseous endpoint will place the tip at the midpoint of the femoral head at the target on the lateral surface, avoiding inadvertent advance of the needle anterior or posterior the femoral head. Final needle placement confirmation is then completed in antero-posterior view (Figure 2A). Contrast enhancement is used to confirm intra-articular spread (Figure 2B).
Cases included in the study were performed over an 8-month period in 2017. Case images recorded in IntelliSpace PACS Radiology software (Andover, MA) were included by creating a list of all cases performed and documented using the major joint injection procedure code. The cases reviewed began with the most recent cases. Two research team members (1 radiologist and 1 interventional pain physician) reviewed the series of saved images for each patient and the associated procedure report. The research team members documented and recorded de-identified study data in Microsoft Excel (Redmond, WA).
Imaging reports, using the saved images and the associated procedure report, were classified for technical approach (anterior, lateral, or inconclusive), success of joint injection as evidenced by appropriate contrast enhancement within the joint space (successful, unsuccessful, or incomplete images), documented use of sedation (yes, no), patient positioning (supine, prone), radiation exposure dose, radiation exposure time, and additional comments, such as “notable pannus” or “hardware present” to annotate significant findings on imaging review.
Statistical Analysis
The distribution of 2 outcomes used to compare rates of complication, radiation dose, and exposure time was checked using the Shapiro-Wilk test. Power analysis determined that inclusion of 30 anterior and 30 lateral cases results in adequate power to detect a 1-point mean difference, assuming a standard deviation of 1.5 in each group. Both radiation dose and exposure time were found to be nonnormally distributed (W = 0.65, P < .001; W = 0.86, P < .001; respectively). Median and interquartile range (IQR) of dose and time in seconds for anterior and lateral approaches were computed. Median differences in radiation dose and exposure time between anterior and lateral approaches were assessed with the k-sample test of equality of medians. All analyses were conducted using Stata Version 14.1 (College Station, TX).
Results
Between June 2017 and January 2018, 88 cases were reviewed as performed, with 30 anterior and 30 lateral approach cases included in this retrospective comparison study. A total of 28 cases were excluded from the study for using an inconclusive approach, multiple or bilateral procedures, cases without recorded dose and time data, and inadequately saved images to provide meaningful data (Figure 3).
Rate of successful intervention with needle placement confirmed within the articular space on contrast enhancement was not significantly different in the study groups with 96.7% (29 of 30) anterior approach cases reported as successful, 100% (30 of 30) lateral approach cases reported as successful. Overhanging pannus in the viewing area was reported in 5 anterior approach cases and 4 lateral cases. Hardware was noted in 2 lateral approach cases, none in anterior approach cases. Sedation was used for 3 of the anterior approach cases and none of the lateral approach cases.
Patients undergoing the lateral approach received a higher median radiation dose than did those undergoing the anterior approach, but this was not statistically significant (P = .07) (Table). Those undergoing the lateral approach also had a longer median exposure time than did those undergoing the anterior approach, but this also was not statistically significant (P = .3). With no immediate complications reported in any of the studied interventions, there was no difference in complication rates between anterior and lateral approach cases.
Discussion
Pain medicine fellows who have previously completed residency in a variety of disciplines, often either anesthesiology or physical medicine and rehabilitation, perform fluoroscopically guided procedures and benefit from increased experience with coaxial technique as this improves needle depth and location awareness. Once mastered, this skill set can be applied to and useful for multiple interventional pain procedures. Similar technical instruction with an emphasis on coaxial technique for hip injections as performed in the anterior or anterolateral approach can be used in both fluoroscopic and ultrasound-guided procedures, including facet injection, transforaminal epidural steroid injection, and myriad other procedures performed to ameliorate pain. There are advantages to pursuing a similar approach with all image-guided procedures. Evaluated in this comparison study is an alternative technique that has potential for risk reduction benefit with reduced proximity to neurovascular structures, which ultimately leads to a safer procedure profile.
Using a lateral approach, the interventionalist determines a starting point, entering the skin at a greater distance from any overlying pannus and the elevated concentration of gram-negative and gram-positive bacteria contained within the inguinal skin.6 A previous study demonstrated improved success of intra-articular needle tip placement without image guidance in patients with body mass index (BMI) < 30.7 A prior study of anterior approach using anatomic landmarks as compared to lateral approach demonstrated the anterior approach pierced or contacted the femoral nerve in 27% of anterior cases and came within 5 mm of 60% of anterior cases.2 Use of image guidance, whether ultrasound, fluoroscopy, or computed tomography (CT) is preferred related to reduced risk of contact with adjacent neurovascular structures. Anatomic surface landmarks have been described as an alternative injection technique, without the use of fluoroscopy for confirmatory initial, intraprocedure, and final placement.8 Palpation of anatomic structures is required for this nonimage-guided technique, and although similar to the described technique in this study, the anatomically guided injection starting point is more lateral than the anterior approach but not in the most lateral position in the transverse plane that is used for this fluoroscopically guided lateral approach study.
Physiologic characteristics of subjects and technical aspects of fluoroscopy both can be factors in radiation dose and exposure times for hip injections. Patient BMI was not included in the data collection, but further study would seek to determine whether BMI is a significant risk for any increased radiation dose and exposure times using lateral approach injections. Use of lateral images for fluoroscopy requires penetration of X-ray beam through more tissue compared with that of anterior-posterior images. Further study of these techniques would benefit from comparing the pulse rate of fluoroscopic images and collimation (or focusing of the radiation beam over a smaller area of tissue) as factors in any observed increase in total radiation dose and exposure times.
Improving the safety profile of this procedure could have a positive impact on the patient population receiving fluoroscopic hip injections, both within the VA Ann Arbor Health System and elsewhere. While the study population was limited to the VA patient population seeking subspecialty nonsurgical joint care at a single tertiary care center, this technique is generalizable and can be used in most patients, as hip pain is a common condition necessitating nonoperative evaluation and treatment.
Radiation Exposures
As our analysis demonstrates, mean radiation dose exposure for each group was consistent with low (≤ 3 mSv) to moderate (> 3-20 mSv) annual effective doses in the general population.7 Both anterior and lateral median radiation dose of 1 mGy and 3 mGy, respectively, are within the standard exposure for radiographs of the pelvis (1.31 mGy).9 It is therefore reasonable to consider a lateral approach for hip injection, given the benefits of direct coaxial approach and avoiding needle entry through higher bacteria-concentrated skin.
The lateral approach did have increased radiation dose and exposure time, although it was not statistically significantly greater than the anterior approach. The difference between radiation dose and time to perform either technique was not clinically significant. One potential explanation for this is that the lateral technique has increased tissue to penetrate, which can be reduced with collimation and other fluoroscopic image adjustments. Additionally, as trainees progress in competency, fewer images should need to be obtained.7 We hypothesize that as familiarity and comfort with this technique increase, the number of images necessary for successful injection would decrease, leading to decreased radiation dose and exposure time. We would expect that in the hands of a board-certified interventionalist, radiation dose and exposure time would be significantly decreased as compared to our current dataset, and this is an area of planned further study. With our existing dataset, the majority of procedures were performed with trainees, with inadequate information documented for comparison of dose over time and procedural experience under individual physicians.
Notable strengths of this study are the direct comparison of the anterior approach when compared to the lateral approach with regard to radiation dose and exposure time, which we have not seen described in the literature. A detailed description of the technique may result in increased utilization by other providers. Data were collected from multiple providers, as board-certified pain physicians and board-eligible interventional pain fellows performed the procedures. This variability in providers increases the generalizability of the findings, with a variety of providers, disciplines, years of experiences, and type of training represented.
Limitations
Limitations include the retrospective nature of the study and the relatively small sample size. However, even with this limitation, it is notable that no statistically significant differences were observed in mean radiation dose or fluoroscopy exposure time, making the lateral approach, at minimum, a noninferior technique. Combined with the improved safety profile, this technique is a viable alternative to the traditional anterior-oblique approach. Further study should be performed, such as a prospective, randomized control trial investigating the 2 techniques and following pain scores and functional ability after the procedure.
Conclusion
Given the decreased procedural risk related to proximity of neurovascular structures and coaxial technique for needle advancement, lateral approach for hip injection should be considered by those in any discipline performing fluoroscopically guided procedures. Lateral technique may be particularly useful in technically challenging cases and when skin entry at the anterior groin is suboptimal, as a noninferior alternative to traditional anterior method.
Hip injections are performed as diagnostic and therapeutic interventions across a variety of medical subspecialties, including but not limited to those practicing physical medicine and rehabilitation, pain medicine, sports medicine, orthopedic surgery, and radiology. Traditional image-guided intra-articular hip injection commonly uses an anterior-oblique approach from a starting point on the anterior groin traversing soft tissue anterior to the femoral neck to the target needle placement at the femoral head-neck junction.
In fluoroscopic procedures, a coaxial technique for needles placement is used for safe and precise insertion of needles. An X-ray beam is angled in line with the projected path of the needle from skin entry point to injection target. Coaxial, en face technique (also called EF, parallel, hub view, down the barrel, or barrel view) appears as a single radiopaque dot over the target injection site.1 This technique minimizes needle redirection for correction of the injection path and minimal disturbance of surrounding tissue on the approach to the intended target.
Noncoaxial technique, as used in the anterior-oblique approach, intentionally directs the needle away from a skin entry point, the needle barrel traversing the X-ray beam toward an injection target. Clinical challenges to injection with the anterior-oblique approach include using a noncoaxial technique. Additional challenges to the anterior-oblique (also referred to as anterior) approach are body habitus and pannus, proximity to neurovascular structures, and patient positioning. By understanding the risks and benefits of varied technical approaches to accomplish a clinical goal and outcome, trainees are better able to select the technique most appropriate for a varied patient population.
Common risks to patients for all intra-articular interventions include bleeding, infection, and pain. Risk of damage to nearby structures is often mentioned as part of a standard informed consent process as it relates to the femoral vein, artery, and nerve that are in close anatomical proximity to the target injection site. When prior studies have examined the risk of complications resulting from intra-articular hip injections, a common conclusion is that despite a relatively low-risk profile for skilled interventionalists, efforts to avoid needle placement in the medial 50% of the femoral head on antero-posterior imaging is recommended.2
The anterior technique is a commonly described approach, and the same can be used for both ultrasound-guided and fluoroscopically guided hip injections.3 Using ultrasound guidance, the anterior technique can be performed with in-plane direct visualization of the needle throughout the procedure. With fluoroscopic guidance, the anterior approach is performed out-of-plane, using the noncoaxial technique. This requires the interventionalist to use tactile and anatomic guidance to the target injection site. The anterior approach for hip injection is one of few interventions where coaxial technique is not used for the procedure, making the instruction for a learner less concrete and potentially more challenging related to the needle path not under direct visualization in plane with the X-ray beam.
Technical guidance and detailed instruction for the lateral approach is infrequently described in fluoroscopic interventional texts. Reference to a lateral approach hip injection was made as early as the 1970s, without detail provided on the technique, with respect to the advantage of visualization of the hip joint for needle placement when hardware is in place.4 A more recent article described a lateral approach technique involving the patient in a decubitus (lateral) supine position, which presents limitations in consistent fluoroscopic imaging and can be a challenging static position for the patient to maintain.5
The retrospective review of anterior-oblique and lateral approach procedures in this study aims to demonstrate that there is no significant difference in radiation exposure, rate of successful intra-articular injection, or complication rate. If proven as a noninferior technique, the lateral approach may be a valuable interventional skill to those performing hip injections. Potential benefits to the patient and provider include options for the provider to access the joint using either technique. Additionally, the approach can be added to the instructional plan for those practitioners providing technical instruction to trainees within their health care system.
Methods
The institutional review board at the VA Ann Arbor Healthcare System reviewed and granted approval for this study. One of 5 interventional pain physician staff members at the VA Ann Arbor Healthcare System performed fluoroscopically guided hip injections. Interventional pain fellows under the direct supervision of board-certified physicians performed the procedures for the study cases. Supervising physicians included both physiatrists and anesthesiologists. Images were reviewed and evaluated without corresponding patient biographic data.
For cases using the lateral approach, the patients were positioned supine on the fluoroscopy table. In anterior-posterior and lateral views, trajectory lines are drawn using a long metal marking rod held adjacent to the patient. With pulsed low-dose fluoroscopy, transverse lines are drawn to identify midpoint of the femoral head in lateral view (Figure 1A, x-axis) and the most direct line from skin to lateral femoral head neck junction joint target (Figure 1B, z-axis). Also confirmed in lateral view, the z-axis marked line drawn on the skin is used to confirm that this transverse plane crosses the overlapping femoral heads (Figure 1A, y-axis).
The cross-section of these transverse and coronal plane lines identifies the starting point for the most direct approach from skin to injection target at femoral head-neck junction. Using the coaxial technique in the lateral view, the needle is introduced and advanced using intermittent fluoroscopic images to the lateral joint target. Continuing in this view, the interventionalist can ensure that advancing the needle to the osseous endpoint will place the tip at the midpoint of the femoral head at the target on the lateral surface, avoiding inadvertent advance of the needle anterior or posterior the femoral head. Final needle placement confirmation is then completed in antero-posterior view (Figure 2A). Contrast enhancement is used to confirm intra-articular spread (Figure 2B).
Cases included in the study were performed over an 8-month period in 2017. Case images recorded in IntelliSpace PACS Radiology software (Andover, MA) were included by creating a list of all cases performed and documented using the major joint injection procedure code. The cases reviewed began with the most recent cases. Two research team members (1 radiologist and 1 interventional pain physician) reviewed the series of saved images for each patient and the associated procedure report. The research team members documented and recorded de-identified study data in Microsoft Excel (Redmond, WA).
Imaging reports, using the saved images and the associated procedure report, were classified for technical approach (anterior, lateral, or inconclusive), success of joint injection as evidenced by appropriate contrast enhancement within the joint space (successful, unsuccessful, or incomplete images), documented use of sedation (yes, no), patient positioning (supine, prone), radiation exposure dose, radiation exposure time, and additional comments, such as “notable pannus” or “hardware present” to annotate significant findings on imaging review.
Statistical Analysis
The distribution of 2 outcomes used to compare rates of complication, radiation dose, and exposure time was checked using the Shapiro-Wilk test. Power analysis determined that inclusion of 30 anterior and 30 lateral cases results in adequate power to detect a 1-point mean difference, assuming a standard deviation of 1.5 in each group. Both radiation dose and exposure time were found to be nonnormally distributed (W = 0.65, P < .001; W = 0.86, P < .001; respectively). Median and interquartile range (IQR) of dose and time in seconds for anterior and lateral approaches were computed. Median differences in radiation dose and exposure time between anterior and lateral approaches were assessed with the k-sample test of equality of medians. All analyses were conducted using Stata Version 14.1 (College Station, TX).
Results
Between June 2017 and January 2018, 88 cases were reviewed as performed, with 30 anterior and 30 lateral approach cases included in this retrospective comparison study. A total of 28 cases were excluded from the study for using an inconclusive approach, multiple or bilateral procedures, cases without recorded dose and time data, and inadequately saved images to provide meaningful data (Figure 3).
Rate of successful intervention with needle placement confirmed within the articular space on contrast enhancement was not significantly different in the study groups with 96.7% (29 of 30) anterior approach cases reported as successful, 100% (30 of 30) lateral approach cases reported as successful. Overhanging pannus in the viewing area was reported in 5 anterior approach cases and 4 lateral cases. Hardware was noted in 2 lateral approach cases, none in anterior approach cases. Sedation was used for 3 of the anterior approach cases and none of the lateral approach cases.
Patients undergoing the lateral approach received a higher median radiation dose than did those undergoing the anterior approach, but this was not statistically significant (P = .07) (Table). Those undergoing the lateral approach also had a longer median exposure time than did those undergoing the anterior approach, but this also was not statistically significant (P = .3). With no immediate complications reported in any of the studied interventions, there was no difference in complication rates between anterior and lateral approach cases.
Discussion
Pain medicine fellows who have previously completed residency in a variety of disciplines, often either anesthesiology or physical medicine and rehabilitation, perform fluoroscopically guided procedures and benefit from increased experience with coaxial technique as this improves needle depth and location awareness. Once mastered, this skill set can be applied to and useful for multiple interventional pain procedures. Similar technical instruction with an emphasis on coaxial technique for hip injections as performed in the anterior or anterolateral approach can be used in both fluoroscopic and ultrasound-guided procedures, including facet injection, transforaminal epidural steroid injection, and myriad other procedures performed to ameliorate pain. There are advantages to pursuing a similar approach with all image-guided procedures. Evaluated in this comparison study is an alternative technique that has potential for risk reduction benefit with reduced proximity to neurovascular structures, which ultimately leads to a safer procedure profile.
Using a lateral approach, the interventionalist determines a starting point, entering the skin at a greater distance from any overlying pannus and the elevated concentration of gram-negative and gram-positive bacteria contained within the inguinal skin.6 A previous study demonstrated improved success of intra-articular needle tip placement without image guidance in patients with body mass index (BMI) < 30.7 A prior study of anterior approach using anatomic landmarks as compared to lateral approach demonstrated the anterior approach pierced or contacted the femoral nerve in 27% of anterior cases and came within 5 mm of 60% of anterior cases.2 Use of image guidance, whether ultrasound, fluoroscopy, or computed tomography (CT) is preferred related to reduced risk of contact with adjacent neurovascular structures. Anatomic surface landmarks have been described as an alternative injection technique, without the use of fluoroscopy for confirmatory initial, intraprocedure, and final placement.8 Palpation of anatomic structures is required for this nonimage-guided technique, and although similar to the described technique in this study, the anatomically guided injection starting point is more lateral than the anterior approach but not in the most lateral position in the transverse plane that is used for this fluoroscopically guided lateral approach study.
Physiologic characteristics of subjects and technical aspects of fluoroscopy both can be factors in radiation dose and exposure times for hip injections. Patient BMI was not included in the data collection, but further study would seek to determine whether BMI is a significant risk for any increased radiation dose and exposure times using lateral approach injections. Use of lateral images for fluoroscopy requires penetration of X-ray beam through more tissue compared with that of anterior-posterior images. Further study of these techniques would benefit from comparing the pulse rate of fluoroscopic images and collimation (or focusing of the radiation beam over a smaller area of tissue) as factors in any observed increase in total radiation dose and exposure times.
Improving the safety profile of this procedure could have a positive impact on the patient population receiving fluoroscopic hip injections, both within the VA Ann Arbor Health System and elsewhere. While the study population was limited to the VA patient population seeking subspecialty nonsurgical joint care at a single tertiary care center, this technique is generalizable and can be used in most patients, as hip pain is a common condition necessitating nonoperative evaluation and treatment.
Radiation Exposures
As our analysis demonstrates, mean radiation dose exposure for each group was consistent with low (≤ 3 mSv) to moderate (> 3-20 mSv) annual effective doses in the general population.7 Both anterior and lateral median radiation dose of 1 mGy and 3 mGy, respectively, are within the standard exposure for radiographs of the pelvis (1.31 mGy).9 It is therefore reasonable to consider a lateral approach for hip injection, given the benefits of direct coaxial approach and avoiding needle entry through higher bacteria-concentrated skin.
The lateral approach did have increased radiation dose and exposure time, although it was not statistically significantly greater than the anterior approach. The difference between radiation dose and time to perform either technique was not clinically significant. One potential explanation for this is that the lateral technique has increased tissue to penetrate, which can be reduced with collimation and other fluoroscopic image adjustments. Additionally, as trainees progress in competency, fewer images should need to be obtained.7 We hypothesize that as familiarity and comfort with this technique increase, the number of images necessary for successful injection would decrease, leading to decreased radiation dose and exposure time. We would expect that in the hands of a board-certified interventionalist, radiation dose and exposure time would be significantly decreased as compared to our current dataset, and this is an area of planned further study. With our existing dataset, the majority of procedures were performed with trainees, with inadequate information documented for comparison of dose over time and procedural experience under individual physicians.
Notable strengths of this study are the direct comparison of the anterior approach when compared to the lateral approach with regard to radiation dose and exposure time, which we have not seen described in the literature. A detailed description of the technique may result in increased utilization by other providers. Data were collected from multiple providers, as board-certified pain physicians and board-eligible interventional pain fellows performed the procedures. This variability in providers increases the generalizability of the findings, with a variety of providers, disciplines, years of experiences, and type of training represented.
Limitations
Limitations include the retrospective nature of the study and the relatively small sample size. However, even with this limitation, it is notable that no statistically significant differences were observed in mean radiation dose or fluoroscopy exposure time, making the lateral approach, at minimum, a noninferior technique. Combined with the improved safety profile, this technique is a viable alternative to the traditional anterior-oblique approach. Further study should be performed, such as a prospective, randomized control trial investigating the 2 techniques and following pain scores and functional ability after the procedure.
Conclusion
Given the decreased procedural risk related to proximity of neurovascular structures and coaxial technique for needle advancement, lateral approach for hip injection should be considered by those in any discipline performing fluoroscopically guided procedures. Lateral technique may be particularly useful in technically challenging cases and when skin entry at the anterior groin is suboptimal, as a noninferior alternative to traditional anterior method.
1. Cianfoni A, Boulter DJ, Rumboldt Z, Sapton T, Bonaldi G. Guidelines to imaging landmarks for interventional spine procedures: fluoroscopy and CT anatomy. Neurographics. 2011;1(1):39-48.
2. Leopold SS, Battista V, Oliverio JA. Safety and efficacy of intraarticular hip injection using anatomic landmarks. Clin Orthop Relat Res. 2001;(391):192-197.
3. Dodré E, Lefebvre G, Cockenpot E, Chastanet P, Cotten A. Interventional MSK procedures: the hip. Br J Radiol. 2016;89(1057):20150408.
4. Hankey S, McCall IW, Park WM, O’Connor BT. Technical problems in arthrography of the painful hip arthroplasty. Clin Radiol. 1979;30(6):653-656.
5. Yasar E, Singh JR, Hill J, Akuthota V. Image-guided injections of the hip. J Nov Physiother Phys Rehabil. 2014;1(2):39-48.
6. Aly R, Maibach HI. Aerobic microbial flora of intertrigenous skin. Appl Environ Microbiol. 1977;33(1):97-100.
7. Fazel R, Krumholz HM, Wang W, et al. Exposure to low-dose ionizing radiation from medical imaging procedures. N Engl J Med. 2009;361(9):849-857.
8. Masoud MA, Said HG. Intra-articular hip injection using anatomic surface landmarks. Arthosc Tech. 2013;2(2):e147-e149.
9. Ofori K, Gordon SW, Akrobortu E, Ampene AA, Darko EO. Estimation of adult patient doses for selected x-ray diagnostic examinations. J Radiat Res Appl Sci. 2014;7(4):459-462.
1. Cianfoni A, Boulter DJ, Rumboldt Z, Sapton T, Bonaldi G. Guidelines to imaging landmarks for interventional spine procedures: fluoroscopy and CT anatomy. Neurographics. 2011;1(1):39-48.
2. Leopold SS, Battista V, Oliverio JA. Safety and efficacy of intraarticular hip injection using anatomic landmarks. Clin Orthop Relat Res. 2001;(391):192-197.
3. Dodré E, Lefebvre G, Cockenpot E, Chastanet P, Cotten A. Interventional MSK procedures: the hip. Br J Radiol. 2016;89(1057):20150408.
4. Hankey S, McCall IW, Park WM, O’Connor BT. Technical problems in arthrography of the painful hip arthroplasty. Clin Radiol. 1979;30(6):653-656.
5. Yasar E, Singh JR, Hill J, Akuthota V. Image-guided injections of the hip. J Nov Physiother Phys Rehabil. 2014;1(2):39-48.
6. Aly R, Maibach HI. Aerobic microbial flora of intertrigenous skin. Appl Environ Microbiol. 1977;33(1):97-100.
7. Fazel R, Krumholz HM, Wang W, et al. Exposure to low-dose ionizing radiation from medical imaging procedures. N Engl J Med. 2009;361(9):849-857.
8. Masoud MA, Said HG. Intra-articular hip injection using anatomic surface landmarks. Arthosc Tech. 2013;2(2):e147-e149.
9. Ofori K, Gordon SW, Akrobortu E, Ampene AA, Darko EO. Estimation of adult patient doses for selected x-ray diagnostic examinations. J Radiat Res Appl Sci. 2014;7(4):459-462.
The Shot That Won the Revolutionary War and Is Still Reverberating
The disputes about those who decline to vaccinate their children for communicable infectious diseases, especially measles, have been in the headlines of late. Those refusals are often done in the name of “medical freedom.”1 Yet this is a much older debate for the military. It seems fitting in this month in which we celebrate the 243rd anniversary of the Declaration of Independence to reflect on the earliest history of the interaction between vaccinations and war in the US and what it tells us about the fight for religious and political freedom and individual liberty.
Go back in time with me to 1776, long before the Fourth of July was a day for barbecues and fireworks. We are in Boston, Philadelphia, and other important cities in colonial America. This time, concern was not about measles but the even more dreaded smallpox. In the first years of the Revolutionary War, General George Washington took command of a newly formed and named Continental Army. A catastrophic 90% of casualties in the Continental Army were from infectious diseases, with the lion’s share of these from smallpox, which at that time had a mortality rate of about 30%.2,3
Early efforts to introduce inoculation into the colonies had failed for many of the same reasons parents across the US today refuse immunization: fear and anxiety. When the renowned New England Puritan minister and scientist Cotton Mather attempted in 1721 to introduce variolation, his house was firebombed and his fellow clergy and physicians alleged that his efforts at inoculation were challenging God’s will to send a plague.3 Variolation was the now antiquated and then laborious process in which a previously unexposed individual was inoculated with material from the vesicle of someone infected with the disease.4,5 Variolation was practiced in parts of Africa and Asia and among wealthy Europeans but remained controversial in many colonies where few Americans had been exposed to smallpox or could afford the procedure.3
It is important to note that the use of variolation was practiced before Edward Jenner famously demonstrated that cowpox vaccine could provide immunity to smallpox in 1798. The majority of those inoculated would develop a mild case of smallpox that required a 5-week period of illness and recovery that provided lifelong immunity. However, during those 5 weeks, they remained a vector of disease for the uninoculated. Southern and New England colonies passed laws that prohibited variolation. Those anti-inoculation attitudes were the basis for the order given to the surgeons general of the Continental Army in 1776 that all inoculations of the troops were forbidden, despite the fact that perhaps only 25% of soldiers possessed any natural immunity.2,3
There was yet another reason that many colonial Americans opposed government-sponsored preventative care, and it was the same reason that they were fighting a war of independence: distrust and resentment of authority. The modern antivaccine movement voices similar fears and suspicions regarding public health campaigns and especially legislative efforts to mandate vaccinations or remove extant exemptions.
In 1775 in Boston, a smallpox outbreak occurred at the same time the Americans laid siege to the British troops occupying the city. Greater natural immunity to the scourge of smallpox either through exposure or variolation provided the British with a stronger defense than the mere city fortifications. There are even some suspicions that the British used the virus as a proto-biologic weapon.
General Washington had initially been against inoculation until he realized that without it the British might win the war. This possibility presented him with a momentous decision: inoculate despite widespread anxiety that variolation would spread the disease or risk the virus ravaging the fighting force. Perhaps the most compelling reason to variolate was that new recruits refused to sign up, fearing not that they would die in battle but of smallpox. In 1777, Washington mandated variolation of the nonimmune troops and new recruits, making it the first large-scale military preventative care measure in history.
Recapitulating an ethical dilemma that still rages in the military nearly 3 centuries later, for British soldiers, inoculation was voluntary not compulsory as for the Americans. There was so much opposition to Washington’s order that communications with surgeons were secret, and commanding officers had to oversee the inoculations.2,3
Washington’s policy not only contributed mightily to the American victory in the war, but also set the precedent for compulsory vaccination in the US military for the next 3 centuries. Currently, regulations require that service members be vaccinated for multiple infectious diseases. Of interest, this mandatory vaccination program has led to no reported cases of measles among military families to date, in part because of federal regulations requiring families of those service members to be vaccinated.6
Ironically, once General Washington made the decision for mass inoculation, he encountered little actual resistance among the troops. However, throughout military history some service members have objected to compulsory vaccination on medical, religious, and personal grounds. In United States v Chadwell, a military court ruled against 2 Marine Corps members who refused vaccination for smallpox, typhoid, paratyphoid, and influenza, citing religious grounds. The court opined that the military orders that ensure the health and safety of the armed forces and thereby that of the public override personal religious beliefs.7
The paradox of liberty—the liberty first won in the Revolutionary War—is that in a pluralistic representative democracy like ours to secure the freedom for all, some, such as the military, must relinquish the very choice to refuse. Their sacrifices grant liberty to others. On June 6, we commemorated the seventy-fifth anniversary of D-Day, remembering how great the cost of that eternal vigilance, which the patriot Thomas Paine said was the price of liberty. On Memorial Day, we remember all those men and women who died in the service of their country. And while they gave up the most precious gift, we must never forget that every person in uniform also surrenders many other significant personal freedoms so that their fellow civilians may exercise them.
The question General Washington faced is one that public health authorities and our legislators again confront. When should the freedom to refuse, which was won with the blood of many valiant heroes and has been defended since 1776, be curtailed for the greater good? We are the one nation in history that has made the defense of self-determination its highest value and in so doing, its greatest challenge.
1. Sun LH. Senate panel warns of dangers of ant-vaccine movement. https://www.washingtonpost.com/health/2019/03/05/combat-anti-vaxxers-us-needs-national-campaign-top-washington-state-official-says/?utm_term=.9a4201be0ed1. Published March 5, 2019. Accessed June 9, 2019.
2. Filsinger AL, Dwek R. George Washington and the first mass military inoculation. http://www.loc.gov/rr/scitech/GW&smallpoxinoculation.html. Published February 12, 2009. Accessed June 10, 2019.
3. Fenn EA. Pox Americana. New York: Hill and Wang; 2001.
4. Steadman’s Medical Dictionary. 28th edition. Philadelphia, PA: Lippincott, Williams & Wilkins; 2006.
5. Artenstein AW, Opal JM, Opal SM, Tramont EC, Georges P, Russell PK. History of U.S. military contributions to the study of vaccines and infectious diseases. Mil Med. 2005;170(suppl 4):3-11.
6. Jowers K. So far, no measles cases at military medical facilities—but officials are watching. https://www.militarytimes.com/pay-benefits/2019/04/19/so-far-no-measles-cases-at-military-medical-facilities-but-officials-are-watching/. Published April 19, 2019. Accessed June 9, 2019.
7. Cole JP, Swendiman KS. Mandatory vaccinations: precedent and current laws. https://fas.org/sgp/crs/misc/RS21414.pdf. Published May 21, 2014. Accessed June 10, 2019.
The disputes about those who decline to vaccinate their children for communicable infectious diseases, especially measles, have been in the headlines of late. Those refusals are often done in the name of “medical freedom.”1 Yet this is a much older debate for the military. It seems fitting in this month in which we celebrate the 243rd anniversary of the Declaration of Independence to reflect on the earliest history of the interaction between vaccinations and war in the US and what it tells us about the fight for religious and political freedom and individual liberty.
Go back in time with me to 1776, long before the Fourth of July was a day for barbecues and fireworks. We are in Boston, Philadelphia, and other important cities in colonial America. This time, concern was not about measles but the even more dreaded smallpox. In the first years of the Revolutionary War, General George Washington took command of a newly formed and named Continental Army. A catastrophic 90% of casualties in the Continental Army were from infectious diseases, with the lion’s share of these from smallpox, which at that time had a mortality rate of about 30%.2,3
Early efforts to introduce inoculation into the colonies had failed for many of the same reasons parents across the US today refuse immunization: fear and anxiety. When the renowned New England Puritan minister and scientist Cotton Mather attempted in 1721 to introduce variolation, his house was firebombed and his fellow clergy and physicians alleged that his efforts at inoculation were challenging God’s will to send a plague.3 Variolation was the now antiquated and then laborious process in which a previously unexposed individual was inoculated with material from the vesicle of someone infected with the disease.4,5 Variolation was practiced in parts of Africa and Asia and among wealthy Europeans but remained controversial in many colonies where few Americans had been exposed to smallpox or could afford the procedure.3
It is important to note that the use of variolation was practiced before Edward Jenner famously demonstrated that cowpox vaccine could provide immunity to smallpox in 1798. The majority of those inoculated would develop a mild case of smallpox that required a 5-week period of illness and recovery that provided lifelong immunity. However, during those 5 weeks, they remained a vector of disease for the uninoculated. Southern and New England colonies passed laws that prohibited variolation. Those anti-inoculation attitudes were the basis for the order given to the surgeons general of the Continental Army in 1776 that all inoculations of the troops were forbidden, despite the fact that perhaps only 25% of soldiers possessed any natural immunity.2,3
There was yet another reason that many colonial Americans opposed government-sponsored preventative care, and it was the same reason that they were fighting a war of independence: distrust and resentment of authority. The modern antivaccine movement voices similar fears and suspicions regarding public health campaigns and especially legislative efforts to mandate vaccinations or remove extant exemptions.
In 1775 in Boston, a smallpox outbreak occurred at the same time the Americans laid siege to the British troops occupying the city. Greater natural immunity to the scourge of smallpox either through exposure or variolation provided the British with a stronger defense than the mere city fortifications. There are even some suspicions that the British used the virus as a proto-biologic weapon.
General Washington had initially been against inoculation until he realized that without it the British might win the war. This possibility presented him with a momentous decision: inoculate despite widespread anxiety that variolation would spread the disease or risk the virus ravaging the fighting force. Perhaps the most compelling reason to variolate was that new recruits refused to sign up, fearing not that they would die in battle but of smallpox. In 1777, Washington mandated variolation of the nonimmune troops and new recruits, making it the first large-scale military preventative care measure in history.
Recapitulating an ethical dilemma that still rages in the military nearly 3 centuries later, for British soldiers, inoculation was voluntary not compulsory as for the Americans. There was so much opposition to Washington’s order that communications with surgeons were secret, and commanding officers had to oversee the inoculations.2,3
Washington’s policy not only contributed mightily to the American victory in the war, but also set the precedent for compulsory vaccination in the US military for the next 3 centuries. Currently, regulations require that service members be vaccinated for multiple infectious diseases. Of interest, this mandatory vaccination program has led to no reported cases of measles among military families to date, in part because of federal regulations requiring families of those service members to be vaccinated.6
Ironically, once General Washington made the decision for mass inoculation, he encountered little actual resistance among the troops. However, throughout military history some service members have objected to compulsory vaccination on medical, religious, and personal grounds. In United States v Chadwell, a military court ruled against 2 Marine Corps members who refused vaccination for smallpox, typhoid, paratyphoid, and influenza, citing religious grounds. The court opined that the military orders that ensure the health and safety of the armed forces and thereby that of the public override personal religious beliefs.7
The paradox of liberty—the liberty first won in the Revolutionary War—is that in a pluralistic representative democracy like ours to secure the freedom for all, some, such as the military, must relinquish the very choice to refuse. Their sacrifices grant liberty to others. On June 6, we commemorated the seventy-fifth anniversary of D-Day, remembering how great the cost of that eternal vigilance, which the patriot Thomas Paine said was the price of liberty. On Memorial Day, we remember all those men and women who died in the service of their country. And while they gave up the most precious gift, we must never forget that every person in uniform also surrenders many other significant personal freedoms so that their fellow civilians may exercise them.
The question General Washington faced is one that public health authorities and our legislators again confront. When should the freedom to refuse, which was won with the blood of many valiant heroes and has been defended since 1776, be curtailed for the greater good? We are the one nation in history that has made the defense of self-determination its highest value and in so doing, its greatest challenge.
The disputes about those who decline to vaccinate their children for communicable infectious diseases, especially measles, have been in the headlines of late. Those refusals are often done in the name of “medical freedom.”1 Yet this is a much older debate for the military. It seems fitting in this month in which we celebrate the 243rd anniversary of the Declaration of Independence to reflect on the earliest history of the interaction between vaccinations and war in the US and what it tells us about the fight for religious and political freedom and individual liberty.
Go back in time with me to 1776, long before the Fourth of July was a day for barbecues and fireworks. We are in Boston, Philadelphia, and other important cities in colonial America. This time, concern was not about measles but the even more dreaded smallpox. In the first years of the Revolutionary War, General George Washington took command of a newly formed and named Continental Army. A catastrophic 90% of casualties in the Continental Army were from infectious diseases, with the lion’s share of these from smallpox, which at that time had a mortality rate of about 30%.2,3
Early efforts to introduce inoculation into the colonies had failed for many of the same reasons parents across the US today refuse immunization: fear and anxiety. When the renowned New England Puritan minister and scientist Cotton Mather attempted in 1721 to introduce variolation, his house was firebombed and his fellow clergy and physicians alleged that his efforts at inoculation were challenging God’s will to send a plague.3 Variolation was the now antiquated and then laborious process in which a previously unexposed individual was inoculated with material from the vesicle of someone infected with the disease.4,5 Variolation was practiced in parts of Africa and Asia and among wealthy Europeans but remained controversial in many colonies where few Americans had been exposed to smallpox or could afford the procedure.3
It is important to note that the use of variolation was practiced before Edward Jenner famously demonstrated that cowpox vaccine could provide immunity to smallpox in 1798. The majority of those inoculated would develop a mild case of smallpox that required a 5-week period of illness and recovery that provided lifelong immunity. However, during those 5 weeks, they remained a vector of disease for the uninoculated. Southern and New England colonies passed laws that prohibited variolation. Those anti-inoculation attitudes were the basis for the order given to the surgeons general of the Continental Army in 1776 that all inoculations of the troops were forbidden, despite the fact that perhaps only 25% of soldiers possessed any natural immunity.2,3
There was yet another reason that many colonial Americans opposed government-sponsored preventative care, and it was the same reason that they were fighting a war of independence: distrust and resentment of authority. The modern antivaccine movement voices similar fears and suspicions regarding public health campaigns and especially legislative efforts to mandate vaccinations or remove extant exemptions.
In 1775 in Boston, a smallpox outbreak occurred at the same time the Americans laid siege to the British troops occupying the city. Greater natural immunity to the scourge of smallpox either through exposure or variolation provided the British with a stronger defense than the mere city fortifications. There are even some suspicions that the British used the virus as a proto-biologic weapon.
General Washington had initially been against inoculation until he realized that without it the British might win the war. This possibility presented him with a momentous decision: inoculate despite widespread anxiety that variolation would spread the disease or risk the virus ravaging the fighting force. Perhaps the most compelling reason to variolate was that new recruits refused to sign up, fearing not that they would die in battle but of smallpox. In 1777, Washington mandated variolation of the nonimmune troops and new recruits, making it the first large-scale military preventative care measure in history.
Recapitulating an ethical dilemma that still rages in the military nearly 3 centuries later, for British soldiers, inoculation was voluntary not compulsory as for the Americans. There was so much opposition to Washington’s order that communications with surgeons were secret, and commanding officers had to oversee the inoculations.2,3
Washington’s policy not only contributed mightily to the American victory in the war, but also set the precedent for compulsory vaccination in the US military for the next 3 centuries. Currently, regulations require that service members be vaccinated for multiple infectious diseases. Of interest, this mandatory vaccination program has led to no reported cases of measles among military families to date, in part because of federal regulations requiring families of those service members to be vaccinated.6
Ironically, once General Washington made the decision for mass inoculation, he encountered little actual resistance among the troops. However, throughout military history some service members have objected to compulsory vaccination on medical, religious, and personal grounds. In United States v Chadwell, a military court ruled against 2 Marine Corps members who refused vaccination for smallpox, typhoid, paratyphoid, and influenza, citing religious grounds. The court opined that the military orders that ensure the health and safety of the armed forces and thereby that of the public override personal religious beliefs.7
The paradox of liberty—the liberty first won in the Revolutionary War—is that in a pluralistic representative democracy like ours to secure the freedom for all, some, such as the military, must relinquish the very choice to refuse. Their sacrifices grant liberty to others. On June 6, we commemorated the seventy-fifth anniversary of D-Day, remembering how great the cost of that eternal vigilance, which the patriot Thomas Paine said was the price of liberty. On Memorial Day, we remember all those men and women who died in the service of their country. And while they gave up the most precious gift, we must never forget that every person in uniform also surrenders many other significant personal freedoms so that their fellow civilians may exercise them.
The question General Washington faced is one that public health authorities and our legislators again confront. When should the freedom to refuse, which was won with the blood of many valiant heroes and has been defended since 1776, be curtailed for the greater good? We are the one nation in history that has made the defense of self-determination its highest value and in so doing, its greatest challenge.
1. Sun LH. Senate panel warns of dangers of ant-vaccine movement. https://www.washingtonpost.com/health/2019/03/05/combat-anti-vaxxers-us-needs-national-campaign-top-washington-state-official-says/?utm_term=.9a4201be0ed1. Published March 5, 2019. Accessed June 9, 2019.
2. Filsinger AL, Dwek R. George Washington and the first mass military inoculation. http://www.loc.gov/rr/scitech/GW&smallpoxinoculation.html. Published February 12, 2009. Accessed June 10, 2019.
3. Fenn EA. Pox Americana. New York: Hill and Wang; 2001.
4. Steadman’s Medical Dictionary. 28th edition. Philadelphia, PA: Lippincott, Williams & Wilkins; 2006.
5. Artenstein AW, Opal JM, Opal SM, Tramont EC, Georges P, Russell PK. History of U.S. military contributions to the study of vaccines and infectious diseases. Mil Med. 2005;170(suppl 4):3-11.
6. Jowers K. So far, no measles cases at military medical facilities—but officials are watching. https://www.militarytimes.com/pay-benefits/2019/04/19/so-far-no-measles-cases-at-military-medical-facilities-but-officials-are-watching/. Published April 19, 2019. Accessed June 9, 2019.
7. Cole JP, Swendiman KS. Mandatory vaccinations: precedent and current laws. https://fas.org/sgp/crs/misc/RS21414.pdf. Published May 21, 2014. Accessed June 10, 2019.
1. Sun LH. Senate panel warns of dangers of ant-vaccine movement. https://www.washingtonpost.com/health/2019/03/05/combat-anti-vaxxers-us-needs-national-campaign-top-washington-state-official-says/?utm_term=.9a4201be0ed1. Published March 5, 2019. Accessed June 9, 2019.
2. Filsinger AL, Dwek R. George Washington and the first mass military inoculation. http://www.loc.gov/rr/scitech/GW&smallpoxinoculation.html. Published February 12, 2009. Accessed June 10, 2019.
3. Fenn EA. Pox Americana. New York: Hill and Wang; 2001.
4. Steadman’s Medical Dictionary. 28th edition. Philadelphia, PA: Lippincott, Williams & Wilkins; 2006.
5. Artenstein AW, Opal JM, Opal SM, Tramont EC, Georges P, Russell PK. History of U.S. military contributions to the study of vaccines and infectious diseases. Mil Med. 2005;170(suppl 4):3-11.
6. Jowers K. So far, no measles cases at military medical facilities—but officials are watching. https://www.militarytimes.com/pay-benefits/2019/04/19/so-far-no-measles-cases-at-military-medical-facilities-but-officials-are-watching/. Published April 19, 2019. Accessed June 9, 2019.
7. Cole JP, Swendiman KS. Mandatory vaccinations: precedent and current laws. https://fas.org/sgp/crs/misc/RS21414.pdf. Published May 21, 2014. Accessed June 10, 2019.
Flu vaccine succeeds in TNF inhibitor users
MADRID – Influenza vaccination is similarly effective for individuals taking a tumor necrosis factor (TNF) inhibitor and healthy controls, but the number needed to vaccinate to prevent one case of influenza for patients taking a TNF inhibitor is much lower, according to data from a study presented at the European Congress of Rheumatology.
The number needed to vaccinate (NNV) to prevent one case of influenza among healthy control patients was 71, compared with an NNV of 10 for patients taking the TNF inhibitor adalimumab (Humira), reported Giovanni Adami, MD, and colleagues at the University of Verona (Italy).
While TNF inhibitors “are known to increase the risk of infection by suppressing the activity of the immune system,” it has not been clear whether the response to vaccination is impaired in patients treated with a TNF inhibitor, Dr. Adami said.
Dr. Adami and colleagues reviewed data from 15,132 adult patients exposed to adalimumab in global rheumatoid arthritis clinical trials and 71,221 healthy controls from clinical trials of influenza vaccines. Overall, the rate of influenza infection was similarly reduced with vaccination in both groups. The rate in healthy individuals went from 2.3% for those unvaccinated to 0.9% for those vaccinated; for TNF inhibitor–treated patients, the rate was 14.4% for those unvaccinated versus 4.5% for those vaccinated.
“It is not surprising that the number needed to vaccinate is dramatically lower in patients treated with immunosuppressors, compared to healthy individuals,” Dr. Adami noted. “As a matter of fact, patients treated with such drugs are at higher risk of infections, namely they have a greater absolute risk of influenza. Nevertheless, [it] is quite surprising that the relative risk reduction is similar between TNF inhibitor–treated patients and healthy controls, meaning that the vaccination is efficacious in both the cohorts.”
The researchers also calculated the cost to prevent one case of influenza, using a cost of approximately 16.5 euro per vaccine. (Dr. Adami also cited an average U.S. cost of about $40/vaccine). Using this method, they estimated a cost for vaccination of 1,174 euro (roughly $1,340) to prevent one influenza infection in the general population, and a cost of about 165 euro (roughly $188) to vaccinate enough people treated with a TNF inhibitor to prevent one infection.
Dr. Adami advised clinicians to remember the low NNV for TNF inhibitor–treated patients with regard to influenza vaccination. “A direct disclosure of the NNV for these patients might help adherence to vaccinations,” he said.
Next steps for research should include extending the real-world effectiveness analysis to other medications and other diseases, such as zoster vaccination in patients treated with Janus kinase inhibitors, Dr. Adami said.
Dr. Adami had no financial conflicts to disclose. Several coauthors disclosed relationships with companies including Abiogen Pharma, Grünenthal, Amgen, Janssen-Cilag, Mundipharma, and Pfizer.
Mitchel L. Zoler contributed to this report.
SOURCE: Adami G et al. Ann Rheum Dis. Jun 2019;78(Suppl 2):192-3. Abstract OP0230, doi: 10.1136/annrheumdis-2019-eular.3088
MADRID – Influenza vaccination is similarly effective for individuals taking a tumor necrosis factor (TNF) inhibitor and healthy controls, but the number needed to vaccinate to prevent one case of influenza for patients taking a TNF inhibitor is much lower, according to data from a study presented at the European Congress of Rheumatology.
The number needed to vaccinate (NNV) to prevent one case of influenza among healthy control patients was 71, compared with an NNV of 10 for patients taking the TNF inhibitor adalimumab (Humira), reported Giovanni Adami, MD, and colleagues at the University of Verona (Italy).
While TNF inhibitors “are known to increase the risk of infection by suppressing the activity of the immune system,” it has not been clear whether the response to vaccination is impaired in patients treated with a TNF inhibitor, Dr. Adami said.
Dr. Adami and colleagues reviewed data from 15,132 adult patients exposed to adalimumab in global rheumatoid arthritis clinical trials and 71,221 healthy controls from clinical trials of influenza vaccines. Overall, the rate of influenza infection was similarly reduced with vaccination in both groups. The rate in healthy individuals went from 2.3% for those unvaccinated to 0.9% for those vaccinated; for TNF inhibitor–treated patients, the rate was 14.4% for those unvaccinated versus 4.5% for those vaccinated.
“It is not surprising that the number needed to vaccinate is dramatically lower in patients treated with immunosuppressors, compared to healthy individuals,” Dr. Adami noted. “As a matter of fact, patients treated with such drugs are at higher risk of infections, namely they have a greater absolute risk of influenza. Nevertheless, [it] is quite surprising that the relative risk reduction is similar between TNF inhibitor–treated patients and healthy controls, meaning that the vaccination is efficacious in both the cohorts.”
The researchers also calculated the cost to prevent one case of influenza, using a cost of approximately 16.5 euro per vaccine. (Dr. Adami also cited an average U.S. cost of about $40/vaccine). Using this method, they estimated a cost for vaccination of 1,174 euro (roughly $1,340) to prevent one influenza infection in the general population, and a cost of about 165 euro (roughly $188) to vaccinate enough people treated with a TNF inhibitor to prevent one infection.
Dr. Adami advised clinicians to remember the low NNV for TNF inhibitor–treated patients with regard to influenza vaccination. “A direct disclosure of the NNV for these patients might help adherence to vaccinations,” he said.
Next steps for research should include extending the real-world effectiveness analysis to other medications and other diseases, such as zoster vaccination in patients treated with Janus kinase inhibitors, Dr. Adami said.
Dr. Adami had no financial conflicts to disclose. Several coauthors disclosed relationships with companies including Abiogen Pharma, Grünenthal, Amgen, Janssen-Cilag, Mundipharma, and Pfizer.
Mitchel L. Zoler contributed to this report.
SOURCE: Adami G et al. Ann Rheum Dis. Jun 2019;78(Suppl 2):192-3. Abstract OP0230, doi: 10.1136/annrheumdis-2019-eular.3088
MADRID – Influenza vaccination is similarly effective for individuals taking a tumor necrosis factor (TNF) inhibitor and healthy controls, but the number needed to vaccinate to prevent one case of influenza for patients taking a TNF inhibitor is much lower, according to data from a study presented at the European Congress of Rheumatology.
The number needed to vaccinate (NNV) to prevent one case of influenza among healthy control patients was 71, compared with an NNV of 10 for patients taking the TNF inhibitor adalimumab (Humira), reported Giovanni Adami, MD, and colleagues at the University of Verona (Italy).
While TNF inhibitors “are known to increase the risk of infection by suppressing the activity of the immune system,” it has not been clear whether the response to vaccination is impaired in patients treated with a TNF inhibitor, Dr. Adami said.
Dr. Adami and colleagues reviewed data from 15,132 adult patients exposed to adalimumab in global rheumatoid arthritis clinical trials and 71,221 healthy controls from clinical trials of influenza vaccines. Overall, the rate of influenza infection was similarly reduced with vaccination in both groups. The rate in healthy individuals went from 2.3% for those unvaccinated to 0.9% for those vaccinated; for TNF inhibitor–treated patients, the rate was 14.4% for those unvaccinated versus 4.5% for those vaccinated.
“It is not surprising that the number needed to vaccinate is dramatically lower in patients treated with immunosuppressors, compared to healthy individuals,” Dr. Adami noted. “As a matter of fact, patients treated with such drugs are at higher risk of infections, namely they have a greater absolute risk of influenza. Nevertheless, [it] is quite surprising that the relative risk reduction is similar between TNF inhibitor–treated patients and healthy controls, meaning that the vaccination is efficacious in both the cohorts.”
The researchers also calculated the cost to prevent one case of influenza, using a cost of approximately 16.5 euro per vaccine. (Dr. Adami also cited an average U.S. cost of about $40/vaccine). Using this method, they estimated a cost for vaccination of 1,174 euro (roughly $1,340) to prevent one influenza infection in the general population, and a cost of about 165 euro (roughly $188) to vaccinate enough people treated with a TNF inhibitor to prevent one infection.
Dr. Adami advised clinicians to remember the low NNV for TNF inhibitor–treated patients with regard to influenza vaccination. “A direct disclosure of the NNV for these patients might help adherence to vaccinations,” he said.
Next steps for research should include extending the real-world effectiveness analysis to other medications and other diseases, such as zoster vaccination in patients treated with Janus kinase inhibitors, Dr. Adami said.
Dr. Adami had no financial conflicts to disclose. Several coauthors disclosed relationships with companies including Abiogen Pharma, Grünenthal, Amgen, Janssen-Cilag, Mundipharma, and Pfizer.
Mitchel L. Zoler contributed to this report.
SOURCE: Adami G et al. Ann Rheum Dis. Jun 2019;78(Suppl 2):192-3. Abstract OP0230, doi: 10.1136/annrheumdis-2019-eular.3088
REPORTING FROM EULAR 2019 CONGRESS
















