User login
Tagraxofusp produces high response rate in BPDCN
Tagraxofusp demonstrated efficacy in a phase 2 trial of patients with previously treated or untreated blastic plasmacytoid dendritic cell neoplasm (BPDCN).
The overall response rate was 90% in previously untreated patients who received the highest dose of tagraxofusp and 67% in patients with relapsed/refractory BPDCN.
The researchers wrote that capillary leak syndrome (CLS) was an important adverse event in this trial, as it caused two deaths. However, the researchers developed strategies that appear to mitigate the risk of CLS in patients taking tagraxofusp.
Naveen Pemmaraju, MD, of the University of Texas MD Anderson Cancer Center, Houston, and his colleagues conducted the trial and reported the results in the New England Journal of Medicine.
The trial included 47 patients – 32 with previously untreated BPDCN and 15 with relapsed/refractory BPDCN. The patients’ median age at baseline was 70 years and 83% were men.
Three patients (all previously untreated) received tagraxofusp at 7 mcg/kg per day, and 44 patients received a 12 mcg/kg per day dose. All patients were treated on days 1-5 of a 21-day cycle.
Response and survival
In the 29 previously untreated patients who received the 12 mcg/kg dose of tagraxofusp, the overall response rate was 90%. The rate of complete response plus clinical complete response in these patients was 72%.
In the 15 patients with relapsed/refractory BPDCN, the overall response rate was 67%, and the rate of complete response plus clinical complete response was 33%.
A total of 14 patients, 13 of whom had previously untreated BPDCN, went on to transplant.
In the 29 previously untreated patients, the median overall survival was not reached at a median follow-up of 25 months. The overall survival rate was 62% at 12 months, 59% at 18 months, and 52% at 24 months.
In the 15 previously treated patients, the median overall survival was 8.5 months.
Safety
Common adverse events in this trial were ALT increase (64%), AST increase (60%), hypoalbuminemia (55%), peripheral edema (51%), thrombocytopenia (49%), nausea (45%), pyrexia (45%), and fatigue (45%).
Among the 44 patients who received the 12 mcg/kg dose of tagraxofusp, 8 (18%) developed CLS. Six patients had grade 2 CLS, one had grade 4, and one had grade 5. There was an additional CLS-related death in a patient who received tagraxofusp at 7 mcg/kg.
After the first death, the trial protocol was amended to reduce CLS risk. Inclusion criteria were changed so that patients must have normal cardiac function, adequate kidney function, and serum albumin of at least 3.2 g/dL. Additionally, the researchers began monitoring patients’ weight, albumin levels, and kidney function. The team withheld tagraxofusp if patients experienced rapid weight gain or if their serum albumin or systolic blood pressure fell too low.
The trial was sponsored by Stemline Therapeutics. The researchers reported relationships with Stemline and other companies.
Tagraxofusp demonstrated efficacy in a phase 2 trial of patients with previously treated or untreated blastic plasmacytoid dendritic cell neoplasm (BPDCN).
The overall response rate was 90% in previously untreated patients who received the highest dose of tagraxofusp and 67% in patients with relapsed/refractory BPDCN.
The researchers wrote that capillary leak syndrome (CLS) was an important adverse event in this trial, as it caused two deaths. However, the researchers developed strategies that appear to mitigate the risk of CLS in patients taking tagraxofusp.
Naveen Pemmaraju, MD, of the University of Texas MD Anderson Cancer Center, Houston, and his colleagues conducted the trial and reported the results in the New England Journal of Medicine.
The trial included 47 patients – 32 with previously untreated BPDCN and 15 with relapsed/refractory BPDCN. The patients’ median age at baseline was 70 years and 83% were men.
Three patients (all previously untreated) received tagraxofusp at 7 mcg/kg per day, and 44 patients received a 12 mcg/kg per day dose. All patients were treated on days 1-5 of a 21-day cycle.
Response and survival
In the 29 previously untreated patients who received the 12 mcg/kg dose of tagraxofusp, the overall response rate was 90%. The rate of complete response plus clinical complete response in these patients was 72%.
In the 15 patients with relapsed/refractory BPDCN, the overall response rate was 67%, and the rate of complete response plus clinical complete response was 33%.
A total of 14 patients, 13 of whom had previously untreated BPDCN, went on to transplant.
In the 29 previously untreated patients, the median overall survival was not reached at a median follow-up of 25 months. The overall survival rate was 62% at 12 months, 59% at 18 months, and 52% at 24 months.
In the 15 previously treated patients, the median overall survival was 8.5 months.
Safety
Common adverse events in this trial were ALT increase (64%), AST increase (60%), hypoalbuminemia (55%), peripheral edema (51%), thrombocytopenia (49%), nausea (45%), pyrexia (45%), and fatigue (45%).
Among the 44 patients who received the 12 mcg/kg dose of tagraxofusp, 8 (18%) developed CLS. Six patients had grade 2 CLS, one had grade 4, and one had grade 5. There was an additional CLS-related death in a patient who received tagraxofusp at 7 mcg/kg.
After the first death, the trial protocol was amended to reduce CLS risk. Inclusion criteria were changed so that patients must have normal cardiac function, adequate kidney function, and serum albumin of at least 3.2 g/dL. Additionally, the researchers began monitoring patients’ weight, albumin levels, and kidney function. The team withheld tagraxofusp if patients experienced rapid weight gain or if their serum albumin or systolic blood pressure fell too low.
The trial was sponsored by Stemline Therapeutics. The researchers reported relationships with Stemline and other companies.
Tagraxofusp demonstrated efficacy in a phase 2 trial of patients with previously treated or untreated blastic plasmacytoid dendritic cell neoplasm (BPDCN).
The overall response rate was 90% in previously untreated patients who received the highest dose of tagraxofusp and 67% in patients with relapsed/refractory BPDCN.
The researchers wrote that capillary leak syndrome (CLS) was an important adverse event in this trial, as it caused two deaths. However, the researchers developed strategies that appear to mitigate the risk of CLS in patients taking tagraxofusp.
Naveen Pemmaraju, MD, of the University of Texas MD Anderson Cancer Center, Houston, and his colleagues conducted the trial and reported the results in the New England Journal of Medicine.
The trial included 47 patients – 32 with previously untreated BPDCN and 15 with relapsed/refractory BPDCN. The patients’ median age at baseline was 70 years and 83% were men.
Three patients (all previously untreated) received tagraxofusp at 7 mcg/kg per day, and 44 patients received a 12 mcg/kg per day dose. All patients were treated on days 1-5 of a 21-day cycle.
Response and survival
In the 29 previously untreated patients who received the 12 mcg/kg dose of tagraxofusp, the overall response rate was 90%. The rate of complete response plus clinical complete response in these patients was 72%.
In the 15 patients with relapsed/refractory BPDCN, the overall response rate was 67%, and the rate of complete response plus clinical complete response was 33%.
A total of 14 patients, 13 of whom had previously untreated BPDCN, went on to transplant.
In the 29 previously untreated patients, the median overall survival was not reached at a median follow-up of 25 months. The overall survival rate was 62% at 12 months, 59% at 18 months, and 52% at 24 months.
In the 15 previously treated patients, the median overall survival was 8.5 months.
Safety
Common adverse events in this trial were ALT increase (64%), AST increase (60%), hypoalbuminemia (55%), peripheral edema (51%), thrombocytopenia (49%), nausea (45%), pyrexia (45%), and fatigue (45%).
Among the 44 patients who received the 12 mcg/kg dose of tagraxofusp, 8 (18%) developed CLS. Six patients had grade 2 CLS, one had grade 4, and one had grade 5. There was an additional CLS-related death in a patient who received tagraxofusp at 7 mcg/kg.
After the first death, the trial protocol was amended to reduce CLS risk. Inclusion criteria were changed so that patients must have normal cardiac function, adequate kidney function, and serum albumin of at least 3.2 g/dL. Additionally, the researchers began monitoring patients’ weight, albumin levels, and kidney function. The team withheld tagraxofusp if patients experienced rapid weight gain or if their serum albumin or systolic blood pressure fell too low.
The trial was sponsored by Stemline Therapeutics. The researchers reported relationships with Stemline and other companies.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: Tagraxofusp produced responses in patients with blastic plasmacytoid dendritic cell neoplasm (BPDCN).
Major finding: The overall response rate was 90% in previously untreated patients who received the highest dose of tagraxofusp and 67% in patients with relapsed/refractory BPDCN.
Study details: A phase 2 trial of 47 patients, 32 with previously untreated BPDCN and 15 with relapsed/refractory BPDCN.
Disclosures: The trial was sponsored by Stemline Therapeutics. The researchers reported relationships with Stemline and other companies.
Source: Pemmaraju N et al. N Engl J Med. 2019;380:1628-37.
Texans’ rattler diet, recycled humans, and, ahem, Puber
Endgame for arachnophobia
We think Tony Stark would like this creative solution for spider and ant phobias. Comic book movies have now infiltrated every aspect of culture, including serious scientific research. And let’s be honest – more than one scientist has been inspired to go into their fields by Bruce Banner, Stark, or maybe even Doctor Octopus (no judgment).
A group of (possibly mad) scientists has tested exposure therapy for spider and ant phobias in people by showing participants the Spider-Man and Ant-Man movies. While the viewing material may not be totally scientifically accurate, researchers found that watching seven seconds of Spider-Man 2 or Ant-Man reduced spider and ant phobia by 20%.
The participants were specifically exposed to ants and spiders in the context of the movies, so surprisingly the phobia reduction had nothing to do with Tobey Maguire or Paul Rudd.
Old poop, new discovery
Here at LOTME, we like us some good bathroom humor. And don’t worry, we won’t ever change. In this week’s edition of the Wonderful World of Poop, we take you to Texas, 1,500 years ago. The sky was bigger, the air was fresher, and the humans of the Lower Pecos region were as hardcore as you can get. A recent re-examination of coprolite samples taken from the region found one interesting chunk of poop-rock that contained an entire rattlesnake.
Now, the presence of snake bits in early human poo is not that crazy; people ate (and still eat) snakes. The appearance of a centimeter-long fang, scales, and bones, however, did take the researchers by surprise. Why would someone eat a snake? Was it an ancient way to inoculate against snake venom? Or perhaps crunchy snake fangs were the world’s earliest version of a Cheeto?
In fact, researchers hypothesized that this dietary behavior was not normal for the people of the Lower Pecos, and most likely was more ceremonial. You know, the casual eating-a-full-snake ceremony.
Will Texans embrace this ceremony of their past and start chomping on rattlers? Who’s to say? All we know is that poop is the gift that keeps on giving.
A new way to soil yourself
If you’re reading this, we can say with some certainty that you managed to survive another Tax Day. Congratulations! But there’s still Benjamin Franklin’s other ultimate certainty of life. You know … the big sleep, the last roundup, assume room temperature, buy the farm, shuffle off this mortal coil, give up the ghost, and so on.
What are you going to do about that?
A big question, for sure, so let’s just focus on the earthly remains. A company called Recompose has a new alternative to burial and cremation, something they’ve dubbed “natural organic reduction” and others have described as “human composting” or “accelerated decomposition.” In a pilot project last year at Washington State University in Pullman, the Recompose process transformed the bodies of six donors to soil in 4-7 weeks, AP reported.
The company says that natural organic reduction is much more environmentally friendly than current practices, creating a cubic yard of soil per person, and that “friends and family are welcome to take some (or all) home to grow a tree or a garden.”
A garden sounds nice, or maybe something indoors. Just think of the potted plant possibilities: daisies (to push up), a Venus flytrap (the organic reduction continues), some poison ivy (a gift for people you don’t like), or maybe roses. Who wouldn’t want to come out of death smelling like a rose?
San Francisco vs. illegal dumping
Maybe you’re not quite ready to commit to using human remains as compost to fertilize your garden. Perhaps you want to start off only using human poop as fertilizer, see how that goes before sprinkling Grandma all over your tulips.
Well, if you’re looking for a sweet deal, we’re certain San Francisco can work something out with you because, in the past 7 years, incidence of human feces in public places within the city has quintupled, rising from 5,500 reported cases in 2011 to 28,100 cases in 2018.
The problem, likely related to an increasing homeless population who can’t afford San Francisco’s exorbitant rental prices and have limited access to public restrooms, is so bad that the city commissioned a “Poop Patrol” in the summer of 2018 to wipe down some of the poorer, more suspect neighborhoods.
While the upstanding members of the Poop Patrol are almost certainly doing a fine job, it’s probably safe to say that human fecal clean-up is an industry ripe for disruption.
We look forward to the inevitable Silicon Valley start-up and for the media to hail it as “Uber, but for poop.”
Endgame for arachnophobia
We think Tony Stark would like this creative solution for spider and ant phobias. Comic book movies have now infiltrated every aspect of culture, including serious scientific research. And let’s be honest – more than one scientist has been inspired to go into their fields by Bruce Banner, Stark, or maybe even Doctor Octopus (no judgment).
A group of (possibly mad) scientists has tested exposure therapy for spider and ant phobias in people by showing participants the Spider-Man and Ant-Man movies. While the viewing material may not be totally scientifically accurate, researchers found that watching seven seconds of Spider-Man 2 or Ant-Man reduced spider and ant phobia by 20%.
The participants were specifically exposed to ants and spiders in the context of the movies, so surprisingly the phobia reduction had nothing to do with Tobey Maguire or Paul Rudd.
Old poop, new discovery
Here at LOTME, we like us some good bathroom humor. And don’t worry, we won’t ever change. In this week’s edition of the Wonderful World of Poop, we take you to Texas, 1,500 years ago. The sky was bigger, the air was fresher, and the humans of the Lower Pecos region were as hardcore as you can get. A recent re-examination of coprolite samples taken from the region found one interesting chunk of poop-rock that contained an entire rattlesnake.
Now, the presence of snake bits in early human poo is not that crazy; people ate (and still eat) snakes. The appearance of a centimeter-long fang, scales, and bones, however, did take the researchers by surprise. Why would someone eat a snake? Was it an ancient way to inoculate against snake venom? Or perhaps crunchy snake fangs were the world’s earliest version of a Cheeto?
In fact, researchers hypothesized that this dietary behavior was not normal for the people of the Lower Pecos, and most likely was more ceremonial. You know, the casual eating-a-full-snake ceremony.
Will Texans embrace this ceremony of their past and start chomping on rattlers? Who’s to say? All we know is that poop is the gift that keeps on giving.
A new way to soil yourself
If you’re reading this, we can say with some certainty that you managed to survive another Tax Day. Congratulations! But there’s still Benjamin Franklin’s other ultimate certainty of life. You know … the big sleep, the last roundup, assume room temperature, buy the farm, shuffle off this mortal coil, give up the ghost, and so on.
What are you going to do about that?
A big question, for sure, so let’s just focus on the earthly remains. A company called Recompose has a new alternative to burial and cremation, something they’ve dubbed “natural organic reduction” and others have described as “human composting” or “accelerated decomposition.” In a pilot project last year at Washington State University in Pullman, the Recompose process transformed the bodies of six donors to soil in 4-7 weeks, AP reported.
The company says that natural organic reduction is much more environmentally friendly than current practices, creating a cubic yard of soil per person, and that “friends and family are welcome to take some (or all) home to grow a tree or a garden.”
A garden sounds nice, or maybe something indoors. Just think of the potted plant possibilities: daisies (to push up), a Venus flytrap (the organic reduction continues), some poison ivy (a gift for people you don’t like), or maybe roses. Who wouldn’t want to come out of death smelling like a rose?
San Francisco vs. illegal dumping
Maybe you’re not quite ready to commit to using human remains as compost to fertilize your garden. Perhaps you want to start off only using human poop as fertilizer, see how that goes before sprinkling Grandma all over your tulips.
Well, if you’re looking for a sweet deal, we’re certain San Francisco can work something out with you because, in the past 7 years, incidence of human feces in public places within the city has quintupled, rising from 5,500 reported cases in 2011 to 28,100 cases in 2018.
The problem, likely related to an increasing homeless population who can’t afford San Francisco’s exorbitant rental prices and have limited access to public restrooms, is so bad that the city commissioned a “Poop Patrol” in the summer of 2018 to wipe down some of the poorer, more suspect neighborhoods.
While the upstanding members of the Poop Patrol are almost certainly doing a fine job, it’s probably safe to say that human fecal clean-up is an industry ripe for disruption.
We look forward to the inevitable Silicon Valley start-up and for the media to hail it as “Uber, but for poop.”
Endgame for arachnophobia
We think Tony Stark would like this creative solution for spider and ant phobias. Comic book movies have now infiltrated every aspect of culture, including serious scientific research. And let’s be honest – more than one scientist has been inspired to go into their fields by Bruce Banner, Stark, or maybe even Doctor Octopus (no judgment).
A group of (possibly mad) scientists has tested exposure therapy for spider and ant phobias in people by showing participants the Spider-Man and Ant-Man movies. While the viewing material may not be totally scientifically accurate, researchers found that watching seven seconds of Spider-Man 2 or Ant-Man reduced spider and ant phobia by 20%.
The participants were specifically exposed to ants and spiders in the context of the movies, so surprisingly the phobia reduction had nothing to do with Tobey Maguire or Paul Rudd.
Old poop, new discovery
Here at LOTME, we like us some good bathroom humor. And don’t worry, we won’t ever change. In this week’s edition of the Wonderful World of Poop, we take you to Texas, 1,500 years ago. The sky was bigger, the air was fresher, and the humans of the Lower Pecos region were as hardcore as you can get. A recent re-examination of coprolite samples taken from the region found one interesting chunk of poop-rock that contained an entire rattlesnake.
Now, the presence of snake bits in early human poo is not that crazy; people ate (and still eat) snakes. The appearance of a centimeter-long fang, scales, and bones, however, did take the researchers by surprise. Why would someone eat a snake? Was it an ancient way to inoculate against snake venom? Or perhaps crunchy snake fangs were the world’s earliest version of a Cheeto?
In fact, researchers hypothesized that this dietary behavior was not normal for the people of the Lower Pecos, and most likely was more ceremonial. You know, the casual eating-a-full-snake ceremony.
Will Texans embrace this ceremony of their past and start chomping on rattlers? Who’s to say? All we know is that poop is the gift that keeps on giving.
A new way to soil yourself
If you’re reading this, we can say with some certainty that you managed to survive another Tax Day. Congratulations! But there’s still Benjamin Franklin’s other ultimate certainty of life. You know … the big sleep, the last roundup, assume room temperature, buy the farm, shuffle off this mortal coil, give up the ghost, and so on.
What are you going to do about that?
A big question, for sure, so let’s just focus on the earthly remains. A company called Recompose has a new alternative to burial and cremation, something they’ve dubbed “natural organic reduction” and others have described as “human composting” or “accelerated decomposition.” In a pilot project last year at Washington State University in Pullman, the Recompose process transformed the bodies of six donors to soil in 4-7 weeks, AP reported.
The company says that natural organic reduction is much more environmentally friendly than current practices, creating a cubic yard of soil per person, and that “friends and family are welcome to take some (or all) home to grow a tree or a garden.”
A garden sounds nice, or maybe something indoors. Just think of the potted plant possibilities: daisies (to push up), a Venus flytrap (the organic reduction continues), some poison ivy (a gift for people you don’t like), or maybe roses. Who wouldn’t want to come out of death smelling like a rose?
San Francisco vs. illegal dumping
Maybe you’re not quite ready to commit to using human remains as compost to fertilize your garden. Perhaps you want to start off only using human poop as fertilizer, see how that goes before sprinkling Grandma all over your tulips.
Well, if you’re looking for a sweet deal, we’re certain San Francisco can work something out with you because, in the past 7 years, incidence of human feces in public places within the city has quintupled, rising from 5,500 reported cases in 2011 to 28,100 cases in 2018.
The problem, likely related to an increasing homeless population who can’t afford San Francisco’s exorbitant rental prices and have limited access to public restrooms, is so bad that the city commissioned a “Poop Patrol” in the summer of 2018 to wipe down some of the poorer, more suspect neighborhoods.
While the upstanding members of the Poop Patrol are almost certainly doing a fine job, it’s probably safe to say that human fecal clean-up is an industry ripe for disruption.
We look forward to the inevitable Silicon Valley start-up and for the media to hail it as “Uber, but for poop.”
Measuring hydroxychloroquine blood levels could inform safe, optimal dosing
SAN FRANCISCO – , according to an investigation of the Hopkins Lupus Cohort, an ongoing longitudinal study of lupus patients in the Baltimore area.
As innocuous as the assertions seem, they are anything but. They directly contradict a 2014 investigation from Kaiser Permanente that put the retinopathy risk after 20 years at almost 40%; that finding led directly to an American Academy of Ophthalmology recommendation to reduce the maximum hydroxychloroquine dose from 6.5 mg/kg per day ideal weight to 5 mg/kg real weight, where it remains to this day.
Meanwhile, very few rheumatologists have access to hydroxychloroquine (HCQ) blood levels because most commercial labs don’t offer them. Plasma testing is widely available, but it’s nowhere near as good, according to Michelle Petri, MD, a rheumatology professor at Johns Hopkins University, Baltimore; director of the Hopkins Lupus Cohort; and a respected authority on lupus management.
“The Kaiser Permanente study was very worrisome,” she said. “I remember that I thought it didn’t fit my practice at all; I don’t see 40%. It made me even more concerned when the ophthalmologists” reduced the dose, “because hydroxychloroquine is the most important medicine I have for my lupus patients; it is the only one that improves survival. We don’t want to scare our patients into thinking that 40% of them are going to have retinopathy.”
Dueling studies
Dr. Petri’s concerns led her and her team to launch their own investigation; they followed 537 Baltimore cohort members on HCQ as they went through eye exams by Hopkins retinopathy specialists, often with optical coherence tomography (OCT). With a sensitivity of 93% and specificity of 84%, OCT is the best screening method available.
“We found that the risk of retinopathy is not nearly as high as Kaiser Permanente found,” just 11.46% (11/96) with 16-20 years of use, and 8% (6/75) with 21 or more years. On average, “the risk is probably about 10% after 16 or more years, not 40%,” Dr. Petri said at an international congress on systemic lupus erythematosus.
Patients with “possible” retinopathy were not included in the analysis.
When asked for comment, Ronald Melles, MD, a Kaiser ophthalmologist in Redwood City, Calif.; one of the two authors on the Kaiser study; and an author on the subsequent AAO recommendations, stood by his work.
“A rate of 12% retinopathy after 16 years of use ... seems right in line with what we found.” However, “the fact that the rate went down to 8%” after 20 years does not make sense; “the longer you are on the medicine, the more likely you would be to develop the toxicity,” he said.
Maybe the fluctuation had to do with the fact that there were only 75 patients in the Hopkins study on HCQ past 20 years, whereas “we looked at 2,361 patients, and 238 were on the medication for” 20 years or more. Patients over 5.0 mg/kg per day had a 5.67-fold higher risk”of retinopathy, he said (univariate analysis, P less than .001).
Dr. Petri wasn’t buying it. The across the board recommendation was made “without any recognition that if you reduce the dose, you reduce the benefit,” she said.
A new referee: blood levels
Dr. Petri and her team also found that HCQ blood levels correlated with retinopathy, and it was a direct relationship. Patients in the highest maximum tertile (1,753-6,281 ng/mL) had a retinopathy rate of 6.7%, a good deal higher than patients in lower tertiles. It was the same story with the highest mean tertile (1,117-3,513 ng/mL). Retinopathy in that group occurred in 7.9% versus 3.7% or less in lower tertiles. The findings were statistically significant.
Patients in the third tertile “are at the greatest risk, so I reduce their dose,” but “I do not want to reduce the dose across the board” to 5 mg/kg per day; that’s overreach. The tertile approach, if it pans out, might be a better way, she said.
The problem with plasma levels is that HCQ binds to red blood cells, so plasma levels are artificially low and do not indicate the true HCQ load. For now, just one company in the United States offers HCQ blood levels: Exagen.
“We have to get the big companies to start offering” this, and “I want rheumatologists to adopt it. I am lucky at Hopkins [because] we have our own homegrown blood level assay,” she said.
Dr. Melles agreed that tracking blood level makes sense, “but the literature I am aware of has not been able to closely correlate either lupus disease activity or retinal toxicity with blood levels. Also, we have seen some patients at lower doses develop toxicity and other patients on higher doses without any detectable changes.”
Still, “we would like to see [this] studied more, perhaps with newer analytic methods,” said his coauthor on the Kaiser study, and also the lead author on the AAO guidelines, Michael Marmor, MD, professor emeritus of ophthalmology at Stanford (Calif.) University.
In the end, on the same team
Dr. Petri said there is interest among some of her fellow members of the American College of Rheumatology to work with AAO to revise the guidelines. “Until then,” she said, “I want the ophthalmologists to withdraw” them.
She’s worried about undertreatment and believes that the previous AAO guideline, up to 6.5 mg/kg per day ideal weight, was fine, “with some understanding that there are high-risk groups, such as the elderly and people with renal impairment, where the dose should be reduced.”
“No matter how obese a patient is, I cap it at 400 mg/day,” she said, and, with the luxury of HCQ blood level testing, “no matter the weight, if the person is in the upper tertile, I reduce the dose.”
Dr. Marmor agreed that “if rheumatologists prescribe 5 mg/kg real weight and do not stress compliance, some patients may indeed be underdosed.”
“However, that is a fault of the doctor and patient relationship,” he said, “not the guideline; we do not feel it ethical to prescribe higher doses which could increase toxicity in reliable patients ... just because some patients might be unreliable.”
Overall, “I have not heard complaints from rheumatologists in our area, who try hard to follow the current recommendations. ... any doctor can use the dose he or she feels is necessary for a patient. Several recent reports [also] suggest the incidence of toxicity is falling now with usage of AAO guidelines, [and] I am not aware of any data” showing that management has become less effective, he said.
In the meantime, “I assure you that AAO wants ... to serve both specialties, and will change the guidelines when there is new, defensible data,” he added.
The Hopkins team found that the risk of HCQ retinopathy was highest in men and white patients, as well as older people. Body mass index and hypertension also predicted retina issues.
“As screening tests are frequently abnormal due to causes other than HCQ ... stopping [it] based on an abnormal test without confirmation from a retinopathy expert could needlessly deprive an SLE patient of an important medication,” they said.
The Hopkins Lupus Cohort is funded by the National Institutes of Health. The physicians didn’t have any relevant disclosures.
SOURCES: Petri M et al. Lupus Sci Med. 2019;6(suppl 1). Abstracts 15 and 16.
SAN FRANCISCO – , according to an investigation of the Hopkins Lupus Cohort, an ongoing longitudinal study of lupus patients in the Baltimore area.
As innocuous as the assertions seem, they are anything but. They directly contradict a 2014 investigation from Kaiser Permanente that put the retinopathy risk after 20 years at almost 40%; that finding led directly to an American Academy of Ophthalmology recommendation to reduce the maximum hydroxychloroquine dose from 6.5 mg/kg per day ideal weight to 5 mg/kg real weight, where it remains to this day.
Meanwhile, very few rheumatologists have access to hydroxychloroquine (HCQ) blood levels because most commercial labs don’t offer them. Plasma testing is widely available, but it’s nowhere near as good, according to Michelle Petri, MD, a rheumatology professor at Johns Hopkins University, Baltimore; director of the Hopkins Lupus Cohort; and a respected authority on lupus management.
“The Kaiser Permanente study was very worrisome,” she said. “I remember that I thought it didn’t fit my practice at all; I don’t see 40%. It made me even more concerned when the ophthalmologists” reduced the dose, “because hydroxychloroquine is the most important medicine I have for my lupus patients; it is the only one that improves survival. We don’t want to scare our patients into thinking that 40% of them are going to have retinopathy.”
Dueling studies
Dr. Petri’s concerns led her and her team to launch their own investigation; they followed 537 Baltimore cohort members on HCQ as they went through eye exams by Hopkins retinopathy specialists, often with optical coherence tomography (OCT). With a sensitivity of 93% and specificity of 84%, OCT is the best screening method available.
“We found that the risk of retinopathy is not nearly as high as Kaiser Permanente found,” just 11.46% (11/96) with 16-20 years of use, and 8% (6/75) with 21 or more years. On average, “the risk is probably about 10% after 16 or more years, not 40%,” Dr. Petri said at an international congress on systemic lupus erythematosus.
Patients with “possible” retinopathy were not included in the analysis.
When asked for comment, Ronald Melles, MD, a Kaiser ophthalmologist in Redwood City, Calif.; one of the two authors on the Kaiser study; and an author on the subsequent AAO recommendations, stood by his work.
“A rate of 12% retinopathy after 16 years of use ... seems right in line with what we found.” However, “the fact that the rate went down to 8%” after 20 years does not make sense; “the longer you are on the medicine, the more likely you would be to develop the toxicity,” he said.
Maybe the fluctuation had to do with the fact that there were only 75 patients in the Hopkins study on HCQ past 20 years, whereas “we looked at 2,361 patients, and 238 were on the medication for” 20 years or more. Patients over 5.0 mg/kg per day had a 5.67-fold higher risk”of retinopathy, he said (univariate analysis, P less than .001).
Dr. Petri wasn’t buying it. The across the board recommendation was made “without any recognition that if you reduce the dose, you reduce the benefit,” she said.
A new referee: blood levels
Dr. Petri and her team also found that HCQ blood levels correlated with retinopathy, and it was a direct relationship. Patients in the highest maximum tertile (1,753-6,281 ng/mL) had a retinopathy rate of 6.7%, a good deal higher than patients in lower tertiles. It was the same story with the highest mean tertile (1,117-3,513 ng/mL). Retinopathy in that group occurred in 7.9% versus 3.7% or less in lower tertiles. The findings were statistically significant.
Patients in the third tertile “are at the greatest risk, so I reduce their dose,” but “I do not want to reduce the dose across the board” to 5 mg/kg per day; that’s overreach. The tertile approach, if it pans out, might be a better way, she said.
The problem with plasma levels is that HCQ binds to red blood cells, so plasma levels are artificially low and do not indicate the true HCQ load. For now, just one company in the United States offers HCQ blood levels: Exagen.
“We have to get the big companies to start offering” this, and “I want rheumatologists to adopt it. I am lucky at Hopkins [because] we have our own homegrown blood level assay,” she said.
Dr. Melles agreed that tracking blood level makes sense, “but the literature I am aware of has not been able to closely correlate either lupus disease activity or retinal toxicity with blood levels. Also, we have seen some patients at lower doses develop toxicity and other patients on higher doses without any detectable changes.”
Still, “we would like to see [this] studied more, perhaps with newer analytic methods,” said his coauthor on the Kaiser study, and also the lead author on the AAO guidelines, Michael Marmor, MD, professor emeritus of ophthalmology at Stanford (Calif.) University.
In the end, on the same team
Dr. Petri said there is interest among some of her fellow members of the American College of Rheumatology to work with AAO to revise the guidelines. “Until then,” she said, “I want the ophthalmologists to withdraw” them.
She’s worried about undertreatment and believes that the previous AAO guideline, up to 6.5 mg/kg per day ideal weight, was fine, “with some understanding that there are high-risk groups, such as the elderly and people with renal impairment, where the dose should be reduced.”
“No matter how obese a patient is, I cap it at 400 mg/day,” she said, and, with the luxury of HCQ blood level testing, “no matter the weight, if the person is in the upper tertile, I reduce the dose.”
Dr. Marmor agreed that “if rheumatologists prescribe 5 mg/kg real weight and do not stress compliance, some patients may indeed be underdosed.”
“However, that is a fault of the doctor and patient relationship,” he said, “not the guideline; we do not feel it ethical to prescribe higher doses which could increase toxicity in reliable patients ... just because some patients might be unreliable.”
Overall, “I have not heard complaints from rheumatologists in our area, who try hard to follow the current recommendations. ... any doctor can use the dose he or she feels is necessary for a patient. Several recent reports [also] suggest the incidence of toxicity is falling now with usage of AAO guidelines, [and] I am not aware of any data” showing that management has become less effective, he said.
In the meantime, “I assure you that AAO wants ... to serve both specialties, and will change the guidelines when there is new, defensible data,” he added.
The Hopkins team found that the risk of HCQ retinopathy was highest in men and white patients, as well as older people. Body mass index and hypertension also predicted retina issues.
“As screening tests are frequently abnormal due to causes other than HCQ ... stopping [it] based on an abnormal test without confirmation from a retinopathy expert could needlessly deprive an SLE patient of an important medication,” they said.
The Hopkins Lupus Cohort is funded by the National Institutes of Health. The physicians didn’t have any relevant disclosures.
SOURCES: Petri M et al. Lupus Sci Med. 2019;6(suppl 1). Abstracts 15 and 16.
SAN FRANCISCO – , according to an investigation of the Hopkins Lupus Cohort, an ongoing longitudinal study of lupus patients in the Baltimore area.
As innocuous as the assertions seem, they are anything but. They directly contradict a 2014 investigation from Kaiser Permanente that put the retinopathy risk after 20 years at almost 40%; that finding led directly to an American Academy of Ophthalmology recommendation to reduce the maximum hydroxychloroquine dose from 6.5 mg/kg per day ideal weight to 5 mg/kg real weight, where it remains to this day.
Meanwhile, very few rheumatologists have access to hydroxychloroquine (HCQ) blood levels because most commercial labs don’t offer them. Plasma testing is widely available, but it’s nowhere near as good, according to Michelle Petri, MD, a rheumatology professor at Johns Hopkins University, Baltimore; director of the Hopkins Lupus Cohort; and a respected authority on lupus management.
“The Kaiser Permanente study was very worrisome,” she said. “I remember that I thought it didn’t fit my practice at all; I don’t see 40%. It made me even more concerned when the ophthalmologists” reduced the dose, “because hydroxychloroquine is the most important medicine I have for my lupus patients; it is the only one that improves survival. We don’t want to scare our patients into thinking that 40% of them are going to have retinopathy.”
Dueling studies
Dr. Petri’s concerns led her and her team to launch their own investigation; they followed 537 Baltimore cohort members on HCQ as they went through eye exams by Hopkins retinopathy specialists, often with optical coherence tomography (OCT). With a sensitivity of 93% and specificity of 84%, OCT is the best screening method available.
“We found that the risk of retinopathy is not nearly as high as Kaiser Permanente found,” just 11.46% (11/96) with 16-20 years of use, and 8% (6/75) with 21 or more years. On average, “the risk is probably about 10% after 16 or more years, not 40%,” Dr. Petri said at an international congress on systemic lupus erythematosus.
Patients with “possible” retinopathy were not included in the analysis.
When asked for comment, Ronald Melles, MD, a Kaiser ophthalmologist in Redwood City, Calif.; one of the two authors on the Kaiser study; and an author on the subsequent AAO recommendations, stood by his work.
“A rate of 12% retinopathy after 16 years of use ... seems right in line with what we found.” However, “the fact that the rate went down to 8%” after 20 years does not make sense; “the longer you are on the medicine, the more likely you would be to develop the toxicity,” he said.
Maybe the fluctuation had to do with the fact that there were only 75 patients in the Hopkins study on HCQ past 20 years, whereas “we looked at 2,361 patients, and 238 were on the medication for” 20 years or more. Patients over 5.0 mg/kg per day had a 5.67-fold higher risk”of retinopathy, he said (univariate analysis, P less than .001).
Dr. Petri wasn’t buying it. The across the board recommendation was made “without any recognition that if you reduce the dose, you reduce the benefit,” she said.
A new referee: blood levels
Dr. Petri and her team also found that HCQ blood levels correlated with retinopathy, and it was a direct relationship. Patients in the highest maximum tertile (1,753-6,281 ng/mL) had a retinopathy rate of 6.7%, a good deal higher than patients in lower tertiles. It was the same story with the highest mean tertile (1,117-3,513 ng/mL). Retinopathy in that group occurred in 7.9% versus 3.7% or less in lower tertiles. The findings were statistically significant.
Patients in the third tertile “are at the greatest risk, so I reduce their dose,” but “I do not want to reduce the dose across the board” to 5 mg/kg per day; that’s overreach. The tertile approach, if it pans out, might be a better way, she said.
The problem with plasma levels is that HCQ binds to red blood cells, so plasma levels are artificially low and do not indicate the true HCQ load. For now, just one company in the United States offers HCQ blood levels: Exagen.
“We have to get the big companies to start offering” this, and “I want rheumatologists to adopt it. I am lucky at Hopkins [because] we have our own homegrown blood level assay,” she said.
Dr. Melles agreed that tracking blood level makes sense, “but the literature I am aware of has not been able to closely correlate either lupus disease activity or retinal toxicity with blood levels. Also, we have seen some patients at lower doses develop toxicity and other patients on higher doses without any detectable changes.”
Still, “we would like to see [this] studied more, perhaps with newer analytic methods,” said his coauthor on the Kaiser study, and also the lead author on the AAO guidelines, Michael Marmor, MD, professor emeritus of ophthalmology at Stanford (Calif.) University.
In the end, on the same team
Dr. Petri said there is interest among some of her fellow members of the American College of Rheumatology to work with AAO to revise the guidelines. “Until then,” she said, “I want the ophthalmologists to withdraw” them.
She’s worried about undertreatment and believes that the previous AAO guideline, up to 6.5 mg/kg per day ideal weight, was fine, “with some understanding that there are high-risk groups, such as the elderly and people with renal impairment, where the dose should be reduced.”
“No matter how obese a patient is, I cap it at 400 mg/day,” she said, and, with the luxury of HCQ blood level testing, “no matter the weight, if the person is in the upper tertile, I reduce the dose.”
Dr. Marmor agreed that “if rheumatologists prescribe 5 mg/kg real weight and do not stress compliance, some patients may indeed be underdosed.”
“However, that is a fault of the doctor and patient relationship,” he said, “not the guideline; we do not feel it ethical to prescribe higher doses which could increase toxicity in reliable patients ... just because some patients might be unreliable.”
Overall, “I have not heard complaints from rheumatologists in our area, who try hard to follow the current recommendations. ... any doctor can use the dose he or she feels is necessary for a patient. Several recent reports [also] suggest the incidence of toxicity is falling now with usage of AAO guidelines, [and] I am not aware of any data” showing that management has become less effective, he said.
In the meantime, “I assure you that AAO wants ... to serve both specialties, and will change the guidelines when there is new, defensible data,” he added.
The Hopkins team found that the risk of HCQ retinopathy was highest in men and white patients, as well as older people. Body mass index and hypertension also predicted retina issues.
“As screening tests are frequently abnormal due to causes other than HCQ ... stopping [it] based on an abnormal test without confirmation from a retinopathy expert could needlessly deprive an SLE patient of an important medication,” they said.
The Hopkins Lupus Cohort is funded by the National Institutes of Health. The physicians didn’t have any relevant disclosures.
SOURCES: Petri M et al. Lupus Sci Med. 2019;6(suppl 1). Abstracts 15 and 16.
REPORTING FROM LUPUS 2019
In chronic pain, catastrophizing contributes to disrupted brain circuitry
MILWAUKEE – When a patient with acute pain tumbles into a chronic pain state, many factors are at play, according to the widely accepted biopsychosocial theory of pain. Emotional, cognitive, and environmental components all contribute to the persistent and recalcitrant symptoms chronic pain patients experience.
Now, modern neuroimaging techniques show how for some, pain signals hijack the brain’s regulatory networks, allowing rumination and catastrophizing to intrude on the exteroception that’s critical to how humans interact with one another and the world. Interrupting catastrophizing with nonpharmacologic techniques yields measurable improvements – and there’s promise that a single treatment session can make a lasting difference.
“Psychosocial phenotypes, such as catastrophizing, are part of a complex biopsychosocial web of contributors to chronic pain. ,” said Robert R. Edwards, PhD, a psychologist at Brigham and Women’s Hospital/Harvard Medical School (Boston) Pain Management Center. Dr. Edwards moderated a session focused on catastrophizing at the scientific meeting of the American Pain Society.
Through magnetic resonance imaging techniques that measure functional connectivity, researchers can now see how nodes in the brain form connected networks that are differentially activated.
For example, the brain’s salience network (SLN) responds to stimuli that merit attention, such as evoked or clinical pain, Vitaly Napadow, PhD, said during his presentation. Key nodes in the SLN include the anterior cingulate cortex, the anterior insula, and the anterior temporoparietal junction. One function of the salience network, he said, is to regulate switching between the default mode network (DMN) – an interoceptive network – and the central executive network, usually active in exteroceptive tasks.
“The default mode network has been found to play an important role in pain processing,” Dr. Napadow said. These brain regions are more active in self-referential cognition – thinking about oneself – than when performing external tasks, he said. Consistently, studies have found decreased DMN deactivation in patients with chronic pain; essentially, the constant low hum of pain-focused DMN activity never turns off in a chronic pain state.
For patients with chronic pain, high levels of catastrophizing mean greater impact on functional brain connectivity, said Dr. Napadow, director of the Center for Integrative Pain NeuroImaging at the Martino Center for Biomedical Imaging at Massachusetts General Hospital and Harvard Medical School, Boston.
Looking at patients with chronic low back pain, he and his research team looked for connections between the DMN and the insula, which has a central role in pain processing. This connectivity was increased only in patients with high catastrophizing scores, said Dr. Napadow, with increased DMN-insula connectivity associated with increased pain scores only for this subgroup (Pain. 2019 Mar 4. doi: 10.1097/j.pain.0000000000001541).
“The model that we’re moving toward is that chronic pain leads to a blurring in the canonical network” of brain connectivity, Dr. Napadow said. “The speculation here is that the DMN-SLN linkage could be a sort of neural substrate for a common perception that chronic pain patients have – that their pain becomes part of who they are. Their interoceptive state becomes linked to the pain they are feeling: They are their pain.”
Where to turn with this information, which has large clinical implications? “Catastrophizing is a consistent risk factor for poor pain treatment outcomes, especially when we’re talking about pharmacologic treatments,” Dr. Edwards said. Also, chronic pain patients with the highest catastrophizing scores have the most opioid-related side effects, he said.
“Cognitive-behavioral therapy is potentially the most effective at reducing this risk factor,” said Dr. Edwards, noting that long-term effects were seen at 6 and 12 months post treatment. “These are significant, moderate-sized effects; there is some evidence that effects are largest in those with the highest baseline pain catastrophizing scores.”
“CBT is considered the gold standard, mainly because it’s the best studied” among treatment modalities, psychologist Beth Darnall, PhD, pointed out in her presentation. There’s evidence that other nonpharmacologic interventions can reduce catastrophizing: Psychology-informed yoga practices, physical therapy, and certain medical devices, such as high-frequency transcutaneous electric nerve stimulation units, may all have efficacy against catastrophizing and the downward spiral of chronic pain.
Still, a randomized controlled trial of CBT for pain in patients with fibromyalgia showed that the benefit, measured as reduction in pain interference with daily functioning, was almost twice as high in the high-catastrophizing group, “suggesting the potential utility of this method for patients at greatest risk,” said Dr. Edwards.
“We see a specific pattern of alterations in chronic pain similar to that seen in anxiety disorder; this suggests that some individuals are primed for the experience of pain,” said Dr. Darnall, clinical professor of anesthesiology, perioperative medicine, and pain medicine at Stanford (Calif.) University. “We are not born with the understanding of how to modulate pain and the distress it causes us.”
When she talks to patients, Dr. Darnall said: “I describe pain as being our ‘harm alarm.’ ... I like to describe it to people that ‘you have a very protective nervous system.’ ”
Dr. Darnall and her colleagues reported success with a pilot study of a single 2.5-hour-long session that addressed pain catastrophizing. From a baseline score of 26.1 on the Pain Catastrophizing Scale to a score of 13.8 at week 4, the 57 participants saw a significant decrease in mean scores on the scale (d [effect size] = 1.15).
On the strength of these early findings, Dr. Darnall and her collaborators are embarking on a randomized controlled trial ; the 3-arm comparative effectiveness study will compare a single-session intervention against 8 weeks of CBT or education-only classes for individuals with catastrophizing and chronic pain. The trial is structured to test the hypothesis that the single-session intervention will be noninferior to the full 8 weeks of CBT, Dr. Darnall said.
Building on the importance of avoiding stigmatizing and pejorative terms when talking about pain and catastrophizing, Dr. Darnall said she’s moved away from using the term “catastrophizing” in patient interactions. The one-session intervention is called “Empowered Relief – Train Your Brain Away from Pain.”
There’s a practical promise to a single-session class: Dr. Darnall has taught up to 85 patients at once, she said, adding, “This is a low-cost and scalable intervention.”
Dr. Edwards and Dr. Napadow reported funding from the National Institutes of Health, and they reported no conflicts of interest. Dr. Darnall reported funding from the NIH and the Patient-Centered Outcomes Research Institute. She serves on the scientific advisory board of Axial Healthcare and has several commercial publications about pain.
MILWAUKEE – When a patient with acute pain tumbles into a chronic pain state, many factors are at play, according to the widely accepted biopsychosocial theory of pain. Emotional, cognitive, and environmental components all contribute to the persistent and recalcitrant symptoms chronic pain patients experience.
Now, modern neuroimaging techniques show how for some, pain signals hijack the brain’s regulatory networks, allowing rumination and catastrophizing to intrude on the exteroception that’s critical to how humans interact with one another and the world. Interrupting catastrophizing with nonpharmacologic techniques yields measurable improvements – and there’s promise that a single treatment session can make a lasting difference.
“Psychosocial phenotypes, such as catastrophizing, are part of a complex biopsychosocial web of contributors to chronic pain. ,” said Robert R. Edwards, PhD, a psychologist at Brigham and Women’s Hospital/Harvard Medical School (Boston) Pain Management Center. Dr. Edwards moderated a session focused on catastrophizing at the scientific meeting of the American Pain Society.
Through magnetic resonance imaging techniques that measure functional connectivity, researchers can now see how nodes in the brain form connected networks that are differentially activated.
For example, the brain’s salience network (SLN) responds to stimuli that merit attention, such as evoked or clinical pain, Vitaly Napadow, PhD, said during his presentation. Key nodes in the SLN include the anterior cingulate cortex, the anterior insula, and the anterior temporoparietal junction. One function of the salience network, he said, is to regulate switching between the default mode network (DMN) – an interoceptive network – and the central executive network, usually active in exteroceptive tasks.
“The default mode network has been found to play an important role in pain processing,” Dr. Napadow said. These brain regions are more active in self-referential cognition – thinking about oneself – than when performing external tasks, he said. Consistently, studies have found decreased DMN deactivation in patients with chronic pain; essentially, the constant low hum of pain-focused DMN activity never turns off in a chronic pain state.
For patients with chronic pain, high levels of catastrophizing mean greater impact on functional brain connectivity, said Dr. Napadow, director of the Center for Integrative Pain NeuroImaging at the Martino Center for Biomedical Imaging at Massachusetts General Hospital and Harvard Medical School, Boston.
Looking at patients with chronic low back pain, he and his research team looked for connections between the DMN and the insula, which has a central role in pain processing. This connectivity was increased only in patients with high catastrophizing scores, said Dr. Napadow, with increased DMN-insula connectivity associated with increased pain scores only for this subgroup (Pain. 2019 Mar 4. doi: 10.1097/j.pain.0000000000001541).
“The model that we’re moving toward is that chronic pain leads to a blurring in the canonical network” of brain connectivity, Dr. Napadow said. “The speculation here is that the DMN-SLN linkage could be a sort of neural substrate for a common perception that chronic pain patients have – that their pain becomes part of who they are. Their interoceptive state becomes linked to the pain they are feeling: They are their pain.”
Where to turn with this information, which has large clinical implications? “Catastrophizing is a consistent risk factor for poor pain treatment outcomes, especially when we’re talking about pharmacologic treatments,” Dr. Edwards said. Also, chronic pain patients with the highest catastrophizing scores have the most opioid-related side effects, he said.
“Cognitive-behavioral therapy is potentially the most effective at reducing this risk factor,” said Dr. Edwards, noting that long-term effects were seen at 6 and 12 months post treatment. “These are significant, moderate-sized effects; there is some evidence that effects are largest in those with the highest baseline pain catastrophizing scores.”
“CBT is considered the gold standard, mainly because it’s the best studied” among treatment modalities, psychologist Beth Darnall, PhD, pointed out in her presentation. There’s evidence that other nonpharmacologic interventions can reduce catastrophizing: Psychology-informed yoga practices, physical therapy, and certain medical devices, such as high-frequency transcutaneous electric nerve stimulation units, may all have efficacy against catastrophizing and the downward spiral of chronic pain.
Still, a randomized controlled trial of CBT for pain in patients with fibromyalgia showed that the benefit, measured as reduction in pain interference with daily functioning, was almost twice as high in the high-catastrophizing group, “suggesting the potential utility of this method for patients at greatest risk,” said Dr. Edwards.
“We see a specific pattern of alterations in chronic pain similar to that seen in anxiety disorder; this suggests that some individuals are primed for the experience of pain,” said Dr. Darnall, clinical professor of anesthesiology, perioperative medicine, and pain medicine at Stanford (Calif.) University. “We are not born with the understanding of how to modulate pain and the distress it causes us.”
When she talks to patients, Dr. Darnall said: “I describe pain as being our ‘harm alarm.’ ... I like to describe it to people that ‘you have a very protective nervous system.’ ”
Dr. Darnall and her colleagues reported success with a pilot study of a single 2.5-hour-long session that addressed pain catastrophizing. From a baseline score of 26.1 on the Pain Catastrophizing Scale to a score of 13.8 at week 4, the 57 participants saw a significant decrease in mean scores on the scale (d [effect size] = 1.15).
On the strength of these early findings, Dr. Darnall and her collaborators are embarking on a randomized controlled trial ; the 3-arm comparative effectiveness study will compare a single-session intervention against 8 weeks of CBT or education-only classes for individuals with catastrophizing and chronic pain. The trial is structured to test the hypothesis that the single-session intervention will be noninferior to the full 8 weeks of CBT, Dr. Darnall said.
Building on the importance of avoiding stigmatizing and pejorative terms when talking about pain and catastrophizing, Dr. Darnall said she’s moved away from using the term “catastrophizing” in patient interactions. The one-session intervention is called “Empowered Relief – Train Your Brain Away from Pain.”
There’s a practical promise to a single-session class: Dr. Darnall has taught up to 85 patients at once, she said, adding, “This is a low-cost and scalable intervention.”
Dr. Edwards and Dr. Napadow reported funding from the National Institutes of Health, and they reported no conflicts of interest. Dr. Darnall reported funding from the NIH and the Patient-Centered Outcomes Research Institute. She serves on the scientific advisory board of Axial Healthcare and has several commercial publications about pain.
MILWAUKEE – When a patient with acute pain tumbles into a chronic pain state, many factors are at play, according to the widely accepted biopsychosocial theory of pain. Emotional, cognitive, and environmental components all contribute to the persistent and recalcitrant symptoms chronic pain patients experience.
Now, modern neuroimaging techniques show how for some, pain signals hijack the brain’s regulatory networks, allowing rumination and catastrophizing to intrude on the exteroception that’s critical to how humans interact with one another and the world. Interrupting catastrophizing with nonpharmacologic techniques yields measurable improvements – and there’s promise that a single treatment session can make a lasting difference.
“Psychosocial phenotypes, such as catastrophizing, are part of a complex biopsychosocial web of contributors to chronic pain. ,” said Robert R. Edwards, PhD, a psychologist at Brigham and Women’s Hospital/Harvard Medical School (Boston) Pain Management Center. Dr. Edwards moderated a session focused on catastrophizing at the scientific meeting of the American Pain Society.
Through magnetic resonance imaging techniques that measure functional connectivity, researchers can now see how nodes in the brain form connected networks that are differentially activated.
For example, the brain’s salience network (SLN) responds to stimuli that merit attention, such as evoked or clinical pain, Vitaly Napadow, PhD, said during his presentation. Key nodes in the SLN include the anterior cingulate cortex, the anterior insula, and the anterior temporoparietal junction. One function of the salience network, he said, is to regulate switching between the default mode network (DMN) – an interoceptive network – and the central executive network, usually active in exteroceptive tasks.
“The default mode network has been found to play an important role in pain processing,” Dr. Napadow said. These brain regions are more active in self-referential cognition – thinking about oneself – than when performing external tasks, he said. Consistently, studies have found decreased DMN deactivation in patients with chronic pain; essentially, the constant low hum of pain-focused DMN activity never turns off in a chronic pain state.
For patients with chronic pain, high levels of catastrophizing mean greater impact on functional brain connectivity, said Dr. Napadow, director of the Center for Integrative Pain NeuroImaging at the Martino Center for Biomedical Imaging at Massachusetts General Hospital and Harvard Medical School, Boston.
Looking at patients with chronic low back pain, he and his research team looked for connections between the DMN and the insula, which has a central role in pain processing. This connectivity was increased only in patients with high catastrophizing scores, said Dr. Napadow, with increased DMN-insula connectivity associated with increased pain scores only for this subgroup (Pain. 2019 Mar 4. doi: 10.1097/j.pain.0000000000001541).
“The model that we’re moving toward is that chronic pain leads to a blurring in the canonical network” of brain connectivity, Dr. Napadow said. “The speculation here is that the DMN-SLN linkage could be a sort of neural substrate for a common perception that chronic pain patients have – that their pain becomes part of who they are. Their interoceptive state becomes linked to the pain they are feeling: They are their pain.”
Where to turn with this information, which has large clinical implications? “Catastrophizing is a consistent risk factor for poor pain treatment outcomes, especially when we’re talking about pharmacologic treatments,” Dr. Edwards said. Also, chronic pain patients with the highest catastrophizing scores have the most opioid-related side effects, he said.
“Cognitive-behavioral therapy is potentially the most effective at reducing this risk factor,” said Dr. Edwards, noting that long-term effects were seen at 6 and 12 months post treatment. “These are significant, moderate-sized effects; there is some evidence that effects are largest in those with the highest baseline pain catastrophizing scores.”
“CBT is considered the gold standard, mainly because it’s the best studied” among treatment modalities, psychologist Beth Darnall, PhD, pointed out in her presentation. There’s evidence that other nonpharmacologic interventions can reduce catastrophizing: Psychology-informed yoga practices, physical therapy, and certain medical devices, such as high-frequency transcutaneous electric nerve stimulation units, may all have efficacy against catastrophizing and the downward spiral of chronic pain.
Still, a randomized controlled trial of CBT for pain in patients with fibromyalgia showed that the benefit, measured as reduction in pain interference with daily functioning, was almost twice as high in the high-catastrophizing group, “suggesting the potential utility of this method for patients at greatest risk,” said Dr. Edwards.
“We see a specific pattern of alterations in chronic pain similar to that seen in anxiety disorder; this suggests that some individuals are primed for the experience of pain,” said Dr. Darnall, clinical professor of anesthesiology, perioperative medicine, and pain medicine at Stanford (Calif.) University. “We are not born with the understanding of how to modulate pain and the distress it causes us.”
When she talks to patients, Dr. Darnall said: “I describe pain as being our ‘harm alarm.’ ... I like to describe it to people that ‘you have a very protective nervous system.’ ”
Dr. Darnall and her colleagues reported success with a pilot study of a single 2.5-hour-long session that addressed pain catastrophizing. From a baseline score of 26.1 on the Pain Catastrophizing Scale to a score of 13.8 at week 4, the 57 participants saw a significant decrease in mean scores on the scale (d [effect size] = 1.15).
On the strength of these early findings, Dr. Darnall and her collaborators are embarking on a randomized controlled trial ; the 3-arm comparative effectiveness study will compare a single-session intervention against 8 weeks of CBT or education-only classes for individuals with catastrophizing and chronic pain. The trial is structured to test the hypothesis that the single-session intervention will be noninferior to the full 8 weeks of CBT, Dr. Darnall said.
Building on the importance of avoiding stigmatizing and pejorative terms when talking about pain and catastrophizing, Dr. Darnall said she’s moved away from using the term “catastrophizing” in patient interactions. The one-session intervention is called “Empowered Relief – Train Your Brain Away from Pain.”
There’s a practical promise to a single-session class: Dr. Darnall has taught up to 85 patients at once, she said, adding, “This is a low-cost and scalable intervention.”
Dr. Edwards and Dr. Napadow reported funding from the National Institutes of Health, and they reported no conflicts of interest. Dr. Darnall reported funding from the NIH and the Patient-Centered Outcomes Research Institute. She serves on the scientific advisory board of Axial Healthcare and has several commercial publications about pain.
REPORTING FROM APS 2019
High-dose MTX-based chemo is well tolerated in older PCNSL patients
GLASGOW – Most older patients with primary central nervous system lymphoma (PCNSL) can tolerate high-dose methotrexate-based chemotherapy and achieve similar outcomes as younger and fitter patients, according to a retrospective analysis of 244 patients in the United Kingdom.
For older patients – at least 65 years old – who received methotrexate-based regimens, treatment-related mortality was 6.8%, which is comparable with rates seen in trials involving younger patients, reported lead author Edward Poynton, MD, of University College Hospital in London.
Specifically, Dr. Poynton cited the phase 2 IELSG32 trial, which had a treatment-related mortality rate of 6% among patients up to age 70 years. These patients were treated with the established protocol for younger patients: chemotherapy with methotrexate, cytarabine, thiotepa, and rituximab (MATRix) followed by autologous stem cell transplant or whole-brain radiotherapy.
Introducing Dr. Poynton’s presentation at the annual meeting of the British Society for Haematology, Simon Rule, MD, of the University of Plymouth (England), added historical context to the new findings.
“When I started in hematology ... [PCNSL] was a universally fatal disease, pretty much,” Dr. Rule said. “And then we had methotrexate, and it worked occasionally. And then we had a randomized trial, which was randomization of methotrexate plus or minus high-dose cytarabine, showing benefit.”
This combination became the benchmark against which subsequent randomized trials were measured; however, such high-intensity regimens have raised concerns about safety and efficacy in older patients, Dr. Rule said, noting that the present study serves to inform clinicians about real-world outcomes in this population.
The retrospective analysis reviewed 244 patients who were aged at least 65 years when histologically diagnosed with PCNSL at 14 U.K. tertiary centers between 2012 and 2017. All patients received first-line care of any kind, ranging from best supportive care to clinical trial therapy. Patients were grouped into three treatment cohorts divided by level of frailty. Analysis showed that these divisions correlated with age, renal function, Eastern Cooperative Oncology Group performance status, and treatment intensity.
The frail group received palliative treatment consisting of whole-brain radiotherapy, an oral alkylator, or best supportive care. The less-fit group received methotrexate in combination with rituximab, an oral alkylator, or both. The fit group was most intensively treated, receiving high-dose methotrexate and cytarabine – with or without rituximab – or MATRix.
The primary objective was overall response rate, while the secondary objectives were median overall survival and progression-free survival.
The analysis showed that 79% of patients (n = 193) received methotrexate-based therapy of some kind, with 61% receiving three or more cycles of therapy and 30% requiring dose reductions. The overall response rate was 63%.
Dr. Poynton noted that about two-thirds of patients who achieved a partial response in early assessment went on to achieve a complete response. Patients in the fit group more often responded than those who were less fit (87% vs. 65%; P = .01) and more often received consolidation therapy (42% vs. 23%; P = .01).
Fitness level was also associated with median overall survival, which was longest in the fit group at 42 months. The other two groups had dramatically shorter survival times: 8 months in the less-fit group and just 2 months in the frail group.
A closer look at the data revealed some patterns, Dr. Poynton said.
“What we see is that age at diagnosis is significantly correlated with progression-free survival but not with overall survival,” he said, noting that, in contrast, performance status was associated with both survival measures.
Methotrexate dose also impacted both survival measures. Patients who received 75% or more of their induction dose over the course of treatment had better median overall survival and progression-free survival than those who received less than 75%. Similarly, consolidation therapy improved both survival measures.
Patients aged older than 70 years who received intensive chemotherapy had a treatment-related mortality rate of 4.8%, which is lower than the overall treatment-related mortality, Dr. Poynton reported.
Considering the correlation between methotrexate dose and survival, Dr. Poynton suggested that “dose reductions should be carefully considered.”
He also noted that patients in the fit cohort who received intensive chemotherapy had comparable outcomes with younger patients in prospective trials, and yet 44% of patients older than 65 years in the real world who received high-dose methotrexate with cytarabine would have been ineligible for the IELSG32 trial.
“We’ve been able to identify this cohort of patients retrospectively,” Dr. Poynton said. “They definitely exist, and I think we need to work harder at how are going to identify these patients prospectively in the future, so we know which of our patients who are older can benefit from intensive chemotherapy and which patients won’t.”
Dr. Poynton reported having no relevant financial disclosures. His coinvestigators reported relationships with AbbVie, Merck, Takeda, Jazz Pharmaceuticals, and others.
GLASGOW – Most older patients with primary central nervous system lymphoma (PCNSL) can tolerate high-dose methotrexate-based chemotherapy and achieve similar outcomes as younger and fitter patients, according to a retrospective analysis of 244 patients in the United Kingdom.
For older patients – at least 65 years old – who received methotrexate-based regimens, treatment-related mortality was 6.8%, which is comparable with rates seen in trials involving younger patients, reported lead author Edward Poynton, MD, of University College Hospital in London.
Specifically, Dr. Poynton cited the phase 2 IELSG32 trial, which had a treatment-related mortality rate of 6% among patients up to age 70 years. These patients were treated with the established protocol for younger patients: chemotherapy with methotrexate, cytarabine, thiotepa, and rituximab (MATRix) followed by autologous stem cell transplant or whole-brain radiotherapy.
Introducing Dr. Poynton’s presentation at the annual meeting of the British Society for Haematology, Simon Rule, MD, of the University of Plymouth (England), added historical context to the new findings.
“When I started in hematology ... [PCNSL] was a universally fatal disease, pretty much,” Dr. Rule said. “And then we had methotrexate, and it worked occasionally. And then we had a randomized trial, which was randomization of methotrexate plus or minus high-dose cytarabine, showing benefit.”
This combination became the benchmark against which subsequent randomized trials were measured; however, such high-intensity regimens have raised concerns about safety and efficacy in older patients, Dr. Rule said, noting that the present study serves to inform clinicians about real-world outcomes in this population.
The retrospective analysis reviewed 244 patients who were aged at least 65 years when histologically diagnosed with PCNSL at 14 U.K. tertiary centers between 2012 and 2017. All patients received first-line care of any kind, ranging from best supportive care to clinical trial therapy. Patients were grouped into three treatment cohorts divided by level of frailty. Analysis showed that these divisions correlated with age, renal function, Eastern Cooperative Oncology Group performance status, and treatment intensity.
The frail group received palliative treatment consisting of whole-brain radiotherapy, an oral alkylator, or best supportive care. The less-fit group received methotrexate in combination with rituximab, an oral alkylator, or both. The fit group was most intensively treated, receiving high-dose methotrexate and cytarabine – with or without rituximab – or MATRix.
The primary objective was overall response rate, while the secondary objectives were median overall survival and progression-free survival.
The analysis showed that 79% of patients (n = 193) received methotrexate-based therapy of some kind, with 61% receiving three or more cycles of therapy and 30% requiring dose reductions. The overall response rate was 63%.
Dr. Poynton noted that about two-thirds of patients who achieved a partial response in early assessment went on to achieve a complete response. Patients in the fit group more often responded than those who were less fit (87% vs. 65%; P = .01) and more often received consolidation therapy (42% vs. 23%; P = .01).
Fitness level was also associated with median overall survival, which was longest in the fit group at 42 months. The other two groups had dramatically shorter survival times: 8 months in the less-fit group and just 2 months in the frail group.
A closer look at the data revealed some patterns, Dr. Poynton said.
“What we see is that age at diagnosis is significantly correlated with progression-free survival but not with overall survival,” he said, noting that, in contrast, performance status was associated with both survival measures.
Methotrexate dose also impacted both survival measures. Patients who received 75% or more of their induction dose over the course of treatment had better median overall survival and progression-free survival than those who received less than 75%. Similarly, consolidation therapy improved both survival measures.
Patients aged older than 70 years who received intensive chemotherapy had a treatment-related mortality rate of 4.8%, which is lower than the overall treatment-related mortality, Dr. Poynton reported.
Considering the correlation between methotrexate dose and survival, Dr. Poynton suggested that “dose reductions should be carefully considered.”
He also noted that patients in the fit cohort who received intensive chemotherapy had comparable outcomes with younger patients in prospective trials, and yet 44% of patients older than 65 years in the real world who received high-dose methotrexate with cytarabine would have been ineligible for the IELSG32 trial.
“We’ve been able to identify this cohort of patients retrospectively,” Dr. Poynton said. “They definitely exist, and I think we need to work harder at how are going to identify these patients prospectively in the future, so we know which of our patients who are older can benefit from intensive chemotherapy and which patients won’t.”
Dr. Poynton reported having no relevant financial disclosures. His coinvestigators reported relationships with AbbVie, Merck, Takeda, Jazz Pharmaceuticals, and others.
GLASGOW – Most older patients with primary central nervous system lymphoma (PCNSL) can tolerate high-dose methotrexate-based chemotherapy and achieve similar outcomes as younger and fitter patients, according to a retrospective analysis of 244 patients in the United Kingdom.
For older patients – at least 65 years old – who received methotrexate-based regimens, treatment-related mortality was 6.8%, which is comparable with rates seen in trials involving younger patients, reported lead author Edward Poynton, MD, of University College Hospital in London.
Specifically, Dr. Poynton cited the phase 2 IELSG32 trial, which had a treatment-related mortality rate of 6% among patients up to age 70 years. These patients were treated with the established protocol for younger patients: chemotherapy with methotrexate, cytarabine, thiotepa, and rituximab (MATRix) followed by autologous stem cell transplant or whole-brain radiotherapy.
Introducing Dr. Poynton’s presentation at the annual meeting of the British Society for Haematology, Simon Rule, MD, of the University of Plymouth (England), added historical context to the new findings.
“When I started in hematology ... [PCNSL] was a universally fatal disease, pretty much,” Dr. Rule said. “And then we had methotrexate, and it worked occasionally. And then we had a randomized trial, which was randomization of methotrexate plus or minus high-dose cytarabine, showing benefit.”
This combination became the benchmark against which subsequent randomized trials were measured; however, such high-intensity regimens have raised concerns about safety and efficacy in older patients, Dr. Rule said, noting that the present study serves to inform clinicians about real-world outcomes in this population.
The retrospective analysis reviewed 244 patients who were aged at least 65 years when histologically diagnosed with PCNSL at 14 U.K. tertiary centers between 2012 and 2017. All patients received first-line care of any kind, ranging from best supportive care to clinical trial therapy. Patients were grouped into three treatment cohorts divided by level of frailty. Analysis showed that these divisions correlated with age, renal function, Eastern Cooperative Oncology Group performance status, and treatment intensity.
The frail group received palliative treatment consisting of whole-brain radiotherapy, an oral alkylator, or best supportive care. The less-fit group received methotrexate in combination with rituximab, an oral alkylator, or both. The fit group was most intensively treated, receiving high-dose methotrexate and cytarabine – with or without rituximab – or MATRix.
The primary objective was overall response rate, while the secondary objectives were median overall survival and progression-free survival.
The analysis showed that 79% of patients (n = 193) received methotrexate-based therapy of some kind, with 61% receiving three or more cycles of therapy and 30% requiring dose reductions. The overall response rate was 63%.
Dr. Poynton noted that about two-thirds of patients who achieved a partial response in early assessment went on to achieve a complete response. Patients in the fit group more often responded than those who were less fit (87% vs. 65%; P = .01) and more often received consolidation therapy (42% vs. 23%; P = .01).
Fitness level was also associated with median overall survival, which was longest in the fit group at 42 months. The other two groups had dramatically shorter survival times: 8 months in the less-fit group and just 2 months in the frail group.
A closer look at the data revealed some patterns, Dr. Poynton said.
“What we see is that age at diagnosis is significantly correlated with progression-free survival but not with overall survival,” he said, noting that, in contrast, performance status was associated with both survival measures.
Methotrexate dose also impacted both survival measures. Patients who received 75% or more of their induction dose over the course of treatment had better median overall survival and progression-free survival than those who received less than 75%. Similarly, consolidation therapy improved both survival measures.
Patients aged older than 70 years who received intensive chemotherapy had a treatment-related mortality rate of 4.8%, which is lower than the overall treatment-related mortality, Dr. Poynton reported.
Considering the correlation between methotrexate dose and survival, Dr. Poynton suggested that “dose reductions should be carefully considered.”
He also noted that patients in the fit cohort who received intensive chemotherapy had comparable outcomes with younger patients in prospective trials, and yet 44% of patients older than 65 years in the real world who received high-dose methotrexate with cytarabine would have been ineligible for the IELSG32 trial.
“We’ve been able to identify this cohort of patients retrospectively,” Dr. Poynton said. “They definitely exist, and I think we need to work harder at how are going to identify these patients prospectively in the future, so we know which of our patients who are older can benefit from intensive chemotherapy and which patients won’t.”
Dr. Poynton reported having no relevant financial disclosures. His coinvestigators reported relationships with AbbVie, Merck, Takeda, Jazz Pharmaceuticals, and others.
REPORTING FROM BSH 2019
CD40 ligand–binding protein safely lowered RA disease activity
A nonantibody scaffold protein that targets the CD40 ligand appears to dampen down autoimmune responses without the thromboembolic complications seen in trials of monoclonal antibodies against the CD40 ligand.
In a paper published in Science Translational Medicine, researchers presented the results of a phase 1a study in 59 healthy volunteers and phase 1b proof-of-concept study in 57 individuals with rheumatoid arthritis. Participants received either varying dosages of CD40 ligand (CD40L)–binding protein VIB4920 – a single dose in the healthy volunteers and seven doses in the phase 1b study – or placebo.
Jodi L. Karnell, PhD, of Viela Bio in Gaithersburg, Md., and coauthors, wrote that the CD40/CD40L pathway is known to play a key role in humoral immune responses and in the pathogenesis of several autoimmune diseases.
However, previous clinical trials of compounds targeting CD40L were stopped early because of an increased risk of adverse thromboembolic events related to platelet aggregation, despite showing potential benefits in lupus and immune thrombocytopenic purpura.
Preclinical studies of VIB4920 found that it blocked the expansion of CD40L-dependent human B cells without showing any signs of platelet aggregation. The authors said the platelet aggregation had been linked to a particular region of anti-CD40L monoclonal antibodies, but VIB4920 was engineered using a protein scaffold that did not contain that region.
In healthy volunteers, researchers saw a dose-dependent suppression of antibody production and reductions in B-cell proliferation, in recall response to a T cell–dependent antigen.
In individuals with rheumatoid arthritis, more than half of those treated with the two highest dosages of VIB4920 achieved low disease activity state or clinical remission by 12 weeks. Overall, there was also a significant decrease in disease activity, and dose-dependent decreases in rheumatoid factor autoantibodies and Vectra DA biomarker score, which is a composite of twelve rheumatoid arthritis–related biomarkers.
“The consistency of improvement across a variety of clinical and laboratory outcome measures further supports the potential clinical efficacy of VIB4920,” the authors wrote.
Researchers saw a similar rate of adverse events in the placebo and treatment arms of the study.
“VIB4920 represents an alternative to monoclonal antibody–based targeting of CD40L, which does not induce platelet aggregation in vitro and demonstrates a favorable safety profile in early clinical evaluation.”
The study was funded by MedImmune. All but one author were employees of MedImmune/AstraZeneca or of Viela Bio.
SOURCE: Karnell J et al. Sci Transl Med. 2019 April 24. doi: 10.1126/scitranslmed.aar6584
A nonantibody scaffold protein that targets the CD40 ligand appears to dampen down autoimmune responses without the thromboembolic complications seen in trials of monoclonal antibodies against the CD40 ligand.
In a paper published in Science Translational Medicine, researchers presented the results of a phase 1a study in 59 healthy volunteers and phase 1b proof-of-concept study in 57 individuals with rheumatoid arthritis. Participants received either varying dosages of CD40 ligand (CD40L)–binding protein VIB4920 – a single dose in the healthy volunteers and seven doses in the phase 1b study – or placebo.
Jodi L. Karnell, PhD, of Viela Bio in Gaithersburg, Md., and coauthors, wrote that the CD40/CD40L pathway is known to play a key role in humoral immune responses and in the pathogenesis of several autoimmune diseases.
However, previous clinical trials of compounds targeting CD40L were stopped early because of an increased risk of adverse thromboembolic events related to platelet aggregation, despite showing potential benefits in lupus and immune thrombocytopenic purpura.
Preclinical studies of VIB4920 found that it blocked the expansion of CD40L-dependent human B cells without showing any signs of platelet aggregation. The authors said the platelet aggregation had been linked to a particular region of anti-CD40L monoclonal antibodies, but VIB4920 was engineered using a protein scaffold that did not contain that region.
In healthy volunteers, researchers saw a dose-dependent suppression of antibody production and reductions in B-cell proliferation, in recall response to a T cell–dependent antigen.
In individuals with rheumatoid arthritis, more than half of those treated with the two highest dosages of VIB4920 achieved low disease activity state or clinical remission by 12 weeks. Overall, there was also a significant decrease in disease activity, and dose-dependent decreases in rheumatoid factor autoantibodies and Vectra DA biomarker score, which is a composite of twelve rheumatoid arthritis–related biomarkers.
“The consistency of improvement across a variety of clinical and laboratory outcome measures further supports the potential clinical efficacy of VIB4920,” the authors wrote.
Researchers saw a similar rate of adverse events in the placebo and treatment arms of the study.
“VIB4920 represents an alternative to monoclonal antibody–based targeting of CD40L, which does not induce platelet aggregation in vitro and demonstrates a favorable safety profile in early clinical evaluation.”
The study was funded by MedImmune. All but one author were employees of MedImmune/AstraZeneca or of Viela Bio.
SOURCE: Karnell J et al. Sci Transl Med. 2019 April 24. doi: 10.1126/scitranslmed.aar6584
A nonantibody scaffold protein that targets the CD40 ligand appears to dampen down autoimmune responses without the thromboembolic complications seen in trials of monoclonal antibodies against the CD40 ligand.
In a paper published in Science Translational Medicine, researchers presented the results of a phase 1a study in 59 healthy volunteers and phase 1b proof-of-concept study in 57 individuals with rheumatoid arthritis. Participants received either varying dosages of CD40 ligand (CD40L)–binding protein VIB4920 – a single dose in the healthy volunteers and seven doses in the phase 1b study – or placebo.
Jodi L. Karnell, PhD, of Viela Bio in Gaithersburg, Md., and coauthors, wrote that the CD40/CD40L pathway is known to play a key role in humoral immune responses and in the pathogenesis of several autoimmune diseases.
However, previous clinical trials of compounds targeting CD40L were stopped early because of an increased risk of adverse thromboembolic events related to platelet aggregation, despite showing potential benefits in lupus and immune thrombocytopenic purpura.
Preclinical studies of VIB4920 found that it blocked the expansion of CD40L-dependent human B cells without showing any signs of platelet aggregation. The authors said the platelet aggregation had been linked to a particular region of anti-CD40L monoclonal antibodies, but VIB4920 was engineered using a protein scaffold that did not contain that region.
In healthy volunteers, researchers saw a dose-dependent suppression of antibody production and reductions in B-cell proliferation, in recall response to a T cell–dependent antigen.
In individuals with rheumatoid arthritis, more than half of those treated with the two highest dosages of VIB4920 achieved low disease activity state or clinical remission by 12 weeks. Overall, there was also a significant decrease in disease activity, and dose-dependent decreases in rheumatoid factor autoantibodies and Vectra DA biomarker score, which is a composite of twelve rheumatoid arthritis–related biomarkers.
“The consistency of improvement across a variety of clinical and laboratory outcome measures further supports the potential clinical efficacy of VIB4920,” the authors wrote.
Researchers saw a similar rate of adverse events in the placebo and treatment arms of the study.
“VIB4920 represents an alternative to monoclonal antibody–based targeting of CD40L, which does not induce platelet aggregation in vitro and demonstrates a favorable safety profile in early clinical evaluation.”
The study was funded by MedImmune. All but one author were employees of MedImmune/AstraZeneca or of Viela Bio.
SOURCE: Karnell J et al. Sci Transl Med. 2019 April 24. doi: 10.1126/scitranslmed.aar6584
FROM SCIENCE TRANSLATIONAL MEDICINE
Key clinical point:
Major finding: Treatment with higher doses of VIB4920 is associated with low disease activity or remission in 50% of rheumatoid arthritis patients.
Study details: Phase 1a and 1b study in 59 healthy individuals and 57 people with rheumatoid arthritis.
Disclosures: The study was funded by MedImmune. All but one author were employees of MedImmune/AstraZeneca or Viela Bio.
Source: Karnell J et al. Sci Transl Med. 2019 April 24. doi: 10.1126/scitranslmed.aar6584.
Report: Part B funds stable, hospital trust running out
Medicare’s Part B trust fund is well funded and stable enough to pay physicians through the foreseeable future, according to an annual report by the Medicare Board of Trustees.
The Supplemental Medical Insurance (SMI) trust fund, which covers Medicare Part B and D, contained $104 billion in assets at the end of 2018 and is expected to be adequately financed in all years because of continued premium and general revenue income, according to the report, which was released April 22.
However, the Hospital Insurance (HI) trust fund, which funds Medicare Part A, is expected to run out by 2026, the same projection as last year, the trustees reported.
In addition, trustees said that total Medicare costs – including both HI and SMI expenditures – will grow from about 4% of gross domestic product (GDP) in 2018 to about 6% of GDP by 2038 and then increase gradually thereafter to about 6.5% of GDP by 2093.
The faster rate of growth in Medicare spending, compared with GDP growth, is attributable to a growing number of Medicare patients and increased volume and intensity of health care services, according to the report. Alone, SMI costs are projected to grow steadily from 2% of GDP in 2018 to about 4% of GDP in 2038 because of the aging population and rising health care costs.
The report delivers a dose of reality, reminding the country that the program’s main trust for hospital services can pay full benefits for only 7 more years, Seema Verma, administrator of the Centers for Medicare & Medicaid Services said.
“The Trump administration is working hard to protect and strengthen Medicare and lower costs while improving quality in order to protect the program for future generations of seniors who have paid into the program their whole lives,” Ms. Verma said in a statement. “If we do not take the fiscal crisis in Medicare seriously, we will jeopardize access to health care for millions of seniors.”
Department of Health & Human Services Secretary Alex M. Azar II said the annual report provides a sobering reminder that more work is necessary to support current and future generations of seniors.
“Instead of trying to expand Medicare into a universal entitlement that even covers wealthy Americans of working age, as some have proposed, we need to fulfill Medicare’s promise to our seniors,” Mr. Azar said in a statement, referring to proposals to expand government health care by some Democrats.
The trustees report notes that Medicare has introduced a number of initiatives to strengthen and protect the program and finalized a number of rules that advance a patient-driven health care system through competition.
“In particular, CMS is strengthening Medicare through increasing choice in Medicare Advantage and adding supplemental benefits to the program, offering more care options for people with diabetes, providing new telehealth services, and lowering prescription drug costs for seniors,” the agency stated in a press release. “CMS is also continuing work to advance policies to increase price transparency and help beneficiaries compare costs across different providers.”
Medicare’s Part B trust fund is well funded and stable enough to pay physicians through the foreseeable future, according to an annual report by the Medicare Board of Trustees.
The Supplemental Medical Insurance (SMI) trust fund, which covers Medicare Part B and D, contained $104 billion in assets at the end of 2018 and is expected to be adequately financed in all years because of continued premium and general revenue income, according to the report, which was released April 22.
However, the Hospital Insurance (HI) trust fund, which funds Medicare Part A, is expected to run out by 2026, the same projection as last year, the trustees reported.
In addition, trustees said that total Medicare costs – including both HI and SMI expenditures – will grow from about 4% of gross domestic product (GDP) in 2018 to about 6% of GDP by 2038 and then increase gradually thereafter to about 6.5% of GDP by 2093.
The faster rate of growth in Medicare spending, compared with GDP growth, is attributable to a growing number of Medicare patients and increased volume and intensity of health care services, according to the report. Alone, SMI costs are projected to grow steadily from 2% of GDP in 2018 to about 4% of GDP in 2038 because of the aging population and rising health care costs.
The report delivers a dose of reality, reminding the country that the program’s main trust for hospital services can pay full benefits for only 7 more years, Seema Verma, administrator of the Centers for Medicare & Medicaid Services said.
“The Trump administration is working hard to protect and strengthen Medicare and lower costs while improving quality in order to protect the program for future generations of seniors who have paid into the program their whole lives,” Ms. Verma said in a statement. “If we do not take the fiscal crisis in Medicare seriously, we will jeopardize access to health care for millions of seniors.”
Department of Health & Human Services Secretary Alex M. Azar II said the annual report provides a sobering reminder that more work is necessary to support current and future generations of seniors.
“Instead of trying to expand Medicare into a universal entitlement that even covers wealthy Americans of working age, as some have proposed, we need to fulfill Medicare’s promise to our seniors,” Mr. Azar said in a statement, referring to proposals to expand government health care by some Democrats.
The trustees report notes that Medicare has introduced a number of initiatives to strengthen and protect the program and finalized a number of rules that advance a patient-driven health care system through competition.
“In particular, CMS is strengthening Medicare through increasing choice in Medicare Advantage and adding supplemental benefits to the program, offering more care options for people with diabetes, providing new telehealth services, and lowering prescription drug costs for seniors,” the agency stated in a press release. “CMS is also continuing work to advance policies to increase price transparency and help beneficiaries compare costs across different providers.”
Medicare’s Part B trust fund is well funded and stable enough to pay physicians through the foreseeable future, according to an annual report by the Medicare Board of Trustees.
The Supplemental Medical Insurance (SMI) trust fund, which covers Medicare Part B and D, contained $104 billion in assets at the end of 2018 and is expected to be adequately financed in all years because of continued premium and general revenue income, according to the report, which was released April 22.
However, the Hospital Insurance (HI) trust fund, which funds Medicare Part A, is expected to run out by 2026, the same projection as last year, the trustees reported.
In addition, trustees said that total Medicare costs – including both HI and SMI expenditures – will grow from about 4% of gross domestic product (GDP) in 2018 to about 6% of GDP by 2038 and then increase gradually thereafter to about 6.5% of GDP by 2093.
The faster rate of growth in Medicare spending, compared with GDP growth, is attributable to a growing number of Medicare patients and increased volume and intensity of health care services, according to the report. Alone, SMI costs are projected to grow steadily from 2% of GDP in 2018 to about 4% of GDP in 2038 because of the aging population and rising health care costs.
The report delivers a dose of reality, reminding the country that the program’s main trust for hospital services can pay full benefits for only 7 more years, Seema Verma, administrator of the Centers for Medicare & Medicaid Services said.
“The Trump administration is working hard to protect and strengthen Medicare and lower costs while improving quality in order to protect the program for future generations of seniors who have paid into the program their whole lives,” Ms. Verma said in a statement. “If we do not take the fiscal crisis in Medicare seriously, we will jeopardize access to health care for millions of seniors.”
Department of Health & Human Services Secretary Alex M. Azar II said the annual report provides a sobering reminder that more work is necessary to support current and future generations of seniors.
“Instead of trying to expand Medicare into a universal entitlement that even covers wealthy Americans of working age, as some have proposed, we need to fulfill Medicare’s promise to our seniors,” Mr. Azar said in a statement, referring to proposals to expand government health care by some Democrats.
The trustees report notes that Medicare has introduced a number of initiatives to strengthen and protect the program and finalized a number of rules that advance a patient-driven health care system through competition.
“In particular, CMS is strengthening Medicare through increasing choice in Medicare Advantage and adding supplemental benefits to the program, offering more care options for people with diabetes, providing new telehealth services, and lowering prescription drug costs for seniors,” the agency stated in a press release. “CMS is also continuing work to advance policies to increase price transparency and help beneficiaries compare costs across different providers.”
Subcutaneous or IV trastuzumab? Take your pick
It’s a toss-up: For patients with early, HER2-positive breast cancer, subcutaneous trastuzumab is comparable in efficacy and safety with intravenous trastuzumab, final results of the phase 3, randomized HannaH trial indicate.
The 6-year event-free survival (EFS) and overall survival (OS) rates were identical for patients randomized either to subcutaneously or intravenously delivered trastuzumab (Herceptin and biosimilars); adverse events rates also were similar, reported Christian Jackisch, MD, PhD, from Sana Klinikum Offenbach, Germany, and his associates.
“Event-free survival and OS results after 6 years of follow-up continue to support the noninferiority of subcutaneous trastuzumab to intravenous trastuzumab observed in the primary analysis. Results for EFS were consistent with those observed in the Neoadjuvant Herceptin [NOAH] trial of intravenous trastuzumab,” the investigators wrote in JAMA Oncology.
The HannaH (Enhanced Treatment With Neoadjuvant Herceptin) trial was designed to show whether subcutaneous trastuzumab was noninferior to intravenous trastuzumab for patients with HER2 (ERBB2)-positive early breast cancer.
Patients received four cycles of neoadjuvant docetaxel, followed by four cycles of combination chemotherapy with fluorouracil, epirubicin, and cyclophosphamide, plus either subcutaneous trastuzumab 600 mg delivered over 5 minutes or IV trastuzumab at a loading dose of 8 mg/kg and maintenance dose of 6 mg/kg every 3 weeks. Patients received an additional 10 cycles of trastuzumab post surgery.
The coprimary endpoints were pathologic complete response, defined as absence of invasive neoplastic cells in the breast (remaining ductal carcinoma in situ was accepted) and serum trough concentration predose on dose cycle 8.
The primary analysis, published in 2012, showed that the subcutaneous formulation has pharmacokinetic, efficacy and safety profiles comparable with those of standard intravenous administration. Subsequent analyses showed similar 3-year EFS rates and safety profiles, Dr. Jackisch and colleagues noted.
The current, final analysis was conducted after a median follow-up of 5.9 years in an intention-to-treat population within the subcutaneous group (294 women), and 6.0 years in the intravenous group (297 women).
The 6-year EFS rate was 65% in each group, and the OS rate was 84% in each group. In both trial arms, 6-year EFS and OS rates were higher for patients with complete pathologic responses than for patients with residual disease.
Adverse events of any grade were reported in 97.6% in the subcutaneous group and 94.6% in the intravenous group. Grade 3 or greater adverse events occurred in 53.2% versus 53.7%, cardiac adverse events in 14.8% versus 14.1%, and serious adverse events in 21.9% versus 15.1%, respectively.
The HannaH trial was sponsored by Hoffman-La Roche. Dr. Jackisch and several coauthors reported receiving grants and personal fees from Hoffmann-La Roche.
SOURCE: Jackisch C et al. JAMA Oncol. 2019 Apr 18. doi: 10.1001/jamaoncol.2019.0339.
It’s a toss-up: For patients with early, HER2-positive breast cancer, subcutaneous trastuzumab is comparable in efficacy and safety with intravenous trastuzumab, final results of the phase 3, randomized HannaH trial indicate.
The 6-year event-free survival (EFS) and overall survival (OS) rates were identical for patients randomized either to subcutaneously or intravenously delivered trastuzumab (Herceptin and biosimilars); adverse events rates also were similar, reported Christian Jackisch, MD, PhD, from Sana Klinikum Offenbach, Germany, and his associates.
“Event-free survival and OS results after 6 years of follow-up continue to support the noninferiority of subcutaneous trastuzumab to intravenous trastuzumab observed in the primary analysis. Results for EFS were consistent with those observed in the Neoadjuvant Herceptin [NOAH] trial of intravenous trastuzumab,” the investigators wrote in JAMA Oncology.
The HannaH (Enhanced Treatment With Neoadjuvant Herceptin) trial was designed to show whether subcutaneous trastuzumab was noninferior to intravenous trastuzumab for patients with HER2 (ERBB2)-positive early breast cancer.
Patients received four cycles of neoadjuvant docetaxel, followed by four cycles of combination chemotherapy with fluorouracil, epirubicin, and cyclophosphamide, plus either subcutaneous trastuzumab 600 mg delivered over 5 minutes or IV trastuzumab at a loading dose of 8 mg/kg and maintenance dose of 6 mg/kg every 3 weeks. Patients received an additional 10 cycles of trastuzumab post surgery.
The coprimary endpoints were pathologic complete response, defined as absence of invasive neoplastic cells in the breast (remaining ductal carcinoma in situ was accepted) and serum trough concentration predose on dose cycle 8.
The primary analysis, published in 2012, showed that the subcutaneous formulation has pharmacokinetic, efficacy and safety profiles comparable with those of standard intravenous administration. Subsequent analyses showed similar 3-year EFS rates and safety profiles, Dr. Jackisch and colleagues noted.
The current, final analysis was conducted after a median follow-up of 5.9 years in an intention-to-treat population within the subcutaneous group (294 women), and 6.0 years in the intravenous group (297 women).
The 6-year EFS rate was 65% in each group, and the OS rate was 84% in each group. In both trial arms, 6-year EFS and OS rates were higher for patients with complete pathologic responses than for patients with residual disease.
Adverse events of any grade were reported in 97.6% in the subcutaneous group and 94.6% in the intravenous group. Grade 3 or greater adverse events occurred in 53.2% versus 53.7%, cardiac adverse events in 14.8% versus 14.1%, and serious adverse events in 21.9% versus 15.1%, respectively.
The HannaH trial was sponsored by Hoffman-La Roche. Dr. Jackisch and several coauthors reported receiving grants and personal fees from Hoffmann-La Roche.
SOURCE: Jackisch C et al. JAMA Oncol. 2019 Apr 18. doi: 10.1001/jamaoncol.2019.0339.
It’s a toss-up: For patients with early, HER2-positive breast cancer, subcutaneous trastuzumab is comparable in efficacy and safety with intravenous trastuzumab, final results of the phase 3, randomized HannaH trial indicate.
The 6-year event-free survival (EFS) and overall survival (OS) rates were identical for patients randomized either to subcutaneously or intravenously delivered trastuzumab (Herceptin and biosimilars); adverse events rates also were similar, reported Christian Jackisch, MD, PhD, from Sana Klinikum Offenbach, Germany, and his associates.
“Event-free survival and OS results after 6 years of follow-up continue to support the noninferiority of subcutaneous trastuzumab to intravenous trastuzumab observed in the primary analysis. Results for EFS were consistent with those observed in the Neoadjuvant Herceptin [NOAH] trial of intravenous trastuzumab,” the investigators wrote in JAMA Oncology.
The HannaH (Enhanced Treatment With Neoadjuvant Herceptin) trial was designed to show whether subcutaneous trastuzumab was noninferior to intravenous trastuzumab for patients with HER2 (ERBB2)-positive early breast cancer.
Patients received four cycles of neoadjuvant docetaxel, followed by four cycles of combination chemotherapy with fluorouracil, epirubicin, and cyclophosphamide, plus either subcutaneous trastuzumab 600 mg delivered over 5 minutes or IV trastuzumab at a loading dose of 8 mg/kg and maintenance dose of 6 mg/kg every 3 weeks. Patients received an additional 10 cycles of trastuzumab post surgery.
The coprimary endpoints were pathologic complete response, defined as absence of invasive neoplastic cells in the breast (remaining ductal carcinoma in situ was accepted) and serum trough concentration predose on dose cycle 8.
The primary analysis, published in 2012, showed that the subcutaneous formulation has pharmacokinetic, efficacy and safety profiles comparable with those of standard intravenous administration. Subsequent analyses showed similar 3-year EFS rates and safety profiles, Dr. Jackisch and colleagues noted.
The current, final analysis was conducted after a median follow-up of 5.9 years in an intention-to-treat population within the subcutaneous group (294 women), and 6.0 years in the intravenous group (297 women).
The 6-year EFS rate was 65% in each group, and the OS rate was 84% in each group. In both trial arms, 6-year EFS and OS rates were higher for patients with complete pathologic responses than for patients with residual disease.
Adverse events of any grade were reported in 97.6% in the subcutaneous group and 94.6% in the intravenous group. Grade 3 or greater adverse events occurred in 53.2% versus 53.7%, cardiac adverse events in 14.8% versus 14.1%, and serious adverse events in 21.9% versus 15.1%, respectively.
The HannaH trial was sponsored by Hoffman-La Roche. Dr. Jackisch and several coauthors reported receiving grants and personal fees from Hoffmann-La Roche.
SOURCE: Jackisch C et al. JAMA Oncol. 2019 Apr 18. doi: 10.1001/jamaoncol.2019.0339.
FROM JAMA ONCOLOGY
Suicide barriers on the Golden Gate Bridge: Will they save lives?
Ultimately, we need to find better treatments for depression and anxiety
San Francisco entrances people. Photographers capture more images of the Golden Gate Bridge than any other bridge in the world.1 And only the Nanjing Yangtze River Bridge in China surpasses the Golden Gate as a destination for dying by suicide.2 At least 1,700 people reportedly have plunged from the bridge to their deaths since its opening in 1937.3
Despite concerted efforts by bridge security, the local mental health community, and a volunteer organization – Bridgewatch Angels – suicides continue at the pace of about 1 every 2 weeks. After more than 60 years of discussion, transportation officials allocated funding and have started building a suicide prevention barrier system on the Golden Gate.
Extrapolating from the success of barriers built on other bridges that were “suicide magnets,” we should be able to assure people that suicide deaths from the Golden Gate will dramatically decrease, and perhaps cease completely.4 Certainly, some in the mental health community think this barrier will save lives. They support this claim by citing research showing that removing highly accessible and lethal means of suicide reduces overall suicide rates, and that suicidal individuals, when thwarted, do not seek alternate modes of death.
I support building the Golden Gate suicide barrier, partly because symbolically, it should deliver a powerful message that we value all human life. But will the barrier save lives? I don’t think it will. As the American Psychiatric Association prepares to gather for its annual meeting in San Francisco, I would like to share my reasoning.
What the evidence shows
The most robust evidence that restricting availability of highly lethal and accessible means of suicide reduces overall suicide deaths comes from studies looking at self-poisoning in Asian countries and Great Britain. In many parts of Asia, ingestion of pesticides constitutes a significant proportion of suicide deaths, and several studies have found that, in localities where sales of highly lethal pesticides were restricted, overall suicide deaths decreased.5,6 Conversely, suicide rates increased when more lethal varieties of pesticides became more available. In Great Britain, overall suicide rates decreased when natural gas replaced coal gas for home heating and cooking.7 For decades preceding this change, more Britons had killed themselves by inhaling coal gas than by any other method.
Strong correlations exist between regional levels of gun ownership and suicide rates by shooting,8 but several potentially confounding sociopolitical factors explain some portion of this connection. Stronger evidence of gun availability affecting suicide rates has been demonstrated by decreases in suicide rates after restrictions in gun access in Switzerland,9 Israel,10 and other areas. These studies show correlations – not causality. However, the number of studies, links between increases and decreases in suicide rates with changes in access to guns, absence of changes in suicide rates during the same time periods among ostensibly similar control populations, and lack of other compelling explanations support the argument that restricting access to highly lethal and accessible means of suicide prevents suicide deaths overall.
The installation of suicide barriers on bridges that have been the sites of multiple suicides robustly reduces or even eliminates suicide deaths from those bridges,11 but the effect on overall suicide rates remains less clear. Various studies have found subsequent increases or no changes12-14 in suicide deaths from other bridges or tall buildings in the vicinity after the installation of suicide barriers on a “suicide magnet.” Many of the studies failed to find any impact on overall suicide rates in the regions investigated. Deaths from jumping off tall structures constitute a tiny proportion of total suicide deaths, making it difficult to detect any changes in overall suicide rates. In the United States, suicides by jumping/falling constituted 1%-2% of total suicides over the last several decades.15
If we know that restricting highly lethal and accessible methods of killing reduces suicide deaths, why would I question the value of the Golden Gate suicide barrier in preventing overall suicide deaths?
Unique aspects of the bridge
The World High Dive Federation recommends keeping dives to less than 20 meters (65.5 feet), with a few exceptions.16 The rail of the Golden Gate Bridge stands 67 meters (220 feet) above the water, and assuming minimal wind resistance, a falling person traverses that distance in about 3.7 seconds and lands with an impact of 130 km/hour (81 miles per hour).17 Only about 1%-2% of those jumping from the Golden Gate survive that fall.18
A 99% likelihood of death sounds pretty lethal; however, death by jumping from the Golden Gate inherently takes place in a public space, with the opportunity for interventions by other people. A more realistic calculation of the lethality would start the instant that someone initiates a sequence of behaviors leading to the intended death. By that criteria, measuring the lethality of the Golden Gate would begin when an individual enters a vehicle or sets off on foot with the plan of going over the railing.
Unless our surveillance-oriented society makes substantially greater advances (which I oppose), we will remain unable to assess suicide lethality by starting at the moment of inception. However, we do have data showing what happens once someone with suicidal intentions walks onto the bridge.
Between 2000 and 2018, observers noted 2,546 people on the Golden Gate who appeared to be considering a suicide attempt, the San Francisco Chronicle has reported. Five hundred sixty-four confirmed suicides occurred. In an additional 71 cases, suicide is presumed but bodies were not recovered. In the 1,911 remaining instances, mental health interventions were made, with individuals taken to local hospitals and psychiatric wards, and released when no longer overtly suicidal. Interventions successfully diverted 75% (1,911/2,546) of those intending to end their own lives, which suggests that the current lethality of the Golden Gate as a means of suicide is only 25%. Even in the bridge’s first half-century, without constant camera monitoring, and a cadre of volunteers and professionals scanning for those attempting suicide, the lethality rate approached about 50%.19
We face even more difficulties measuring accessibility than in determining lethality. The Golden Gate appears to be accessible to almost anyone – drivers have to pay a toll only when traveling from the north, and then only after they have traversed the span. Pedestrians retain unfettered admittance to the east sidewalk (facing San Francisco city and bay) throughout daylight hours. But any determination of accessibility must include how quickly and easily one can make use of an opportunity.
Both entrances to the Golden Gate are embedded in the Golden Gate National Recreational Area, part of the National Park system. The south entrance to the bridge arises from The Presidio, a former military installation that housed about 4,000 people.20 Even fewer people live in the parklands at the north end of the bridge. The Presidio extends far enough so that the closest San Francisco neighborhoods outside of the park are a full 2.2 km (1.36 miles) from the bridge railing. A brisk walk would still require a minimum of about 20 minutes to get to the bridge; it is difficult to arrive at the bridge without a trek.
Researchers define impulsivity, like accessibility, inconsistently – and often imprecisely. Impulsivity, which clearly exists on a spectrum, connotes overvaluing of immediate feelings and thoughts at the expense of longer term goals and aspirations. Some suicide research appears to define impulsivity as the antithesis of planned behavior;21,22 others define it pragmatically as behaviors executed within 5 minutes of a decision,23 and still others contend that “suicidal behavior is rarely if ever impulsive.”24 Furthermore, when we assess impulsivity, we must acknowledge a fundamental difference between “impulsive” shootings and poisonings that are accomplished at home and within seconds or minutes, from “impulsive” Golden Gate Bridge suicide attempts, which require substantial travel and time commitments, and inherently involve the potential for others to intervene.
Those arguing that the bridge suicide barrier will save lives often bring up two additional sets of numbers to back up their assertions. They provide evidence that most of those people who were stopped in their attempts at suicide at the Golden Gate do not go on to commit suicide elsewhere, and that many of those who survived their attempts express regret at having tried to kill themselves. Specifically, 94% of those who were prevented from jumping from the Golden Gate had not committed suicide after a median follow-up of 26 years, according to a follow-up study published a few years ago. On the other hand, those who have made a serious suicide attempt have a substantially increased risk, relative to the general population, of dying from a later attempt,25,26 and the strongest predictor for death by suicide is having made a previous, serious suicide attempt.27
While all of these studies provide important and interesting information regarding suicide, none directly address the question of whether individuals will substitute attempts by other methods if the Golden Gate Bridge were no longer available. Many discussions blur the distinction between how individuals behave after a thwarted Golden Gate suicide attempt and how other people might act if we secured the bridge from any potential future suicide attempts. I hope that the following analogy makes this distinction clearer without trivializing: Imagine that we know that everyone who was interrupted while eating their dinner in a particular restaurant never went back and ate out anywhere, ever again. We could not conclude from this that another individual, who learned that the intended restaurant was indefinitely closed, would never dine out again. Once effective suicide barriers exist on the Golden Gate, this will likely become widely known, thereby greatly reducing the likelihood that any individuals will consider the possibility of jumping from the bridge. But it seems very unlikely that this would vanquish all suicidal impulses from the northern California population.
Lessons from patients
Two former patients of mine ended their lives by suicide from the Golden Gate. P, a solitary and lonely man in his 50s, was referred to me by his neighbor, Q, one of my long-term patients. P had a history of repeated assessments for lifelong depression, with minimal follow-up. I made a treatment plan with P that we hoped would address both his depression and his reluctance to engage with mental health professionals. He did not return for his follow-up appointment and ignored all my attempts to contact him.
P continued to have intermittent contact with Q. A decade after I had evaluated him, P was finally hospitalized for depression. Since P had no local family or friends, he asked Q to pick him up from the hospital at the time of his discharge. P asked Q to drive him to the Golden Gate Bridge, ostensibly to relish his release by partaking of the panoramic view of San Francisco from the bridge. They parked in the lot at the north end of the bridge, where Q stayed with the car at the vista point. The last that anyone saw of P was when Q noticed him walking on the bridge; nobody saw him go over and his body was not recovered.
In contrast to my brief connection with P, I worked with S over the course of 8 years to deal with her very severe attention-deficit/hyperactivity disorder and associated depression, which destroyed jobs and friendships, and estranged her from her family. She moved to Hawaii in hopes of “starting over with less baggage,” but I received a few phone calls over the next few years detailing suicide attempts, including driving her car off a bridge. Floundering in life, she returned to San Francisco and was hospitalized with suicidal ideation. The inpatient team sedated her heavily, ignored her past treatments and diagnoses, and discharged her after several days. Within a day of discharge, S’s sister called to say that S’s body had been recovered from the water below the bridge.
I don’t think that suicide was inevitable for either P or S, but I also lack any indication that either would be alive today had we installed suicide barriers on the Golden Gate years ago. Unless we eliminate access to guns, cars, trains, poisons, ropes, tall buildings and cliffs, people contemplating suicide will have numerous options at their disposal. We are likely to save lives by continuing to find ways to restrict access to means of death that can be used within seconds and have a high degree of lethality, and we should persist with such efforts. Buying a $5 trigger lock for every gun in California, and spending tens of millions on a public service campaign would cost less and may well save more lives than the Golden Gate suicide barrier. Unfortunately, we still possess very limited knowledge regarding which suicide prevention measures have an “impact on actual deaths or behavior.”28
To increase our efficacy in reducing suicide, we need to find better treatments for depression and anxiety. We also need to identify better ways of targeting those most at risk for suicide,29 improve our delivery of such treatments, and mitigate the social factors that contribute to such misery and unhappiness.
As a psychiatrist who has lost not only patients but also family members to suicide, I appreciate the hole in the soul these deaths create. I understand the drive to find ways to prevent additional deaths and save future survivors from such grief. But we must design psychiatric interventions that do the maximum good. To be imprecise in the lessons we learn from those who have killed themselves doubles down on the disservice to those lives already lost.
Dr. Kruse is a psychiatrist who practices in San Francisco. Several key details about the patients were changed to protect confidentiality.
References
1. Frommer’s Comprehensive Travel Guide, California. New York: Prentice Hall Travel, 1993.
2. “Chen Si, the ‘Angel of Nanjing,’ has saved more than 330 people from suicide,” by Matt Young, News.com.au. May 14, 2017.
3. “Finding Kyle,” by Lizzie Johnson, San Francisco Chronicle. Feb 8, 2019.
4. Beautrais A. Suicide by jumping. A review of research and prevention strategies. Crisis. 2007 Jan;28 Suppl 1:58-63. Crisis: The J of Crisis Interven Suicide Preven. 2007 Jan. (28)[Suppl1]:58-63.
5. Gunnell D et al. The global distribution of fatal pesticide self-poisoning: Systematic review. BMC Public Health. 2007 Dec. 21;7:357.
6. Vijayakumar L and Satheesh-Babu R. Does ‘no pesticide’ reduce suicides? Int J Soc Psychiatry. 2009 Jul 17;55:401-6.
7. Kreitman N. The coal gas story. United Kingdom suicide rates, 1960-71. Br J Prev Soc Med. 1976 Jun;30(2)86-93.
8. Ajdacic-Gross V et al. Changing times: A longitudinal analysis of international firearm suicide data. Am J Public Health. 2006 Oct;96(10):1752-5.
9. Reisch T et al. Change in suicide rates in Switzerland before and after firearm restriction resulting from the 2003 “Army XXI” reform. Am J Psychiatry. 2013 Sep170(9):977-84.
10. Lubin G et al. Decrease in suicide rates after a change of policy reducing access to firearms in adolescents: A naturalistic epidemiological study. Suicide Life Threat Behav. 2010 Oct;40(5):421-4.
11. Sinyor M and Levitt A. Effect of a barrier at Bloor Street Viaduct on suicide rates in Toronto: Natural experiment BMJ. 2010;341. doi: 1136/bmjc2884.
12. O’Carroll P and Silverman M. Community suicide prevention: The effectiveness of bridge barriers. Suicide Life Threat Behav. 1994 Spring;24(1):89-91; discussion 91-9.
13. Pelletier A. Preventing suicide by jumping: The effect of a bridge safety fence. Inj Prev. 2007 Feb;13(1):57-9.
14. Bennewith O et al. Effect of barriers on the Clifton suspension bridge, England, on local patterns of suicide: Implications for prevention. Br J Psychiatry. 2007 Mar;190:266-7.
15. Harvard T.H. Chan School of Public Health. 2004. “How do people most commonly complete suicide?”
16. “How cliff diving works,” by Heather Kolich, HowStuffWorks.com. Oct 5, 2009.
17. “Bridge design and construction statistics.” Goldengate.org
18. “How did teen survive fall from Golden Gate Bridge?” by Remy Molina, Live Science. Apr 19, 2011.
19. Seiden R. Where are they now? A follow-up study of suicide attempters from the Golden Gate Bridge. Suicide Life Threat Behav. 1978 Winter;8(4):203-16.
20. Presidio demographics. Point2homes.com.
21. Baca-García E et al. A prospective study of the paradoxical relationship between impulsivity and lethality of suicide attempts. J Clin Psychiatry. 2001 Jul;62(7):560-4.
22. Lim M et al. Differences between impulsive and non-impulsive suicide attempts among individuals treated in emergency rooms of South Korea. Psychiatry Investig. 2016 Jul;13(4):389-96.
23. Simon O et al. Characteristics of impulsive suicide attempts and attempters. Suicide Life Threat Behav. 2001;32(1 Suppl):49-59.
24. Anestis M et al. Reconsidering the link between impulsivity and suicidal behavior. Pers Soc Psychol Rev. 2014 Nov;18(4):366-86.
25. Ostamo A et al. Excess mortality of suicide attempters. Psychiatry Psychiatr Epidemiol. 2001 Jan;36(1):29-35.
26. Leon A et al. Statistical issues in the identification of risk factors for suicidal behavior: The application of survival analysis. Psychiatry Res. 1990 Jan;31(1):99-108.
27. Bostwick J et al. Suicide attempt as a risk factor for completed suicide: Even more lethal than we knew. Am J Psychiatry. 2016 Nov 1;173(11):1094-100.
28. Stone D and Crosby A. Suicide prevention. Am J Lifestyle Med. 2014;8(6):404-20.
29. Belsher B et al. Prediction models for suicide attempts and deaths: A systematic review and simulation. JAMA Psychiatry. 2019 Mar 13. doi: 10.1001/jamapsychiatry.2019.0174.
Ultimately, we need to find better treatments for depression and anxiety
Ultimately, we need to find better treatments for depression and anxiety
San Francisco entrances people. Photographers capture more images of the Golden Gate Bridge than any other bridge in the world.1 And only the Nanjing Yangtze River Bridge in China surpasses the Golden Gate as a destination for dying by suicide.2 At least 1,700 people reportedly have plunged from the bridge to their deaths since its opening in 1937.3
Despite concerted efforts by bridge security, the local mental health community, and a volunteer organization – Bridgewatch Angels – suicides continue at the pace of about 1 every 2 weeks. After more than 60 years of discussion, transportation officials allocated funding and have started building a suicide prevention barrier system on the Golden Gate.
Extrapolating from the success of barriers built on other bridges that were “suicide magnets,” we should be able to assure people that suicide deaths from the Golden Gate will dramatically decrease, and perhaps cease completely.4 Certainly, some in the mental health community think this barrier will save lives. They support this claim by citing research showing that removing highly accessible and lethal means of suicide reduces overall suicide rates, and that suicidal individuals, when thwarted, do not seek alternate modes of death.
I support building the Golden Gate suicide barrier, partly because symbolically, it should deliver a powerful message that we value all human life. But will the barrier save lives? I don’t think it will. As the American Psychiatric Association prepares to gather for its annual meeting in San Francisco, I would like to share my reasoning.
What the evidence shows
The most robust evidence that restricting availability of highly lethal and accessible means of suicide reduces overall suicide deaths comes from studies looking at self-poisoning in Asian countries and Great Britain. In many parts of Asia, ingestion of pesticides constitutes a significant proportion of suicide deaths, and several studies have found that, in localities where sales of highly lethal pesticides were restricted, overall suicide deaths decreased.5,6 Conversely, suicide rates increased when more lethal varieties of pesticides became more available. In Great Britain, overall suicide rates decreased when natural gas replaced coal gas for home heating and cooking.7 For decades preceding this change, more Britons had killed themselves by inhaling coal gas than by any other method.
Strong correlations exist between regional levels of gun ownership and suicide rates by shooting,8 but several potentially confounding sociopolitical factors explain some portion of this connection. Stronger evidence of gun availability affecting suicide rates has been demonstrated by decreases in suicide rates after restrictions in gun access in Switzerland,9 Israel,10 and other areas. These studies show correlations – not causality. However, the number of studies, links between increases and decreases in suicide rates with changes in access to guns, absence of changes in suicide rates during the same time periods among ostensibly similar control populations, and lack of other compelling explanations support the argument that restricting access to highly lethal and accessible means of suicide prevents suicide deaths overall.
The installation of suicide barriers on bridges that have been the sites of multiple suicides robustly reduces or even eliminates suicide deaths from those bridges,11 but the effect on overall suicide rates remains less clear. Various studies have found subsequent increases or no changes12-14 in suicide deaths from other bridges or tall buildings in the vicinity after the installation of suicide barriers on a “suicide magnet.” Many of the studies failed to find any impact on overall suicide rates in the regions investigated. Deaths from jumping off tall structures constitute a tiny proportion of total suicide deaths, making it difficult to detect any changes in overall suicide rates. In the United States, suicides by jumping/falling constituted 1%-2% of total suicides over the last several decades.15
If we know that restricting highly lethal and accessible methods of killing reduces suicide deaths, why would I question the value of the Golden Gate suicide barrier in preventing overall suicide deaths?
Unique aspects of the bridge
The World High Dive Federation recommends keeping dives to less than 20 meters (65.5 feet), with a few exceptions.16 The rail of the Golden Gate Bridge stands 67 meters (220 feet) above the water, and assuming minimal wind resistance, a falling person traverses that distance in about 3.7 seconds and lands with an impact of 130 km/hour (81 miles per hour).17 Only about 1%-2% of those jumping from the Golden Gate survive that fall.18
A 99% likelihood of death sounds pretty lethal; however, death by jumping from the Golden Gate inherently takes place in a public space, with the opportunity for interventions by other people. A more realistic calculation of the lethality would start the instant that someone initiates a sequence of behaviors leading to the intended death. By that criteria, measuring the lethality of the Golden Gate would begin when an individual enters a vehicle or sets off on foot with the plan of going over the railing.
Unless our surveillance-oriented society makes substantially greater advances (which I oppose), we will remain unable to assess suicide lethality by starting at the moment of inception. However, we do have data showing what happens once someone with suicidal intentions walks onto the bridge.
Between 2000 and 2018, observers noted 2,546 people on the Golden Gate who appeared to be considering a suicide attempt, the San Francisco Chronicle has reported. Five hundred sixty-four confirmed suicides occurred. In an additional 71 cases, suicide is presumed but bodies were not recovered. In the 1,911 remaining instances, mental health interventions were made, with individuals taken to local hospitals and psychiatric wards, and released when no longer overtly suicidal. Interventions successfully diverted 75% (1,911/2,546) of those intending to end their own lives, which suggests that the current lethality of the Golden Gate as a means of suicide is only 25%. Even in the bridge’s first half-century, without constant camera monitoring, and a cadre of volunteers and professionals scanning for those attempting suicide, the lethality rate approached about 50%.19
We face even more difficulties measuring accessibility than in determining lethality. The Golden Gate appears to be accessible to almost anyone – drivers have to pay a toll only when traveling from the north, and then only after they have traversed the span. Pedestrians retain unfettered admittance to the east sidewalk (facing San Francisco city and bay) throughout daylight hours. But any determination of accessibility must include how quickly and easily one can make use of an opportunity.
Both entrances to the Golden Gate are embedded in the Golden Gate National Recreational Area, part of the National Park system. The south entrance to the bridge arises from The Presidio, a former military installation that housed about 4,000 people.20 Even fewer people live in the parklands at the north end of the bridge. The Presidio extends far enough so that the closest San Francisco neighborhoods outside of the park are a full 2.2 km (1.36 miles) from the bridge railing. A brisk walk would still require a minimum of about 20 minutes to get to the bridge; it is difficult to arrive at the bridge without a trek.
Researchers define impulsivity, like accessibility, inconsistently – and often imprecisely. Impulsivity, which clearly exists on a spectrum, connotes overvaluing of immediate feelings and thoughts at the expense of longer term goals and aspirations. Some suicide research appears to define impulsivity as the antithesis of planned behavior;21,22 others define it pragmatically as behaviors executed within 5 minutes of a decision,23 and still others contend that “suicidal behavior is rarely if ever impulsive.”24 Furthermore, when we assess impulsivity, we must acknowledge a fundamental difference between “impulsive” shootings and poisonings that are accomplished at home and within seconds or minutes, from “impulsive” Golden Gate Bridge suicide attempts, which require substantial travel and time commitments, and inherently involve the potential for others to intervene.
Those arguing that the bridge suicide barrier will save lives often bring up two additional sets of numbers to back up their assertions. They provide evidence that most of those people who were stopped in their attempts at suicide at the Golden Gate do not go on to commit suicide elsewhere, and that many of those who survived their attempts express regret at having tried to kill themselves. Specifically, 94% of those who were prevented from jumping from the Golden Gate had not committed suicide after a median follow-up of 26 years, according to a follow-up study published a few years ago. On the other hand, those who have made a serious suicide attempt have a substantially increased risk, relative to the general population, of dying from a later attempt,25,26 and the strongest predictor for death by suicide is having made a previous, serious suicide attempt.27
While all of these studies provide important and interesting information regarding suicide, none directly address the question of whether individuals will substitute attempts by other methods if the Golden Gate Bridge were no longer available. Many discussions blur the distinction between how individuals behave after a thwarted Golden Gate suicide attempt and how other people might act if we secured the bridge from any potential future suicide attempts. I hope that the following analogy makes this distinction clearer without trivializing: Imagine that we know that everyone who was interrupted while eating their dinner in a particular restaurant never went back and ate out anywhere, ever again. We could not conclude from this that another individual, who learned that the intended restaurant was indefinitely closed, would never dine out again. Once effective suicide barriers exist on the Golden Gate, this will likely become widely known, thereby greatly reducing the likelihood that any individuals will consider the possibility of jumping from the bridge. But it seems very unlikely that this would vanquish all suicidal impulses from the northern California population.
Lessons from patients
Two former patients of mine ended their lives by suicide from the Golden Gate. P, a solitary and lonely man in his 50s, was referred to me by his neighbor, Q, one of my long-term patients. P had a history of repeated assessments for lifelong depression, with minimal follow-up. I made a treatment plan with P that we hoped would address both his depression and his reluctance to engage with mental health professionals. He did not return for his follow-up appointment and ignored all my attempts to contact him.
P continued to have intermittent contact with Q. A decade after I had evaluated him, P was finally hospitalized for depression. Since P had no local family or friends, he asked Q to pick him up from the hospital at the time of his discharge. P asked Q to drive him to the Golden Gate Bridge, ostensibly to relish his release by partaking of the panoramic view of San Francisco from the bridge. They parked in the lot at the north end of the bridge, where Q stayed with the car at the vista point. The last that anyone saw of P was when Q noticed him walking on the bridge; nobody saw him go over and his body was not recovered.
In contrast to my brief connection with P, I worked with S over the course of 8 years to deal with her very severe attention-deficit/hyperactivity disorder and associated depression, which destroyed jobs and friendships, and estranged her from her family. She moved to Hawaii in hopes of “starting over with less baggage,” but I received a few phone calls over the next few years detailing suicide attempts, including driving her car off a bridge. Floundering in life, she returned to San Francisco and was hospitalized with suicidal ideation. The inpatient team sedated her heavily, ignored her past treatments and diagnoses, and discharged her after several days. Within a day of discharge, S’s sister called to say that S’s body had been recovered from the water below the bridge.
I don’t think that suicide was inevitable for either P or S, but I also lack any indication that either would be alive today had we installed suicide barriers on the Golden Gate years ago. Unless we eliminate access to guns, cars, trains, poisons, ropes, tall buildings and cliffs, people contemplating suicide will have numerous options at their disposal. We are likely to save lives by continuing to find ways to restrict access to means of death that can be used within seconds and have a high degree of lethality, and we should persist with such efforts. Buying a $5 trigger lock for every gun in California, and spending tens of millions on a public service campaign would cost less and may well save more lives than the Golden Gate suicide barrier. Unfortunately, we still possess very limited knowledge regarding which suicide prevention measures have an “impact on actual deaths or behavior.”28
To increase our efficacy in reducing suicide, we need to find better treatments for depression and anxiety. We also need to identify better ways of targeting those most at risk for suicide,29 improve our delivery of such treatments, and mitigate the social factors that contribute to such misery and unhappiness.
As a psychiatrist who has lost not only patients but also family members to suicide, I appreciate the hole in the soul these deaths create. I understand the drive to find ways to prevent additional deaths and save future survivors from such grief. But we must design psychiatric interventions that do the maximum good. To be imprecise in the lessons we learn from those who have killed themselves doubles down on the disservice to those lives already lost.
Dr. Kruse is a psychiatrist who practices in San Francisco. Several key details about the patients were changed to protect confidentiality.
References
1. Frommer’s Comprehensive Travel Guide, California. New York: Prentice Hall Travel, 1993.
2. “Chen Si, the ‘Angel of Nanjing,’ has saved more than 330 people from suicide,” by Matt Young, News.com.au. May 14, 2017.
3. “Finding Kyle,” by Lizzie Johnson, San Francisco Chronicle. Feb 8, 2019.
4. Beautrais A. Suicide by jumping. A review of research and prevention strategies. Crisis. 2007 Jan;28 Suppl 1:58-63. Crisis: The J of Crisis Interven Suicide Preven. 2007 Jan. (28)[Suppl1]:58-63.
5. Gunnell D et al. The global distribution of fatal pesticide self-poisoning: Systematic review. BMC Public Health. 2007 Dec. 21;7:357.
6. Vijayakumar L and Satheesh-Babu R. Does ‘no pesticide’ reduce suicides? Int J Soc Psychiatry. 2009 Jul 17;55:401-6.
7. Kreitman N. The coal gas story. United Kingdom suicide rates, 1960-71. Br J Prev Soc Med. 1976 Jun;30(2)86-93.
8. Ajdacic-Gross V et al. Changing times: A longitudinal analysis of international firearm suicide data. Am J Public Health. 2006 Oct;96(10):1752-5.
9. Reisch T et al. Change in suicide rates in Switzerland before and after firearm restriction resulting from the 2003 “Army XXI” reform. Am J Psychiatry. 2013 Sep170(9):977-84.
10. Lubin G et al. Decrease in suicide rates after a change of policy reducing access to firearms in adolescents: A naturalistic epidemiological study. Suicide Life Threat Behav. 2010 Oct;40(5):421-4.
11. Sinyor M and Levitt A. Effect of a barrier at Bloor Street Viaduct on suicide rates in Toronto: Natural experiment BMJ. 2010;341. doi: 1136/bmjc2884.
12. O’Carroll P and Silverman M. Community suicide prevention: The effectiveness of bridge barriers. Suicide Life Threat Behav. 1994 Spring;24(1):89-91; discussion 91-9.
13. Pelletier A. Preventing suicide by jumping: The effect of a bridge safety fence. Inj Prev. 2007 Feb;13(1):57-9.
14. Bennewith O et al. Effect of barriers on the Clifton suspension bridge, England, on local patterns of suicide: Implications for prevention. Br J Psychiatry. 2007 Mar;190:266-7.
15. Harvard T.H. Chan School of Public Health. 2004. “How do people most commonly complete suicide?”
16. “How cliff diving works,” by Heather Kolich, HowStuffWorks.com. Oct 5, 2009.
17. “Bridge design and construction statistics.” Goldengate.org
18. “How did teen survive fall from Golden Gate Bridge?” by Remy Molina, Live Science. Apr 19, 2011.
19. Seiden R. Where are they now? A follow-up study of suicide attempters from the Golden Gate Bridge. Suicide Life Threat Behav. 1978 Winter;8(4):203-16.
20. Presidio demographics. Point2homes.com.
21. Baca-García E et al. A prospective study of the paradoxical relationship between impulsivity and lethality of suicide attempts. J Clin Psychiatry. 2001 Jul;62(7):560-4.
22. Lim M et al. Differences between impulsive and non-impulsive suicide attempts among individuals treated in emergency rooms of South Korea. Psychiatry Investig. 2016 Jul;13(4):389-96.
23. Simon O et al. Characteristics of impulsive suicide attempts and attempters. Suicide Life Threat Behav. 2001;32(1 Suppl):49-59.
24. Anestis M et al. Reconsidering the link between impulsivity and suicidal behavior. Pers Soc Psychol Rev. 2014 Nov;18(4):366-86.
25. Ostamo A et al. Excess mortality of suicide attempters. Psychiatry Psychiatr Epidemiol. 2001 Jan;36(1):29-35.
26. Leon A et al. Statistical issues in the identification of risk factors for suicidal behavior: The application of survival analysis. Psychiatry Res. 1990 Jan;31(1):99-108.
27. Bostwick J et al. Suicide attempt as a risk factor for completed suicide: Even more lethal than we knew. Am J Psychiatry. 2016 Nov 1;173(11):1094-100.
28. Stone D and Crosby A. Suicide prevention. Am J Lifestyle Med. 2014;8(6):404-20.
29. Belsher B et al. Prediction models for suicide attempts and deaths: A systematic review and simulation. JAMA Psychiatry. 2019 Mar 13. doi: 10.1001/jamapsychiatry.2019.0174.
San Francisco entrances people. Photographers capture more images of the Golden Gate Bridge than any other bridge in the world.1 And only the Nanjing Yangtze River Bridge in China surpasses the Golden Gate as a destination for dying by suicide.2 At least 1,700 people reportedly have plunged from the bridge to their deaths since its opening in 1937.3
Despite concerted efforts by bridge security, the local mental health community, and a volunteer organization – Bridgewatch Angels – suicides continue at the pace of about 1 every 2 weeks. After more than 60 years of discussion, transportation officials allocated funding and have started building a suicide prevention barrier system on the Golden Gate.
Extrapolating from the success of barriers built on other bridges that were “suicide magnets,” we should be able to assure people that suicide deaths from the Golden Gate will dramatically decrease, and perhaps cease completely.4 Certainly, some in the mental health community think this barrier will save lives. They support this claim by citing research showing that removing highly accessible and lethal means of suicide reduces overall suicide rates, and that suicidal individuals, when thwarted, do not seek alternate modes of death.
I support building the Golden Gate suicide barrier, partly because symbolically, it should deliver a powerful message that we value all human life. But will the barrier save lives? I don’t think it will. As the American Psychiatric Association prepares to gather for its annual meeting in San Francisco, I would like to share my reasoning.
What the evidence shows
The most robust evidence that restricting availability of highly lethal and accessible means of suicide reduces overall suicide deaths comes from studies looking at self-poisoning in Asian countries and Great Britain. In many parts of Asia, ingestion of pesticides constitutes a significant proportion of suicide deaths, and several studies have found that, in localities where sales of highly lethal pesticides were restricted, overall suicide deaths decreased.5,6 Conversely, suicide rates increased when more lethal varieties of pesticides became more available. In Great Britain, overall suicide rates decreased when natural gas replaced coal gas for home heating and cooking.7 For decades preceding this change, more Britons had killed themselves by inhaling coal gas than by any other method.
Strong correlations exist between regional levels of gun ownership and suicide rates by shooting,8 but several potentially confounding sociopolitical factors explain some portion of this connection. Stronger evidence of gun availability affecting suicide rates has been demonstrated by decreases in suicide rates after restrictions in gun access in Switzerland,9 Israel,10 and other areas. These studies show correlations – not causality. However, the number of studies, links between increases and decreases in suicide rates with changes in access to guns, absence of changes in suicide rates during the same time periods among ostensibly similar control populations, and lack of other compelling explanations support the argument that restricting access to highly lethal and accessible means of suicide prevents suicide deaths overall.
The installation of suicide barriers on bridges that have been the sites of multiple suicides robustly reduces or even eliminates suicide deaths from those bridges,11 but the effect on overall suicide rates remains less clear. Various studies have found subsequent increases or no changes12-14 in suicide deaths from other bridges or tall buildings in the vicinity after the installation of suicide barriers on a “suicide magnet.” Many of the studies failed to find any impact on overall suicide rates in the regions investigated. Deaths from jumping off tall structures constitute a tiny proportion of total suicide deaths, making it difficult to detect any changes in overall suicide rates. In the United States, suicides by jumping/falling constituted 1%-2% of total suicides over the last several decades.15
If we know that restricting highly lethal and accessible methods of killing reduces suicide deaths, why would I question the value of the Golden Gate suicide barrier in preventing overall suicide deaths?
Unique aspects of the bridge
The World High Dive Federation recommends keeping dives to less than 20 meters (65.5 feet), with a few exceptions.16 The rail of the Golden Gate Bridge stands 67 meters (220 feet) above the water, and assuming minimal wind resistance, a falling person traverses that distance in about 3.7 seconds and lands with an impact of 130 km/hour (81 miles per hour).17 Only about 1%-2% of those jumping from the Golden Gate survive that fall.18
A 99% likelihood of death sounds pretty lethal; however, death by jumping from the Golden Gate inherently takes place in a public space, with the opportunity for interventions by other people. A more realistic calculation of the lethality would start the instant that someone initiates a sequence of behaviors leading to the intended death. By that criteria, measuring the lethality of the Golden Gate would begin when an individual enters a vehicle or sets off on foot with the plan of going over the railing.
Unless our surveillance-oriented society makes substantially greater advances (which I oppose), we will remain unable to assess suicide lethality by starting at the moment of inception. However, we do have data showing what happens once someone with suicidal intentions walks onto the bridge.
Between 2000 and 2018, observers noted 2,546 people on the Golden Gate who appeared to be considering a suicide attempt, the San Francisco Chronicle has reported. Five hundred sixty-four confirmed suicides occurred. In an additional 71 cases, suicide is presumed but bodies were not recovered. In the 1,911 remaining instances, mental health interventions were made, with individuals taken to local hospitals and psychiatric wards, and released when no longer overtly suicidal. Interventions successfully diverted 75% (1,911/2,546) of those intending to end their own lives, which suggests that the current lethality of the Golden Gate as a means of suicide is only 25%. Even in the bridge’s first half-century, without constant camera monitoring, and a cadre of volunteers and professionals scanning for those attempting suicide, the lethality rate approached about 50%.19
We face even more difficulties measuring accessibility than in determining lethality. The Golden Gate appears to be accessible to almost anyone – drivers have to pay a toll only when traveling from the north, and then only after they have traversed the span. Pedestrians retain unfettered admittance to the east sidewalk (facing San Francisco city and bay) throughout daylight hours. But any determination of accessibility must include how quickly and easily one can make use of an opportunity.
Both entrances to the Golden Gate are embedded in the Golden Gate National Recreational Area, part of the National Park system. The south entrance to the bridge arises from The Presidio, a former military installation that housed about 4,000 people.20 Even fewer people live in the parklands at the north end of the bridge. The Presidio extends far enough so that the closest San Francisco neighborhoods outside of the park are a full 2.2 km (1.36 miles) from the bridge railing. A brisk walk would still require a minimum of about 20 minutes to get to the bridge; it is difficult to arrive at the bridge without a trek.
Researchers define impulsivity, like accessibility, inconsistently – and often imprecisely. Impulsivity, which clearly exists on a spectrum, connotes overvaluing of immediate feelings and thoughts at the expense of longer term goals and aspirations. Some suicide research appears to define impulsivity as the antithesis of planned behavior;21,22 others define it pragmatically as behaviors executed within 5 minutes of a decision,23 and still others contend that “suicidal behavior is rarely if ever impulsive.”24 Furthermore, when we assess impulsivity, we must acknowledge a fundamental difference between “impulsive” shootings and poisonings that are accomplished at home and within seconds or minutes, from “impulsive” Golden Gate Bridge suicide attempts, which require substantial travel and time commitments, and inherently involve the potential for others to intervene.
Those arguing that the bridge suicide barrier will save lives often bring up two additional sets of numbers to back up their assertions. They provide evidence that most of those people who were stopped in their attempts at suicide at the Golden Gate do not go on to commit suicide elsewhere, and that many of those who survived their attempts express regret at having tried to kill themselves. Specifically, 94% of those who were prevented from jumping from the Golden Gate had not committed suicide after a median follow-up of 26 years, according to a follow-up study published a few years ago. On the other hand, those who have made a serious suicide attempt have a substantially increased risk, relative to the general population, of dying from a later attempt,25,26 and the strongest predictor for death by suicide is having made a previous, serious suicide attempt.27
While all of these studies provide important and interesting information regarding suicide, none directly address the question of whether individuals will substitute attempts by other methods if the Golden Gate Bridge were no longer available. Many discussions blur the distinction between how individuals behave after a thwarted Golden Gate suicide attempt and how other people might act if we secured the bridge from any potential future suicide attempts. I hope that the following analogy makes this distinction clearer without trivializing: Imagine that we know that everyone who was interrupted while eating their dinner in a particular restaurant never went back and ate out anywhere, ever again. We could not conclude from this that another individual, who learned that the intended restaurant was indefinitely closed, would never dine out again. Once effective suicide barriers exist on the Golden Gate, this will likely become widely known, thereby greatly reducing the likelihood that any individuals will consider the possibility of jumping from the bridge. But it seems very unlikely that this would vanquish all suicidal impulses from the northern California population.
Lessons from patients
Two former patients of mine ended their lives by suicide from the Golden Gate. P, a solitary and lonely man in his 50s, was referred to me by his neighbor, Q, one of my long-term patients. P had a history of repeated assessments for lifelong depression, with minimal follow-up. I made a treatment plan with P that we hoped would address both his depression and his reluctance to engage with mental health professionals. He did not return for his follow-up appointment and ignored all my attempts to contact him.
P continued to have intermittent contact with Q. A decade after I had evaluated him, P was finally hospitalized for depression. Since P had no local family or friends, he asked Q to pick him up from the hospital at the time of his discharge. P asked Q to drive him to the Golden Gate Bridge, ostensibly to relish his release by partaking of the panoramic view of San Francisco from the bridge. They parked in the lot at the north end of the bridge, where Q stayed with the car at the vista point. The last that anyone saw of P was when Q noticed him walking on the bridge; nobody saw him go over and his body was not recovered.
In contrast to my brief connection with P, I worked with S over the course of 8 years to deal with her very severe attention-deficit/hyperactivity disorder and associated depression, which destroyed jobs and friendships, and estranged her from her family. She moved to Hawaii in hopes of “starting over with less baggage,” but I received a few phone calls over the next few years detailing suicide attempts, including driving her car off a bridge. Floundering in life, she returned to San Francisco and was hospitalized with suicidal ideation. The inpatient team sedated her heavily, ignored her past treatments and diagnoses, and discharged her after several days. Within a day of discharge, S’s sister called to say that S’s body had been recovered from the water below the bridge.
I don’t think that suicide was inevitable for either P or S, but I also lack any indication that either would be alive today had we installed suicide barriers on the Golden Gate years ago. Unless we eliminate access to guns, cars, trains, poisons, ropes, tall buildings and cliffs, people contemplating suicide will have numerous options at their disposal. We are likely to save lives by continuing to find ways to restrict access to means of death that can be used within seconds and have a high degree of lethality, and we should persist with such efforts. Buying a $5 trigger lock for every gun in California, and spending tens of millions on a public service campaign would cost less and may well save more lives than the Golden Gate suicide barrier. Unfortunately, we still possess very limited knowledge regarding which suicide prevention measures have an “impact on actual deaths or behavior.”28
To increase our efficacy in reducing suicide, we need to find better treatments for depression and anxiety. We also need to identify better ways of targeting those most at risk for suicide,29 improve our delivery of such treatments, and mitigate the social factors that contribute to such misery and unhappiness.
As a psychiatrist who has lost not only patients but also family members to suicide, I appreciate the hole in the soul these deaths create. I understand the drive to find ways to prevent additional deaths and save future survivors from such grief. But we must design psychiatric interventions that do the maximum good. To be imprecise in the lessons we learn from those who have killed themselves doubles down on the disservice to those lives already lost.
Dr. Kruse is a psychiatrist who practices in San Francisco. Several key details about the patients were changed to protect confidentiality.
References
1. Frommer’s Comprehensive Travel Guide, California. New York: Prentice Hall Travel, 1993.
2. “Chen Si, the ‘Angel of Nanjing,’ has saved more than 330 people from suicide,” by Matt Young, News.com.au. May 14, 2017.
3. “Finding Kyle,” by Lizzie Johnson, San Francisco Chronicle. Feb 8, 2019.
4. Beautrais A. Suicide by jumping. A review of research and prevention strategies. Crisis. 2007 Jan;28 Suppl 1:58-63. Crisis: The J of Crisis Interven Suicide Preven. 2007 Jan. (28)[Suppl1]:58-63.
5. Gunnell D et al. The global distribution of fatal pesticide self-poisoning: Systematic review. BMC Public Health. 2007 Dec. 21;7:357.
6. Vijayakumar L and Satheesh-Babu R. Does ‘no pesticide’ reduce suicides? Int J Soc Psychiatry. 2009 Jul 17;55:401-6.
7. Kreitman N. The coal gas story. United Kingdom suicide rates, 1960-71. Br J Prev Soc Med. 1976 Jun;30(2)86-93.
8. Ajdacic-Gross V et al. Changing times: A longitudinal analysis of international firearm suicide data. Am J Public Health. 2006 Oct;96(10):1752-5.
9. Reisch T et al. Change in suicide rates in Switzerland before and after firearm restriction resulting from the 2003 “Army XXI” reform. Am J Psychiatry. 2013 Sep170(9):977-84.
10. Lubin G et al. Decrease in suicide rates after a change of policy reducing access to firearms in adolescents: A naturalistic epidemiological study. Suicide Life Threat Behav. 2010 Oct;40(5):421-4.
11. Sinyor M and Levitt A. Effect of a barrier at Bloor Street Viaduct on suicide rates in Toronto: Natural experiment BMJ. 2010;341. doi: 1136/bmjc2884.
12. O’Carroll P and Silverman M. Community suicide prevention: The effectiveness of bridge barriers. Suicide Life Threat Behav. 1994 Spring;24(1):89-91; discussion 91-9.
13. Pelletier A. Preventing suicide by jumping: The effect of a bridge safety fence. Inj Prev. 2007 Feb;13(1):57-9.
14. Bennewith O et al. Effect of barriers on the Clifton suspension bridge, England, on local patterns of suicide: Implications for prevention. Br J Psychiatry. 2007 Mar;190:266-7.
15. Harvard T.H. Chan School of Public Health. 2004. “How do people most commonly complete suicide?”
16. “How cliff diving works,” by Heather Kolich, HowStuffWorks.com. Oct 5, 2009.
17. “Bridge design and construction statistics.” Goldengate.org
18. “How did teen survive fall from Golden Gate Bridge?” by Remy Molina, Live Science. Apr 19, 2011.
19. Seiden R. Where are they now? A follow-up study of suicide attempters from the Golden Gate Bridge. Suicide Life Threat Behav. 1978 Winter;8(4):203-16.
20. Presidio demographics. Point2homes.com.
21. Baca-García E et al. A prospective study of the paradoxical relationship between impulsivity and lethality of suicide attempts. J Clin Psychiatry. 2001 Jul;62(7):560-4.
22. Lim M et al. Differences between impulsive and non-impulsive suicide attempts among individuals treated in emergency rooms of South Korea. Psychiatry Investig. 2016 Jul;13(4):389-96.
23. Simon O et al. Characteristics of impulsive suicide attempts and attempters. Suicide Life Threat Behav. 2001;32(1 Suppl):49-59.
24. Anestis M et al. Reconsidering the link between impulsivity and suicidal behavior. Pers Soc Psychol Rev. 2014 Nov;18(4):366-86.
25. Ostamo A et al. Excess mortality of suicide attempters. Psychiatry Psychiatr Epidemiol. 2001 Jan;36(1):29-35.
26. Leon A et al. Statistical issues in the identification of risk factors for suicidal behavior: The application of survival analysis. Psychiatry Res. 1990 Jan;31(1):99-108.
27. Bostwick J et al. Suicide attempt as a risk factor for completed suicide: Even more lethal than we knew. Am J Psychiatry. 2016 Nov 1;173(11):1094-100.
28. Stone D and Crosby A. Suicide prevention. Am J Lifestyle Med. 2014;8(6):404-20.
29. Belsher B et al. Prediction models for suicide attempts and deaths: A systematic review and simulation. JAMA Psychiatry. 2019 Mar 13. doi: 10.1001/jamapsychiatry.2019.0174.
Monitoring, early intervention key to CAR T safety
GLASGOW – Constant patient monitoring and early intervention with tocilizumab and steroids are essential to the safe delivery of chimeric antigen receptor (CAR) T-cell therapy in patients with non-Hodgkin lymphoma (NHL), according to a leading expert.
As a clinical researcher at MD Anderson Cancer Center in Houston, Loretta Nastoupil, MD has played an active role in the evolution of CAR T-cell therapy, from early trials to ongoing development of treatment protocols. During a presentation at the annual meeting of the British Society for Haematology, Dr. Nastoupil discussed leading topics in CAR T-cell therapy, with an emphasis on safe delivery.
“[Toxicity] is something we don’t talk about as much as we should, partly because this therapy works and it’s really exciting,” Dr. Nastoupil said. “But the toxicity is not something that I minimize, and it’s very challenging. It’s led us to restructure our inpatient services. It’s led to a lot of sleepless nights. These patients can do very, very well, or they can do very, very poorly in terms of toxicity and I think the most important strategy is recognition and early intervention.”
Monitoring
Early recognition depends on close monitoring, Dr. Nastoupil said, which is carried out by highly trained nursing staff who follow therapy-specific decision algorithms.
“We have nurses that are on the front line,” Dr. Nastoupil said. “They’re the most important group. We have staff that round on [patients] daily, but the nurses are there 24 hours a day. We have a flow sheet where they grade cytokine release syndrome and neurotoxicity every 8 hours, or if there is an acute change in symptoms or toxicity, they’ll do it in real time.”
Dr. Nastoupil said that if these toxicities are detected, intervention is occurring sooner than it did with some of the first patients to receive CAR-T cell therapy.
“Initially there was a lot of fear surrounding anything that would abort the CAR-T cell therapy,” Dr. Nastoupil said. “There was concern that if you were trying to mitigate some of the toxicity you might have a negative impact on efficacy ... [W]ith the first iteration of studies, generally we were waiting until grade 3 or higher cytokine release syndrome before initiating either tocilizumab and/or steroids. As the studies evolved, it started to move into grade 2 toxicity that we started using therapy, mostly because we started to see that those patients were still responding.”
At MD Anderson, these earlier interventions have decreased severity of adverse events.
“It’s rare nowadays to have grade 3 or 4 cytokine release syndrome because we are generally introducing abortive therapy at grade 2,” Dr. Nastoupil said, citing increased use of steroids and tocilizumab.
Currently, no consensus exists for managing these events, partly because clinicians are still learning about best management practices.
“There will be a consensus on management,” Dr. Nastoupil said. “I think that’s needed. The problem is, it will probably evolve as we get more experience with managing these patients. I think there’s been a little hesitation to put something out on paper knowing that a year from now that might change.”
Grading toxicity
In contrast, Dr. Nastoupil said that a consensus has been reached for grading acute toxicity. Of note, fever is now considered an essential element of cytokine release syndrome.
“The first thing we see [with cytokine release syndrome] is fever, generally speaking,” Dr. Nastoupil said. “That will prompt a workup for infection because these patients are going to be neutropenic. And we initiate broad spectrum antimicrobials.”
She said that some patients treated with CAR T-cell therapy have had disseminated fungal infections, so clinicians need to be on the lookout for septic shock.
To assess neurotoxicity, the team at MD Anderson uses an objective scoring system, called “CARTOX.” This helps maintain consistency when facing broadly different neurological presentations.
“There’s such a wide ranging spectrum of patients that are undergoing neurotoxicity you can’t expect someone, even myself, to be consistent when you are trying to tease out how serious it is,” Dr. Nastoupil said.
With CARTOX, nurses can easily score patients and call clinicians with results. Still, this doesn’t eliminate difficulties inherent to managing neurotoxicity, particularly when it is severe.
“I’d say one of the areas that is still very challenging is when [patients with neurotoxicity] are no longer responding,” Dr. Nastoupil said. “You have to be very mindful of seizure activity. We’ve had a couple of patients with status [epilepticus]. You don’t see seizure activity physically, but when you do an EEG, you pick it up.”
Dr. Nastoupil added that most centers are now giving patients prophylactic levetiracetam (Keppra) to lower seizure risk.
Choosing therapy
When selecting between the two therapies currently approved by the Food and Drug Administration – tisagenlecleucel (Kymriah) and axicabtagene ciloleucel (Yescarta) – based on safety, Dr. Nastoupil said that rates of cytokine release syndrome appear similar, but neurotoxicity rates may differ.
“Cytokine release syndrome in my opinion is probably more similar than different in terms of grade 3 or higher because tocilizumab and steroids work quite well in aborting those toxicities,” Dr. Nastoupil said. “But neurotoxicity still sticks out in my mind as the most striking difference, where with axicabtagene you see more grade 3 or higher neurotoxicity, though very, very few deaths as a result of this. But it’s very challenging in terms of management.”
According to Dr. Nastoupil, comparisons between CAR T-cell therapies have been complicated by differences in clinical trial methodologies. However, she offered a general conclusion regarding efficacy.
“[W]hat I’ll tell you, at the end of the day, is [that existing CAR T-cell therapies] all seem to sort of settle out around 30%-40% in terms of durable responses,” Dr. Nastoupil said.
Dr. Nastoupil concluded her presentation with an overview and look to the future.
“I do think [CAR T-cell therapy] is transformative, particularly for our chemo refractory patients,” she said. “There is nothing else like it. The problem right now is that it is only durable in 40% of patients. So can we be better at selecting out patients that are more likely to respond? Does introducing this in earlier lines of therapy increase that fraction of patients that are potentially cured?”
Considering these questions, she said: “We need more patients. We need more data. We need longer follow-up to understand the nuances of this therapy.”
Dr. Nastoupil previously reported financial relationships with Celgene, Genentech, Gilead, Merck, Novartis, Spectrum, and TG Therapeutics.
GLASGOW – Constant patient monitoring and early intervention with tocilizumab and steroids are essential to the safe delivery of chimeric antigen receptor (CAR) T-cell therapy in patients with non-Hodgkin lymphoma (NHL), according to a leading expert.
As a clinical researcher at MD Anderson Cancer Center in Houston, Loretta Nastoupil, MD has played an active role in the evolution of CAR T-cell therapy, from early trials to ongoing development of treatment protocols. During a presentation at the annual meeting of the British Society for Haematology, Dr. Nastoupil discussed leading topics in CAR T-cell therapy, with an emphasis on safe delivery.
“[Toxicity] is something we don’t talk about as much as we should, partly because this therapy works and it’s really exciting,” Dr. Nastoupil said. “But the toxicity is not something that I minimize, and it’s very challenging. It’s led us to restructure our inpatient services. It’s led to a lot of sleepless nights. These patients can do very, very well, or they can do very, very poorly in terms of toxicity and I think the most important strategy is recognition and early intervention.”
Monitoring
Early recognition depends on close monitoring, Dr. Nastoupil said, which is carried out by highly trained nursing staff who follow therapy-specific decision algorithms.
“We have nurses that are on the front line,” Dr. Nastoupil said. “They’re the most important group. We have staff that round on [patients] daily, but the nurses are there 24 hours a day. We have a flow sheet where they grade cytokine release syndrome and neurotoxicity every 8 hours, or if there is an acute change in symptoms or toxicity, they’ll do it in real time.”
Dr. Nastoupil said that if these toxicities are detected, intervention is occurring sooner than it did with some of the first patients to receive CAR-T cell therapy.
“Initially there was a lot of fear surrounding anything that would abort the CAR-T cell therapy,” Dr. Nastoupil said. “There was concern that if you were trying to mitigate some of the toxicity you might have a negative impact on efficacy ... [W]ith the first iteration of studies, generally we were waiting until grade 3 or higher cytokine release syndrome before initiating either tocilizumab and/or steroids. As the studies evolved, it started to move into grade 2 toxicity that we started using therapy, mostly because we started to see that those patients were still responding.”
At MD Anderson, these earlier interventions have decreased severity of adverse events.
“It’s rare nowadays to have grade 3 or 4 cytokine release syndrome because we are generally introducing abortive therapy at grade 2,” Dr. Nastoupil said, citing increased use of steroids and tocilizumab.
Currently, no consensus exists for managing these events, partly because clinicians are still learning about best management practices.
“There will be a consensus on management,” Dr. Nastoupil said. “I think that’s needed. The problem is, it will probably evolve as we get more experience with managing these patients. I think there’s been a little hesitation to put something out on paper knowing that a year from now that might change.”
Grading toxicity
In contrast, Dr. Nastoupil said that a consensus has been reached for grading acute toxicity. Of note, fever is now considered an essential element of cytokine release syndrome.
“The first thing we see [with cytokine release syndrome] is fever, generally speaking,” Dr. Nastoupil said. “That will prompt a workup for infection because these patients are going to be neutropenic. And we initiate broad spectrum antimicrobials.”
She said that some patients treated with CAR T-cell therapy have had disseminated fungal infections, so clinicians need to be on the lookout for septic shock.
To assess neurotoxicity, the team at MD Anderson uses an objective scoring system, called “CARTOX.” This helps maintain consistency when facing broadly different neurological presentations.
“There’s such a wide ranging spectrum of patients that are undergoing neurotoxicity you can’t expect someone, even myself, to be consistent when you are trying to tease out how serious it is,” Dr. Nastoupil said.
With CARTOX, nurses can easily score patients and call clinicians with results. Still, this doesn’t eliminate difficulties inherent to managing neurotoxicity, particularly when it is severe.
“I’d say one of the areas that is still very challenging is when [patients with neurotoxicity] are no longer responding,” Dr. Nastoupil said. “You have to be very mindful of seizure activity. We’ve had a couple of patients with status [epilepticus]. You don’t see seizure activity physically, but when you do an EEG, you pick it up.”
Dr. Nastoupil added that most centers are now giving patients prophylactic levetiracetam (Keppra) to lower seizure risk.
Choosing therapy
When selecting between the two therapies currently approved by the Food and Drug Administration – tisagenlecleucel (Kymriah) and axicabtagene ciloleucel (Yescarta) – based on safety, Dr. Nastoupil said that rates of cytokine release syndrome appear similar, but neurotoxicity rates may differ.
“Cytokine release syndrome in my opinion is probably more similar than different in terms of grade 3 or higher because tocilizumab and steroids work quite well in aborting those toxicities,” Dr. Nastoupil said. “But neurotoxicity still sticks out in my mind as the most striking difference, where with axicabtagene you see more grade 3 or higher neurotoxicity, though very, very few deaths as a result of this. But it’s very challenging in terms of management.”
According to Dr. Nastoupil, comparisons between CAR T-cell therapies have been complicated by differences in clinical trial methodologies. However, she offered a general conclusion regarding efficacy.
“[W]hat I’ll tell you, at the end of the day, is [that existing CAR T-cell therapies] all seem to sort of settle out around 30%-40% in terms of durable responses,” Dr. Nastoupil said.
Dr. Nastoupil concluded her presentation with an overview and look to the future.
“I do think [CAR T-cell therapy] is transformative, particularly for our chemo refractory patients,” she said. “There is nothing else like it. The problem right now is that it is only durable in 40% of patients. So can we be better at selecting out patients that are more likely to respond? Does introducing this in earlier lines of therapy increase that fraction of patients that are potentially cured?”
Considering these questions, she said: “We need more patients. We need more data. We need longer follow-up to understand the nuances of this therapy.”
Dr. Nastoupil previously reported financial relationships with Celgene, Genentech, Gilead, Merck, Novartis, Spectrum, and TG Therapeutics.
GLASGOW – Constant patient monitoring and early intervention with tocilizumab and steroids are essential to the safe delivery of chimeric antigen receptor (CAR) T-cell therapy in patients with non-Hodgkin lymphoma (NHL), according to a leading expert.
As a clinical researcher at MD Anderson Cancer Center in Houston, Loretta Nastoupil, MD has played an active role in the evolution of CAR T-cell therapy, from early trials to ongoing development of treatment protocols. During a presentation at the annual meeting of the British Society for Haematology, Dr. Nastoupil discussed leading topics in CAR T-cell therapy, with an emphasis on safe delivery.
“[Toxicity] is something we don’t talk about as much as we should, partly because this therapy works and it’s really exciting,” Dr. Nastoupil said. “But the toxicity is not something that I minimize, and it’s very challenging. It’s led us to restructure our inpatient services. It’s led to a lot of sleepless nights. These patients can do very, very well, or they can do very, very poorly in terms of toxicity and I think the most important strategy is recognition and early intervention.”
Monitoring
Early recognition depends on close monitoring, Dr. Nastoupil said, which is carried out by highly trained nursing staff who follow therapy-specific decision algorithms.
“We have nurses that are on the front line,” Dr. Nastoupil said. “They’re the most important group. We have staff that round on [patients] daily, but the nurses are there 24 hours a day. We have a flow sheet where they grade cytokine release syndrome and neurotoxicity every 8 hours, or if there is an acute change in symptoms or toxicity, they’ll do it in real time.”
Dr. Nastoupil said that if these toxicities are detected, intervention is occurring sooner than it did with some of the first patients to receive CAR-T cell therapy.
“Initially there was a lot of fear surrounding anything that would abort the CAR-T cell therapy,” Dr. Nastoupil said. “There was concern that if you were trying to mitigate some of the toxicity you might have a negative impact on efficacy ... [W]ith the first iteration of studies, generally we were waiting until grade 3 or higher cytokine release syndrome before initiating either tocilizumab and/or steroids. As the studies evolved, it started to move into grade 2 toxicity that we started using therapy, mostly because we started to see that those patients were still responding.”
At MD Anderson, these earlier interventions have decreased severity of adverse events.
“It’s rare nowadays to have grade 3 or 4 cytokine release syndrome because we are generally introducing abortive therapy at grade 2,” Dr. Nastoupil said, citing increased use of steroids and tocilizumab.
Currently, no consensus exists for managing these events, partly because clinicians are still learning about best management practices.
“There will be a consensus on management,” Dr. Nastoupil said. “I think that’s needed. The problem is, it will probably evolve as we get more experience with managing these patients. I think there’s been a little hesitation to put something out on paper knowing that a year from now that might change.”
Grading toxicity
In contrast, Dr. Nastoupil said that a consensus has been reached for grading acute toxicity. Of note, fever is now considered an essential element of cytokine release syndrome.
“The first thing we see [with cytokine release syndrome] is fever, generally speaking,” Dr. Nastoupil said. “That will prompt a workup for infection because these patients are going to be neutropenic. And we initiate broad spectrum antimicrobials.”
She said that some patients treated with CAR T-cell therapy have had disseminated fungal infections, so clinicians need to be on the lookout for septic shock.
To assess neurotoxicity, the team at MD Anderson uses an objective scoring system, called “CARTOX.” This helps maintain consistency when facing broadly different neurological presentations.
“There’s such a wide ranging spectrum of patients that are undergoing neurotoxicity you can’t expect someone, even myself, to be consistent when you are trying to tease out how serious it is,” Dr. Nastoupil said.
With CARTOX, nurses can easily score patients and call clinicians with results. Still, this doesn’t eliminate difficulties inherent to managing neurotoxicity, particularly when it is severe.
“I’d say one of the areas that is still very challenging is when [patients with neurotoxicity] are no longer responding,” Dr. Nastoupil said. “You have to be very mindful of seizure activity. We’ve had a couple of patients with status [epilepticus]. You don’t see seizure activity physically, but when you do an EEG, you pick it up.”
Dr. Nastoupil added that most centers are now giving patients prophylactic levetiracetam (Keppra) to lower seizure risk.
Choosing therapy
When selecting between the two therapies currently approved by the Food and Drug Administration – tisagenlecleucel (Kymriah) and axicabtagene ciloleucel (Yescarta) – based on safety, Dr. Nastoupil said that rates of cytokine release syndrome appear similar, but neurotoxicity rates may differ.
“Cytokine release syndrome in my opinion is probably more similar than different in terms of grade 3 or higher because tocilizumab and steroids work quite well in aborting those toxicities,” Dr. Nastoupil said. “But neurotoxicity still sticks out in my mind as the most striking difference, where with axicabtagene you see more grade 3 or higher neurotoxicity, though very, very few deaths as a result of this. But it’s very challenging in terms of management.”
According to Dr. Nastoupil, comparisons between CAR T-cell therapies have been complicated by differences in clinical trial methodologies. However, she offered a general conclusion regarding efficacy.
“[W]hat I’ll tell you, at the end of the day, is [that existing CAR T-cell therapies] all seem to sort of settle out around 30%-40% in terms of durable responses,” Dr. Nastoupil said.
Dr. Nastoupil concluded her presentation with an overview and look to the future.
“I do think [CAR T-cell therapy] is transformative, particularly for our chemo refractory patients,” she said. “There is nothing else like it. The problem right now is that it is only durable in 40% of patients. So can we be better at selecting out patients that are more likely to respond? Does introducing this in earlier lines of therapy increase that fraction of patients that are potentially cured?”
Considering these questions, she said: “We need more patients. We need more data. We need longer follow-up to understand the nuances of this therapy.”
Dr. Nastoupil previously reported financial relationships with Celgene, Genentech, Gilead, Merck, Novartis, Spectrum, and TG Therapeutics.
EXPERT ANALYSIS FROM BSH 2019