Dogs can be protective, even against Crohn’s disease

Article Type
Changed
Mon, 06/06/2022 - 10:22

Sorry, cat people and only children: Having a dog as a toddler and growing up in a large family are two things linked to a significantly lower chance of getting Crohn’s disease later in life, according to a new study.

Children who lived with a dog between the ages of 2 years and 4 years were 37% less likely to have Crohn’s disease, the study says. And those who lived with at least three other family members during the first year of life were 64% less likely to have this form of inflammatory bowel disease (IBD).

“In this study, we’re interested in environmental exposures and which ones are associated with Crohn’s disease onset,” Williams Turpin, PhD, said in a media interview May 23 at the annual Digestive Disease Week® (DDW).

Dr. Turpin and colleagues looked at other things in the environment – including living on a farm, drinking unpasteurized milk or well water, and growing up with a cat – but they did not have a significant link to a higher risk.

Two other things were associated with a slight increase in risk: having a sibling with Crohn’s disease and living with a bird at time of the study. But the number of bird owners was small; only a few people in the study had a pet bird when they enrolled.

The link to living with a dog as a toddler “was more robust,” said Dr. Turpin, a project manager at Mount Sinai Hospital in Toronto.

The study included 4,289 healthy first-degree relatives of people diagnosed with Crohn’s disease. They provided urine, blood, and stool samples and did surveys about environmental exposures at different stages of life.

Investigators followed them an average of 5.6 years, during which time 86 people got Crohn’s disease.
 

Gut instinct

Living with a dog early in life likely means more exposure to different microbes, boosting the strength of a person’s immune system against later challenges. This theory was supported in the study comparing the gut microbiome in people who did and not have a dog in the home early in life.

Dr. Turpin and colleagues genetically sequenced the gut microbiome of the people in the study and found differences in bacteria between groups.

“Our study also shows that just by living with a dog, it impacts your gut microbiome composition, which may have an impact on the immune response later in life,” Dr. Turpin said.

The researchers also looked at the health of the gut by measuring certain factors in the urine. One factor was higher in people who did not live with a dog at any point.
 

Mediated by the microbiome?

Living with a dog between the ages of 2 and 4 years and a large family size (more than three people) in the first year were significantly associated with a lower risk of Crohn’s disease onset.

It is unknown if the results apply to other populations; the researchers studied first-degree relatives of people with Crohn’s disease.

“The study needs to be replicated and validated,” Dr. Turpin said.

Future research could evaluate people who never had a dog and look for changes in their microbiome after they get one.
 

‘Well-crafted’ study

“It’s a really interesting study from a good group. It’s novel in terms of getting at what really drives environmental risk factors,” said Brigid Boland, MD, a gastroenterologist at UC San Diego Health, who was not affiliated with the study.

Autoimmune diseases are really complicated, in part because the risk of getting an autoimmune disease is low, and you’re going back in time to look at what put people at risk.

“The study was well crafted in choosing siblings and family members of people with IBD,” Dr. Boland said, agreeing with Dr. Turpin that more research is needed to understand this.

A version of this article first appeared on WebMD.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Sorry, cat people and only children: Having a dog as a toddler and growing up in a large family are two things linked to a significantly lower chance of getting Crohn’s disease later in life, according to a new study.

Children who lived with a dog between the ages of 2 years and 4 years were 37% less likely to have Crohn’s disease, the study says. And those who lived with at least three other family members during the first year of life were 64% less likely to have this form of inflammatory bowel disease (IBD).

“In this study, we’re interested in environmental exposures and which ones are associated with Crohn’s disease onset,” Williams Turpin, PhD, said in a media interview May 23 at the annual Digestive Disease Week® (DDW).

Dr. Turpin and colleagues looked at other things in the environment – including living on a farm, drinking unpasteurized milk or well water, and growing up with a cat – but they did not have a significant link to a higher risk.

Two other things were associated with a slight increase in risk: having a sibling with Crohn’s disease and living with a bird at time of the study. But the number of bird owners was small; only a few people in the study had a pet bird when they enrolled.

The link to living with a dog as a toddler “was more robust,” said Dr. Turpin, a project manager at Mount Sinai Hospital in Toronto.

The study included 4,289 healthy first-degree relatives of people diagnosed with Crohn’s disease. They provided urine, blood, and stool samples and did surveys about environmental exposures at different stages of life.

Investigators followed them an average of 5.6 years, during which time 86 people got Crohn’s disease.
 

Gut instinct

Living with a dog early in life likely means more exposure to different microbes, boosting the strength of a person’s immune system against later challenges. This theory was supported in the study comparing the gut microbiome in people who did and not have a dog in the home early in life.

Dr. Turpin and colleagues genetically sequenced the gut microbiome of the people in the study and found differences in bacteria between groups.

“Our study also shows that just by living with a dog, it impacts your gut microbiome composition, which may have an impact on the immune response later in life,” Dr. Turpin said.

The researchers also looked at the health of the gut by measuring certain factors in the urine. One factor was higher in people who did not live with a dog at any point.
 

Mediated by the microbiome?

Living with a dog between the ages of 2 and 4 years and a large family size (more than three people) in the first year were significantly associated with a lower risk of Crohn’s disease onset.

It is unknown if the results apply to other populations; the researchers studied first-degree relatives of people with Crohn’s disease.

“The study needs to be replicated and validated,” Dr. Turpin said.

Future research could evaluate people who never had a dog and look for changes in their microbiome after they get one.
 

‘Well-crafted’ study

“It’s a really interesting study from a good group. It’s novel in terms of getting at what really drives environmental risk factors,” said Brigid Boland, MD, a gastroenterologist at UC San Diego Health, who was not affiliated with the study.

Autoimmune diseases are really complicated, in part because the risk of getting an autoimmune disease is low, and you’re going back in time to look at what put people at risk.

“The study was well crafted in choosing siblings and family members of people with IBD,” Dr. Boland said, agreeing with Dr. Turpin that more research is needed to understand this.

A version of this article first appeared on WebMD.com.

Sorry, cat people and only children: Having a dog as a toddler and growing up in a large family are two things linked to a significantly lower chance of getting Crohn’s disease later in life, according to a new study.

Children who lived with a dog between the ages of 2 years and 4 years were 37% less likely to have Crohn’s disease, the study says. And those who lived with at least three other family members during the first year of life were 64% less likely to have this form of inflammatory bowel disease (IBD).

“In this study, we’re interested in environmental exposures and which ones are associated with Crohn’s disease onset,” Williams Turpin, PhD, said in a media interview May 23 at the annual Digestive Disease Week® (DDW).

Dr. Turpin and colleagues looked at other things in the environment – including living on a farm, drinking unpasteurized milk or well water, and growing up with a cat – but they did not have a significant link to a higher risk.

Two other things were associated with a slight increase in risk: having a sibling with Crohn’s disease and living with a bird at time of the study. But the number of bird owners was small; only a few people in the study had a pet bird when they enrolled.

The link to living with a dog as a toddler “was more robust,” said Dr. Turpin, a project manager at Mount Sinai Hospital in Toronto.

The study included 4,289 healthy first-degree relatives of people diagnosed with Crohn’s disease. They provided urine, blood, and stool samples and did surveys about environmental exposures at different stages of life.

Investigators followed them an average of 5.6 years, during which time 86 people got Crohn’s disease.
 

Gut instinct

Living with a dog early in life likely means more exposure to different microbes, boosting the strength of a person’s immune system against later challenges. This theory was supported in the study comparing the gut microbiome in people who did and not have a dog in the home early in life.

Dr. Turpin and colleagues genetically sequenced the gut microbiome of the people in the study and found differences in bacteria between groups.

“Our study also shows that just by living with a dog, it impacts your gut microbiome composition, which may have an impact on the immune response later in life,” Dr. Turpin said.

The researchers also looked at the health of the gut by measuring certain factors in the urine. One factor was higher in people who did not live with a dog at any point.
 

Mediated by the microbiome?

Living with a dog between the ages of 2 and 4 years and a large family size (more than three people) in the first year were significantly associated with a lower risk of Crohn’s disease onset.

It is unknown if the results apply to other populations; the researchers studied first-degree relatives of people with Crohn’s disease.

“The study needs to be replicated and validated,” Dr. Turpin said.

Future research could evaluate people who never had a dog and look for changes in their microbiome after they get one.
 

‘Well-crafted’ study

“It’s a really interesting study from a good group. It’s novel in terms of getting at what really drives environmental risk factors,” said Brigid Boland, MD, a gastroenterologist at UC San Diego Health, who was not affiliated with the study.

Autoimmune diseases are really complicated, in part because the risk of getting an autoimmune disease is low, and you’re going back in time to look at what put people at risk.

“The study was well crafted in choosing siblings and family members of people with IBD,” Dr. Boland said, agreeing with Dr. Turpin that more research is needed to understand this.

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pfizer asks FDA to authorize COVID vaccine for children younger than 5

Article Type
Changed
Thu, 12/15/2022 - 14:31

The FDA has accepted Pfizer’s application for a COVID-19 vaccine for children under age 5, which clears the way for approval and distribution in June.

Pfizer announced June 1 that it completed the application for a three-dose vaccine for kids between 6 months and 5 years old, and the FDA said it received the emergency use application.

Children in this age group – the last to be eligible for COVID-19 vaccines – could begin getting shots as early as June 21, according to White House COVID-19 response coordinator Ashish Jha, MD.

Meanwhile, COVID-19 cases are still high – an average of 100,000 cases a day – but death numbers are about 90% lower than they were when President Joe Biden first took office, Dr. Jha said. 

The FDA’s advisory group, the Vaccines and Related Biological Products Advisory Committee, is scheduled to meet June 14 and June 15 to discuss data submitted by both Pfizer and Moderna.  

If the FDA gives them the green light, the CDC will then weigh in.

“We know that many, many parents are eager to vaccinate their youngest kids, and it’s important to do this right,” Dr. Jha said at a White House press briefing on June 2. “We expect that vaccinations will begin in earnest as early as June 21 and really roll on throughout that week.”

States can place their orders as early as June 3, Dr. Jha said, and there will initially be 10 million doses available. If the FDA gives emergency use authorization for the vaccines, the government will begin shipping doses to thousands of sites across the country.

“The good news is we have plenty of supply of Pfizer and Moderna vaccines,” Dr. Jha said. “We’ve asked states to distribute to their highest priority sites, serving the highest risk and hardest to reach areas.”

Pfizer’s clinical trials found that three doses of the vaccine for children 6 months to under 5 years were safe and effective and proved to be 80% effective against Omicron.

The FDA announced its meeting information with a conversation about the Moderna vaccine for ages 6-17 scheduled for June 14 and a conversation about the Pfizer and Moderna vaccines for young children scheduled for June 15.

Moderna applied for FDA authorization of its two-dose vaccine for children under age 6 on April 28. The company said the vaccine was 51% effective against infections with symptoms for children ages 6 months to 2 years and 37% effective for ages 2-5.

Pfizer’s 3-microgram dose is one-tenth of its adult dose. Moderna’s 25-microgram dose is one-quarter of its adult dose.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The FDA has accepted Pfizer’s application for a COVID-19 vaccine for children under age 5, which clears the way for approval and distribution in June.

Pfizer announced June 1 that it completed the application for a three-dose vaccine for kids between 6 months and 5 years old, and the FDA said it received the emergency use application.

Children in this age group – the last to be eligible for COVID-19 vaccines – could begin getting shots as early as June 21, according to White House COVID-19 response coordinator Ashish Jha, MD.

Meanwhile, COVID-19 cases are still high – an average of 100,000 cases a day – but death numbers are about 90% lower than they were when President Joe Biden first took office, Dr. Jha said. 

The FDA’s advisory group, the Vaccines and Related Biological Products Advisory Committee, is scheduled to meet June 14 and June 15 to discuss data submitted by both Pfizer and Moderna.  

If the FDA gives them the green light, the CDC will then weigh in.

“We know that many, many parents are eager to vaccinate their youngest kids, and it’s important to do this right,” Dr. Jha said at a White House press briefing on June 2. “We expect that vaccinations will begin in earnest as early as June 21 and really roll on throughout that week.”

States can place their orders as early as June 3, Dr. Jha said, and there will initially be 10 million doses available. If the FDA gives emergency use authorization for the vaccines, the government will begin shipping doses to thousands of sites across the country.

“The good news is we have plenty of supply of Pfizer and Moderna vaccines,” Dr. Jha said. “We’ve asked states to distribute to their highest priority sites, serving the highest risk and hardest to reach areas.”

Pfizer’s clinical trials found that three doses of the vaccine for children 6 months to under 5 years were safe and effective and proved to be 80% effective against Omicron.

The FDA announced its meeting information with a conversation about the Moderna vaccine for ages 6-17 scheduled for June 14 and a conversation about the Pfizer and Moderna vaccines for young children scheduled for June 15.

Moderna applied for FDA authorization of its two-dose vaccine for children under age 6 on April 28. The company said the vaccine was 51% effective against infections with symptoms for children ages 6 months to 2 years and 37% effective for ages 2-5.

Pfizer’s 3-microgram dose is one-tenth of its adult dose. Moderna’s 25-microgram dose is one-quarter of its adult dose.

A version of this article first appeared on Medscape.com.

The FDA has accepted Pfizer’s application for a COVID-19 vaccine for children under age 5, which clears the way for approval and distribution in June.

Pfizer announced June 1 that it completed the application for a three-dose vaccine for kids between 6 months and 5 years old, and the FDA said it received the emergency use application.

Children in this age group – the last to be eligible for COVID-19 vaccines – could begin getting shots as early as June 21, according to White House COVID-19 response coordinator Ashish Jha, MD.

Meanwhile, COVID-19 cases are still high – an average of 100,000 cases a day – but death numbers are about 90% lower than they were when President Joe Biden first took office, Dr. Jha said. 

The FDA’s advisory group, the Vaccines and Related Biological Products Advisory Committee, is scheduled to meet June 14 and June 15 to discuss data submitted by both Pfizer and Moderna.  

If the FDA gives them the green light, the CDC will then weigh in.

“We know that many, many parents are eager to vaccinate their youngest kids, and it’s important to do this right,” Dr. Jha said at a White House press briefing on June 2. “We expect that vaccinations will begin in earnest as early as June 21 and really roll on throughout that week.”

States can place their orders as early as June 3, Dr. Jha said, and there will initially be 10 million doses available. If the FDA gives emergency use authorization for the vaccines, the government will begin shipping doses to thousands of sites across the country.

“The good news is we have plenty of supply of Pfizer and Moderna vaccines,” Dr. Jha said. “We’ve asked states to distribute to their highest priority sites, serving the highest risk and hardest to reach areas.”

Pfizer’s clinical trials found that three doses of the vaccine for children 6 months to under 5 years were safe and effective and proved to be 80% effective against Omicron.

The FDA announced its meeting information with a conversation about the Moderna vaccine for ages 6-17 scheduled for June 14 and a conversation about the Pfizer and Moderna vaccines for young children scheduled for June 15.

Moderna applied for FDA authorization of its two-dose vaccine for children under age 6 on April 28. The company said the vaccine was 51% effective against infections with symptoms for children ages 6 months to 2 years and 37% effective for ages 2-5.

Pfizer’s 3-microgram dose is one-tenth of its adult dose. Moderna’s 25-microgram dose is one-quarter of its adult dose.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

High-dose antipsychotics show some benefit in treatment-resistant cases

Article Type
Changed
Fri, 06/03/2022 - 09:51

– Patients with severe schizophrenia who fail to respond to treatment with standard doses of second-generation antipsychotics show significant improvement with – and tolerance of – higher maintenance doses of the drugs, new research shows.

“The use of [higher doses of] long-acting injectable second-generation antipsychotics shows improvement not only in treatment adherence, but also in diminished relapses and suicide attempts compared with other previous treatment options used with these severely ill patients,” lead author Juan Jose Fernandez-Miranda, MD, said in an interview.

Dr. Juan Jose Fernandez-Miranda

Dr. Fernandez-Miranda, of the Mental Health Service of the Principality of Asturias, in Gijón, Spain, underscored the tolerability of the novel approach of high doses: “No important side effects were found, and less than occurred with previous treatments,” he said.

While higher doses of second-generation antipsychotics for patients with treatment refractory schizophrenia are sometimes considered necessary, particularly with acute psychosis, evidence of benefits of the approach is lacking, and there are concerns about adverse events such as extrapyramidal symptoms and hyperprolactinemia.

To investigate the effects, the authors evaluated patients in a community-based, case managed program with severe, (CGI-S = 5), resistant schizophrenia.

All had been treated in the previous 3 years with at least two different antipsychotics, including clozapine in a few cases, with poor outcomes when receiving standard doses, and eligibility included being at risk of medication noncompliance, and/or experiencing a lack of effectiveness or adverse effects with previous antipsychotics.

For the second 3 years of the observational study, they were treated with doses of at least 75 mg of risperidone long-acting injectable (n = 60), 175 mg or more of monthly paliperidone palmitate (n = 60), or 600 mg or higher of aripiprazole once monthly (n = 30).

During the study, the average antipsychotic doses were: risperidone 111.2 mg/14 days; paliperidone palmitate 231.2 mg. eq./28 days; and aripiprazole 780 mg/28 days. In addition to the intensive pharmacological intervention, patients received psychosocial integrated intervention, as in the previous 3 years.

Over the 3 years with the higher maintenance doses, significant improvements were observed with all of the injectable treatment groups in terms of decreases on the Clinical Global Impression Scale – Severity score (CGI-S; P < .01) and in the four areas of the World Health Organization Disability Schedule (WHO-DAS), including in self-care, occupational, family, and social measures (P < .01 through P < .001).

Scores on the Medication Adherence Rating Scale (MARS), increased with all of the long-acting injectables (P < .01), particularly with paliperidone palmitate and aripiprazole.

Patients had significant decreases in hospital admissions at the end of the 36-month treatments and reductions in suicide attempts (both P < .001), compared with the previous 3 years, without any differences across the three injectables.

Importantly, tolerability was good for all of the long-acting antidepressants, with reductions in side effects as well as biological parameters compared with previous treatments, notably in the aripiprazole group.

While reductions in weight and prolactin levels were observed in all long-acting treatments, the differences were statistically significant only among patients treated with aripiprazole (P < .05), as was expected.

Two patients treated with aripiprazole discontinued treatment because of side effects from treatment, and the rate was five with paliperidone palmitate and nine with risperidone.

One person in the aripiprazole group discontinued because of a lack of effectiveness, while two discontinued in the paliperidone palmitate group and four with risperidone.

Dr. Fernandez-Miranda noted that “both the intensive case-managed multicomponent treatment and use of high doses of long-acting antipsychotics were in all probability linked to the high adherence and positive clinical outcomes.”

The results provide evidence that “long-acting second-generation antipsychotics are a remarkable option for patients with severe schizophrenia and a background of treatment discontinuation or intolerable adverse effects with other antipsychotics,” Dr. Fernandez-Miranda added.

“We suggest that, in some illness critical conditions, high doses of long-acting second-generation antipsychotics could represent an alternative to clozapine,” he added.


 

 

 

Some hesitation warranted

Commenting on the study, T. Scott Stroup, MD, MPH, professor of psychiatry at Columbia University, New York, noted the key limitations of a lack of randomization and comparison group of clozapine or typical-dose long-acting injectables.

Dr. T. Scott Stroup

“In addition, pre-post or mirror-image designs may be affected by expectation bias and regression to the mean,” he said in an interview.

“I don’t doubt that some patients do well on relatively high doses of long-acting injectable medications and that some tolerate these doses,” he noted “Most adverse effects are dose related, but without a typical-dose comparison group we cannot assess this.”

Ultimately, Dr. Stroup recommends sticking with standard recommendations – at least to start.

“My take-home message is that clozapine remains the treatment of choice for treatment-resistant schizophrenia, and in most cases clozapine should be tried before considering high-dose long-acting injectables,” he said. 

“If there is uncertainty about whether someone is taking a prescribed oral antipsychotic medication, then a trial of a typical dose of a long-acting injectable is a good option to rule out pseudo-treatment resistance.”

Furthermore, “this study doesn’t affect the recommendation that people who need antipsychotic medications should receive the lowest effective dose,” he said.

The authors and Dr. Stroup had no disclosures to report.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Patients with severe schizophrenia who fail to respond to treatment with standard doses of second-generation antipsychotics show significant improvement with – and tolerance of – higher maintenance doses of the drugs, new research shows.

“The use of [higher doses of] long-acting injectable second-generation antipsychotics shows improvement not only in treatment adherence, but also in diminished relapses and suicide attempts compared with other previous treatment options used with these severely ill patients,” lead author Juan Jose Fernandez-Miranda, MD, said in an interview.

Dr. Juan Jose Fernandez-Miranda

Dr. Fernandez-Miranda, of the Mental Health Service of the Principality of Asturias, in Gijón, Spain, underscored the tolerability of the novel approach of high doses: “No important side effects were found, and less than occurred with previous treatments,” he said.

While higher doses of second-generation antipsychotics for patients with treatment refractory schizophrenia are sometimes considered necessary, particularly with acute psychosis, evidence of benefits of the approach is lacking, and there are concerns about adverse events such as extrapyramidal symptoms and hyperprolactinemia.

To investigate the effects, the authors evaluated patients in a community-based, case managed program with severe, (CGI-S = 5), resistant schizophrenia.

All had been treated in the previous 3 years with at least two different antipsychotics, including clozapine in a few cases, with poor outcomes when receiving standard doses, and eligibility included being at risk of medication noncompliance, and/or experiencing a lack of effectiveness or adverse effects with previous antipsychotics.

For the second 3 years of the observational study, they were treated with doses of at least 75 mg of risperidone long-acting injectable (n = 60), 175 mg or more of monthly paliperidone palmitate (n = 60), or 600 mg or higher of aripiprazole once monthly (n = 30).

During the study, the average antipsychotic doses were: risperidone 111.2 mg/14 days; paliperidone palmitate 231.2 mg. eq./28 days; and aripiprazole 780 mg/28 days. In addition to the intensive pharmacological intervention, patients received psychosocial integrated intervention, as in the previous 3 years.

Over the 3 years with the higher maintenance doses, significant improvements were observed with all of the injectable treatment groups in terms of decreases on the Clinical Global Impression Scale – Severity score (CGI-S; P < .01) and in the four areas of the World Health Organization Disability Schedule (WHO-DAS), including in self-care, occupational, family, and social measures (P < .01 through P < .001).

Scores on the Medication Adherence Rating Scale (MARS), increased with all of the long-acting injectables (P < .01), particularly with paliperidone palmitate and aripiprazole.

Patients had significant decreases in hospital admissions at the end of the 36-month treatments and reductions in suicide attempts (both P < .001), compared with the previous 3 years, without any differences across the three injectables.

Importantly, tolerability was good for all of the long-acting antidepressants, with reductions in side effects as well as biological parameters compared with previous treatments, notably in the aripiprazole group.

While reductions in weight and prolactin levels were observed in all long-acting treatments, the differences were statistically significant only among patients treated with aripiprazole (P < .05), as was expected.

Two patients treated with aripiprazole discontinued treatment because of side effects from treatment, and the rate was five with paliperidone palmitate and nine with risperidone.

One person in the aripiprazole group discontinued because of a lack of effectiveness, while two discontinued in the paliperidone palmitate group and four with risperidone.

Dr. Fernandez-Miranda noted that “both the intensive case-managed multicomponent treatment and use of high doses of long-acting antipsychotics were in all probability linked to the high adherence and positive clinical outcomes.”

The results provide evidence that “long-acting second-generation antipsychotics are a remarkable option for patients with severe schizophrenia and a background of treatment discontinuation or intolerable adverse effects with other antipsychotics,” Dr. Fernandez-Miranda added.

“We suggest that, in some illness critical conditions, high doses of long-acting second-generation antipsychotics could represent an alternative to clozapine,” he added.


 

 

 

Some hesitation warranted

Commenting on the study, T. Scott Stroup, MD, MPH, professor of psychiatry at Columbia University, New York, noted the key limitations of a lack of randomization and comparison group of clozapine or typical-dose long-acting injectables.

Dr. T. Scott Stroup

“In addition, pre-post or mirror-image designs may be affected by expectation bias and regression to the mean,” he said in an interview.

“I don’t doubt that some patients do well on relatively high doses of long-acting injectable medications and that some tolerate these doses,” he noted “Most adverse effects are dose related, but without a typical-dose comparison group we cannot assess this.”

Ultimately, Dr. Stroup recommends sticking with standard recommendations – at least to start.

“My take-home message is that clozapine remains the treatment of choice for treatment-resistant schizophrenia, and in most cases clozapine should be tried before considering high-dose long-acting injectables,” he said. 

“If there is uncertainty about whether someone is taking a prescribed oral antipsychotic medication, then a trial of a typical dose of a long-acting injectable is a good option to rule out pseudo-treatment resistance.”

Furthermore, “this study doesn’t affect the recommendation that people who need antipsychotic medications should receive the lowest effective dose,” he said.

The authors and Dr. Stroup had no disclosures to report.

– Patients with severe schizophrenia who fail to respond to treatment with standard doses of second-generation antipsychotics show significant improvement with – and tolerance of – higher maintenance doses of the drugs, new research shows.

“The use of [higher doses of] long-acting injectable second-generation antipsychotics shows improvement not only in treatment adherence, but also in diminished relapses and suicide attempts compared with other previous treatment options used with these severely ill patients,” lead author Juan Jose Fernandez-Miranda, MD, said in an interview.

Dr. Juan Jose Fernandez-Miranda

Dr. Fernandez-Miranda, of the Mental Health Service of the Principality of Asturias, in Gijón, Spain, underscored the tolerability of the novel approach of high doses: “No important side effects were found, and less than occurred with previous treatments,” he said.

While higher doses of second-generation antipsychotics for patients with treatment refractory schizophrenia are sometimes considered necessary, particularly with acute psychosis, evidence of benefits of the approach is lacking, and there are concerns about adverse events such as extrapyramidal symptoms and hyperprolactinemia.

To investigate the effects, the authors evaluated patients in a community-based, case managed program with severe, (CGI-S = 5), resistant schizophrenia.

All had been treated in the previous 3 years with at least two different antipsychotics, including clozapine in a few cases, with poor outcomes when receiving standard doses, and eligibility included being at risk of medication noncompliance, and/or experiencing a lack of effectiveness or adverse effects with previous antipsychotics.

For the second 3 years of the observational study, they were treated with doses of at least 75 mg of risperidone long-acting injectable (n = 60), 175 mg or more of monthly paliperidone palmitate (n = 60), or 600 mg or higher of aripiprazole once monthly (n = 30).

During the study, the average antipsychotic doses were: risperidone 111.2 mg/14 days; paliperidone palmitate 231.2 mg. eq./28 days; and aripiprazole 780 mg/28 days. In addition to the intensive pharmacological intervention, patients received psychosocial integrated intervention, as in the previous 3 years.

Over the 3 years with the higher maintenance doses, significant improvements were observed with all of the injectable treatment groups in terms of decreases on the Clinical Global Impression Scale – Severity score (CGI-S; P < .01) and in the four areas of the World Health Organization Disability Schedule (WHO-DAS), including in self-care, occupational, family, and social measures (P < .01 through P < .001).

Scores on the Medication Adherence Rating Scale (MARS), increased with all of the long-acting injectables (P < .01), particularly with paliperidone palmitate and aripiprazole.

Patients had significant decreases in hospital admissions at the end of the 36-month treatments and reductions in suicide attempts (both P < .001), compared with the previous 3 years, without any differences across the three injectables.

Importantly, tolerability was good for all of the long-acting antidepressants, with reductions in side effects as well as biological parameters compared with previous treatments, notably in the aripiprazole group.

While reductions in weight and prolactin levels were observed in all long-acting treatments, the differences were statistically significant only among patients treated with aripiprazole (P < .05), as was expected.

Two patients treated with aripiprazole discontinued treatment because of side effects from treatment, and the rate was five with paliperidone palmitate and nine with risperidone.

One person in the aripiprazole group discontinued because of a lack of effectiveness, while two discontinued in the paliperidone palmitate group and four with risperidone.

Dr. Fernandez-Miranda noted that “both the intensive case-managed multicomponent treatment and use of high doses of long-acting antipsychotics were in all probability linked to the high adherence and positive clinical outcomes.”

The results provide evidence that “long-acting second-generation antipsychotics are a remarkable option for patients with severe schizophrenia and a background of treatment discontinuation or intolerable adverse effects with other antipsychotics,” Dr. Fernandez-Miranda added.

“We suggest that, in some illness critical conditions, high doses of long-acting second-generation antipsychotics could represent an alternative to clozapine,” he added.


 

 

 

Some hesitation warranted

Commenting on the study, T. Scott Stroup, MD, MPH, professor of psychiatry at Columbia University, New York, noted the key limitations of a lack of randomization and comparison group of clozapine or typical-dose long-acting injectables.

Dr. T. Scott Stroup

“In addition, pre-post or mirror-image designs may be affected by expectation bias and regression to the mean,” he said in an interview.

“I don’t doubt that some patients do well on relatively high doses of long-acting injectable medications and that some tolerate these doses,” he noted “Most adverse effects are dose related, but without a typical-dose comparison group we cannot assess this.”

Ultimately, Dr. Stroup recommends sticking with standard recommendations – at least to start.

“My take-home message is that clozapine remains the treatment of choice for treatment-resistant schizophrenia, and in most cases clozapine should be tried before considering high-dose long-acting injectables,” he said. 

“If there is uncertainty about whether someone is taking a prescribed oral antipsychotic medication, then a trial of a typical dose of a long-acting injectable is a good option to rule out pseudo-treatment resistance.”

Furthermore, “this study doesn’t affect the recommendation that people who need antipsychotic medications should receive the lowest effective dose,” he said.

The authors and Dr. Stroup had no disclosures to report.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT APA 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Immunotherapy treatment combo charts new course for resectable NSCLC treatment

Article Type
Changed
Thu, 12/15/2022 - 14:31

Among patients with resectable non–small cell lung cancer (NSCLC), neoadjuvant nivolumab combined with chemotherapy led to significantly longer event-free survival and more frequent pathological complete response in patients than chemotherapy alone.

“Our data show that three cycles of neoadjuvant nivolumab plus chemotherapy improved long-term clinical outcomes in patients with resectable stage IB-IIIA NSCLC without impeding the feasibility of surgery or increasing the incidence of adverse events as compared with chemotherapy alone,” wrote the investigators, who were led by Patrick M. Forde, MB, BCh, Johns Hopkins Kimmel Cancer Center, Baltimore. The study was published online in the New England Journal of Medicine.

Nivolumab (Opdivo, Bristol-Myers Squibb), in combination with platinum-doublet chemotherapy, was approved in March by the Food and Drug Administration as a treatment for adults with early-stage, resectable NSCLC. It is the first approval of a neoadjuvant therapy for this patient population. The results of the study, called CheckMate 816, formed the basis of the approval.

About one in four NSCLC patients have resectable disease at diagnosis, but their mortality rate is 30%-55% even after surgery. Neoadjuvant chemotherapy improves survival in this group, but 5-year recurrence rates improve by just 5%-6%, and rates of pathological complete response are low.

In the neoadjuvant setting, the anti–programmed death 1 (PD-1) antibody nivolumab could reduce micrometastases and boost immune response against bulk tumor and tumor antigens. A phase 2 study published in the Journal of Thoracic Oncology showed that neoadjuvant nivolumab combined with chemotherapy conferred good 3-year overall survival (81.9%) and progression-free survival (69.6%) among patients with stage IIIA NSCLC.
 

Results from CheckMate 816

CheckMate 816 is an open-label, phase 3 trial in which 358 patients were randomized to a neoadjuvant course of 360 mg nivolumab and platinum-doublet chemotherapy or platinum-doublet chemotherapy alone. Treatments occurred every 3 weeks for three cycles.

Definitive surgery was performed in 83.2% of the combination group (R0, 83.2%) and 75.4% in the chemotherapy-only group (R0, 77.8%). 93.8% in the combined group and 84.7% in the chemotherapy-only group completed neoadjuvant treatment. 11.9% of the combination group and 22.2% in the chemotherapy-only group underwent adjuvant therapy. A total of 21.2% in the combination group had cancer therapy versus 43.6% of the chemotherapy-only group.

After a minimum follow-up of 21 months, the combination group had a median event-free survival of 31.6 months versus 20.8 months in the chemotherapy-only group (hazard ratio for disease progression, disease recurrence, or death, 0.63; P = .005). The interim analysis for overall survival showed a possible trend towards improved overall survival in the combination group (HR, 0.57; 99.67% confidence interval, 0.30-1.07; P = .0008).

A total of 24.0% of the combination therapy achieved a pathological complete response versus 2.2% in the chemotherapy-only group (odds ratio, 13.94; P < .001).

Grade 3 or 4 treatment-related adverse events occurred in 33.5% of the combination group and 36.9% of the chemotherapy-only group.

The researchers noted that 63.1% of patients in the study had stage IIIA tumors, which has a poor prognosis.

There were benefits to the combination treatment across PD-1–status subgroups, but event-free survival was higher where PD-L1 expression level was 1% or more.

The study is limited by its open-label nature. It was funded by Bristol-Myers Squibb.

Publications
Topics
Sections

Among patients with resectable non–small cell lung cancer (NSCLC), neoadjuvant nivolumab combined with chemotherapy led to significantly longer event-free survival and more frequent pathological complete response in patients than chemotherapy alone.

“Our data show that three cycles of neoadjuvant nivolumab plus chemotherapy improved long-term clinical outcomes in patients with resectable stage IB-IIIA NSCLC without impeding the feasibility of surgery or increasing the incidence of adverse events as compared with chemotherapy alone,” wrote the investigators, who were led by Patrick M. Forde, MB, BCh, Johns Hopkins Kimmel Cancer Center, Baltimore. The study was published online in the New England Journal of Medicine.

Nivolumab (Opdivo, Bristol-Myers Squibb), in combination with platinum-doublet chemotherapy, was approved in March by the Food and Drug Administration as a treatment for adults with early-stage, resectable NSCLC. It is the first approval of a neoadjuvant therapy for this patient population. The results of the study, called CheckMate 816, formed the basis of the approval.

About one in four NSCLC patients have resectable disease at diagnosis, but their mortality rate is 30%-55% even after surgery. Neoadjuvant chemotherapy improves survival in this group, but 5-year recurrence rates improve by just 5%-6%, and rates of pathological complete response are low.

In the neoadjuvant setting, the anti–programmed death 1 (PD-1) antibody nivolumab could reduce micrometastases and boost immune response against bulk tumor and tumor antigens. A phase 2 study published in the Journal of Thoracic Oncology showed that neoadjuvant nivolumab combined with chemotherapy conferred good 3-year overall survival (81.9%) and progression-free survival (69.6%) among patients with stage IIIA NSCLC.
 

Results from CheckMate 816

CheckMate 816 is an open-label, phase 3 trial in which 358 patients were randomized to a neoadjuvant course of 360 mg nivolumab and platinum-doublet chemotherapy or platinum-doublet chemotherapy alone. Treatments occurred every 3 weeks for three cycles.

Definitive surgery was performed in 83.2% of the combination group (R0, 83.2%) and 75.4% in the chemotherapy-only group (R0, 77.8%). 93.8% in the combined group and 84.7% in the chemotherapy-only group completed neoadjuvant treatment. 11.9% of the combination group and 22.2% in the chemotherapy-only group underwent adjuvant therapy. A total of 21.2% in the combination group had cancer therapy versus 43.6% of the chemotherapy-only group.

After a minimum follow-up of 21 months, the combination group had a median event-free survival of 31.6 months versus 20.8 months in the chemotherapy-only group (hazard ratio for disease progression, disease recurrence, or death, 0.63; P = .005). The interim analysis for overall survival showed a possible trend towards improved overall survival in the combination group (HR, 0.57; 99.67% confidence interval, 0.30-1.07; P = .0008).

A total of 24.0% of the combination therapy achieved a pathological complete response versus 2.2% in the chemotherapy-only group (odds ratio, 13.94; P < .001).

Grade 3 or 4 treatment-related adverse events occurred in 33.5% of the combination group and 36.9% of the chemotherapy-only group.

The researchers noted that 63.1% of patients in the study had stage IIIA tumors, which has a poor prognosis.

There were benefits to the combination treatment across PD-1–status subgroups, but event-free survival was higher where PD-L1 expression level was 1% or more.

The study is limited by its open-label nature. It was funded by Bristol-Myers Squibb.

Among patients with resectable non–small cell lung cancer (NSCLC), neoadjuvant nivolumab combined with chemotherapy led to significantly longer event-free survival and more frequent pathological complete response in patients than chemotherapy alone.

“Our data show that three cycles of neoadjuvant nivolumab plus chemotherapy improved long-term clinical outcomes in patients with resectable stage IB-IIIA NSCLC without impeding the feasibility of surgery or increasing the incidence of adverse events as compared with chemotherapy alone,” wrote the investigators, who were led by Patrick M. Forde, MB, BCh, Johns Hopkins Kimmel Cancer Center, Baltimore. The study was published online in the New England Journal of Medicine.

Nivolumab (Opdivo, Bristol-Myers Squibb), in combination with platinum-doublet chemotherapy, was approved in March by the Food and Drug Administration as a treatment for adults with early-stage, resectable NSCLC. It is the first approval of a neoadjuvant therapy for this patient population. The results of the study, called CheckMate 816, formed the basis of the approval.

About one in four NSCLC patients have resectable disease at diagnosis, but their mortality rate is 30%-55% even after surgery. Neoadjuvant chemotherapy improves survival in this group, but 5-year recurrence rates improve by just 5%-6%, and rates of pathological complete response are low.

In the neoadjuvant setting, the anti–programmed death 1 (PD-1) antibody nivolumab could reduce micrometastases and boost immune response against bulk tumor and tumor antigens. A phase 2 study published in the Journal of Thoracic Oncology showed that neoadjuvant nivolumab combined with chemotherapy conferred good 3-year overall survival (81.9%) and progression-free survival (69.6%) among patients with stage IIIA NSCLC.
 

Results from CheckMate 816

CheckMate 816 is an open-label, phase 3 trial in which 358 patients were randomized to a neoadjuvant course of 360 mg nivolumab and platinum-doublet chemotherapy or platinum-doublet chemotherapy alone. Treatments occurred every 3 weeks for three cycles.

Definitive surgery was performed in 83.2% of the combination group (R0, 83.2%) and 75.4% in the chemotherapy-only group (R0, 77.8%). 93.8% in the combined group and 84.7% in the chemotherapy-only group completed neoadjuvant treatment. 11.9% of the combination group and 22.2% in the chemotherapy-only group underwent adjuvant therapy. A total of 21.2% in the combination group had cancer therapy versus 43.6% of the chemotherapy-only group.

After a minimum follow-up of 21 months, the combination group had a median event-free survival of 31.6 months versus 20.8 months in the chemotherapy-only group (hazard ratio for disease progression, disease recurrence, or death, 0.63; P = .005). The interim analysis for overall survival showed a possible trend towards improved overall survival in the combination group (HR, 0.57; 99.67% confidence interval, 0.30-1.07; P = .0008).

A total of 24.0% of the combination therapy achieved a pathological complete response versus 2.2% in the chemotherapy-only group (odds ratio, 13.94; P < .001).

Grade 3 or 4 treatment-related adverse events occurred in 33.5% of the combination group and 36.9% of the chemotherapy-only group.

The researchers noted that 63.1% of patients in the study had stage IIIA tumors, which has a poor prognosis.

There were benefits to the combination treatment across PD-1–status subgroups, but event-free survival was higher where PD-L1 expression level was 1% or more.

The study is limited by its open-label nature. It was funded by Bristol-Myers Squibb.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

'New benchmark' set in phase-3 blood cancer study

Article Type
Changed
Tue, 01/17/2023 - 11:24

The largest trial to date in mantle cell lymphoma shows that adding the Bruton’s tyrosine kinase (BTK) inhibitor ibrutinib (Imbruvica) to standard of care treatment improves progression-free survival (PFS) by 50%.

The phase 3 SHINE study was conducted in 520 older patients (aged ≥ 65 years) with newly diagnosed mantle cell lymphoma who were randomized to receive ibrutinib or placebo plus bendamustine-rituximab (BR) and rituximab maintenance.

After 7 years of follow-up, median PFS was 80.6 months with the ibrutinib combination versus 52.9 years with placebo, offering patients an additional 2.3 years of disease-free life.

Complete response rates were higher with ibrutinib versus placebo, and importantly, there were no new safety signals with the combination.

“We believe this phase 3 clinical trial sets a new benchmark for patients with newly diagnosed mantle cell lymphoma and the elderly,” commented lead investigator Dr. Michael Wang, department of lymphoma & myeloma, University of Texas MD Anderson Cancer Center, Houston.

He was speaking during a press briefing at the annual meeting of the American Society of Clinical Oncology, where the study was presented. It was also simultaneously published in the New England Journal of Medicine.

These results “bring new hope to newly diagnosed, older patients with this rare cancer, who have had too few treatment options” and are “generally underrepresented in clinical trials,” commented Dr. Julie R. Gralow, ASCO chief medical officer.

She described the difference in PFS between the two treatment groups as “profound” and “clinically meaningful,” and said the combination can be considered a “new standard of care as initial treatment of older patients with mantle cell lymphoma.”
 

Some lymphoma experts not impressed

The study got pushback from several lymphoma experts commenting on Twitter.

Lymphoma specialist and consultant hematologist Toby Eyre, MBChB, from Oxford University, London, highlighted the fact that although there was a PFS benefit, there was no overall survival benefit and more toxicity. 

“I hope no one implements this regimen,” replied “Papa Heme” Dr. Aaron Goodman, a hematologist at UC San Diego Health, California.

“The authors should be congratulated on completing a large RCT in this space. As far as the result adding ibrutinib added about 28 mos to PFS. This is actually the median DoR of BTK inhibitors in the 2nd line. So big question is, whether the extra tox is worth it,” commented another lymphoma specialist, Dr. Tim Fenske, MD, of the Medical College of Wisconsin, Milwaukee, replying in the same Twitter thread. 

“I don’t see a benefit in adding continuous ibrutinib upfront to BR, based on these results. Added toxicity + less treatment free interval make this a tough pill to swallow (pun intended),” commented Dr. Alan Skarbnik, MD, of Novant Health, Charlotte, N.C.
 

Potential for first-line use

Ibrutinib is already approved for use in mantle cell lymphoma, but in patients who have received at least one prior therapy; this is an accelerated approval, based on overall response rate. 

These new data could lead to approval for first-line use of the drug.

“There is an urgent need to improve outcomes for older patients with mantle cell lymphoma,” Dr. Wang commented in a company press release. “Given the median progression-free survival of 6.7 years, the ibrutinib combination demonstrated the potential to be a first-line treatment in this population.” 

Mantle cell lymphoma, a form of non-Hodgkin’s lymphoma, affects men more than women and is more common in people aged over 65 years. Older patients often cannot tolerate intensive chemotherapy or stem cell transplants, so they often have poor outcomes, Dr. Wang explained during the press briefing.

He noted that SHINE is the first phase 3 study to examine ibrutinib plus BR as a first-line therapy in mantle cell lymphoma and involved patients with previously untreated stage II-IV disease aged ≥ 65 years not planning to undergo stem cell transplant.

Participants were a median age of 71 years, and 68%-71% were male. Most were White (76%-79%), and median time from initial diagnosis to randomization was 1.4-1.5 months.

At the data cut-off of June 30, 2021, median follow-up was 84.7 months. Disease progression or death had occurred in 44.4% of patients given ibrutinib and 58.0% of those given placebo.

Dr. Wang noted that the PFS curves “separated early, indicating the benefit that was achieved early within the first year and also that those benefits remained durable” throughout follow-up.

The percentage of patients with a complete response was 65.5% among patients treated with ibrutinib and 57.6% among those in the placebo group.

At the current analysis, there was no significant difference in overall survival between the two treatment arms, with a hazard ratio of 1.07 (P = .06).

Dr. Wang explained that “even though the study has been going on for 10 years, we don’t have enough deaths ... to evaluate overall survival yet.”

Furthermore, the median age of patients at enrollment was 71 years and is currently 78 years, with “half of them over 80 years,” so they are more likely to die of “other causes” than from mantle cell lymphoma, he commented.

He added that if the study had been designed to assess overall survival, it would have been “very different,” requiring 1,500 patients and a follow-up of 15-20 years.

The safety profile of the novel combination was “no surprise,” Dr. Wang said, and “consistent with what we’re seeing in daily practice.”

Grade 3/4 treatment-related adverse events were seen in 81.5% of patients treated with ibrutinib and 77.3% of those given placebo, and 47.1% and 48.1%, respectively, experienced grade 3/4 neutropenia.

In the post-presentation discussion, Dr. Wang said that approximately 40% of the patients in the placebo group received a BTK inhibitor at progression, and most were given ibrutinib.

He cautioned that the current results cannot be generalized to “other subtypes of lymphoma,” as they are “very different,” with different prognostic factors and different underlying biologies.

The study was funded by Janssen Pharmaceuticals and Pharmacyclics, an AbbVie Company. Dr. Wang has reported relationships with multiple companies, as listed in the article. Dr. Gralow has reported relationships with Genentech, AstraZeneca, Hexal, Puma Biotechnology, Roche, Novartis, Seagen, and Genomic Health.

 

 

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

The largest trial to date in mantle cell lymphoma shows that adding the Bruton’s tyrosine kinase (BTK) inhibitor ibrutinib (Imbruvica) to standard of care treatment improves progression-free survival (PFS) by 50%.

The phase 3 SHINE study was conducted in 520 older patients (aged ≥ 65 years) with newly diagnosed mantle cell lymphoma who were randomized to receive ibrutinib or placebo plus bendamustine-rituximab (BR) and rituximab maintenance.

After 7 years of follow-up, median PFS was 80.6 months with the ibrutinib combination versus 52.9 years with placebo, offering patients an additional 2.3 years of disease-free life.

Complete response rates were higher with ibrutinib versus placebo, and importantly, there were no new safety signals with the combination.

“We believe this phase 3 clinical trial sets a new benchmark for patients with newly diagnosed mantle cell lymphoma and the elderly,” commented lead investigator Dr. Michael Wang, department of lymphoma & myeloma, University of Texas MD Anderson Cancer Center, Houston.

He was speaking during a press briefing at the annual meeting of the American Society of Clinical Oncology, where the study was presented. It was also simultaneously published in the New England Journal of Medicine.

These results “bring new hope to newly diagnosed, older patients with this rare cancer, who have had too few treatment options” and are “generally underrepresented in clinical trials,” commented Dr. Julie R. Gralow, ASCO chief medical officer.

She described the difference in PFS between the two treatment groups as “profound” and “clinically meaningful,” and said the combination can be considered a “new standard of care as initial treatment of older patients with mantle cell lymphoma.”
 

Some lymphoma experts not impressed

The study got pushback from several lymphoma experts commenting on Twitter.

Lymphoma specialist and consultant hematologist Toby Eyre, MBChB, from Oxford University, London, highlighted the fact that although there was a PFS benefit, there was no overall survival benefit and more toxicity. 

“I hope no one implements this regimen,” replied “Papa Heme” Dr. Aaron Goodman, a hematologist at UC San Diego Health, California.

“The authors should be congratulated on completing a large RCT in this space. As far as the result adding ibrutinib added about 28 mos to PFS. This is actually the median DoR of BTK inhibitors in the 2nd line. So big question is, whether the extra tox is worth it,” commented another lymphoma specialist, Dr. Tim Fenske, MD, of the Medical College of Wisconsin, Milwaukee, replying in the same Twitter thread. 

“I don’t see a benefit in adding continuous ibrutinib upfront to BR, based on these results. Added toxicity + less treatment free interval make this a tough pill to swallow (pun intended),” commented Dr. Alan Skarbnik, MD, of Novant Health, Charlotte, N.C.
 

Potential for first-line use

Ibrutinib is already approved for use in mantle cell lymphoma, but in patients who have received at least one prior therapy; this is an accelerated approval, based on overall response rate. 

These new data could lead to approval for first-line use of the drug.

“There is an urgent need to improve outcomes for older patients with mantle cell lymphoma,” Dr. Wang commented in a company press release. “Given the median progression-free survival of 6.7 years, the ibrutinib combination demonstrated the potential to be a first-line treatment in this population.” 

Mantle cell lymphoma, a form of non-Hodgkin’s lymphoma, affects men more than women and is more common in people aged over 65 years. Older patients often cannot tolerate intensive chemotherapy or stem cell transplants, so they often have poor outcomes, Dr. Wang explained during the press briefing.

He noted that SHINE is the first phase 3 study to examine ibrutinib plus BR as a first-line therapy in mantle cell lymphoma and involved patients with previously untreated stage II-IV disease aged ≥ 65 years not planning to undergo stem cell transplant.

Participants were a median age of 71 years, and 68%-71% were male. Most were White (76%-79%), and median time from initial diagnosis to randomization was 1.4-1.5 months.

At the data cut-off of June 30, 2021, median follow-up was 84.7 months. Disease progression or death had occurred in 44.4% of patients given ibrutinib and 58.0% of those given placebo.

Dr. Wang noted that the PFS curves “separated early, indicating the benefit that was achieved early within the first year and also that those benefits remained durable” throughout follow-up.

The percentage of patients with a complete response was 65.5% among patients treated with ibrutinib and 57.6% among those in the placebo group.

At the current analysis, there was no significant difference in overall survival between the two treatment arms, with a hazard ratio of 1.07 (P = .06).

Dr. Wang explained that “even though the study has been going on for 10 years, we don’t have enough deaths ... to evaluate overall survival yet.”

Furthermore, the median age of patients at enrollment was 71 years and is currently 78 years, with “half of them over 80 years,” so they are more likely to die of “other causes” than from mantle cell lymphoma, he commented.

He added that if the study had been designed to assess overall survival, it would have been “very different,” requiring 1,500 patients and a follow-up of 15-20 years.

The safety profile of the novel combination was “no surprise,” Dr. Wang said, and “consistent with what we’re seeing in daily practice.”

Grade 3/4 treatment-related adverse events were seen in 81.5% of patients treated with ibrutinib and 77.3% of those given placebo, and 47.1% and 48.1%, respectively, experienced grade 3/4 neutropenia.

In the post-presentation discussion, Dr. Wang said that approximately 40% of the patients in the placebo group received a BTK inhibitor at progression, and most were given ibrutinib.

He cautioned that the current results cannot be generalized to “other subtypes of lymphoma,” as they are “very different,” with different prognostic factors and different underlying biologies.

The study was funded by Janssen Pharmaceuticals and Pharmacyclics, an AbbVie Company. Dr. Wang has reported relationships with multiple companies, as listed in the article. Dr. Gralow has reported relationships with Genentech, AstraZeneca, Hexal, Puma Biotechnology, Roche, Novartis, Seagen, and Genomic Health.

 

 

A version of this article first appeared on Medscape.com.

The largest trial to date in mantle cell lymphoma shows that adding the Bruton’s tyrosine kinase (BTK) inhibitor ibrutinib (Imbruvica) to standard of care treatment improves progression-free survival (PFS) by 50%.

The phase 3 SHINE study was conducted in 520 older patients (aged ≥ 65 years) with newly diagnosed mantle cell lymphoma who were randomized to receive ibrutinib or placebo plus bendamustine-rituximab (BR) and rituximab maintenance.

After 7 years of follow-up, median PFS was 80.6 months with the ibrutinib combination versus 52.9 years with placebo, offering patients an additional 2.3 years of disease-free life.

Complete response rates were higher with ibrutinib versus placebo, and importantly, there were no new safety signals with the combination.

“We believe this phase 3 clinical trial sets a new benchmark for patients with newly diagnosed mantle cell lymphoma and the elderly,” commented lead investigator Dr. Michael Wang, department of lymphoma & myeloma, University of Texas MD Anderson Cancer Center, Houston.

He was speaking during a press briefing at the annual meeting of the American Society of Clinical Oncology, where the study was presented. It was also simultaneously published in the New England Journal of Medicine.

These results “bring new hope to newly diagnosed, older patients with this rare cancer, who have had too few treatment options” and are “generally underrepresented in clinical trials,” commented Dr. Julie R. Gralow, ASCO chief medical officer.

She described the difference in PFS between the two treatment groups as “profound” and “clinically meaningful,” and said the combination can be considered a “new standard of care as initial treatment of older patients with mantle cell lymphoma.”
 

Some lymphoma experts not impressed

The study got pushback from several lymphoma experts commenting on Twitter.

Lymphoma specialist and consultant hematologist Toby Eyre, MBChB, from Oxford University, London, highlighted the fact that although there was a PFS benefit, there was no overall survival benefit and more toxicity. 

“I hope no one implements this regimen,” replied “Papa Heme” Dr. Aaron Goodman, a hematologist at UC San Diego Health, California.

“The authors should be congratulated on completing a large RCT in this space. As far as the result adding ibrutinib added about 28 mos to PFS. This is actually the median DoR of BTK inhibitors in the 2nd line. So big question is, whether the extra tox is worth it,” commented another lymphoma specialist, Dr. Tim Fenske, MD, of the Medical College of Wisconsin, Milwaukee, replying in the same Twitter thread. 

“I don’t see a benefit in adding continuous ibrutinib upfront to BR, based on these results. Added toxicity + less treatment free interval make this a tough pill to swallow (pun intended),” commented Dr. Alan Skarbnik, MD, of Novant Health, Charlotte, N.C.
 

Potential for first-line use

Ibrutinib is already approved for use in mantle cell lymphoma, but in patients who have received at least one prior therapy; this is an accelerated approval, based on overall response rate. 

These new data could lead to approval for first-line use of the drug.

“There is an urgent need to improve outcomes for older patients with mantle cell lymphoma,” Dr. Wang commented in a company press release. “Given the median progression-free survival of 6.7 years, the ibrutinib combination demonstrated the potential to be a first-line treatment in this population.” 

Mantle cell lymphoma, a form of non-Hodgkin’s lymphoma, affects men more than women and is more common in people aged over 65 years. Older patients often cannot tolerate intensive chemotherapy or stem cell transplants, so they often have poor outcomes, Dr. Wang explained during the press briefing.

He noted that SHINE is the first phase 3 study to examine ibrutinib plus BR as a first-line therapy in mantle cell lymphoma and involved patients with previously untreated stage II-IV disease aged ≥ 65 years not planning to undergo stem cell transplant.

Participants were a median age of 71 years, and 68%-71% were male. Most were White (76%-79%), and median time from initial diagnosis to randomization was 1.4-1.5 months.

At the data cut-off of June 30, 2021, median follow-up was 84.7 months. Disease progression or death had occurred in 44.4% of patients given ibrutinib and 58.0% of those given placebo.

Dr. Wang noted that the PFS curves “separated early, indicating the benefit that was achieved early within the first year and also that those benefits remained durable” throughout follow-up.

The percentage of patients with a complete response was 65.5% among patients treated with ibrutinib and 57.6% among those in the placebo group.

At the current analysis, there was no significant difference in overall survival between the two treatment arms, with a hazard ratio of 1.07 (P = .06).

Dr. Wang explained that “even though the study has been going on for 10 years, we don’t have enough deaths ... to evaluate overall survival yet.”

Furthermore, the median age of patients at enrollment was 71 years and is currently 78 years, with “half of them over 80 years,” so they are more likely to die of “other causes” than from mantle cell lymphoma, he commented.

He added that if the study had been designed to assess overall survival, it would have been “very different,” requiring 1,500 patients and a follow-up of 15-20 years.

The safety profile of the novel combination was “no surprise,” Dr. Wang said, and “consistent with what we’re seeing in daily practice.”

Grade 3/4 treatment-related adverse events were seen in 81.5% of patients treated with ibrutinib and 77.3% of those given placebo, and 47.1% and 48.1%, respectively, experienced grade 3/4 neutropenia.

In the post-presentation discussion, Dr. Wang said that approximately 40% of the patients in the placebo group received a BTK inhibitor at progression, and most were given ibrutinib.

He cautioned that the current results cannot be generalized to “other subtypes of lymphoma,” as they are “very different,” with different prognostic factors and different underlying biologies.

The study was funded by Janssen Pharmaceuticals and Pharmacyclics, an AbbVie Company. Dr. Wang has reported relationships with multiple companies, as listed in the article. Dr. Gralow has reported relationships with Genentech, AstraZeneca, Hexal, Puma Biotechnology, Roche, Novartis, Seagen, and Genomic Health.

 

 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASCO 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

The Slow, Long Search for Migraine’s Headwaters

Article Type
Changed
Thu, 11/03/2022 - 13:55
Display Headline
The Slow, Long Search for Migraine’s Headwaters

Dr. Messoud Ashina is a Professor of Neurology, Faculty of Health and Medical Sciences, University of Copenhagen, Denmark. He is Director of the Human Migraine Research Unit at the Danish Headache Center and Department of Neurology, Rigshospitalet Glostrup. He serves an associate editor for Cephalalgia, Journal of Headache and Pain, and Brain.

 

Dr. Faisal Mohammad Amin is an Associate Professor, Danish Headache Center, Department of Neurology, Rigshospitalet Glostrup, University of Copenhagen, Denmark. He is an associate editor for Headache Medicine and is President of the Danish Headache Society.

 

Dr. Ashina reports that he has received fees and grants from and/or has served as a principal trial investigator for AbbVie, Amgen, Eli Lilly, Lundbeck Pharmaceuticals, Lundbeck Foundation, Novartis, Novo Nordisk Foundation, and Teva.
 
Dr. Amin reports that he has worked as a consultant, speaker, and/or primary investigator Eli Lilly, Lundbeck, Novartis, and Teva. Both authors have reported that they have no ownership interest nor own any stocks in a pharmaceutical company.

 

 

 

 

 

Since the time of the Neanderthals, humankind has looked for ways to rid the brain of migraine headache. There is evidence that trepanation–removing a portion of bone from the skull–was performed on Neolithic skulls. Did it work for that poor individual? We will never know.

 

What is known is that the often circuitous hunt for effective treatments has taken centuries. And while this search led to the successful introduction of calcitonin gene-related peptides (CGRPs) a few years ago, the search is nowhere near finished, as efforts to pinpoint the source of migraine continue, as does the search for other possible therapies.

 

The nearly 39 million people with migraine in the United States  would be grateful; they often experience a perplexing, frustrating, and unsatisfactory search for a pain-free existence. Migraine is estimated to cost more than $20 million per year in direct medical expenses and lost productivity in the United States. People with migraine, meanwhile, face the prospect of significant disability. More than 8 in every 10 participants in the American Migraine Study had at least some headache-related disability. More than half said their pain has caused severe impairment.

 

The search to find relief for these patients is focused on understanding the pathophysiology of migraine. Approaches include in vitro application of mediators, direct electrical stimulation of trigeminal neurons in vivo, administration of vasoactive substances in vivo, and introduction of exogenous pain-inducing substances in vivo. In 2021, investigators at AstraZeneca and the University of Arizona College of Medicine described their development of an injury-free murine model to be used to study migraine-like pain. Animal research has led to a few interventional studies involving new and existing medications. 

 

How the field has evolved from using a chisel to make a cranial hole to using magnetic resonance imaging and other technologies to examine the trigeminovascular system’s role in the pathophysiology of migraine headache is a tale worth telling.

 

From crocodiles to nitroglycerin to allergies

The search for an effective remedy for migraine has proved to be torturously slow. In addition to trepanation, another procedure thought to have been used during prehistoric times involved a religious ritual whereby a clay crocodile was attached tightly with a strip of linen to an individual’s head. Though the gods were credited if the headache pain receded, relief likely came from the resulting compression on the scalp. Centuries later, in the Middle Ages, treatments included soaking bandages in drugs and then applying them to the head or mixing elixirs with vinegar (which opened scalp pores) and opium (which traveled into the scalp through the open pores).

 

The Persian scholar Ibn Sina (980-1032), also known as Avicenna, postulated that the pain could emanate from the bones that comprise the skull or within the parenchyma or from veins and arteries outside the cranium. The medicinal plants he investigated for the treatment of migraine have components that resonate today: antineuroinflammatory agents, analgesics, and even cyclooxegenase-2 inhibitors.

 

Six hundred years later, English physician Thomas Willis discussed how the vascular system perpetrated migraine, and in the next century, Erasmus Darwin, Charles’ grandfather, proposed that individuals with migraine be spun around so that blood from the head would be forced down toward the feet. In the 1800s, English physician Edward Liveing abandoned vascular theory, instead proposing that migraine resulted from discharges of the central nervous system.

 

British neurologist William Gowers thought migraine could be a derangement of neurons, but ultimately wrote in his Manual of Diseases (p. 852) that, “When all has been that can be, mystery still envelops the mechanism of migraine.” Gowers advocated continuous treatment with drugs to minimize the frequency of attacks, as well as treating the attacks themselves. His preferred treatments were nitroglycerin in alcohol, combined with other agents, as well as marijuana. His choice of nitroglycerin is an interesting one, given that modern medicine considers nitroglycerin an important neurochemical in migraine initiation.

 

The concept of neuronal involvement retained support into the 20th century, solidified by German physician Paul Ehrlich’s Nobel Prize–winning work involving immunology and brain receptors. In the 1920s, thoughts turned to allergy as the source of migraine, as an association between migraine, asthma, and urticaria emerged, but this connection was eventually proved to be incidental, not causal.

 

In the 1930s, the vascular theory again was vogue -- aided by studies performed by US physician Harold G. Wolff. His work, the first to assess headache in a laboratory setting, along with observations about changes in vasculature and evolving treatment, appeared to support the vascular nature of headache. In the 1940s and 1950s, psychosomatic disorders crept into the mix of possible causes. Some categorized migraine as a so-called stress disease.

 

Puzzles and irony

In 1979, Moskowitz and colleagues introduced a new hypothesis focused on the importance of the neuropeptide-containing trigeminal nerve. CGRP is stored in vesicles in sensory nerve terminals, where it is released along with the vasodilating peptide, substance P, when the trigeminal nerve is activated.

 

At about the same time, researchers in England were working on a discovery with ancestral roots going back hundreds of years. In the 18th century, scientists learned that rye ergot was a constrictor of blood vessels. In time, ergot became ergotamine and hence more valuable because it could reduce vascular headaches. But the adverse effects, prominent in those with cardiovascular disease, kept researchers in the lab.

 

So, while Moskowitz and colleagues were focused on CGRPs, Humphrey et al were focused on a receptor they found in cranial blood vessels that came to be called serotonin (5-HT1B). An agonist soon followed. In 1991, sumatriptan became available in Europe, and 2 years later, it was available in the United States. But sumatriptan is for acute care treatment, not a preventive therapy. It was Moskowitz’s work that led to studies demonstrating that antisera could neutralize CGRP and substance P.

 

For those with chronic migraine, preventive therapy was exactly what they needed because, while the triptans helped, they were insufficient for many. In a 2-year longitudinal analysis conducted in Italy involving 82,446 individuals prescribed at least 1 triptan, 31,515 had an unmet medical need in migraine (3.1 per 1000 patients).

 

In February 2022, a team of researchers published the results of a genome-wide association study involving over 100,000 cases. The results were 125 risk loci linked to migraine within the vascular and central nervous systems, thereby firmly establishing that the pathophysiology of migraine exists in neurovascular mechanisms. 

 

The fact that it has taken technology to prove that migraine exists and that it is organically rooted is obviously satisfying but also frustrating. For centuries, people with migraine were considered to have caused their own illness or were exaggerating the pain.

 

In March 2022, a large German population-based study found that people with migraine still struggled with bias, stigma, and undermedication. Fifty-four percent said they were not seeing a physician for their migraine, and 33% said they had not received information on medication overuse risks.

 

With captured images of what happens inside the brains of these patients during an attack, now the focus can be on helping them and not questioning the validity of their reported symptoms.

 

Coming next month, a discussion about migraine therapies.

Publications
Topics
Sections

Dr. Messoud Ashina is a Professor of Neurology, Faculty of Health and Medical Sciences, University of Copenhagen, Denmark. He is Director of the Human Migraine Research Unit at the Danish Headache Center and Department of Neurology, Rigshospitalet Glostrup. He serves an associate editor for Cephalalgia, Journal of Headache and Pain, and Brain.

 

Dr. Faisal Mohammad Amin is an Associate Professor, Danish Headache Center, Department of Neurology, Rigshospitalet Glostrup, University of Copenhagen, Denmark. He is an associate editor for Headache Medicine and is President of the Danish Headache Society.

 

Dr. Ashina reports that he has received fees and grants from and/or has served as a principal trial investigator for AbbVie, Amgen, Eli Lilly, Lundbeck Pharmaceuticals, Lundbeck Foundation, Novartis, Novo Nordisk Foundation, and Teva.
 
Dr. Amin reports that he has worked as a consultant, speaker, and/or primary investigator Eli Lilly, Lundbeck, Novartis, and Teva. Both authors have reported that they have no ownership interest nor own any stocks in a pharmaceutical company.

 

 

 

 

 

Since the time of the Neanderthals, humankind has looked for ways to rid the brain of migraine headache. There is evidence that trepanation–removing a portion of bone from the skull–was performed on Neolithic skulls. Did it work for that poor individual? We will never know.

 

What is known is that the often circuitous hunt for effective treatments has taken centuries. And while this search led to the successful introduction of calcitonin gene-related peptides (CGRPs) a few years ago, the search is nowhere near finished, as efforts to pinpoint the source of migraine continue, as does the search for other possible therapies.

 

The nearly 39 million people with migraine in the United States  would be grateful; they often experience a perplexing, frustrating, and unsatisfactory search for a pain-free existence. Migraine is estimated to cost more than $20 million per year in direct medical expenses and lost productivity in the United States. People with migraine, meanwhile, face the prospect of significant disability. More than 8 in every 10 participants in the American Migraine Study had at least some headache-related disability. More than half said their pain has caused severe impairment.

 

The search to find relief for these patients is focused on understanding the pathophysiology of migraine. Approaches include in vitro application of mediators, direct electrical stimulation of trigeminal neurons in vivo, administration of vasoactive substances in vivo, and introduction of exogenous pain-inducing substances in vivo. In 2021, investigators at AstraZeneca and the University of Arizona College of Medicine described their development of an injury-free murine model to be used to study migraine-like pain. Animal research has led to a few interventional studies involving new and existing medications. 

 

How the field has evolved from using a chisel to make a cranial hole to using magnetic resonance imaging and other technologies to examine the trigeminovascular system’s role in the pathophysiology of migraine headache is a tale worth telling.

 

From crocodiles to nitroglycerin to allergies

The search for an effective remedy for migraine has proved to be torturously slow. In addition to trepanation, another procedure thought to have been used during prehistoric times involved a religious ritual whereby a clay crocodile was attached tightly with a strip of linen to an individual’s head. Though the gods were credited if the headache pain receded, relief likely came from the resulting compression on the scalp. Centuries later, in the Middle Ages, treatments included soaking bandages in drugs and then applying them to the head or mixing elixirs with vinegar (which opened scalp pores) and opium (which traveled into the scalp through the open pores).

 

The Persian scholar Ibn Sina (980-1032), also known as Avicenna, postulated that the pain could emanate from the bones that comprise the skull or within the parenchyma or from veins and arteries outside the cranium. The medicinal plants he investigated for the treatment of migraine have components that resonate today: antineuroinflammatory agents, analgesics, and even cyclooxegenase-2 inhibitors.

 

Six hundred years later, English physician Thomas Willis discussed how the vascular system perpetrated migraine, and in the next century, Erasmus Darwin, Charles’ grandfather, proposed that individuals with migraine be spun around so that blood from the head would be forced down toward the feet. In the 1800s, English physician Edward Liveing abandoned vascular theory, instead proposing that migraine resulted from discharges of the central nervous system.

 

British neurologist William Gowers thought migraine could be a derangement of neurons, but ultimately wrote in his Manual of Diseases (p. 852) that, “When all has been that can be, mystery still envelops the mechanism of migraine.” Gowers advocated continuous treatment with drugs to minimize the frequency of attacks, as well as treating the attacks themselves. His preferred treatments were nitroglycerin in alcohol, combined with other agents, as well as marijuana. His choice of nitroglycerin is an interesting one, given that modern medicine considers nitroglycerin an important neurochemical in migraine initiation.

 

The concept of neuronal involvement retained support into the 20th century, solidified by German physician Paul Ehrlich’s Nobel Prize–winning work involving immunology and brain receptors. In the 1920s, thoughts turned to allergy as the source of migraine, as an association between migraine, asthma, and urticaria emerged, but this connection was eventually proved to be incidental, not causal.

 

In the 1930s, the vascular theory again was vogue -- aided by studies performed by US physician Harold G. Wolff. His work, the first to assess headache in a laboratory setting, along with observations about changes in vasculature and evolving treatment, appeared to support the vascular nature of headache. In the 1940s and 1950s, psychosomatic disorders crept into the mix of possible causes. Some categorized migraine as a so-called stress disease.

 

Puzzles and irony

In 1979, Moskowitz and colleagues introduced a new hypothesis focused on the importance of the neuropeptide-containing trigeminal nerve. CGRP is stored in vesicles in sensory nerve terminals, where it is released along with the vasodilating peptide, substance P, when the trigeminal nerve is activated.

 

At about the same time, researchers in England were working on a discovery with ancestral roots going back hundreds of years. In the 18th century, scientists learned that rye ergot was a constrictor of blood vessels. In time, ergot became ergotamine and hence more valuable because it could reduce vascular headaches. But the adverse effects, prominent in those with cardiovascular disease, kept researchers in the lab.

 

So, while Moskowitz and colleagues were focused on CGRPs, Humphrey et al were focused on a receptor they found in cranial blood vessels that came to be called serotonin (5-HT1B). An agonist soon followed. In 1991, sumatriptan became available in Europe, and 2 years later, it was available in the United States. But sumatriptan is for acute care treatment, not a preventive therapy. It was Moskowitz’s work that led to studies demonstrating that antisera could neutralize CGRP and substance P.

 

For those with chronic migraine, preventive therapy was exactly what they needed because, while the triptans helped, they were insufficient for many. In a 2-year longitudinal analysis conducted in Italy involving 82,446 individuals prescribed at least 1 triptan, 31,515 had an unmet medical need in migraine (3.1 per 1000 patients).

 

In February 2022, a team of researchers published the results of a genome-wide association study involving over 100,000 cases. The results were 125 risk loci linked to migraine within the vascular and central nervous systems, thereby firmly establishing that the pathophysiology of migraine exists in neurovascular mechanisms. 

 

The fact that it has taken technology to prove that migraine exists and that it is organically rooted is obviously satisfying but also frustrating. For centuries, people with migraine were considered to have caused their own illness or were exaggerating the pain.

 

In March 2022, a large German population-based study found that people with migraine still struggled with bias, stigma, and undermedication. Fifty-four percent said they were not seeing a physician for their migraine, and 33% said they had not received information on medication overuse risks.

 

With captured images of what happens inside the brains of these patients during an attack, now the focus can be on helping them and not questioning the validity of their reported symptoms.

 

Coming next month, a discussion about migraine therapies.

Dr. Messoud Ashina is a Professor of Neurology, Faculty of Health and Medical Sciences, University of Copenhagen, Denmark. He is Director of the Human Migraine Research Unit at the Danish Headache Center and Department of Neurology, Rigshospitalet Glostrup. He serves an associate editor for Cephalalgia, Journal of Headache and Pain, and Brain.

 

Dr. Faisal Mohammad Amin is an Associate Professor, Danish Headache Center, Department of Neurology, Rigshospitalet Glostrup, University of Copenhagen, Denmark. He is an associate editor for Headache Medicine and is President of the Danish Headache Society.

 

Dr. Ashina reports that he has received fees and grants from and/or has served as a principal trial investigator for AbbVie, Amgen, Eli Lilly, Lundbeck Pharmaceuticals, Lundbeck Foundation, Novartis, Novo Nordisk Foundation, and Teva.
 
Dr. Amin reports that he has worked as a consultant, speaker, and/or primary investigator Eli Lilly, Lundbeck, Novartis, and Teva. Both authors have reported that they have no ownership interest nor own any stocks in a pharmaceutical company.

 

 

 

 

 

Since the time of the Neanderthals, humankind has looked for ways to rid the brain of migraine headache. There is evidence that trepanation–removing a portion of bone from the skull–was performed on Neolithic skulls. Did it work for that poor individual? We will never know.

 

What is known is that the often circuitous hunt for effective treatments has taken centuries. And while this search led to the successful introduction of calcitonin gene-related peptides (CGRPs) a few years ago, the search is nowhere near finished, as efforts to pinpoint the source of migraine continue, as does the search for other possible therapies.

 

The nearly 39 million people with migraine in the United States  would be grateful; they often experience a perplexing, frustrating, and unsatisfactory search for a pain-free existence. Migraine is estimated to cost more than $20 million per year in direct medical expenses and lost productivity in the United States. People with migraine, meanwhile, face the prospect of significant disability. More than 8 in every 10 participants in the American Migraine Study had at least some headache-related disability. More than half said their pain has caused severe impairment.

 

The search to find relief for these patients is focused on understanding the pathophysiology of migraine. Approaches include in vitro application of mediators, direct electrical stimulation of trigeminal neurons in vivo, administration of vasoactive substances in vivo, and introduction of exogenous pain-inducing substances in vivo. In 2021, investigators at AstraZeneca and the University of Arizona College of Medicine described their development of an injury-free murine model to be used to study migraine-like pain. Animal research has led to a few interventional studies involving new and existing medications. 

 

How the field has evolved from using a chisel to make a cranial hole to using magnetic resonance imaging and other technologies to examine the trigeminovascular system’s role in the pathophysiology of migraine headache is a tale worth telling.

 

From crocodiles to nitroglycerin to allergies

The search for an effective remedy for migraine has proved to be torturously slow. In addition to trepanation, another procedure thought to have been used during prehistoric times involved a religious ritual whereby a clay crocodile was attached tightly with a strip of linen to an individual’s head. Though the gods were credited if the headache pain receded, relief likely came from the resulting compression on the scalp. Centuries later, in the Middle Ages, treatments included soaking bandages in drugs and then applying them to the head or mixing elixirs with vinegar (which opened scalp pores) and opium (which traveled into the scalp through the open pores).

 

The Persian scholar Ibn Sina (980-1032), also known as Avicenna, postulated that the pain could emanate from the bones that comprise the skull or within the parenchyma or from veins and arteries outside the cranium. The medicinal plants he investigated for the treatment of migraine have components that resonate today: antineuroinflammatory agents, analgesics, and even cyclooxegenase-2 inhibitors.

 

Six hundred years later, English physician Thomas Willis discussed how the vascular system perpetrated migraine, and in the next century, Erasmus Darwin, Charles’ grandfather, proposed that individuals with migraine be spun around so that blood from the head would be forced down toward the feet. In the 1800s, English physician Edward Liveing abandoned vascular theory, instead proposing that migraine resulted from discharges of the central nervous system.

 

British neurologist William Gowers thought migraine could be a derangement of neurons, but ultimately wrote in his Manual of Diseases (p. 852) that, “When all has been that can be, mystery still envelops the mechanism of migraine.” Gowers advocated continuous treatment with drugs to minimize the frequency of attacks, as well as treating the attacks themselves. His preferred treatments were nitroglycerin in alcohol, combined with other agents, as well as marijuana. His choice of nitroglycerin is an interesting one, given that modern medicine considers nitroglycerin an important neurochemical in migraine initiation.

 

The concept of neuronal involvement retained support into the 20th century, solidified by German physician Paul Ehrlich’s Nobel Prize–winning work involving immunology and brain receptors. In the 1920s, thoughts turned to allergy as the source of migraine, as an association between migraine, asthma, and urticaria emerged, but this connection was eventually proved to be incidental, not causal.

 

In the 1930s, the vascular theory again was vogue -- aided by studies performed by US physician Harold G. Wolff. His work, the first to assess headache in a laboratory setting, along with observations about changes in vasculature and evolving treatment, appeared to support the vascular nature of headache. In the 1940s and 1950s, psychosomatic disorders crept into the mix of possible causes. Some categorized migraine as a so-called stress disease.

 

Puzzles and irony

In 1979, Moskowitz and colleagues introduced a new hypothesis focused on the importance of the neuropeptide-containing trigeminal nerve. CGRP is stored in vesicles in sensory nerve terminals, where it is released along with the vasodilating peptide, substance P, when the trigeminal nerve is activated.

 

At about the same time, researchers in England were working on a discovery with ancestral roots going back hundreds of years. In the 18th century, scientists learned that rye ergot was a constrictor of blood vessels. In time, ergot became ergotamine and hence more valuable because it could reduce vascular headaches. But the adverse effects, prominent in those with cardiovascular disease, kept researchers in the lab.

 

So, while Moskowitz and colleagues were focused on CGRPs, Humphrey et al were focused on a receptor they found in cranial blood vessels that came to be called serotonin (5-HT1B). An agonist soon followed. In 1991, sumatriptan became available in Europe, and 2 years later, it was available in the United States. But sumatriptan is for acute care treatment, not a preventive therapy. It was Moskowitz’s work that led to studies demonstrating that antisera could neutralize CGRP and substance P.

 

For those with chronic migraine, preventive therapy was exactly what they needed because, while the triptans helped, they were insufficient for many. In a 2-year longitudinal analysis conducted in Italy involving 82,446 individuals prescribed at least 1 triptan, 31,515 had an unmet medical need in migraine (3.1 per 1000 patients).

 

In February 2022, a team of researchers published the results of a genome-wide association study involving over 100,000 cases. The results were 125 risk loci linked to migraine within the vascular and central nervous systems, thereby firmly establishing that the pathophysiology of migraine exists in neurovascular mechanisms. 

 

The fact that it has taken technology to prove that migraine exists and that it is organically rooted is obviously satisfying but also frustrating. For centuries, people with migraine were considered to have caused their own illness or were exaggerating the pain.

 

In March 2022, a large German population-based study found that people with migraine still struggled with bias, stigma, and undermedication. Fifty-four percent said they were not seeing a physician for their migraine, and 33% said they had not received information on medication overuse risks.

 

With captured images of what happens inside the brains of these patients during an attack, now the focus can be on helping them and not questioning the validity of their reported symptoms.

 

Coming next month, a discussion about migraine therapies.

Publications
Publications
Topics
Article Type
Display Headline
The Slow, Long Search for Migraine’s Headwaters
Display Headline
The Slow, Long Search for Migraine’s Headwaters
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 06/01/2022 - 16:30
Un-Gate On Date
Wed, 06/01/2022 - 16:30
Use ProPublica
CFC Schedule Remove Status
Wed, 06/01/2022 - 16:30
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Activity Salesforce Deliverable ID
314833.1
Activity ID
78066
Product Name
ICYMI Expert Perspectives
Product ID
112
Supporter Name /ID
Biohaven Pharmaceuticals [ 5341 ]

Clinical Edge Journal Scan Commentary: Migraine June 2022

Article Type
Changed
Sat, 07/30/2022 - 11:27
Dr Berk scans the journal, so you don't have to!

 

Many of our patients with refractory migraine do not respond to first-line acute or preventive treatments, and, almost by definition, first- and second-line treatments have failed in the majority of patients on calcitonin gene-related peptide (CGRP) antagonist medications. Three studies this month highlight the efficacy of CGRP monoclonal antibody (mAb) and small-molecule medications in this population specifically.

 

After an initial first dose of a CGRP mAb treatment, many patients ask whether a suboptimal response necessitates switching to another agent or whether a second (or third) dose should be given first. Eptinezumab is an intravenously administered mAb that is repeated every 12 weeks. Schim and colleagues present post hoc data for patients who initially had a minimally beneficial response to eptinezumab and received a second dose at week 13.
 

The authors define a suboptimal response as having less then a 50% decrease in monthly migraine days after 12 weeks. There were two pooled samples of patients — those who received 100 mg eptinezumab and those who received a 300 mg dose. Approximately 45% of patients in the pivotal trials of eptinezumab (PROMISE-1 and -2) were considered suboptimal responders, and 33%-37% of those suboptimal responders had a more than 50% decrease of their monthly migraine days after a second dose (week 24).
 

Further analysis determined predictive factors that favored a second dose response. The most prominent (and arguably most obvious) predictive factor was a favorable response after the first dose; the greater percent change in monthly migraine days after the first dose was proportional to the response after the second dose. Change in the Headache Impact Test (HIT-6) disability score after the first dose was also seen to be a strong predictive factor for improvement after the second dose.

 

When we discuss continuation of medications with our patients, especially when they have a suboptimal response, we should first keep in mind the degree of improvement that the patient initially had. There can be benefit from further treatment with the same medication; however, if the response truly was minimal, it may be better to consider another treatment option.

 

Practically every patient taking a preventive medication is taking at least one acute medication as well. Even the best preventive medication is not a guarantee that further exacerbations will not occur, and our patients will still need some acute treatment option even when their preventive medications are very effective. The study by Ambrosini and colleagues specifically shows how effective a preventive medication can be, specifically in allowing the patient to use fewer acute medications over time in a population of patients who have been resistant to two to four treatments.
 

Galcanezumab is a once-monthly mAb for the prevention of migraine. The authors of this study compared the acute use of medications for migraine in both the randomized and open-label stages of a study assessing treatment-refractory patients. A total of 462 patients were enrolled who were all resistant to two to four standard-of-care migraine-preventive medications that had been stopped either because of lack of efficacy or tolerance. The double-blind stage lasted 3 months; the open-label stage lasted another 3 months.
 

The treatment group was seen to use significantly fewer acute medications after just the first month and continued to improve through month 3. In the open-label phase, a similar improvement was noted in patients transitioning from placebo. In addition to acute medication use, emergency department use for migraine treatment was decreased significantly as well, by more than two thirds in month 3.

Migraine prevention will always remain the key ingredient for improvement for patients with higher frequencies of migraine, and adequate prevention will allow for the lower use of acute medications, and for less healthcare system use in general.
 

Most practitioners recommend migraine-specific medications for the acute treatment of migraine. Since the advent of sumatriptan, this has usually meant a triptan medication. However, a significant percentage of the population (up to 44% in one study) are either intolerant to, contraindicated for, or respond insufficiently to triptan medications. This can either be due to a strong triptan side effect (worsened nausea; tightness/soreness of the muscles of the chest, shoulders, and neck), having cardiovascular risk factors, or not responding adequately 2 hours after treatment. The study by Lipton and colleagues specifically assessed the efficacy of ubrogepant in this population.
 

Ubrogepant is a small-molecule CGRP antagonist for the acute treatment of migraine. Although somewhat controversial, most practitioners use ubrogepant in patients with some cardiovascular risk, a situation where they would be more likely to avoid the use of triptans. The study authors pooled post hoc data from the pivotal ubrogepant trials (ACHIEVE-1 and -2) to isolate patients with insufficient response to triptans, and their primary outcome was improvement in function 2 hours after medication dose.
 

Participants in the pivotal trials were separated into three groups: triptan responders, triptan insufficient responders, and triptan-naive patients. Triptan response was defined as achieving pain freedom 2 hours after medication dose. Both those who had an insufficient response and those who no longer use the triptan owing to intolerance or contraindications were included in the group with insufficient triptan response. Function improvement was defined as the primary outcome on the basis of a 4-point response scale (0 = no disability, 1 = mildly impaired, 2 = moderately impaired, 3 = severely impaired). In addition, patients were asked to report scores of satisfaction with the medication (yes or no) at 2 and 24 hours and their impression of overall change at 2 hours using a 7-point scale.
 

The population group of triptan insufficient responders (451 patients) had significant improvement in the primary outcome functional disability at 2, 4, and 7 hours after receipt of medications, but there was no statistical difference at 1 hour. This was similar when comparing those with intolerance to triptans, insufficient response to triptans, or contraindications for triptans. The secondary outcomes of satisfaction and global impression of change were also statistically improved in the insufficient-responders group. No additional tolerance issues or adverse events were noted in this group either.

It would certainly be worth considering the use of a gepant acute medication, such as ubrogepant, in patients who are intolerant to or inadequately treated by triptan medications. There still is much to learn about cardiovascular risk and the use of CGRP antagonists, and although no adverse events were noted, more data may be necessary to widely prescribe this class in higher-risk patients.

Author and Disclosure Information

Thomas Berk, MD 

Neura Health and Thomas Jefferson University, Woodbury, NJ 

Publications
Topics
Sections
Author and Disclosure Information

Thomas Berk, MD 

Neura Health and Thomas Jefferson University, Woodbury, NJ 

Author and Disclosure Information

Thomas Berk, MD 

Neura Health and Thomas Jefferson University, Woodbury, NJ 

Dr Berk scans the journal, so you don't have to!
Dr Berk scans the journal, so you don't have to!

 

Many of our patients with refractory migraine do not respond to first-line acute or preventive treatments, and, almost by definition, first- and second-line treatments have failed in the majority of patients on calcitonin gene-related peptide (CGRP) antagonist medications. Three studies this month highlight the efficacy of CGRP monoclonal antibody (mAb) and small-molecule medications in this population specifically.

 

After an initial first dose of a CGRP mAb treatment, many patients ask whether a suboptimal response necessitates switching to another agent or whether a second (or third) dose should be given first. Eptinezumab is an intravenously administered mAb that is repeated every 12 weeks. Schim and colleagues present post hoc data for patients who initially had a minimally beneficial response to eptinezumab and received a second dose at week 13.
 

The authors define a suboptimal response as having less then a 50% decrease in monthly migraine days after 12 weeks. There were two pooled samples of patients — those who received 100 mg eptinezumab and those who received a 300 mg dose. Approximately 45% of patients in the pivotal trials of eptinezumab (PROMISE-1 and -2) were considered suboptimal responders, and 33%-37% of those suboptimal responders had a more than 50% decrease of their monthly migraine days after a second dose (week 24).
 

Further analysis determined predictive factors that favored a second dose response. The most prominent (and arguably most obvious) predictive factor was a favorable response after the first dose; the greater percent change in monthly migraine days after the first dose was proportional to the response after the second dose. Change in the Headache Impact Test (HIT-6) disability score after the first dose was also seen to be a strong predictive factor for improvement after the second dose.

 

When we discuss continuation of medications with our patients, especially when they have a suboptimal response, we should first keep in mind the degree of improvement that the patient initially had. There can be benefit from further treatment with the same medication; however, if the response truly was minimal, it may be better to consider another treatment option.

 

Practically every patient taking a preventive medication is taking at least one acute medication as well. Even the best preventive medication is not a guarantee that further exacerbations will not occur, and our patients will still need some acute treatment option even when their preventive medications are very effective. The study by Ambrosini and colleagues specifically shows how effective a preventive medication can be, specifically in allowing the patient to use fewer acute medications over time in a population of patients who have been resistant to two to four treatments.
 

Galcanezumab is a once-monthly mAb for the prevention of migraine. The authors of this study compared the acute use of medications for migraine in both the randomized and open-label stages of a study assessing treatment-refractory patients. A total of 462 patients were enrolled who were all resistant to two to four standard-of-care migraine-preventive medications that had been stopped either because of lack of efficacy or tolerance. The double-blind stage lasted 3 months; the open-label stage lasted another 3 months.
 

The treatment group was seen to use significantly fewer acute medications after just the first month and continued to improve through month 3. In the open-label phase, a similar improvement was noted in patients transitioning from placebo. In addition to acute medication use, emergency department use for migraine treatment was decreased significantly as well, by more than two thirds in month 3.

Migraine prevention will always remain the key ingredient for improvement for patients with higher frequencies of migraine, and adequate prevention will allow for the lower use of acute medications, and for less healthcare system use in general.
 

Most practitioners recommend migraine-specific medications for the acute treatment of migraine. Since the advent of sumatriptan, this has usually meant a triptan medication. However, a significant percentage of the population (up to 44% in one study) are either intolerant to, contraindicated for, or respond insufficiently to triptan medications. This can either be due to a strong triptan side effect (worsened nausea; tightness/soreness of the muscles of the chest, shoulders, and neck), having cardiovascular risk factors, or not responding adequately 2 hours after treatment. The study by Lipton and colleagues specifically assessed the efficacy of ubrogepant in this population.
 

Ubrogepant is a small-molecule CGRP antagonist for the acute treatment of migraine. Although somewhat controversial, most practitioners use ubrogepant in patients with some cardiovascular risk, a situation where they would be more likely to avoid the use of triptans. The study authors pooled post hoc data from the pivotal ubrogepant trials (ACHIEVE-1 and -2) to isolate patients with insufficient response to triptans, and their primary outcome was improvement in function 2 hours after medication dose.
 

Participants in the pivotal trials were separated into three groups: triptan responders, triptan insufficient responders, and triptan-naive patients. Triptan response was defined as achieving pain freedom 2 hours after medication dose. Both those who had an insufficient response and those who no longer use the triptan owing to intolerance or contraindications were included in the group with insufficient triptan response. Function improvement was defined as the primary outcome on the basis of a 4-point response scale (0 = no disability, 1 = mildly impaired, 2 = moderately impaired, 3 = severely impaired). In addition, patients were asked to report scores of satisfaction with the medication (yes or no) at 2 and 24 hours and their impression of overall change at 2 hours using a 7-point scale.
 

The population group of triptan insufficient responders (451 patients) had significant improvement in the primary outcome functional disability at 2, 4, and 7 hours after receipt of medications, but there was no statistical difference at 1 hour. This was similar when comparing those with intolerance to triptans, insufficient response to triptans, or contraindications for triptans. The secondary outcomes of satisfaction and global impression of change were also statistically improved in the insufficient-responders group. No additional tolerance issues or adverse events were noted in this group either.

It would certainly be worth considering the use of a gepant acute medication, such as ubrogepant, in patients who are intolerant to or inadequately treated by triptan medications. There still is much to learn about cardiovascular risk and the use of CGRP antagonists, and although no adverse events were noted, more data may be necessary to widely prescribe this class in higher-risk patients.

 

Many of our patients with refractory migraine do not respond to first-line acute or preventive treatments, and, almost by definition, first- and second-line treatments have failed in the majority of patients on calcitonin gene-related peptide (CGRP) antagonist medications. Three studies this month highlight the efficacy of CGRP monoclonal antibody (mAb) and small-molecule medications in this population specifically.

 

After an initial first dose of a CGRP mAb treatment, many patients ask whether a suboptimal response necessitates switching to another agent or whether a second (or third) dose should be given first. Eptinezumab is an intravenously administered mAb that is repeated every 12 weeks. Schim and colleagues present post hoc data for patients who initially had a minimally beneficial response to eptinezumab and received a second dose at week 13.
 

The authors define a suboptimal response as having less then a 50% decrease in monthly migraine days after 12 weeks. There were two pooled samples of patients — those who received 100 mg eptinezumab and those who received a 300 mg dose. Approximately 45% of patients in the pivotal trials of eptinezumab (PROMISE-1 and -2) were considered suboptimal responders, and 33%-37% of those suboptimal responders had a more than 50% decrease of their monthly migraine days after a second dose (week 24).
 

Further analysis determined predictive factors that favored a second dose response. The most prominent (and arguably most obvious) predictive factor was a favorable response after the first dose; the greater percent change in monthly migraine days after the first dose was proportional to the response after the second dose. Change in the Headache Impact Test (HIT-6) disability score after the first dose was also seen to be a strong predictive factor for improvement after the second dose.

 

When we discuss continuation of medications with our patients, especially when they have a suboptimal response, we should first keep in mind the degree of improvement that the patient initially had. There can be benefit from further treatment with the same medication; however, if the response truly was minimal, it may be better to consider another treatment option.

 

Practically every patient taking a preventive medication is taking at least one acute medication as well. Even the best preventive medication is not a guarantee that further exacerbations will not occur, and our patients will still need some acute treatment option even when their preventive medications are very effective. The study by Ambrosini and colleagues specifically shows how effective a preventive medication can be, specifically in allowing the patient to use fewer acute medications over time in a population of patients who have been resistant to two to four treatments.
 

Galcanezumab is a once-monthly mAb for the prevention of migraine. The authors of this study compared the acute use of medications for migraine in both the randomized and open-label stages of a study assessing treatment-refractory patients. A total of 462 patients were enrolled who were all resistant to two to four standard-of-care migraine-preventive medications that had been stopped either because of lack of efficacy or tolerance. The double-blind stage lasted 3 months; the open-label stage lasted another 3 months.
 

The treatment group was seen to use significantly fewer acute medications after just the first month and continued to improve through month 3. In the open-label phase, a similar improvement was noted in patients transitioning from placebo. In addition to acute medication use, emergency department use for migraine treatment was decreased significantly as well, by more than two thirds in month 3.

Migraine prevention will always remain the key ingredient for improvement for patients with higher frequencies of migraine, and adequate prevention will allow for the lower use of acute medications, and for less healthcare system use in general.
 

Most practitioners recommend migraine-specific medications for the acute treatment of migraine. Since the advent of sumatriptan, this has usually meant a triptan medication. However, a significant percentage of the population (up to 44% in one study) are either intolerant to, contraindicated for, or respond insufficiently to triptan medications. This can either be due to a strong triptan side effect (worsened nausea; tightness/soreness of the muscles of the chest, shoulders, and neck), having cardiovascular risk factors, or not responding adequately 2 hours after treatment. The study by Lipton and colleagues specifically assessed the efficacy of ubrogepant in this population.
 

Ubrogepant is a small-molecule CGRP antagonist for the acute treatment of migraine. Although somewhat controversial, most practitioners use ubrogepant in patients with some cardiovascular risk, a situation where they would be more likely to avoid the use of triptans. The study authors pooled post hoc data from the pivotal ubrogepant trials (ACHIEVE-1 and -2) to isolate patients with insufficient response to triptans, and their primary outcome was improvement in function 2 hours after medication dose.
 

Participants in the pivotal trials were separated into three groups: triptan responders, triptan insufficient responders, and triptan-naive patients. Triptan response was defined as achieving pain freedom 2 hours after medication dose. Both those who had an insufficient response and those who no longer use the triptan owing to intolerance or contraindications were included in the group with insufficient triptan response. Function improvement was defined as the primary outcome on the basis of a 4-point response scale (0 = no disability, 1 = mildly impaired, 2 = moderately impaired, 3 = severely impaired). In addition, patients were asked to report scores of satisfaction with the medication (yes or no) at 2 and 24 hours and their impression of overall change at 2 hours using a 7-point scale.
 

The population group of triptan insufficient responders (451 patients) had significant improvement in the primary outcome functional disability at 2, 4, and 7 hours after receipt of medications, but there was no statistical difference at 1 hour. This was similar when comparing those with intolerance to triptans, insufficient response to triptans, or contraindications for triptans. The secondary outcomes of satisfaction and global impression of change were also statistically improved in the insufficient-responders group. No additional tolerance issues or adverse events were noted in this group either.

It would certainly be worth considering the use of a gepant acute medication, such as ubrogepant, in patients who are intolerant to or inadequately treated by triptan medications. There still is much to learn about cardiovascular risk and the use of CGRP antagonists, and although no adverse events were noted, more data may be necessary to widely prescribe this class in higher-risk patients.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Article Series
Clinical Edge Journal Scan Commentary: Migraine June 2022
Gate On Date
Tue, 01/11/2022 - 20:45
Un-Gate On Date
Tue, 01/11/2022 - 20:45
Use ProPublica
CFC Schedule Remove Status
Tue, 01/11/2022 - 20:45
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

White children more likely to get imaging in EDs: Study

Article Type
Changed
Mon, 06/06/2022 - 10:23

 

Non-Hispanic White children were more likely to receive diagnostic imaging at children’s hospitals’ emergency departments across the United States than were Hispanic children and non-Hispanic Black children, according to a large study published in JAMA Network Open.

Researchers found that, the more the percentage of children from minority groups cared for by a hospital increased, the wider the imaging gap between those children and non-Hispanic White children.

The cross-sectional study, led by Margaret E. Samuels-Kalow, MD, MPhil, MSHP, with the department of emergency medicine, Massachusetts General Hospital and Harvard Medical School in Boston, included 38 children’s hospitals and more than 12 million ED visits.

“These findings emphasize the urgent need for interventions at the hospital level to improve equity in imaging in pediatric emergency medicine,” the authors write.

Patients included in the study were younger than 18 and visited an ED from January 2016 through December 2019. Data were pulled from the Pediatric Health Information System.

Of the more than 12 million visits in this study, 3.5 million (28.7%) involved at least one diagnostic imaging test.

Diagnostic imaging was performed in 1.5 million visits (34.2%) for non-Hispanic White children; 790,961 (24.6%) for non-Hispanic Black children; and 907,222 (26.1%) for Hispanic children (P < .001).

Non-Hispanic Black children were consistently less likely to get diagnostic imaging than non-Hispanic White counterparts at every hospital in the study, no matter the imaging modality: radiography, ultrasonography, computed tomography, or magnetic resonance imaging.

Hispanic patients were generally less likely to get imaging than non-Hispanic White patients, though results were less consistent for ultrasound and MRI.

In a sensitivity analysis, when looking at imaging from patients’ first visit across the study cohort, non-Hispanic Black children were significantly less likely to get imaging than non-Hispanic White children (adjusted odds ratio, 0.77; 95% confidence interval, 0.74-0.79).

“This remained significant even after adjustment for a priori specified confounders including hospital propensity to image,” the authors write.

Authors acknowledge that it is possible that some of the differences may be attributable to the patient mix regarding severity of cases or indications for imaging by hospital, but they note that all models were adjusted for diagnosis-related group and other potential confounders.

This study did not assess whether one group is being overtested. Researchers also note that higher rates of imaging do not necessarily indicate higher quality of care.

However, the authors note, previous research has suggested overtesting of non-Hispanic White patients for head CT and chest pain, as well as patterns of overtreatment of non-Hispanic White patients who have bronchiolitis or viral upper respiratory tract infections.

Medell Briggs-Malonson, MD, MPH, chief of health equity, diversity and inclusion for the University of California, Los Angeles, Hospital and Clinic System, who was not part of the study, said in an interview “this all rings true.”

“This is not the first study we have had in either the pediatric or adult populations that shows disparate levels of care as well as health outcomes. Now we are starting to be able to measure it,” she said.

This study is further evidence of medical racism, she says, and highlights that it’s not the hospital choice or the insurance type affecting the numbers, she said.

“When you control for those factors, it looks to be it’s only due to race and that’s because of the very deep levels of implicit bias as well as explicit bias that we still have in our health systems and even in our providers,” said Dr. Briggs-Malonson, who is also an associate professor of emergency medicine at UCLA. “It’s incredibly important to identify and immediately address.”

 

 

What can be done?

Changing these patterns starts with knowing the numbers, the authors write.

“Hospitals should measure their own differences in imaging rates and increase awareness of existing areas of differential treatment as a starting point for improvement,” Dr. Samuels-Kalow and coauthors say.

Dr. Briggs-Malonson added that guidelines are very clear about when children should get imaging. Adhering to evidence-based guidelines can help avoid variations in care from external factors.

“If children are not receiving the absolute best comprehensive evaluation in the emergency department that they deserve, we can miss many different illnesses, which can lead to worse outcomes,” she noted.

As for what might motivate lack of imaging, Dr. Briggs-Malonson pointed to longstanding trends of providers thinking complaints raised by minority patients may not be as severe as they report. Conversely, in caring for White patients there may be a feeling that more tests and imaging may be better out of more fear of missing something, she said.

At UCLA, she says, dashboards have been developed to track statistics on care by age, race, ethnicity, language, insurance type, etc., though not specifically in pediatric imaging, to assess and address any care inequities.

Summer L. Kaplan, MD, MS, director of emergency radiology at Children’s Hospital of Philadelphia, who also was not part of the study, said the finding of racial disparities in pediatric ED imaging provides evidence that gaps still exist in providing the best care to all children and families seeking emergency care.

“However, it is important to recognize that more imaging does not equal better care,” she said. “More imaging may be associated with unnecessary, low-value tests that may add radiation and other risks but do not improve care.”

She said higher rates of imaging may occur when patients present early in the course of a disease, when the differential diagnosis remains broad.

If families have delayed seeking care because of time constraints, transportation problems, cost of care, or mistrust of the health system, children may present later in the course of a disease and require less imaging for a diagnosis, she explained.

“This paper offers a valuable look at the inequities that exist in pediatric emergency imaging use, and further research will be essential to understand and address the causes of these differences,” Dr. Kaplan said.

A coauthor reported compensation as a member of a Medical Review Committee for Highmark. Other coauthors reported grants from the U.S. Agency for Healthcare Research and Quality outside the submitted work. Dr. Briggs-Malonson and Dr. Kaplan reported no relevant financial relationships.

Publications
Topics
Sections

 

Non-Hispanic White children were more likely to receive diagnostic imaging at children’s hospitals’ emergency departments across the United States than were Hispanic children and non-Hispanic Black children, according to a large study published in JAMA Network Open.

Researchers found that, the more the percentage of children from minority groups cared for by a hospital increased, the wider the imaging gap between those children and non-Hispanic White children.

The cross-sectional study, led by Margaret E. Samuels-Kalow, MD, MPhil, MSHP, with the department of emergency medicine, Massachusetts General Hospital and Harvard Medical School in Boston, included 38 children’s hospitals and more than 12 million ED visits.

“These findings emphasize the urgent need for interventions at the hospital level to improve equity in imaging in pediatric emergency medicine,” the authors write.

Patients included in the study were younger than 18 and visited an ED from January 2016 through December 2019. Data were pulled from the Pediatric Health Information System.

Of the more than 12 million visits in this study, 3.5 million (28.7%) involved at least one diagnostic imaging test.

Diagnostic imaging was performed in 1.5 million visits (34.2%) for non-Hispanic White children; 790,961 (24.6%) for non-Hispanic Black children; and 907,222 (26.1%) for Hispanic children (P < .001).

Non-Hispanic Black children were consistently less likely to get diagnostic imaging than non-Hispanic White counterparts at every hospital in the study, no matter the imaging modality: radiography, ultrasonography, computed tomography, or magnetic resonance imaging.

Hispanic patients were generally less likely to get imaging than non-Hispanic White patients, though results were less consistent for ultrasound and MRI.

In a sensitivity analysis, when looking at imaging from patients’ first visit across the study cohort, non-Hispanic Black children were significantly less likely to get imaging than non-Hispanic White children (adjusted odds ratio, 0.77; 95% confidence interval, 0.74-0.79).

“This remained significant even after adjustment for a priori specified confounders including hospital propensity to image,” the authors write.

Authors acknowledge that it is possible that some of the differences may be attributable to the patient mix regarding severity of cases or indications for imaging by hospital, but they note that all models were adjusted for diagnosis-related group and other potential confounders.

This study did not assess whether one group is being overtested. Researchers also note that higher rates of imaging do not necessarily indicate higher quality of care.

However, the authors note, previous research has suggested overtesting of non-Hispanic White patients for head CT and chest pain, as well as patterns of overtreatment of non-Hispanic White patients who have bronchiolitis or viral upper respiratory tract infections.

Medell Briggs-Malonson, MD, MPH, chief of health equity, diversity and inclusion for the University of California, Los Angeles, Hospital and Clinic System, who was not part of the study, said in an interview “this all rings true.”

“This is not the first study we have had in either the pediatric or adult populations that shows disparate levels of care as well as health outcomes. Now we are starting to be able to measure it,” she said.

This study is further evidence of medical racism, she says, and highlights that it’s not the hospital choice or the insurance type affecting the numbers, she said.

“When you control for those factors, it looks to be it’s only due to race and that’s because of the very deep levels of implicit bias as well as explicit bias that we still have in our health systems and even in our providers,” said Dr. Briggs-Malonson, who is also an associate professor of emergency medicine at UCLA. “It’s incredibly important to identify and immediately address.”

 

 

What can be done?

Changing these patterns starts with knowing the numbers, the authors write.

“Hospitals should measure their own differences in imaging rates and increase awareness of existing areas of differential treatment as a starting point for improvement,” Dr. Samuels-Kalow and coauthors say.

Dr. Briggs-Malonson added that guidelines are very clear about when children should get imaging. Adhering to evidence-based guidelines can help avoid variations in care from external factors.

“If children are not receiving the absolute best comprehensive evaluation in the emergency department that they deserve, we can miss many different illnesses, which can lead to worse outcomes,” she noted.

As for what might motivate lack of imaging, Dr. Briggs-Malonson pointed to longstanding trends of providers thinking complaints raised by minority patients may not be as severe as they report. Conversely, in caring for White patients there may be a feeling that more tests and imaging may be better out of more fear of missing something, she said.

At UCLA, she says, dashboards have been developed to track statistics on care by age, race, ethnicity, language, insurance type, etc., though not specifically in pediatric imaging, to assess and address any care inequities.

Summer L. Kaplan, MD, MS, director of emergency radiology at Children’s Hospital of Philadelphia, who also was not part of the study, said the finding of racial disparities in pediatric ED imaging provides evidence that gaps still exist in providing the best care to all children and families seeking emergency care.

“However, it is important to recognize that more imaging does not equal better care,” she said. “More imaging may be associated with unnecessary, low-value tests that may add radiation and other risks but do not improve care.”

She said higher rates of imaging may occur when patients present early in the course of a disease, when the differential diagnosis remains broad.

If families have delayed seeking care because of time constraints, transportation problems, cost of care, or mistrust of the health system, children may present later in the course of a disease and require less imaging for a diagnosis, she explained.

“This paper offers a valuable look at the inequities that exist in pediatric emergency imaging use, and further research will be essential to understand and address the causes of these differences,” Dr. Kaplan said.

A coauthor reported compensation as a member of a Medical Review Committee for Highmark. Other coauthors reported grants from the U.S. Agency for Healthcare Research and Quality outside the submitted work. Dr. Briggs-Malonson and Dr. Kaplan reported no relevant financial relationships.

 

Non-Hispanic White children were more likely to receive diagnostic imaging at children’s hospitals’ emergency departments across the United States than were Hispanic children and non-Hispanic Black children, according to a large study published in JAMA Network Open.

Researchers found that, the more the percentage of children from minority groups cared for by a hospital increased, the wider the imaging gap between those children and non-Hispanic White children.

The cross-sectional study, led by Margaret E. Samuels-Kalow, MD, MPhil, MSHP, with the department of emergency medicine, Massachusetts General Hospital and Harvard Medical School in Boston, included 38 children’s hospitals and more than 12 million ED visits.

“These findings emphasize the urgent need for interventions at the hospital level to improve equity in imaging in pediatric emergency medicine,” the authors write.

Patients included in the study were younger than 18 and visited an ED from January 2016 through December 2019. Data were pulled from the Pediatric Health Information System.

Of the more than 12 million visits in this study, 3.5 million (28.7%) involved at least one diagnostic imaging test.

Diagnostic imaging was performed in 1.5 million visits (34.2%) for non-Hispanic White children; 790,961 (24.6%) for non-Hispanic Black children; and 907,222 (26.1%) for Hispanic children (P < .001).

Non-Hispanic Black children were consistently less likely to get diagnostic imaging than non-Hispanic White counterparts at every hospital in the study, no matter the imaging modality: radiography, ultrasonography, computed tomography, or magnetic resonance imaging.

Hispanic patients were generally less likely to get imaging than non-Hispanic White patients, though results were less consistent for ultrasound and MRI.

In a sensitivity analysis, when looking at imaging from patients’ first visit across the study cohort, non-Hispanic Black children were significantly less likely to get imaging than non-Hispanic White children (adjusted odds ratio, 0.77; 95% confidence interval, 0.74-0.79).

“This remained significant even after adjustment for a priori specified confounders including hospital propensity to image,” the authors write.

Authors acknowledge that it is possible that some of the differences may be attributable to the patient mix regarding severity of cases or indications for imaging by hospital, but they note that all models were adjusted for diagnosis-related group and other potential confounders.

This study did not assess whether one group is being overtested. Researchers also note that higher rates of imaging do not necessarily indicate higher quality of care.

However, the authors note, previous research has suggested overtesting of non-Hispanic White patients for head CT and chest pain, as well as patterns of overtreatment of non-Hispanic White patients who have bronchiolitis or viral upper respiratory tract infections.

Medell Briggs-Malonson, MD, MPH, chief of health equity, diversity and inclusion for the University of California, Los Angeles, Hospital and Clinic System, who was not part of the study, said in an interview “this all rings true.”

“This is not the first study we have had in either the pediatric or adult populations that shows disparate levels of care as well as health outcomes. Now we are starting to be able to measure it,” she said.

This study is further evidence of medical racism, she says, and highlights that it’s not the hospital choice or the insurance type affecting the numbers, she said.

“When you control for those factors, it looks to be it’s only due to race and that’s because of the very deep levels of implicit bias as well as explicit bias that we still have in our health systems and even in our providers,” said Dr. Briggs-Malonson, who is also an associate professor of emergency medicine at UCLA. “It’s incredibly important to identify and immediately address.”

 

 

What can be done?

Changing these patterns starts with knowing the numbers, the authors write.

“Hospitals should measure their own differences in imaging rates and increase awareness of existing areas of differential treatment as a starting point for improvement,” Dr. Samuels-Kalow and coauthors say.

Dr. Briggs-Malonson added that guidelines are very clear about when children should get imaging. Adhering to evidence-based guidelines can help avoid variations in care from external factors.

“If children are not receiving the absolute best comprehensive evaluation in the emergency department that they deserve, we can miss many different illnesses, which can lead to worse outcomes,” she noted.

As for what might motivate lack of imaging, Dr. Briggs-Malonson pointed to longstanding trends of providers thinking complaints raised by minority patients may not be as severe as they report. Conversely, in caring for White patients there may be a feeling that more tests and imaging may be better out of more fear of missing something, she said.

At UCLA, she says, dashboards have been developed to track statistics on care by age, race, ethnicity, language, insurance type, etc., though not specifically in pediatric imaging, to assess and address any care inequities.

Summer L. Kaplan, MD, MS, director of emergency radiology at Children’s Hospital of Philadelphia, who also was not part of the study, said the finding of racial disparities in pediatric ED imaging provides evidence that gaps still exist in providing the best care to all children and families seeking emergency care.

“However, it is important to recognize that more imaging does not equal better care,” she said. “More imaging may be associated with unnecessary, low-value tests that may add radiation and other risks but do not improve care.”

She said higher rates of imaging may occur when patients present early in the course of a disease, when the differential diagnosis remains broad.

If families have delayed seeking care because of time constraints, transportation problems, cost of care, or mistrust of the health system, children may present later in the course of a disease and require less imaging for a diagnosis, she explained.

“This paper offers a valuable look at the inequities that exist in pediatric emergency imaging use, and further research will be essential to understand and address the causes of these differences,” Dr. Kaplan said.

A coauthor reported compensation as a member of a Medical Review Committee for Highmark. Other coauthors reported grants from the U.S. Agency for Healthcare Research and Quality outside the submitted work. Dr. Briggs-Malonson and Dr. Kaplan reported no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Are teenagers tone deaf?

Article Type
Changed
Thu, 06/02/2022 - 16:21

I suspect that you have heard or read about the recent study in the Journal of Neuroscience that claims to have discovered evidence that as children become teenagers, their brains begin to tune out their mother’s voices. The story appeared in at least 10 Internet news sources including the American Academy of Pediatrics’ daily briefing.

Based on functional MRI studies by a group at Stanford (Calif.) University, the researchers found that while in general, teenagers became more attentive to all voices as they reached puberty, novel voices were favored over the maternal voices that had flooded their environment as younger children. Of course none of this comes as a surprise to anyone who has parented a teenager or spent any time trying to communicate with adolescents. Although we all must be a bit careful not to put too much stock in functional MRI studies, these findings do suggest a physiologic basis for the peer pressure that becomes one of the hallmarks of adolescence. I wouldn’t be surprised if some clever entrepreneur has already begun using MRI to search for just the right tonal qualities that will make the perfect Internet influencer.

Dr. William G. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years.
Dr. William G. Wilkoff

But, will these MRI studies help parents who have already thrown up their arms and admitted defeat mumbling, “He’s stopped listening to me?” The more observant parents already realized long ago that their words were often the least effective tools in their tool kit when it comes to modifying behavior.

Just listen in any neighborhood playground or grocery store to how often you hear a parent trying to get a toddler or young child to correct a misbehavior using threats or promises that you and everyone else within earshot knows will never be followed by any consequence. How often do you see a parent modeling behaviors that they expect their children to avoid?

Some more “enlightened” parents will avoid threats and instead attempt to engage in a dialogue with their misbehaving child hoping that a rational discussion with a sleep-deprived toddler in full tantrum mode can convince the youngster to self-correct.

I’m sure you learned and may have even used the playground retort “sticks and stones may break my bones but words will never hurt me.” Of course more untrue words were never spoken. Words can hurt and they can scar. But words and threats can also be hollow and will fall on ears deafened by months and years during which there were no consequences. It is certainly nice to know that there is some physiologic correlation to what we all suspected. The good news is that teenagers are still listening to us, although they are increasingly more interested in what their peers and the rest of the world has to say.

What the study fails to point out is that while teenagers may still be listening to us their behavior is molded not so much by what we say but how we as parents and adults behave. Have we parented in a way in which our words are followed up with appropriate consequences? And, more importantly, have we modeled behavior that matches our words? We need to help parents realize that words can be important but parenting by example is the gold standard.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

Publications
Topics
Sections

I suspect that you have heard or read about the recent study in the Journal of Neuroscience that claims to have discovered evidence that as children become teenagers, their brains begin to tune out their mother’s voices. The story appeared in at least 10 Internet news sources including the American Academy of Pediatrics’ daily briefing.

Based on functional MRI studies by a group at Stanford (Calif.) University, the researchers found that while in general, teenagers became more attentive to all voices as they reached puberty, novel voices were favored over the maternal voices that had flooded their environment as younger children. Of course none of this comes as a surprise to anyone who has parented a teenager or spent any time trying to communicate with adolescents. Although we all must be a bit careful not to put too much stock in functional MRI studies, these findings do suggest a physiologic basis for the peer pressure that becomes one of the hallmarks of adolescence. I wouldn’t be surprised if some clever entrepreneur has already begun using MRI to search for just the right tonal qualities that will make the perfect Internet influencer.

Dr. William G. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years.
Dr. William G. Wilkoff

But, will these MRI studies help parents who have already thrown up their arms and admitted defeat mumbling, “He’s stopped listening to me?” The more observant parents already realized long ago that their words were often the least effective tools in their tool kit when it comes to modifying behavior.

Just listen in any neighborhood playground or grocery store to how often you hear a parent trying to get a toddler or young child to correct a misbehavior using threats or promises that you and everyone else within earshot knows will never be followed by any consequence. How often do you see a parent modeling behaviors that they expect their children to avoid?

Some more “enlightened” parents will avoid threats and instead attempt to engage in a dialogue with their misbehaving child hoping that a rational discussion with a sleep-deprived toddler in full tantrum mode can convince the youngster to self-correct.

I’m sure you learned and may have even used the playground retort “sticks and stones may break my bones but words will never hurt me.” Of course more untrue words were never spoken. Words can hurt and they can scar. But words and threats can also be hollow and will fall on ears deafened by months and years during which there were no consequences. It is certainly nice to know that there is some physiologic correlation to what we all suspected. The good news is that teenagers are still listening to us, although they are increasingly more interested in what their peers and the rest of the world has to say.

What the study fails to point out is that while teenagers may still be listening to us their behavior is molded not so much by what we say but how we as parents and adults behave. Have we parented in a way in which our words are followed up with appropriate consequences? And, more importantly, have we modeled behavior that matches our words? We need to help parents realize that words can be important but parenting by example is the gold standard.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

I suspect that you have heard or read about the recent study in the Journal of Neuroscience that claims to have discovered evidence that as children become teenagers, their brains begin to tune out their mother’s voices. The story appeared in at least 10 Internet news sources including the American Academy of Pediatrics’ daily briefing.

Based on functional MRI studies by a group at Stanford (Calif.) University, the researchers found that while in general, teenagers became more attentive to all voices as they reached puberty, novel voices were favored over the maternal voices that had flooded their environment as younger children. Of course none of this comes as a surprise to anyone who has parented a teenager or spent any time trying to communicate with adolescents. Although we all must be a bit careful not to put too much stock in functional MRI studies, these findings do suggest a physiologic basis for the peer pressure that becomes one of the hallmarks of adolescence. I wouldn’t be surprised if some clever entrepreneur has already begun using MRI to search for just the right tonal qualities that will make the perfect Internet influencer.

Dr. William G. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years.
Dr. William G. Wilkoff

But, will these MRI studies help parents who have already thrown up their arms and admitted defeat mumbling, “He’s stopped listening to me?” The more observant parents already realized long ago that their words were often the least effective tools in their tool kit when it comes to modifying behavior.

Just listen in any neighborhood playground or grocery store to how often you hear a parent trying to get a toddler or young child to correct a misbehavior using threats or promises that you and everyone else within earshot knows will never be followed by any consequence. How often do you see a parent modeling behaviors that they expect their children to avoid?

Some more “enlightened” parents will avoid threats and instead attempt to engage in a dialogue with their misbehaving child hoping that a rational discussion with a sleep-deprived toddler in full tantrum mode can convince the youngster to self-correct.

I’m sure you learned and may have even used the playground retort “sticks and stones may break my bones but words will never hurt me.” Of course more untrue words were never spoken. Words can hurt and they can scar. But words and threats can also be hollow and will fall on ears deafened by months and years during which there were no consequences. It is certainly nice to know that there is some physiologic correlation to what we all suspected. The good news is that teenagers are still listening to us, although they are increasingly more interested in what their peers and the rest of the world has to say.

What the study fails to point out is that while teenagers may still be listening to us their behavior is molded not so much by what we say but how we as parents and adults behave. Have we parented in a way in which our words are followed up with appropriate consequences? And, more importantly, have we modeled behavior that matches our words? We need to help parents realize that words can be important but parenting by example is the gold standard.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

TDF use in HBV-HIV coinfection linked with kidney, bone issues

Article Type
Changed
Thu, 06/02/2022 - 16:17

Patients coinfected with hepatitis B virus (HBV) and human immunodeficiency virus who take tenofovir disoproxil fumarate (TDF) may have worsening renal function and bone turnover, according to a small, prospective cohort study in HIV Medicine.

“In this HBV-HIV cohort of adults with high prevalence of tenofovir use, several biomarkers of renal function and bone turnover indicated worsening status over approximately 4 years, highlighting the importance of clinicians’ awareness,” lead author Richard K. Sterling, MD, MSc, assistant chair of research in the department of internal medicine of Virginia Commonwealth University, Richmond, told this news organization in an email.

TDF is a common component of antiretroviral therapy (ART) in adults coinfected with HBV and HIV. The drug is known to adversely affect kidney function and bone turnover, but few studies have evaluated these issues, the authors write.

Dr. Sterling and colleagues enrolled adults coinfected with HBV and HIV who were taking any type of ART in their study at eight sites in North America.

The authors assessed demographics, medical history, current health status reports, physical exams, and blood and urine tests. They extracted clinical, laboratory, and radiologic data from medical records, and they processed whole blood, stored serum at -70 °C (-94 °F) at each site, and tested specimens in central laboratories.

The researchers assessed the participants at baseline and every 24 weeks for up to 192 weeks (3.7 years). They tested bone markers from stored serum at baseline, week 96, and week 192. And they recorded changes in renal function markers and bone turnover over time.

At baseline, the median age of the 115 patients was 49 years; 91% were male, and 52% were non-Hispanic Black. Their median body mass index was 26 kg/m2, with 6.3% of participants underweight and 59% overweight or obese. The participants had been living with HIV for a median of about 20 years.

Overall, 84% of participants reported tenofovir use, 3% reported no HBV therapy, and 80% had HBV/HIV suppression. In addition, 13% had stage 2 liver fibrosis and 23% had stage 3 to 4 liver fibrosis. No participants reported using immunosuppressants, 4% reported using an anticoagulant, 3% reported taking calcium plus vitamin D, and 33% reported taking multivitamins.

Throughout the follow-up period, TDF use ranged from 80% to 92%. Estimated glomerular filtration rate (eGFR) dropped from 87.1 to 79.9 ml/min/1.73m2 over 192 weeks (P < .001); but eGFR prevalence < 60 ml/min/1.73m2 did not appear to change over time (always < 16%; P = .43).

From baseline to week 192, procollagen type 1 N-terminal propeptide (P1NP) dropped from 146.7 to 130.5 ng/ml (P = .001), osteocalcin dropped from 14.4 to 10.2 ng/ml (P < .001), and C-terminal telopeptides of type I collagen (CTX-1) dropped from 373 to 273 pg/ml (P < .001).

Predictors of decrease in eGFR included younger age, male sex, and overweight or obesity. Predictors of worsening bone turnover included Black race, healthy weight, advanced fibrosis, undetectable HBV DNA, and lower parathyroid hormone level.
 

Monitor patients with HBV and HIV closely

“The long-term effects of TDF on renal and bone health are important to monitor,” Dr. Sterling advised. “For renal health, physicians should monitor GFR as well as creatinine. For bone health, monitoring serum calcium, vitamin D, parathyroid hormone, and phosphate may not catch increased bone turnover.”

“We knew that TDF can cause renal dysfunction; however, we were surprised that we did not observe significant rise in serum creatinine but did observe decline in glomerular filtration rate and several markers of increased bone turnover,” he added.

Dr. Sterling acknowledged that limitations of the study include its small cohort, short follow-up, and lack of control participants who were taking TDF while mono-infected with either HBV or HIV. He added that strengths include close follow-up, use of bone turnover markers, and control for severity of liver disease.

Joseph Alvarnas, MD, a hematologist and oncologist in the department of hematology & hematopoietic cell transplant at City of Hope Comprehensive Cancer Center, Duarte, California, told this news organization that he welcomes the rigor of the study. “This study provides an important reminder of the complexities of taking a comprehensive management approach to the care of patients with long-term HIV infection,” Dr. Alvarnas wrote in an email. He was not involved in the study.

“More than 6 million people worldwide live with coinfection,” he added. “Patients coinfected with HBV and HIV have additional care needs over those living with only chronic HIV infection. With more HIV-infected patients becoming long-term survivors who are managed through the use of effective ART, fully understanding the differentiated long-term care needs of this population is important.”

Debika Bhattacharya, MD, a specialist in HIV and viral hepatitis coinfection in the Division of Infectious Diseases at UCLA Health, Los Angeles, joined Dr. Sterling and Dr. Alvarnas in advising clinicians to regularly evaluate the kidney and bone health of their coinfected patients.

“While this study focuses the very common antiretroviral agent TDF, it will be important to see the impact of a similar drug, tenofovir alafenamide (TAF) – which has been associated with less impact on bone and kidney health – on clinical outcomes in HBV-HIV coinfection,” Dr. Bhattacharya, who also was not involved in the study, wrote in an email.

The National Institute of Diabetes and Digestive and Kidney Diseases funded the study. Dr. Sterling has served on boards for Pfizer and AskBio, and he reports research grants from Gilead, Abbott, AbbVie, and Roche to his institution. Most other authors report financial relationships with pharmaceutical companies. Dr. Alvarnas reports no relevant financial relationships. Dr. Bhattacharya has received a research grant from Gilead Sciences, paid to her institution.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Patients coinfected with hepatitis B virus (HBV) and human immunodeficiency virus who take tenofovir disoproxil fumarate (TDF) may have worsening renal function and bone turnover, according to a small, prospective cohort study in HIV Medicine.

“In this HBV-HIV cohort of adults with high prevalence of tenofovir use, several biomarkers of renal function and bone turnover indicated worsening status over approximately 4 years, highlighting the importance of clinicians’ awareness,” lead author Richard K. Sterling, MD, MSc, assistant chair of research in the department of internal medicine of Virginia Commonwealth University, Richmond, told this news organization in an email.

TDF is a common component of antiretroviral therapy (ART) in adults coinfected with HBV and HIV. The drug is known to adversely affect kidney function and bone turnover, but few studies have evaluated these issues, the authors write.

Dr. Sterling and colleagues enrolled adults coinfected with HBV and HIV who were taking any type of ART in their study at eight sites in North America.

The authors assessed demographics, medical history, current health status reports, physical exams, and blood and urine tests. They extracted clinical, laboratory, and radiologic data from medical records, and they processed whole blood, stored serum at -70 °C (-94 °F) at each site, and tested specimens in central laboratories.

The researchers assessed the participants at baseline and every 24 weeks for up to 192 weeks (3.7 years). They tested bone markers from stored serum at baseline, week 96, and week 192. And they recorded changes in renal function markers and bone turnover over time.

At baseline, the median age of the 115 patients was 49 years; 91% were male, and 52% were non-Hispanic Black. Their median body mass index was 26 kg/m2, with 6.3% of participants underweight and 59% overweight or obese. The participants had been living with HIV for a median of about 20 years.

Overall, 84% of participants reported tenofovir use, 3% reported no HBV therapy, and 80% had HBV/HIV suppression. In addition, 13% had stage 2 liver fibrosis and 23% had stage 3 to 4 liver fibrosis. No participants reported using immunosuppressants, 4% reported using an anticoagulant, 3% reported taking calcium plus vitamin D, and 33% reported taking multivitamins.

Throughout the follow-up period, TDF use ranged from 80% to 92%. Estimated glomerular filtration rate (eGFR) dropped from 87.1 to 79.9 ml/min/1.73m2 over 192 weeks (P < .001); but eGFR prevalence < 60 ml/min/1.73m2 did not appear to change over time (always < 16%; P = .43).

From baseline to week 192, procollagen type 1 N-terminal propeptide (P1NP) dropped from 146.7 to 130.5 ng/ml (P = .001), osteocalcin dropped from 14.4 to 10.2 ng/ml (P < .001), and C-terminal telopeptides of type I collagen (CTX-1) dropped from 373 to 273 pg/ml (P < .001).

Predictors of decrease in eGFR included younger age, male sex, and overweight or obesity. Predictors of worsening bone turnover included Black race, healthy weight, advanced fibrosis, undetectable HBV DNA, and lower parathyroid hormone level.
 

Monitor patients with HBV and HIV closely

“The long-term effects of TDF on renal and bone health are important to monitor,” Dr. Sterling advised. “For renal health, physicians should monitor GFR as well as creatinine. For bone health, monitoring serum calcium, vitamin D, parathyroid hormone, and phosphate may not catch increased bone turnover.”

“We knew that TDF can cause renal dysfunction; however, we were surprised that we did not observe significant rise in serum creatinine but did observe decline in glomerular filtration rate and several markers of increased bone turnover,” he added.

Dr. Sterling acknowledged that limitations of the study include its small cohort, short follow-up, and lack of control participants who were taking TDF while mono-infected with either HBV or HIV. He added that strengths include close follow-up, use of bone turnover markers, and control for severity of liver disease.

Joseph Alvarnas, MD, a hematologist and oncologist in the department of hematology & hematopoietic cell transplant at City of Hope Comprehensive Cancer Center, Duarte, California, told this news organization that he welcomes the rigor of the study. “This study provides an important reminder of the complexities of taking a comprehensive management approach to the care of patients with long-term HIV infection,” Dr. Alvarnas wrote in an email. He was not involved in the study.

“More than 6 million people worldwide live with coinfection,” he added. “Patients coinfected with HBV and HIV have additional care needs over those living with only chronic HIV infection. With more HIV-infected patients becoming long-term survivors who are managed through the use of effective ART, fully understanding the differentiated long-term care needs of this population is important.”

Debika Bhattacharya, MD, a specialist in HIV and viral hepatitis coinfection in the Division of Infectious Diseases at UCLA Health, Los Angeles, joined Dr. Sterling and Dr. Alvarnas in advising clinicians to regularly evaluate the kidney and bone health of their coinfected patients.

“While this study focuses the very common antiretroviral agent TDF, it will be important to see the impact of a similar drug, tenofovir alafenamide (TAF) – which has been associated with less impact on bone and kidney health – on clinical outcomes in HBV-HIV coinfection,” Dr. Bhattacharya, who also was not involved in the study, wrote in an email.

The National Institute of Diabetes and Digestive and Kidney Diseases funded the study. Dr. Sterling has served on boards for Pfizer and AskBio, and he reports research grants from Gilead, Abbott, AbbVie, and Roche to his institution. Most other authors report financial relationships with pharmaceutical companies. Dr. Alvarnas reports no relevant financial relationships. Dr. Bhattacharya has received a research grant from Gilead Sciences, paid to her institution.

A version of this article first appeared on Medscape.com.

Patients coinfected with hepatitis B virus (HBV) and human immunodeficiency virus who take tenofovir disoproxil fumarate (TDF) may have worsening renal function and bone turnover, according to a small, prospective cohort study in HIV Medicine.

“In this HBV-HIV cohort of adults with high prevalence of tenofovir use, several biomarkers of renal function and bone turnover indicated worsening status over approximately 4 years, highlighting the importance of clinicians’ awareness,” lead author Richard K. Sterling, MD, MSc, assistant chair of research in the department of internal medicine of Virginia Commonwealth University, Richmond, told this news organization in an email.

TDF is a common component of antiretroviral therapy (ART) in adults coinfected with HBV and HIV. The drug is known to adversely affect kidney function and bone turnover, but few studies have evaluated these issues, the authors write.

Dr. Sterling and colleagues enrolled adults coinfected with HBV and HIV who were taking any type of ART in their study at eight sites in North America.

The authors assessed demographics, medical history, current health status reports, physical exams, and blood and urine tests. They extracted clinical, laboratory, and radiologic data from medical records, and they processed whole blood, stored serum at -70 °C (-94 °F) at each site, and tested specimens in central laboratories.

The researchers assessed the participants at baseline and every 24 weeks for up to 192 weeks (3.7 years). They tested bone markers from stored serum at baseline, week 96, and week 192. And they recorded changes in renal function markers and bone turnover over time.

At baseline, the median age of the 115 patients was 49 years; 91% were male, and 52% were non-Hispanic Black. Their median body mass index was 26 kg/m2, with 6.3% of participants underweight and 59% overweight or obese. The participants had been living with HIV for a median of about 20 years.

Overall, 84% of participants reported tenofovir use, 3% reported no HBV therapy, and 80% had HBV/HIV suppression. In addition, 13% had stage 2 liver fibrosis and 23% had stage 3 to 4 liver fibrosis. No participants reported using immunosuppressants, 4% reported using an anticoagulant, 3% reported taking calcium plus vitamin D, and 33% reported taking multivitamins.

Throughout the follow-up period, TDF use ranged from 80% to 92%. Estimated glomerular filtration rate (eGFR) dropped from 87.1 to 79.9 ml/min/1.73m2 over 192 weeks (P < .001); but eGFR prevalence < 60 ml/min/1.73m2 did not appear to change over time (always < 16%; P = .43).

From baseline to week 192, procollagen type 1 N-terminal propeptide (P1NP) dropped from 146.7 to 130.5 ng/ml (P = .001), osteocalcin dropped from 14.4 to 10.2 ng/ml (P < .001), and C-terminal telopeptides of type I collagen (CTX-1) dropped from 373 to 273 pg/ml (P < .001).

Predictors of decrease in eGFR included younger age, male sex, and overweight or obesity. Predictors of worsening bone turnover included Black race, healthy weight, advanced fibrosis, undetectable HBV DNA, and lower parathyroid hormone level.
 

Monitor patients with HBV and HIV closely

“The long-term effects of TDF on renal and bone health are important to monitor,” Dr. Sterling advised. “For renal health, physicians should monitor GFR as well as creatinine. For bone health, monitoring serum calcium, vitamin D, parathyroid hormone, and phosphate may not catch increased bone turnover.”

“We knew that TDF can cause renal dysfunction; however, we were surprised that we did not observe significant rise in serum creatinine but did observe decline in glomerular filtration rate and several markers of increased bone turnover,” he added.

Dr. Sterling acknowledged that limitations of the study include its small cohort, short follow-up, and lack of control participants who were taking TDF while mono-infected with either HBV or HIV. He added that strengths include close follow-up, use of bone turnover markers, and control for severity of liver disease.

Joseph Alvarnas, MD, a hematologist and oncologist in the department of hematology & hematopoietic cell transplant at City of Hope Comprehensive Cancer Center, Duarte, California, told this news organization that he welcomes the rigor of the study. “This study provides an important reminder of the complexities of taking a comprehensive management approach to the care of patients with long-term HIV infection,” Dr. Alvarnas wrote in an email. He was not involved in the study.

“More than 6 million people worldwide live with coinfection,” he added. “Patients coinfected with HBV and HIV have additional care needs over those living with only chronic HIV infection. With more HIV-infected patients becoming long-term survivors who are managed through the use of effective ART, fully understanding the differentiated long-term care needs of this population is important.”

Debika Bhattacharya, MD, a specialist in HIV and viral hepatitis coinfection in the Division of Infectious Diseases at UCLA Health, Los Angeles, joined Dr. Sterling and Dr. Alvarnas in advising clinicians to regularly evaluate the kidney and bone health of their coinfected patients.

“While this study focuses the very common antiretroviral agent TDF, it will be important to see the impact of a similar drug, tenofovir alafenamide (TAF) – which has been associated with less impact on bone and kidney health – on clinical outcomes in HBV-HIV coinfection,” Dr. Bhattacharya, who also was not involved in the study, wrote in an email.

The National Institute of Diabetes and Digestive and Kidney Diseases funded the study. Dr. Sterling has served on boards for Pfizer and AskBio, and he reports research grants from Gilead, Abbott, AbbVie, and Roche to his institution. Most other authors report financial relationships with pharmaceutical companies. Dr. Alvarnas reports no relevant financial relationships. Dr. Bhattacharya has received a research grant from Gilead Sciences, paid to her institution.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article